Sample records for remote sensing camera

  1. Evaluation of an airborne remote sensing platform consisting of two consumer-grade cameras for crop identification

    USDA-ARS?s Scientific Manuscript database

    Remote sensing systems based on consumer-grade cameras have been increasingly used in scientific research and remote sensing applications because of their low cost and ease of use. However, the performance of consumer-grade cameras for practical applications have not been well documented in related ...

  2. The integrated design and archive of space-borne signal processing and compression coding

    NASA Astrophysics Data System (ADS)

    He, Qiang-min; Su, Hao-hang; Wu, Wen-bo

    2017-10-01

    With the increasing demand of users for the extraction of remote sensing image information, it is very urgent to significantly enhance the whole system's imaging quality and imaging ability by using the integrated design to achieve its compact structure, light quality and higher attitude maneuver ability. At this present stage, the remote sensing camera's video signal processing unit and image compression and coding unit are distributed in different devices. The volume, weight and consumption of these two units is relatively large, which unable to meet the requirements of the high mobility remote sensing camera. This paper according to the high mobility remote sensing camera's technical requirements, designs a kind of space-borne integrated signal processing and compression circuit by researching a variety of technologies, such as the high speed and high density analog-digital mixed PCB design, the embedded DSP technology and the image compression technology based on the special-purpose chips. This circuit lays a solid foundation for the research of the high mobility remote sensing camera.

  3. An airborne multispectral imaging system based on two consumer-grade cameras for agricultural remote sensing

    USDA-ARS?s Scientific Manuscript database

    This paper describes the design and evaluation of an airborne multispectral imaging system based on two identical consumer-grade cameras for agricultural remote sensing. The cameras are equipped with a full-frame complementary metal oxide semiconductor (CMOS) sensor with 5616 × 3744 pixels. One came...

  4. Optical registration of spaceborne low light remote sensing camera

    NASA Astrophysics Data System (ADS)

    Li, Chong-yang; Hao, Yan-hui; Xu, Peng-mei; Wang, Dong-jie; Ma, Li-na; Zhao, Ying-long

    2018-02-01

    For the high precision requirement of spaceborne low light remote sensing camera optical registration, optical registration of dual channel for CCD and EMCCD is achieved by the high magnification optical registration system. System integration optical registration and accuracy of optical registration scheme for spaceborne low light remote sensing camera with short focal depth and wide field of view is proposed in this paper. It also includes analysis of parallel misalignment of CCD and accuracy of optical registration. Actual registration results show that imaging clearly, MTF and accuracy of optical registration meet requirements, it provide important guarantee to get high quality image data in orbit.

  5. Remote Sensing Simulation Activities for Earthlings

    ERIC Educational Resources Information Center

    Krockover, Gerald H.; Odden, Thomas D.

    1977-01-01

    Suggested are activities using a Polaroid camera to illustrate the capabilities of remote sensing. Reading materials from the National Aeronautics and Space Administration (NASA) are suggested. Methods for (1) finding a camera's focal length, (2) calculating ground dimension photograph simulation, and (3) limiting size using film resolution are…

  6. Photogrammetry and Remote Sensing: New German Standards (din) Setting Quality Requirements of Products Generated by Digital Cameras, Pan-Sharpening and Classification

    NASA Astrophysics Data System (ADS)

    Reulke, R.; Baltrusch, S.; Brunn, A.; Komp, K.; Kresse, W.; von Schönermark, M.; Spreckels, V.

    2012-08-01

    10 years after the first introduction of a digital airborne mapping camera in the ISPRS conference 2000 in Amsterdam, several digital cameras are now available. They are well established in the market and have replaced the analogue camera. A general improvement in image quality accompanied the digital camera development. The signal-to-noise ratio and the dynamic range are significantly better than with the analogue cameras. In addition, digital cameras can be spectrally and radiometrically calibrated. The use of these cameras required a rethinking in many places though. New data products were introduced. In the recent years, some activities took place that should lead to a better understanding of the cameras and the data produced by these cameras. Several projects, like the projects of the German Society for Photogrammetry, Remote Sensing and Geoinformation (DGPF) or EuroSDR (European Spatial Data Research), were conducted to test and compare the performance of the different cameras. In this paper the current DIN (Deutsches Institut fuer Normung - German Institute for Standardization) standards will be presented. These include the standard for digital cameras, the standard for ortho rectification, the standard for classification, and the standard for pan-sharpening. In addition, standards for the derivation of elevation models, the use of Radar / SAR, and image quality are in preparation. The OGC has indicated its interest in participating that development. The OGC has already published specifications in the field of photogrammetry and remote sensing. One goal of joint future work could be to merge these formerly independent developments and the joint development of a suite of implementation specifications for photogrammetry and remote sensing.

  7. REMOTE SENSING IN OCEANOGRAPHY.

    DTIC Science & Technology

    remote sensing from satellites. Sensing of oceanographic variables from aircraft began with the photographing of waves and ice. Since then remote measurement of sea surface temperatures and wave heights have become routine. Sensors tested for oceanographic applications include multi-band color cameras, radar scatterometers, infrared spectrometers and scanners, passive microwave radiometers, and radar imagers. Remote sensing has found its greatest application in providing rapid coverage of large oceanographic areas for synoptic and analysis and

  8. Characterization of Vegetation using the UC Davis Remote Sensing Testbed

    NASA Astrophysics Data System (ADS)

    Falk, M.; Hart, Q. J.; Bowen, K. S.; Ustin, S. L.

    2006-12-01

    Remote sensing provides information about the dynamics of the terrestrial biosphere with continuous spatial and temporal coverage on many different scales. We present the design and construction of a suite of instrument modules and network infrastructure with size, weight and power constraints suitable for small scale vehicles, anticipating vigorous growth in unmanned aerial vehicles (UAV) and other mobile platforms. Our approach provides the rapid deployment and low cost acquisition of high aerial imagery for applications requiring high spatial resolution and revisits. The testbed supports a wide range of applications, encourages remote sensing solutions in new disciplines and demonstrates the complete range of engineering knowledge required for the successful deployment of remote sensing instruments. The initial testbed is deployed on a Sig Kadet Senior remote controlled plane. It includes an onboard computer with wireless radio, GPS, inertia measurement unit, 3-axis electronic compass and digital cameras. The onboard camera is either a RGB digital camera or a modified digital camera with red and NIR channels. Cameras were calibrated using selective light sources, an integrating spheres and a spectrometer, allowing for the computation of vegetation indices such as the NDVI. Field tests to date have investigated technical challenges in wireless communication bandwidth limits, automated image geolocation, and user interfaces; as well as image applications such as environmental landscape mapping focusing on Sudden Oak Death and invasive species detection, studies on the impact of bird colonies on tree canopies, and precision agriculture.

  9. Investigation of fugitive emissions from petrochemical transport barges using optical remote sensing

    EPA Science Inventory

    Recent airborne remote sensing survey data acquired with passive gas imaging equipment (PGIE), in this case infrared cameras, have shown potentially significant fugitive volatile organic carbon (VOC) emissions from petrochemical transport barges. The experiment found remote sens...

  10. Bundle block adjustment of large-scale remote sensing data with Block-based Sparse Matrix Compression combined with Preconditioned Conjugate Gradient

    NASA Astrophysics Data System (ADS)

    Zheng, Maoteng; Zhang, Yongjun; Zhou, Shunping; Zhu, Junfeng; Xiong, Xiaodong

    2016-07-01

    In recent years, new platforms and sensors in photogrammetry, remote sensing and computer vision areas have become available, such as Unmanned Aircraft Vehicles (UAV), oblique camera systems, common digital cameras and even mobile phone cameras. Images collected by all these kinds of sensors could be used as remote sensing data sources. These sensors can obtain large-scale remote sensing data which consist of a great number of images. Bundle block adjustment of large-scale data with conventional algorithm is very time and space (memory) consuming due to the super large normal matrix arising from large-scale data. In this paper, an efficient Block-based Sparse Matrix Compression (BSMC) method combined with the Preconditioned Conjugate Gradient (PCG) algorithm is chosen to develop a stable and efficient bundle block adjustment system in order to deal with the large-scale remote sensing data. The main contribution of this work is the BSMC-based PCG algorithm which is more efficient in time and memory than the traditional algorithm without compromising the accuracy. Totally 8 datasets of real data are used to test our proposed method. Preliminary results have shown that the BSMC method can efficiently decrease the time and memory requirement of large-scale data.

  11. Top of Mars Rover Curiosity Remote Sensing Mast

    NASA Image and Video Library

    2011-04-06

    The remote sensing mast on NASA Mars rover Curiosity holds two science instruments for studying the rover surroundings and two stereo navigation cameras for use in driving the rover and planning rover activities.

  12. Wageningen UR Unmanned Aerial Remote Sensing Facility - Overview of activities

    NASA Astrophysics Data System (ADS)

    Bartholomeus, Harm; Keesstra, Saskia; Kooistra, Lammert; Suomalainen, Juha; Mucher, Sander; Kramer, Henk; Franke, Jappe

    2016-04-01

    To support environmental management there is an increasing need for timely, accurate and detailed information on our land. Unmanned Aerial Systems (UAS) are increasingly used to monitor agricultural crop development, habitat quality or urban heat efficiency. An important reason is that UAS technology is maturing quickly while the flexible capabilities of UAS fill a gap between satellite based and ground based geo-sensing systems. In 2012, different groups within Wageningen University and Research Centre have established an Unmanned Airborne Remote Sensing Facility. The objective of this facility is threefold: a) To develop innovation in the field of remote sensing science by providing a platform for dedicated and high-quality experiments; b) To support high quality UAS services by providing calibration facilities and disseminating processing procedures to the UAS user community; and c) To promote and test the use of UAS in a broad range of application fields like habitat monitoring, precision agriculture and land degradation assessment. The facility is hosted by the Laboratory of Geo-Information Science and Remote Sensing (GRS) and the Department of Soil Physics and Land Management (SLM) of Wageningen University together with the team Earth Informatics (EI) of Alterra. The added value of the Unmanned Aerial Remote Sensing Facility is that compared to for example satellite based remote sensing more dedicated science experiments can be prepared. This includes for example higher frequent observations in time (e.g., diurnal observations), observations of an object under different observation angles for characterization of BRDF and flexibility in use of camera's and sensors types. In this way, laboratory type of set ups can be tested in a field situation and effects of up-scaling can be tested. In the last years we developed and implemented different camera systems (e.g. a hyperspectral pushbroom system, and multispectral frame cameras) which we operated in projects all around the world, while new camera systems are being planned such as LiDAR and a full frame hyperspectral camera. In the presentation we will give an overview of our activities, ranging from erosion studies, decision support for precision agriculture, determining leaf biochemistry and canopy structure in tropical forests to the mapping of coastal zones.

  13. Recent developments in space shuttle remote sensing, using hand-held film cameras

    NASA Technical Reports Server (NTRS)

    Amsbury, David L.; Bremer, Jeffrey M.

    1992-01-01

    The authors report on the advantages and disadvantages of a number of camera systems which are currently employed for space shuttle remote sensing operations. Systems discussed include the modified Hasselbad, the Rolleiflex 6008, the Linkof 5-inch format system, and the Nikon F3/F4 systems. Film/filter combinations (color positive films, color infrared films, color negative films and polarization filters) are presented.

  14. Applications of remote sensing to watershed management

    NASA Technical Reports Server (NTRS)

    Rango, A.

    1975-01-01

    Aircraft and satellite remote sensing systems which are capable of contributing to watershed management are described and include: the multispectral scanner subsystem on LANDSAT and the basic multispectral camera array flown on high altitude aircraft such as the U-2. Various aspects of watershed management investigated by remote sensing systems are discussed. Major areas included are: snow mapping, surface water inventories, flood management, hydrologic land use monitoring, and watershed modeling. It is indicated that technological advances in remote sensing of hydrological data must be coupled with an expansion of awareness and training in remote sensing techniques of the watershed management community.

  15. Autonomous Exploration for Gathering Increased Science

    NASA Technical Reports Server (NTRS)

    Bornstein, Benjamin J.; Castano, Rebecca; Estlin, Tara A.; Gaines, Daniel M.; Anderson, Robert C.; Thompson, David R.; DeGranville, Charles K.; Chien, Steve A.; Tang, Benyang; Burl, Michael C.; hide

    2010-01-01

    The Autonomous Exploration for Gathering Increased Science System (AEGIS) provides automated targeting for remote sensing instruments on the Mars Exploration Rover (MER) mission, which at the time of this reporting has had two rovers exploring the surface of Mars (see figure). Currently, targets for rover remote-sensing instruments must be selected manually based on imagery already on the ground with the operations team. AEGIS enables the rover flight software to analyze imagery onboard in order to autonomously select and sequence targeted remote-sensing observations in an opportunistic fashion. In particular, this technology will be used to automatically acquire sub-framed, high-resolution, targeted images taken with the MER panoramic cameras. This software provides: 1) Automatic detection of terrain features in rover camera images, 2) Feature extraction for detected terrain targets, 3) Prioritization of terrain targets based on a scientist target feature set, and 4) Automated re-targeting of rover remote-sensing instruments at the highest priority target.

  16. Verification technology of remote sensing camera satellite imaging simulation based on ray tracing

    NASA Astrophysics Data System (ADS)

    Gu, Qiongqiong; Chen, Xiaomei; Yang, Deyun

    2017-08-01

    Remote sensing satellite camera imaging simulation technology is broadly used to evaluate the satellite imaging quality and to test the data application system. But the simulation precision is hard to examine. In this paper, we propose an experimental simulation verification method, which is based on the test parameter variation comparison. According to the simulation model based on ray-tracing, the experiment is to verify the model precision by changing the types of devices, which are corresponding the parameters of the model. The experimental results show that the similarity between the imaging model based on ray tracing and the experimental image is 91.4%, which can simulate the remote sensing satellite imaging system very well.

  17. A real-time MTFC algorithm of space remote-sensing camera based on FPGA

    NASA Astrophysics Data System (ADS)

    Zhao, Liting; Huang, Gang; Lin, Zhe

    2018-01-01

    A real-time MTFC algorithm of space remote-sensing camera based on FPGA was designed. The algorithm can provide real-time image processing to enhance image clarity when the remote-sensing camera running on-orbit. The image restoration algorithm adopted modular design. The MTF measurement calculation module on-orbit had the function of calculating the edge extension function, line extension function, ESF difference operation, normalization MTF and MTFC parameters. The MTFC image filtering and noise suppression had the function of filtering algorithm and effectively suppressing the noise. The algorithm used System Generator to design the image processing algorithms to simplify the design structure of system and the process redesign. The image gray gradient dot sharpness edge contrast and median-high frequency were enhanced. The image SNR after recovery reduced less than 1 dB compared to the original image. The image restoration system can be widely used in various fields.

  18. Brazil's remote sensing activities in the Eighties

    NASA Technical Reports Server (NTRS)

    Raupp, M. A.; Pereiradacunha, R.; Novaes, R. A.

    1985-01-01

    Most of the remote sensing activities in Brazil have been conducted by the Institute for Space Research (INPE). This report describes briefly INPE's activities in remote sensing in the last years. INPE has been engaged in research (e.g., radiance studies), development (e.g., CCD-scanners, image processing devices) and applications (e.g., crop survey, land use, mineral resources, etc.) of remote sensing. INPE is also responsible for the operation (data reception and processing) of the LANDSATs and meteorological satellites. Data acquisition activities include the development of CCD-Camera to be deployed on board the space shuttle and the construction of a remote sensing satellite.

  19. Physics teaching by infrared remote sensing of vegetation

    NASA Astrophysics Data System (ADS)

    Schüttler, Tobias; Maman, Shimrit; Girwidz, Raimund

    2018-05-01

    Context- and project-based teaching has proven to foster different affective and cognitive aspects of learning. As a versatile and multidisciplinary scientific research area with diverse applications for everyday life, satellite remote sensing is an interesting context for physics education. In this paper we give a brief overview of satellite remote sensing of vegetation and how to obtain your own, individual infrared remote sensing data with affordable converted digital cameras. This novel technique provides the opportunity to conduct individual remote sensing measurement projects with students in their respective environment. The data can be compared to real satellite data and is of sufficient accuracy for educational purposes.

  20. The HydroColor App: Above Water Measurements of Remote Sensing Reflectance and Turbidity Using a Smartphone Camera

    PubMed Central

    Leeuw, Thomas; Boss, Emmanuel

    2018-01-01

    HydroColor is a mobile application that utilizes a smartphone’s camera and auxiliary sensors to measure the remote sensing reflectance of natural water bodies. HydroColor uses the smartphone’s digital camera as a three-band radiometer. Users are directed by the application to collect a series of three images. These images are used to calculate the remote sensing reflectance in the red, green, and blue broad wavelength bands. As with satellite measurements, the reflectance can be inverted to estimate the concentration of absorbing and scattering substances in the water, which are predominately composed of suspended sediment, chlorophyll, and dissolved organic matter. This publication describes the measurement method and investigates the precision of HydroColor’s reflectance and turbidity estimates compared to commercial instruments. It is shown that HydroColor can measure the remote sensing reflectance to within 26% of a precision radiometer and turbidity within 24% of a portable turbidimeter. HydroColor distinguishes itself from other water quality camera methods in that its operation is based on radiometric measurements instead of image color. HydroColor is one of the few mobile applications to use a smartphone as a completely objective sensor, as opposed to subjective user observations or color matching using the human eye. This makes HydroColor a powerful tool for crowdsourcing of aquatic optical data. PMID:29337917

  1. The HydroColor App: Above Water Measurements of Remote Sensing Reflectance and Turbidity Using a Smartphone Camera.

    PubMed

    Leeuw, Thomas; Boss, Emmanuel

    2018-01-16

    HydroColor is a mobile application that utilizes a smartphone's camera and auxiliary sensors to measure the remote sensing reflectance of natural water bodies. HydroColor uses the smartphone's digital camera as a three-band radiometer. Users are directed by the application to collect a series of three images. These images are used to calculate the remote sensing reflectance in the red, green, and blue broad wavelength bands. As with satellite measurements, the reflectance can be inverted to estimate the concentration of absorbing and scattering substances in the water, which are predominately composed of suspended sediment, chlorophyll, and dissolved organic matter. This publication describes the measurement method and investigates the precision of HydroColor's reflectance and turbidity estimates compared to commercial instruments. It is shown that HydroColor can measure the remote sensing reflectance to within 26% of a precision radiometer and turbidity within 24% of a portable turbidimeter. HydroColor distinguishes itself from other water quality camera methods in that its operation is based on radiometric measurements instead of image color. HydroColor is one of the few mobile applications to use a smartphone as a completely objective sensor, as opposed to subjective user observations or color matching using the human eye. This makes HydroColor a powerful tool for crowdsourcing of aquatic optical data.

  2. Remote sensing and implications for variable-rate application using agricultural aircraft

    NASA Astrophysics Data System (ADS)

    Thomson, Steven J.; Smith, Lowrey A.; Ray, Jeffrey D.; Zimba, Paul V.

    2004-01-01

    Aircraft routinely used for agricultural spray application are finding utility for remote sensing. Data obtained from remote sensing can be used for prescription application of pesticides, fertilizers, cotton growth regulators, and water (the latter with the assistance of hyperspectral indices and thermal imaging). Digital video was used to detect weeds in early cotton, and preliminary data were obtained to see if nitrogen status could be detected in early soybeans. Weeds were differentiable from early cotton at very low altitudes (65-m), with the aid of supervised classification algorithms in the ENVI image analysis software. The camera was flown at very low altitude for acceptable pixel resolution. Nitrogen status was not detectable by statistical analysis of digital numbers (DNs) obtained from images, but soybean cultivar differences were statistically discernable (F=26, p=0.01). Spectroradiometer data are being analyzed to identify narrow spectral bands that might aid in selecting camera filters for determination of plant nitrogen status. Multiple camera configurations are proposed to allow vegetative indices to be developed more readily. Both remotely sensed field images and ground data are to be used for decision-making in a proposed variable-rate application system for agricultural aircraft. For this system, prescriptions generated from digital imagery and data will be coupled with GPS-based swath guidance and programmable flow control.

  3. Photogrammetric Processing of Planetary Linear Pushbroom Images Based on Approximate Orthophotos

    NASA Astrophysics Data System (ADS)

    Geng, X.; Xu, Q.; Xing, S.; Hou, Y. F.; Lan, C. Z.; Zhang, J. J.

    2018-04-01

    It is still a great challenging task to efficiently produce planetary mapping products from orbital remote sensing images. There are many disadvantages in photogrammetric processing of planetary stereo images, such as lacking ground control information and informative features. Among which, image matching is the most difficult job in planetary photogrammetry. This paper designs a photogrammetric processing framework for planetary remote sensing images based on approximate orthophotos. Both tie points extraction for bundle adjustment and dense image matching for generating digital terrain model (DTM) are performed on approximate orthophotos. Since most of planetary remote sensing images are acquired by linear scanner cameras, we mainly deal with linear pushbroom images. In order to improve the computational efficiency of orthophotos generation and coordinates transformation, a fast back-projection algorithm of linear pushbroom images is introduced. Moreover, an iteratively refined DTM and orthophotos scheme was adopted in the DTM generation process, which is helpful to reduce search space of image matching and improve matching accuracy of conjugate points. With the advantages of approximate orthophotos, the matching results of planetary remote sensing images can be greatly improved. We tested the proposed approach with Mars Express (MEX) High Resolution Stereo Camera (HRSC) and Lunar Reconnaissance Orbiter (LRO) Narrow Angle Camera (NAC) images. The preliminary experimental results demonstrate the feasibility of the proposed approach.

  4. Laboratory and Field Application of River Depth Estimation Techniques Using Remotely Sensed Data: Annual Report Year 1

    DTIC Science & Technology

    2013-09-30

    coordinates locally oriented in the streamwise and cross-stream directions, respectively. To test the expressions and investigate potential errors, we...Survey Geomorphology and Sediment Transport Laboratory (GSTL). The IR camera was mounted on a rack ~1m above the surface of the flow and oriented so that...MD_SWMS, American Society for Photogrammetry and Remote Sensing, Proceedings of the 2008 Annual Conference –PNAMP Special Session: Remote Sensing

  5. Tectonics and Volcanism of East Africa as Seen Using Remote Sensing Imagery

    NASA Technical Reports Server (NTRS)

    Hutt, Duncan John

    1996-01-01

    The East African Rift is the largest area of active continental geology. The tectonics of this area has been studied with remote sensing data, including AVHRR, Landsat MSS and TM, SPOT, and electronic still camera from Shuttle. Lineation trends have been compared to centers of volcanic and earthquake activity as well as the trends shown on existing geologic maps. Remote sensing data can be used effectively to reveal and analyze significant tectonic features in this area.

  6. Monitoring forests from space: quantifying forest change by using satellite data.

    Treesearch

    Jonathan Thompson

    2006-01-01

    Change is the only constant in forest ecosystems. Quantifying regional-scale forest change is increasingly done with remote sensing, which relies on data sent from digital camera-like sensors mounted to Earth-orbiting satellites. Through remote sensing, changes in forests can be studied comprehensively and uniformly across time and space.

  7. Exploring Remote Rensing Through The Use Of Readily-Available Classroom Technologies

    NASA Astrophysics Data System (ADS)

    Rogers, M. A.

    2013-12-01

    Frontier geoscience research using remotely-sensed satellite observation routinely requires sophisticated and novel remote sensing techniques to succeed. Describing these techniques in an educational format presents significant challenges to the science educator, especially with regards to the professional development setting where a small, but competent audience has limited instructor contact time to develop the necessary understanding. In this presentation, we describe the use of simple and cheaply available technologies, including ultrasonic transducers, FLIR detectors, and even simple web cameras to provide a tangible analogue to sophisticated remote sensing platforms. We also describe methods of curriculum development that leverages the use of these simple devices to teach the fundamentals of remote sensing, resulting in a deeper and more intuitive understanding of the techniques used in modern remote sensing research. Sample workshop itineraries using these techniques are provided as well.

  8. Application of remote sensing for planning purposes

    NASA Technical Reports Server (NTRS)

    Hughes, T. H. (Editor)

    1977-01-01

    Types of remotely sensed data are many and varied but, all are primarily dependent on the sensor platform and the kind of sensing system used. A sensor platform is the type of aircraft or satellite to which a sensing system is attached; each platform has its own inherent advantages and disadvantages. Selected attributes of several current or recently used platforms are outlined. Though sensing systems are highly varied, they may be divided into various operational categories such as cameras, electromechanical scanners, and radars.

  9. Active landslide monitoring using remote sensing data, GPS measurements and cameras on board UAV

    NASA Astrophysics Data System (ADS)

    Nikolakopoulos, Konstantinos G.; Kavoura, Katerina; Depountis, Nikolaos; Argyropoulos, Nikolaos; Koukouvelas, Ioannis; Sabatakakis, Nikolaos

    2015-10-01

    An active landslide can be monitored using many different methods: Classical geotechnical measurements like inclinometer, topographical survey measurements with total stations or GPS and photogrammetric techniques using airphotos or high resolution satellite images. As the cost of the aerial photo campaign and the acquisition of very high resolution satellite data is quite expensive the use of cameras on board UAV could be an identical solution. Small UAVs (Unmanned Aerial Vehicles) have started their development as expensive toys but they currently became a very valuable tool in remote sensing monitoring of small areas. The purpose of this work is to demonstrate a cheap but effective solution for an active landslide monitoring. We present the first experimental results of the synergistic use of UAV, GPS measurements and remote sensing data. A six-rotor aircraft with a total weight of 6 kg carrying two small cameras has been used. Very accurate digital airphotos, high accuracy DSM, DGPS measurements and the data captured from the UAV are combined and the results are presented in the current study.

  10. Airborne multispectral identification of individual cotton plants using consumer-grade cameras

    USDA-ARS?s Scientific Manuscript database

    Although multispectral remote sensing using consumer-grade cameras has successfully identified fields of small cotton plants, improvements to detection sensitivity are needed to identify individual or small clusters of plants. The imaging sensor of consumer-grade cameras are based on a Bayer patter...

  11. Review of oil spill remote sensing.

    PubMed

    Fingas, Merv; Brown, Carl

    2014-06-15

    Remote-sensing for oil spills is reviewed. The use of visible techniques is ubiquitous, however it gives only the same results as visual monitoring. Oil has no particular spectral features that would allow for identification among the many possible background interferences. Cameras are only useful to provide documentation. In daytime oil absorbs light and remits this as thermal energy at temperatures 3-8K above ambient, this is detectable by infrared (IR) cameras. Laser fluorosensors are useful instruments because of their unique capability to identify oil on backgrounds that include water, soil, weeds, ice and snow. They are the only sensor that can positively discriminate oil on most backgrounds. Radar detects oil on water by the fact that oil will dampen water-surface capillary waves under low to moderate wave/wind conditions. Radar offers the only potential for large area searches, day/night and foul weather remote sensing. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Thematic Conference on Remote Sensing for Exploration Geology, 6th, Houston, TX, May 16-19, 1988, Proceedings. Volumes 1 & 2

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Papers concerning remote sensing applications for exploration geology are presented, covering topics such as remote sensing technology, data availability, frontier exploration, and exploration in mature basins. Other topics include offshore applications, geobotany, mineral exploration, engineering and environmental applications, image processing, and prospects for future developments in remote sensing for exploration geology. Consideration is given to the use of data from Landsat, MSS, TM, SAR, short wavelength IR, the Geophysical Environmental Research Airborne Scanner, gas chromatography, sonar imaging, the Airborne Visible-IR Imaging Spectrometer, field spectrometry, airborne thermal IR scanners, SPOT, AVHRR, SIR, the Large Format camera, and multitimephase satellite photographs.

  13. A low-cost dual-camera imaging system for aerial applicators

    USDA-ARS?s Scientific Manuscript database

    Agricultural aircraft provide a readily available remote sensing platform as low-cost and easy-to-use consumer-grade cameras are being increasingly used for aerial imaging. In this article, we report on a dual-camera imaging system we recently assembled that can capture RGB and near-infrared (NIR) i...

  14. Gyrocopter-Based Remote Sensing Platform

    NASA Astrophysics Data System (ADS)

    Weber, I.; Jenal, A.; Kneer, C.; Bongartz, J.

    2015-04-01

    In this paper the development of a lightweight and highly modularized airborne sensor platform for remote sensing applications utilizing a gyrocopter as a carrier platform is described. The current sensor configuration consists of a high resolution DSLR camera for VIS-RGB recordings. As a second sensor modality, a snapshot hyperspectral camera was integrated in the aircraft. Moreover a custom-developed thermal imaging system composed of a VIS-PAN camera and a LWIR-camera is used for aerial recordings in the thermal infrared range. Furthermore another custom-developed highly flexible imaging system for high resolution multispectral image acquisition with up to six spectral bands in the VIS-NIR range is presented. The performance of the overall system was tested during several flights with all sensor modalities and the precalculated demands with respect to spatial resolution and reliability were validated. The collected data sets were georeferenced, georectified, orthorectified and then stitched to mosaics.

  15. Suppressing the image smear of the vibration modulation transfer function for remote-sensing optical cameras.

    PubMed

    Li, Jin; Liu, Zilong; Liu, Si

    2017-02-20

    In on-board photographing processes of satellite cameras, the platform vibration can generate image motion, distortion, and smear, which seriously affect the image quality and image positioning. In this paper, we create a mathematical model of a vibrating modulate transfer function (VMTF) for a remote-sensing camera. The total MTF of a camera is reduced by the VMTF, which means the image quality is degraded. In order to avoid the degeneration of the total MTF caused by vibrations, we use an Mn-20Cu-5Ni-2Fe (M2052) manganese copper alloy material to fabricate a vibration-isolation mechanism (VIM). The VIM can transform platform vibration energy into irreversible thermal energy with its internal twin crystals structure. Our experiment shows the M2052 manganese copper alloy material is good enough to suppress image motion below 125 Hz, which is the vibration frequency of satellite platforms. The camera optical system has a higher MTF after suppressing the vibration of the M2052 material than before.

  16. Development of a highly automated system for the remote evaluation of individual tree parameters

    Treesearch

    Richard Pollock

    2000-01-01

    A highly-automated procedure for remotely estimating individual tree location, crown diameter, species class, and height has been developed. This procedure will involve the use of a multimodal airborne sensing system that consists of a digital frame camera, a scanning laser rangefinder, and a position and orientation measurement system. Data from the multimodal sensing...

  17. Quantifying Forest Ground Flora Biomass Using Close-range Remote Sensing

    Treesearch

    Paul F. Doruska; Robert C. Weih; Matthew D. Lane; Don C. Bragg

    2005-01-01

    Close-range remote sensing was used to estimate biomass of forest ground flora in Arkansas. Digital images of a series of 1-m² plots were taken using Kodak DCS760 and Kodak DCS420CIR digital cameras. ESRI ArcGIS™ and ERDAS Imagine® software was used to calculate the Normalized Difference Vegetation Index (NDVI) and the Average Visible...

  18. Crop classification and LAI estimation using original and resolution-reduced images from consumer-grade cameras

    USDA-ARS?s Scientific Manuscript database

    Consumer-grade cameras are being increasingly used for remote sensing applications in recent years. However, the performance of this type of cameras has not been systematically tested and well documented in the literature. The objective of this research was to evaluate the performance of original an...

  19. Fixed-focus camera objective for small remote sensing satellites

    NASA Astrophysics Data System (ADS)

    Topaz, Jeremy M.; Braun, Ofer; Freiman, Dov

    1993-09-01

    An athermalized objective has been designed for a compact, lightweight push-broom camera which is under development at El-Op Ltd. for use in small remote-sensing satellites. The high performance objective has a fixed focus setting, but maintains focus passively over the full range of temperatures encountered in small satellites. The lens is an F/5.0, 320 mm focal length Tessar type, operating over the range 0.5 - 0.9 micrometers . It has a 16 degree(s) field of view and accommodates various state-of-the-art silicon detector arrays. The design and performance of the objective is described in this paper.

  20. Remote sensing operations (multispectral scanner and photographic) in the New York Bight, 22 September 1975

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.; Hall, J. B., Jr.

    1977-01-01

    Ocean dumping of waste materials is a significant environmental concern in the New York Bight. One of these waste materials, sewage sludge, was monitored in an experiment conducted in the New York Bight on September 22, 1975. Remote sensing over controlled sewage sludge dumping included an 11-band multispectral scanner, fiver multispectral cameras and one mapping camera. Concurrent in situ water samples were taken and acoustical measurements were made of the sewage sludge plumes. Data were obtained for sewage sludge plumes resulting from line (moving barge) and spot (stationary barge) dumps. Multiple aircraft overpasses were made to evaluate temporal effects on the plume signature.

  1. Remote sensing of water quality in reservoirs and lakes in semi-arid climates

    NASA Technical Reports Server (NTRS)

    Anderson, H. M.; Horne, A. J.

    1975-01-01

    Overlake measurements using aerial cameras (remote sensing) combined with water truth collected from boats most economically provided wide-band photographs rather than precise spectra. With use of false color infrared film (400-950 nm), the reflected spectral signatures seen from hundreds to thousands of meters above the lake merged to produce various color tones. Such colors were easily and inexpensively obtained and could be recognized by lake management personnel without any prior training. The characteristic spectral signatures of various algal types were also recognizable in part by the color tone produced by remote sensing.

  2. Using oblique digital photography for alluvial sandbar monitoring and low-cost change detection

    USGS Publications Warehouse

    Tusso, Robert B.; Buscombe, Daniel D.; Grams, Paul E.

    2015-01-01

    The maintenance of alluvial sandbars is a longstanding management interest along the Colorado River in Grand Canyon. Resource managers are interested in both the long-term trend in sandbar condition and the short-term response to management actions, such as intentional controlled floods released from Glen Canyon Dam. Long-term monitoring is accomplished at a range of scales, by a combination of annual topographic survey at selected sites, daily collection of images from those sites using novel, autonomously operating, digital camera systems (hereafter referred to as 'remote cameras'), and quadrennial remote sensing of sandbars canyonwide. In this paper, we present results from the remote camera images for daily changes in sandbar topography.

  3. Hybrid Image Fusion for Sharpness Enhancement of Multi-Spectral Lunar Images

    NASA Astrophysics Data System (ADS)

    Awumah, Anna; Mahanti, Prasun; Robinson, Mark

    2016-10-01

    Image fusion enhances the sharpness of a multi-spectral (MS) image by incorporating spatial details from a higher-resolution panchromatic (Pan) image [1,2]. Known applications of image fusion for planetary images are rare, although image fusion is well-known for its applications to Earth-based remote sensing. In a recent work [3], six different image fusion algorithms were implemented and their performances were verified with images from the Lunar Reconnaissance Orbiter (LRO) Camera. The image fusion procedure obtained a high-resolution multi-spectral (HRMS) product from the LRO Narrow Angle Camera (used as Pan) and LRO Wide Angle Camera (used as MS) images. The results showed that the Intensity-Hue-Saturation (IHS) algorithm results in a high-spatial quality product while the Wavelet-based image fusion algorithm best preserves spectral quality among all the algorithms. In this work we show the results of a hybrid IHS-Wavelet image fusion algorithm when applied to LROC MS images. The hybrid method provides the best HRMS product - both in terms of spatial resolution and preservation of spectral details. Results from hybrid image fusion can enable new science and increase the science return from existing LROC images.[1] Pohl, Cle, and John L. Van Genderen. "Review article multisensor image fusion in remote sensing: concepts, methods and applications." International journal of remote sensing 19.5 (1998): 823-854.[2] Zhang, Yun. "Understanding image fusion." Photogramm. Eng. Remote Sens 70.6 (2004): 657-661.[3] Mahanti, Prasun et al. "Enhancement of spatial resolution of the LROC Wide Angle Camera images." Archives, XXIII ISPRS Congress Archives (2016).

  4. History and use of remote sensing for conservation and management of federal lands in Alaska, USA

    USGS Publications Warehouse

    Markon, Carl

    1995-01-01

    Remote sensing has been used to aid land use planning efforts for federal public lands in Alaska since the 1940s. Four federal land management agencies-the U.S. Fish and Wildlife Service, US. Bureau of Land Management, US. National Park Service, and U.S. Forest Service-have used aerial photography and satellite imagery to document the extent, type, and condition of Alaska's natural resources. Aerial photographs have been used to collect detailed information over small to medium-sized areas. This standard management tool is obtainable using equipment ranging from hand-held 35-mm cameras to precision metric mapping cameras. Satellite data, equally important, provide synoptic views of landscapes, are digitally manipulatable, and are easily merged with other digital databases. To date, over 109.2 million ha (72%) of Alaska's land cover have been mapped via remote sensing. This information has provided a base for conservation, management, and planning on federal public lands in Alaska.

  5. USDA/federal user of LANDSAT remote sensing

    NASA Technical Reports Server (NTRS)

    Allen, R.

    1981-01-01

    Developed and potential uses of remote sensing in crop condition and acreage assessment, renewable resources inventories, conservation practices, and water and forest management applications are described. Operational approaches, the adaptation of procedures to needs, and the agency's concern about data continuity and cost are discussed as well as support for future technology development for enhanced sensing capability. The use of improved camera systems for soil mapping and conservation monitoring from space shuttle, and of aerospace radar to improve soil moisture monitoring are mentioned.

  6. International Space Station Data Collection for Disaster Response

    NASA Technical Reports Server (NTRS)

    Stefanov, William L.; Evans, Cynthia A.

    2015-01-01

    Remotely sensed data acquired by orbital sensor systems has emerged as a vital tool to identify the extent of damage resulting from a natural disaster, as well as providing near-real time mapping support to response efforts on the ground and humanitarian aid efforts. The International Space Station (ISS) is a unique terrestrial remote sensing platform for acquiring disaster response imagery. Unlike automated remote-sensing platforms it has a human crew; is equipped with both internal and externally-mounted remote sensing instruments; and has an inclined, low-Earth orbit that provides variable views and lighting (day and night) over 95 percent of the inhabited surface of the Earth. As such, it provides a useful complement to autonomous sensor systems in higher altitude polar orbits. NASA remote sensing assets on the station began collecting International Disaster Charter (IDC) response data in May 2012. The initial NASA ISS sensor systems responding to IDC activations included the ISS Agricultural Camera (ISSAC), mounted in the Window Observational Research Facility (WORF); the Crew Earth Observations (CEO) Facility, where the crew collects imagery using off-the-shelf handheld digital cameras; and the Hyperspectral Imager for the Coastal Ocean (HICO), a visible to near-infrared system mounted externally on the Japan Experiment Module Exposed Facility. The ISSAC completed its primary mission in January 2013. It was replaced by the very high resolution ISS SERVIR Environmental Research and Visualization System (ISERV) Pathfinder, a visible-wavelength digital camera, telescope, and pointing system. Since the start of IDC response in 2012 there have been 108 IDC activations; NASA sensor systems have collected data for thirty-two of these events. Of the successful data collections, eight involved two or more ISS sensor systems responding to the same event. Data has also been collected by International Partners in response to natural disasters, most notably JAXA and Roscosmos/Energia through the Urugan program.

  7. Accurate reconstruction of hyperspectral images from compressive sensing measurements

    NASA Astrophysics Data System (ADS)

    Greer, John B.; Flake, J. C.

    2013-05-01

    The emerging field of Compressive Sensing (CS) provides a new way to capture data by shifting the heaviest burden of data collection from the sensor to the computer on the user-end. This new means of sensing requires fewer measurements for a given amount of information than traditional sensors. We investigate the efficacy of CS for capturing HyperSpectral Imagery (HSI) remotely. We also introduce a new family of algorithms for constructing HSI from CS measurements with Split Bregman Iteration [Goldstein and Osher,2009]. These algorithms combine spatial Total Variation (TV) with smoothing in the spectral dimension. We examine models for three different CS sensors: the Coded Aperture Snapshot Spectral Imager-Single Disperser (CASSI-SD) [Wagadarikar et al.,2008] and Dual Disperser (CASSI-DD) [Gehm et al.,2007] cameras, and a hypothetical random sensing model closer to CS theory, but not necessarily implementable with existing technology. We simulate the capture of remotely sensed images by applying the sensor forward models to well-known HSI scenes - an AVIRIS image of Cuprite, Nevada and the HYMAP Urban image. To measure accuracy of the CS models, we compare the scenes constructed with our new algorithm to the original AVIRIS and HYMAP cubes. The results demonstrate the possibility of accurately sensing HSI remotely with significantly fewer measurements than standard hyperspectral cameras.

  8. Remote identification of individual volunteer cotton plants

    USDA-ARS?s Scientific Manuscript database

    Although airborne multispectral remote sensing can identify fields of small cotton plants, improvements to detection sensitivity are needed to identify individual or small clusters of plants that can similarly provide habitat for boll weevils. However, when consumer-grade cameras are used, each pix...

  9. [A review of atmospheric aerosol research by using polarization remote sensing].

    PubMed

    Guo, Hong; Gu, Xing-Fa; Xie, Dong-Hai; Yu, Tao; Meng, Qing-Yan

    2014-07-01

    In the present paper, aerosol research by using polarization remote sensing in last two decades (1993-2013) was reviewed, including aerosol researches based on POLDER/PARASOL, APS(Aerosol Polarimetry Sensor), Polarized Airborne camera and Ground-based measurements. We emphasize the following three aspects: (1) The retrieval algorithms developed for land and marine aerosol by using POLDER/PARASOL; The validation and application of POLDER/PARASOL AOD, and cross-comparison with AOD of other satellites, such as MODIS AOD. (2) The retrieval algorithms developed for land and marine aerosol by using MICROPOL and RSP/APS. We also introduce the new progress in aerosol research based on The Directional Polarimetric Camera (DPC), which was produced by Anhui Institute of Optics and Fine Mechanics, Chinese Academy of Sciences (CAS). (3) The aerosol retrieval algorithms by using measurements from ground-based instruments, such as CE318-2 and CE318-DP. The retrieval results from spaceborne sensors, airborne camera and ground-based measurements include total AOD, fine-mode AOD, coarse-mode AOD, size distribution, particle shape, complex refractive indices, single scattering albedo, scattering phase function, polarization phase function and AOD above cloud. Finally, based on the research, the authors present the problems and prospects of atmospheric aerosol research by using polarization remote sensing, and provide a valuable reference for the future studies of atmospheric aerosol.

  10. Multispectral Photography

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Model II Multispectral Camera is an advanced aerial camera that provides optimum enhancement of a scene by recording spectral signatures of ground objects only in narrow, preselected bands of the electromagnetic spectrum. Its photos have applications in such areas as agriculture, forestry, water pollution investigations, soil analysis, geologic exploration, water depth studies and camouflage detection. The target scene is simultaneously photographed in four separate spectral bands. Using a multispectral viewer, such as their Model 75 Spectral Data creates a color image from the black and white positives taken by the camera. With this optical image analysis unit, all four bands are superimposed in accurate registration and illuminated with combinations of blue green, red, and white light. Best color combination for displaying the target object is selected and printed. Spectral Data Corporation produces several types of remote sensing equipment and also provides aerial survey, image processing and analysis and number of other remote sensing services.

  11. Direct Reflectance Measurements from Drones: Sensor Absolute Radiometric Calibration and System Tests for Forest Reflectance Characterization.

    PubMed

    Hakala, Teemu; Markelin, Lauri; Honkavaara, Eija; Scott, Barry; Theocharous, Theo; Nevalainen, Olli; Näsi, Roope; Suomalainen, Juha; Viljanen, Niko; Greenwell, Claire; Fox, Nigel

    2018-05-03

    Drone-based remote sensing has evolved rapidly in recent years. Miniaturized hyperspectral imaging sensors are becoming more common as they provide more abundant information of the object compared to traditional cameras. Reflectance is a physically defined object property and therefore often preferred output of the remote sensing data capture to be used in the further processes. Absolute calibration of the sensor provides a possibility for physical modelling of the imaging process and enables efficient procedures for reflectance correction. Our objective is to develop a method for direct reflectance measurements for drone-based remote sensing. It is based on an imaging spectrometer and irradiance spectrometer. This approach is highly attractive for many practical applications as it does not require in situ reflectance panels for converting the sensor radiance to ground reflectance factors. We performed SI-traceable spectral and radiance calibration of a tuneable Fabry-Pérot Interferometer -based (FPI) hyperspectral camera at the National Physical Laboratory NPL (Teddington, UK). The camera represents novel technology by collecting 2D format hyperspectral image cubes using time sequential spectral scanning principle. The radiance accuracy of different channels varied between ±4% when evaluated using independent test data, and linearity of the camera response was on average 0.9994. The spectral response calibration showed side peaks on several channels that were due to the multiple orders of interference of the FPI. The drone-based direct reflectance measurement system showed promising results with imagery collected over Wytham Forest (Oxford, UK).

  12. Direct Reflectance Measurements from Drones: Sensor Absolute Radiometric Calibration and System Tests for Forest Reflectance Characterization

    PubMed Central

    Hakala, Teemu; Scott, Barry; Theocharous, Theo; Näsi, Roope; Suomalainen, Juha; Greenwell, Claire; Fox, Nigel

    2018-01-01

    Drone-based remote sensing has evolved rapidly in recent years. Miniaturized hyperspectral imaging sensors are becoming more common as they provide more abundant information of the object compared to traditional cameras. Reflectance is a physically defined object property and therefore often preferred output of the remote sensing data capture to be used in the further processes. Absolute calibration of the sensor provides a possibility for physical modelling of the imaging process and enables efficient procedures for reflectance correction. Our objective is to develop a method for direct reflectance measurements for drone-based remote sensing. It is based on an imaging spectrometer and irradiance spectrometer. This approach is highly attractive for many practical applications as it does not require in situ reflectance panels for converting the sensor radiance to ground reflectance factors. We performed SI-traceable spectral and radiance calibration of a tuneable Fabry-Pérot Interferometer -based (FPI) hyperspectral camera at the National Physical Laboratory NPL (Teddington, UK). The camera represents novel technology by collecting 2D format hyperspectral image cubes using time sequential spectral scanning principle. The radiance accuracy of different channels varied between ±4% when evaluated using independent test data, and linearity of the camera response was on average 0.9994. The spectral response calibration showed side peaks on several channels that were due to the multiple orders of interference of the FPI. The drone-based direct reflectance measurement system showed promising results with imagery collected over Wytham Forest (Oxford, UK). PMID:29751560

  13. Volcano monitoring with an infrared camera: first insights from Villarrica Volcano

    NASA Astrophysics Data System (ADS)

    Rosas Sotomayor, Florencia; Amigo Ramos, Alvaro; Velasquez Vargas, Gabriela; Medina, Roxana; Thomas, Helen; Prata, Fred; Geoffroy, Carolina

    2015-04-01

    This contribution focuses on the first trials of the, almost 24/7 monitoring of Villarrica volcano with an infrared camera. Results must be compared with other SO2 remote sensing instruments such as DOAS and UV-camera, for the ''day'' measurements. Infrared remote sensing of volcanic emissions is a fast and safe method to obtain gas abundances in volcanic plumes, in particular when the access to the vent is difficult, during volcanic crisis and at night time. In recent years, a ground-based infrared camera (Nicair) has been developed by Nicarnica Aviation, which quantifies SO2 and ash on volcanic plumes, based on the infrared radiance at specific wavelengths through the application of filters. Three Nicair1 (first model) have been acquired by the Geological Survey of Chile in order to study degassing of active volcanoes. Several trials with the instruments have been performed in northern Chilean volcanoes, and have proven that the intervals of retrieved SO2 concentration and fluxes are as expected. Measurements were also performed at Villarrica volcano, and a location to install a ''fixed'' camera, at 8km from the crater, was discovered here. It is a coffee house with electrical power, wifi network, polite and committed owners and a full view of the volcano summit. The first measurements are being made and processed in order to have full day and week of SO2 emissions, analyze data transfer and storage, improve the remote control of the instrument and notebook in case of breakdown, web-cam/GoPro support, and the goal of the project: which is to implement a fixed station to monitor and study the Villarrica volcano with a Nicair1 integrating and comparing these results with other remote sensing instruments. This works also looks upon the strengthen of bonds with the community by developing teaching material and giving talks to communicate volcanic hazards and other geoscience topics to the people who live "just around the corner" from one of the most active volcanoes in Chile.

  14. NASA Remote Sensing Research as Applied to Archaeology

    NASA Technical Reports Server (NTRS)

    Giardino, Marco J.; Thomas, Michael R.

    2002-01-01

    The use of remotely sensed images is not new to archaeology. Ever since balloons and airplanes first flew cameras over archaeological sites, researchers have taken advantage of the elevated observation platforms to understand sites better. When viewed from above, crop marks, soil anomalies and buried features revealed new information that was not readily visible from ground level. Since 1974 and initially under the leadership of Dr. Tom Sever, NASA's Stennis Space Center, located on the Mississippi Gulf Coast, pioneered and expanded the application of remote sensing to archaeological topics, including cultural resource management. Building on remote sensing activities initiated by the National Park Service, archaeologists increasingly used this technology to study the past in greater depth. By the early 1980s, there were sufficient accomplishments in the application of remote sensing to anthropology and archaeology that a chapter on the subject was included in fundamental remote sensing references. Remote sensing technology and image analysis are currently undergoing a profound shift in emphasis from broad classification to detection, identification and condition of specific materials, both organic and inorganic. In the last few years, remote sensing platforms have grown increasingly capable and sophisticated. Sensors currently in use, or nearing deployment, offer significantly finer spatial and spectral resolutions than were previously available. Paired with new techniques of image analysis, this technology may make the direct detection of archaeological sites a realistic goal.

  15. Quantitative comparison of airborne remote-sensed and in situ Rhodamine WT dye and temperature during RIVET & IB09

    NASA Astrophysics Data System (ADS)

    Lenain, L.; Clark, D. B.; Guza, R. T.; Hally-Rosendahl, K.; Statom, N.; Feddersen, F.

    2012-12-01

    The transport and evolution of temperature, sediment, chlorophyll, fluorescent dye, and other tracers is of significant oceanographic interest, particularly in complex coastal environments such as the nearshore, river mouths, and tidal inlets. Remote sensing improves spatial coverage over in situ observations, and ground truthing remote sensed observations is critical for its use. Here, we present remotely sensed observations of Rhodamine WT dye and Sea Surface Temperature (SST) using the SIO Modular Aerial Sensing System (MASS) and compare them with in situ observations from the IB09 (0-300 m seaward of the surfzone, Imperial Beach, CA, October 2009) and RIVET (New River Inlet, NC, May 2012) field experiments. Dye concentrations are estimated from a unique multispectral camera system that measures the emission and absorption wavelengths of Rhodamine WT dye. During RIVET, dye is also characterized using a pushbroom hyperspectral imaging system (SPECIM AISAEagle VNIR 400-990 nm) while SST is estimated using a long-wave infrared camera (FLIR SC6000HS) coupled with an infrared pyrometer (Heitronics KT19.85II). Repeated flight passes over the dye plume were conducted approximately every 5 min for up to 4.5 hr in duration with a swath width ranging from 400 to 2000 m (altitude dependent), and provided a unique spatio-temporal depiction of the plume. A dye proxy is developed using the measured radiance at the emission and absorption wavelengths of the Rhodamine WT dye. During IB09 and RIVET, in situ dye and temperature were measured with two GPS-tracked jet skis, a small boat, and moored observations. The in situ observations are compared with the remotely sensed data in these two complex coastal environments. Funding was provided by the Office of Naval Research.

  16. A telescopic cinema sound camera for observing high altitude aerospace vehicles

    NASA Astrophysics Data System (ADS)

    Slater, Dan

    2014-09-01

    Rockets and other high altitude aerospace vehicles produce interesting visual and aural phenomena that can be remotely observed from long distances. This paper describes a compact, passive and covert remote sensing system that can produce high resolution sound movies at >100 km viewing distances. The telescopic high resolution camera is capable of resolving and quantifying space launch vehicle dynamics including plume formation, staging events and payload fairing jettison. Flight vehicles produce sounds and vibrations that modulate the local electromagnetic environment. These audio frequency modulations can be remotely sensed by passive optical and radio wave detectors. Acousto-optic sensing methods were primarily used but an experimental radioacoustic sensor using passive micro-Doppler radar techniques was also tested. The synchronized combination of high resolution flight vehicle imagery with the associated vehicle sounds produces a cinema like experience that that is useful in both an aerospace engineering and a Hollywood film production context. Examples of visual, aural and radar observations of the first SpaceX Falcon 9 v1.1 rocket launch are shown and discussed.

  17. Measurement Sets and Sites Commonly Used for High Spatial Resolution Image Product Characterization

    NASA Technical Reports Server (NTRS)

    Pagnutti, Mary

    2006-01-01

    Scientists within NASA's Applied Sciences Directorate have developed a well-characterized remote sensing Verification & Validation (V&V) site at the John C. Stennis Space Center (SSC). This site has enabled the in-flight characterization of satellite high spatial resolution remote sensing system products form Space Imaging IKONOS, Digital Globe QuickBird, and ORBIMAGE OrbView, as well as advanced multispectral airborne digital camera products. SSC utilizes engineered geodetic targets, edge targets, radiometric tarps, atmospheric monitoring equipment and their Instrument Validation Laboratory to characterize high spatial resolution remote sensing data products. This presentation describes the SSC characterization capabilities and techniques in the visible through near infrared spectrum and examples of calibration results.

  18. DETECTION AND IDENTIFICATION OF TOXIC AIR POLLUTANTS USING FIELD PORTABLE AND AIRBORNE REMOTE IMAGING SYSTEMS

    EPA Science Inventory

    Remote sensing technologies are a class of instrument and sensor systems that include laser imageries, imaging spectrometers, and visible to thermal infrared cameras. These systems have been successfully used for gas phase chemical compound identification in a variety of field e...

  19. American Society for Photogrammetry and Remote Sensing and ACSM, Fall Convention, Reno, NV, Oct. 4-9, 1987, ASPRS Technical Papers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1987-01-01

    Recent advances in remote-sensing technology and applications are examined in reviews and reports. Topics addressed include the use of Landsat TM data to assess suspended-sediment dispersion in a coastal lagoon, the use of sun incidence angle and IR reflectance levels in mapping old-growth coniferous forests, information-management systems, Large-Format-Camera soil mapping, and the economic potential of Landsat TM winter-wheat crop-condition assessment. Consideration is given to measurement of ephemeral gully erosion by airborne laser ranging, the creation of a multipurpose cadaster, high-resolution remote sensing and the news media, the role of vegetation in the global carbon cycle, PC applications in analytical photogrammetry,more » multispectral geological remote sensing of a suspected impact crater, fractional calculus in digital terrain modeling, and automated mapping using GP-based survey data.« less

  20. Optical Constituents Along a River Mouth and Inlet: Variability and Signature in Remotely Sensed Reflectance, and: Optical Constituents at the Mouth of the Columbia River: Variability and Signature in Remotely Sensed Reflectance

    DTIC Science & Technology

    2013-09-30

    Vision Floc Camera (MVFC), a Sequoia Scientific LISST 100x Type B, an RBR CTD, and two pressure-actuated Niskin bottles. The Niskin bottles were...Eco bb2fl, that measures 3 backscattering at 532 and 650 nm and CDOM fluorescence, a WetLabs WetStar CDOM fluorometer, a Sequoia Scientific flow

  1. An HDR imaging method with DTDI technology for push-broom cameras

    NASA Astrophysics Data System (ADS)

    Sun, Wu; Han, Chengshan; Xue, Xucheng; Lv, Hengyi; Shi, Junxia; Hu, Changhong; Li, Xiangzhi; Fu, Yao; Jiang, Xiaonan; Huang, Liang; Han, Hongyin

    2018-03-01

    Conventionally, high dynamic-range (HDR) imaging is based on taking two or more pictures of the same scene with different exposure. However, due to a high-speed relative motion between the camera and the scene, it is hard for this technique to be applied to push-broom remote sensing cameras. For the sake of HDR imaging in push-broom remote sensing applications, the present paper proposes an innovative method which can generate HDR images without redundant image sensors or optical components. Specifically, this paper adopts an area array CMOS (complementary metal oxide semiconductor) with the digital domain time-delay-integration (DTDI) technology for imaging, instead of adopting more than one row of image sensors, thereby taking more than one picture with different exposure. And then a new HDR image by fusing two original images with a simple algorithm can be achieved. By conducting the experiment, the dynamic range (DR) of the image increases by 26.02 dB. The proposed method is proved to be effective and has potential in other imaging applications where there is a relative motion between the cameras and scenes.

  2. Image Mosaicking Approach for a Double-Camera System in the GaoFen2 Optical Remote Sensing Satellite Based on the Big Virtual Camera.

    PubMed

    Cheng, Yufeng; Jin, Shuying; Wang, Mi; Zhu, Ying; Dong, Zhipeng

    2017-06-20

    The linear array push broom imaging mode is widely used for high resolution optical satellites (HROS). Using double-cameras attached by a high-rigidity support along with push broom imaging is one method to enlarge the field of view while ensuring high resolution. High accuracy image mosaicking is the key factor of the geometrical quality of complete stitched satellite imagery. This paper proposes a high accuracy image mosaicking approach based on the big virtual camera (BVC) in the double-camera system on the GaoFen2 optical remote sensing satellite (GF2). A big virtual camera can be built according to the rigorous imaging model of a single camera; then, each single image strip obtained by each TDI-CCD detector can be re-projected to the virtual detector of the big virtual camera coordinate system using forward-projection and backward-projection to obtain the corresponding single virtual image. After an on-orbit calibration and relative orientation, the complete final virtual image can be obtained by stitching the single virtual images together based on their coordinate information on the big virtual detector image plane. The paper subtly uses the concept of the big virtual camera to obtain a stitched image and the corresponding high accuracy rational function model (RFM) for concurrent post processing. Experiments verified that the proposed method can achieve seamless mosaicking while maintaining the geometric accuracy.

  3. An Efficient Image Compressor for Charge Coupled Devices Camera

    PubMed Central

    Li, Jin; Xing, Fei; You, Zheng

    2014-01-01

    Recently, the discrete wavelet transforms- (DWT-) based compressor, such as JPEG2000 and CCSDS-IDC, is widely seen as the state of the art compression scheme for charge coupled devices (CCD) camera. However, CCD images project on the DWT basis to produce a large number of large amplitude high-frequency coefficients because these images have a large number of complex texture and contour information, which are disadvantage for the later coding. In this paper, we proposed a low-complexity posttransform coupled with compressing sensing (PT-CS) compression approach for remote sensing image. First, the DWT is applied to the remote sensing image. Then, a pair base posttransform is applied to the DWT coefficients. The pair base are DCT base and Hadamard base, which can be used on the high and low bit-rate, respectively. The best posttransform is selected by the l p-norm-based approach. The posttransform is considered as the sparse representation stage of CS. The posttransform coefficients are resampled by sensing measurement matrix. Experimental results on on-board CCD camera images show that the proposed approach significantly outperforms the CCSDS-IDC-based coder, and its performance is comparable to that of the JPEG2000 at low bit rate and it does not have the high excessive implementation complexity of JPEG2000. PMID:25114977

  4. QWIP technology for both military and civilian applications

    NASA Astrophysics Data System (ADS)

    Gunapala, Sarath D.; Kukkonen, Carl A.; Sirangelo, Mark N.; McQuiston, Barbara K.; Chehayeb, Riad; Kaufmann, M.

    2001-10-01

    Advanced thermal imaging infrared cameras have been a cost effective and reliable method to obtain the temperature of objects. Quantum Well Infrared Photodetector (QWIP) based thermal imaging systems have advanced the state-of-the-art and are the most sensitive commercially available thermal systems. QWIP Technologies LLC, under exclusive agreement with Caltech University, is currently manufacturing the QWIP-ChipTM, a 320 X 256 element, bound-to-quasibound QWIP FPA. The camera performance falls within the long-wave IR band, spectrally peaked at 8.5 μm. The camera is equipped with a 32-bit floating-point digital signal processor combined with multi- tasking software, delivering a digital acquisition resolution of 12-bits using nominal power consumption of less than 50 Watts. With a variety of video interface options, remote control capability via an RS-232 connection, and an integrated control driver circuit to support motorized zoom and focus- compatible lenses, this camera design has excellent application in both the military and commercial sector. In the area of remote sensing, high-performance QWIP systems can be used for high-resolution, target recognition as part of a new system of airborne platforms (including UAVs). Such systems also have direct application in law enforcement, surveillance, industrial monitoring and road hazard detection systems. This presentation will cover the current performance of the commercial QWIP cameras, conceptual platform systems and advanced image processing for use in both military remote sensing and civilian applications currently being developed in road hazard monitoring.

  5. The International Space Station: A Unique Platform For Terrestrial Remote Sensing

    NASA Technical Reports Server (NTRS)

    Stefanov, William L.; Evans, Cynthia A.

    2012-01-01

    The International Space Station (ISS) became operational in November of 2000, and until recently remote sensing activities and operations have focused on handheld astronaut photography of the Earth. This effort builds from earlier NASA and Russian space programs (e.g. Evans et al. 2000; Glazovskiy and Dessinov 2000). To date, astronauts have taken more than 600,000 images of the Earth s land surface, oceans, and atmospheric phenomena from orbit using film and digital cameras as part two payloads: NASA s Crew Earth Observations experiment (http://eol.jsc.nasa.gov/) and Russia s Uragan experiment (Stefanov et al. 2012). Many of these images have unique attributes - varying look angles, ground resolutions, and illumination - that are not available from other remote sensing platforms. Despite this large volume of imagery and clear capability for Earth remote sensing, the ISS historically has not been perceived as an Earth observations platform by many remote sensing scientists. With the recent installation of new facilities and sophisticated sensor systems, and additional systems manifested and in development, that perception is changing to take advantage of the unique capabilities and viewing opportunities offered by the ISS.

  6. Unmanned aerial systems for photogrammetry and remote sensing: A review

    NASA Astrophysics Data System (ADS)

    Colomina, I.; Molina, P.

    2014-06-01

    We discuss the evolution and state-of-the-art of the use of Unmanned Aerial Systems (UAS) in the field of Photogrammetry and Remote Sensing (PaRS). UAS, Remotely-Piloted Aerial Systems, Unmanned Aerial Vehicles or simply, drones are a hot topic comprising a diverse array of aspects including technology, privacy rights, safety and regulations, and even war and peace. Modern photogrammetry and remote sensing identified the potential of UAS-sourced imagery more than thirty years ago. In the last five years, these two sister disciplines have developed technology and methods that challenge the current aeronautical regulatory framework and their own traditional acquisition and processing methods. Navety and ingenuity have combined off-the-shelf, low-cost equipment with sophisticated computer vision, robotics and geomatic engineering. The results are cm-level resolution and accuracy products that can be generated even with cameras costing a few-hundred euros. In this review article, following a brief historic background and regulatory status analysis, we review the recent unmanned aircraft, sensing, navigation, orientation and general data processing developments for UAS photogrammetry and remote sensing with emphasis on the nano-micro-mini UAS segment.

  7. Remote sensing: a tool for park planning and management

    USGS Publications Warehouse

    Draeger, William C.; Pettinger, Lawrence R.

    1981-01-01

    Remote sensing may be defined as the science of imaging or measuring objects from a distance. More commonly, however, the term is used in reference to the acquisition and use of photographs, photo-like images, and other data acquired from aircraft and satellites. Thus, remote sensing includes the use of such diverse materials as photographs taken by hand from a light aircraft, conventional aerial photographs obtained with a precision mapping camera, satellite images acquired with sophisticated scanning devices, radar images, and magnetic and gravimetric data that may not even be in image form. Remotely sensed images may be color or black and white, can vary in scale from those that cover only a few hectares of the earth's surface to those that cover tens of thousands of square kilometers, and they may be interpreted visually or with the assistance of computer systems. This article attempts to describe several of the commonly available types of remotely sensed data, to discuss approaches to data analysis, and to demonstrate (with image examples) typical applications that might interest managers of parks and natural areas.

  8. Solar-Powered Airplane with Cameras and WLAN

    NASA Technical Reports Server (NTRS)

    Higgins, Robert G.; Dunagan, Steve E.; Sullivan, Don; Slye, Robert; Brass, James; Leung, Joe G.; Gallmeyer, Bruce; Aoyagi, Michio; Wei, Mei Y.; Herwitz, Stanley R.; hide

    2004-01-01

    An experimental airborne remote sensing system includes a remotely controlled, lightweight, solar-powered airplane (see figure) that carries two digital-output electronic cameras and communicates with a nearby ground control and monitoring station via a wireless local-area network (WLAN). The speed of the airplane -- typically <50 km/h -- is low enough to enable loitering over farm fields, disaster scenes, or other areas of interest to collect high-resolution digital imagery that could be delivered to end users (e.g., farm managers or disaster-relief coordinators) in nearly real time.

  9. Highly Protable Airborne Multispectral Imaging System

    NASA Technical Reports Server (NTRS)

    Lehnemann, Robert; Mcnamee, Todd

    2001-01-01

    A portable instrumentation system is described that includes and airborne and a ground-based subsytem. It can acquire multispectral image data over swaths of terrain ranging in width from about 1.5 to 1 km. The system was developed especially for use in coastal environments and is well suited for performing remote sensing and general environmental monitoring. It includes a small,munpilotaed, remotely controlled airplance that carries a forward-looking camera for navigation, three downward-looking monochrome video cameras for imaging terrain in three spectral bands, a video transmitter, and a Global Positioning System (GPS) reciever.

  10. Remote sensing techniques applied to multispectral recognition of the Aranjuez pilot zone

    NASA Technical Reports Server (NTRS)

    Lemos, G. L.; Salinas, J.; Rebollo, M.

    1977-01-01

    A rectangular (7 x 14 km) area 40 km S of Madrid was remote-sensed with a three-stage recognition process. Ground truth was established in the first phase, airborne sensing with a multispectral scanner and photographic cameras were used in the second phase, and Landsat satellite data were obtained in the third phase. Agronomic and hydrological photointerpretation problems are discussed. Color, black/white, and labeled areas are displayed for crop recognition in the land-use survey; turbidity, concentrations of pollutants and natural chemicals, and densitometry of the water are considered in the evaluation of water resources.

  11. REVIEW OF DEVELOPMENTS IN SPACE REMOTE SENSING FOR MONITORING RESOURCES.

    USGS Publications Warehouse

    Watkins, Allen H.; Lauer, D.T.; Bailey, G.B.; Moore, D.G.; Rohde, W.G.

    1984-01-01

    Space remote sensing systems are compared for suitability in assessing and monitoring the Earth's renewable resources. Systems reviewed include the Landsat Thematic Mapper (TM), the National Oceanic and Atmospheric Administration (NOAA) Advanced Very High Resolution Radiometer (AVHRR), the French Systeme Probatoire d'Observation de la Terre (SPOT), the German Shuttle Pallet Satellite (SPAS) Modular Optoelectronic Multispectral Scanner (MOMS), the European Space Agency (ESA) Spacelab Metric Camera, the National Aeronautics and Space Administration (NASA) Large Format Camera (LFC) and Shuttle Imaging Radar (SIR-A and -B), the Russian Meteor satellite BIK-E and fragment experiments and MKF-6M and KATE-140 camera systems, the ESA Earth Resources Satellite (ERS-1), the Japanese Marine Observation Satellite (MOS-1) and Earth Resources Satellite (JERS-1), the Canadian Radarsat, the Indian Resources Satellite (IRS), and systems proposed or planned by China, Brazil, Indonesia, and others. Also reviewed are the concepts for a 6-channel Shuttle Imaging Spectroradiometer, a 128-channel Shuttle Imaging Spectrometer Experiment (SISEX), and the U. S. Mapsat.

  12. A New Remote Sensing Filter Radiometer Employing a Fabry-Perot Etalon and a CCD Camera for Column Measurements of Methane in the Earth Atmosphere

    NASA Technical Reports Server (NTRS)

    Georgieva, E. M.; Huang, W.; Heaps, W. S.

    2012-01-01

    A portable remote sensing system for precision column measurements of methane has been developed, built and tested at NASA GSFC. The sensor covers the spectral range from 1.636 micrometers to 1.646 micrometers, employs an air-gapped Fabry-Perot filter and a CCD camera and has a potential to operate from a variety of platforms. The detector is an XS-1.7-320 camera unit from Xenics Infrared solutions which combines an uncooled InGaAs detector array working up to 1.7 micrometers. Custom software was developed in addition to the graphical user basic interface X-Control provided by the company to help save and process the data. The technique and setup can be used to measure other trace gases in the atmosphere with minimal changes of the etalon and the prefilter. In this paper we describe the calibration of the system using several different approaches.

  13. Phenocams bridge the gap between field and satellite observations in an arid grassland ecosystem

    USDA-ARS?s Scientific Manuscript database

    Near surface (i.e., camera) and satellite remote sensing metrics have become widely used indicators of plant growing seasons. While robust linkages have been established between field metrics and ecosystem exchange in many land cover types, assessment of how well remotely-derived season start and en...

  14. Navigation and Remote Sensing Payloads and Methods of the Sarvant Unmanned Aerial System

    NASA Astrophysics Data System (ADS)

    Molina, P.; Fortuny, P.; Colomina, I.; Remy, M.; Macedo, K. A. C.; Zúnigo, Y. R. C.; Vaz, E.; Luebeck, D.; Moreira, J.; Blázquez, M.

    2013-08-01

    In a large number of scenarios and missions, the technical, operational and economical advantages of UAS-based photogrammetry and remote sensing over traditional airborne and satellite platforms are apparent. Airborne Synthetic Aperture Radar (SAR) or combined optical/SAR operation in remote areas might be a case of a typical "dull, dirty, dangerous" mission suitable for unmanned operation - in harsh environments such as for example rain forest areas in Brazil, topographic mapping of small to medium sparsely inhabited remote areas with UAS-based photogrammetry and remote sensing seems to be a reasonable paradigm. An example of such a system is the SARVANT platform, a fixed-wing aerial vehicle with a six-meter wingspan and a maximumtake- of-weight of 140 kilograms, able to carry a fifty-kilogram payload. SARVANT includes a multi-band (X and P) interferometric SAR payload, as the P-band enables the topographic mapping of densely tree-covered areas, providing terrain profile information. Moreover, the combination of X- and P-band measurements can be used to extract biomass estimations. Finally, long-term plan entails to incorporate surveying capabilities also at optical bands and deliver real-time imagery to a control station. This paper focuses on the remote-sensing concept in SARVANT, composed by the aforementioned SAR sensor and envisioning a double optical camera configuration to cover the visible and the near-infrared spectrum. The flexibility on the optical payload election, ranging from professional, medium-format cameras to mass-market, small-format cameras, is discussed as a driver in the SARVANT development. The paper also focuses on the navigation and orientation payloads, including the sensors (IMU and GNSS), the measurement acquisition system and the proposed navigation and orientation methods. The latter includes the Fast AT procedure, which performs close to traditional Integrated Sensor Orientation (ISO) and better than Direct Sensor Orientation (DiSO), and features the advantage of not requiring the massive image processing load for the generation of tie points, although it does require some Ground Control Points (GCPs). This technique is further supported by the availability of a high quality INS/GNSS trajectory, motivated by single-pass and repeat-pass SAR interferometry requirements.

  15. EROS: A space program for Earth resources

    USGS Publications Warehouse

    Metz, G.G.; Wiepking, P.J.

    1980-01-01

    Within the technology of the space age lies a key to increased knowledge about the resources and environment of the Earth. This key is remote sensing detecting the nature of an object without actually touching it. Although the photographic camera is the most familiar remote-sensing device, other instrument systems, such as scanning radiometers and radar, also can produce photographs and images. On the basis of the potential of this technology, and in response to the critical need for greater knowledge of the Earth and its resources, the Department of the Interior established the Earth Resources Observation Systems (EROS) Program to gather and use remotely sensed data collected by satellite and aircraft of natural and manmade features on the Earth's surface.

  16. Hi-Tech for Archeology

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Remote sensing is the process of acquiring physical information from a distance, obtaining data on Earth features from a satellite or an airplane. Advanced remote sensing instruments detect radiations not visible to the ordinary camera or the human eye in several bands of the spectrum. These data are computer processed to produce multispectral images that can provide enormous amounts of information about Earth objects or phenomena. Since every object on Earth emits or reflects radiation in its own unique signature, remote sensing data can be interpreted to tell the difference between one type of vegetation and another, between densely populated urban areas and lightly populated farmland, between clear and polluted water or in the archeological application between rain forest and hidden man made structures.

  17. Results from the National Aeronautics and Space Administration remote sensing experiments in the New York Bight, 7-17 April 1975

    NASA Technical Reports Server (NTRS)

    Hall, J. B., Jr. (Compiler); Pearson, A. O. (Compiler)

    1977-01-01

    A cooperative operation was conducted in the New York Bight to evaluate the role of remote sensing technology to monitor ocean dumping. Six NASA remote sensing experiments were flown on the C-54, U-2, and C-130 NASA aircraft, while NOAA obtained concurrent sea truth information using helicopters and surface platforms. The experiments included: (1) a Radiometer/Scatterometer (RADSCAT), (2) an Ocean Color Scanner (OCS), (3) a Multichannel Ocean Color Sensor (MOCS), (4) four Hasselblad cameras, (5) an Ebert spectrometer; and (6) a Reconafax IV infrared scanner and a Precision Radiation Thermometer (PRT-5). The results of these experiments relative to the use of remote sensors to detect, quantify, and determine the dispersion of pollutants dumped into the New York Bight are presented.

  18. Ground-based remote sensing with long lens video camera for upper-stem diameter and other tree crown measurements

    Treesearch

    Neil A. Clark; Sang-Mook Lee

    2004-01-01

    This paper demonstrates how a digital video camera with a long lens can be used with pulse laser ranging in order to collect very large-scale tree crown measurements. The long focal length of the camera lens provides the magnification required for precise viewing of distant points with the trade-off of spatial coverage. Multiple video frames are mosaicked into a single...

  19. Image quality enhancement method for on-orbit remote sensing cameras using invariable modulation transfer function.

    PubMed

    Li, Jin; Liu, Zilong

    2017-07-24

    Remote sensing cameras in the visible/near infrared range are essential tools in Earth-observation, deep-space exploration, and celestial navigation. Their imaging performance, i.e. image quality here, directly determines the target-observation performance of a spacecraft, and even the successful completion of a space mission. Unfortunately, the camera itself, such as a optical system, a image sensor, and a electronic system, limits the on-orbit imaging performance. Here, we demonstrate an on-orbit high-resolution imaging method based on the invariable modulation transfer function (IMTF) of cameras. The IMTF, which is stable and invariable to the changing of ground targets, atmosphere, and environment on orbit or on the ground, depending on the camera itself, is extracted using a pixel optical focal-plane (PFP). The PFP produces multiple spatial frequency targets, which are used to calculate the IMTF at different frequencies. The resulting IMTF in combination with a constrained least-squares filter compensates for the IMTF, which represents the removal of the imaging effects limited by the camera itself. This method is experimentally confirmed. Experiments on an on-orbit panchromatic camera indicate that the proposed method increases 6.5 times of the average gradient, 3.3 times of the edge intensity, and 1.56 times of the MTF value compared to the case when IMTF is not used. This opens a door to push the limitation of a camera itself, enabling high-resolution on-orbit optical imaging.

  20. Airport Remote Tower Sensor Systems

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Gawdiak, Yuri; Leidichj, Christopher; Papasin, Richard; Tran, Peter B.; Bass, Kevin

    2006-01-01

    Networks of video cameras, meteorological sensors, and ancillary electronic equipment are under development in collaboration among NASA Ames Research Center, the Federal Aviation Administration (FAA), and the National Oceanic Atmospheric Administration (NOAA). These networks are to be established at and near airports to provide real-time information on local weather conditions that affect aircraft approaches and landings. The prototype network is an airport-approach-zone camera system (AAZCS), which has been deployed at San Francisco International Airport (SFO) and San Carlos Airport (SQL). The AAZCS includes remotely controlled color video cameras located on top of SFO and SQL air-traffic control towers. The cameras are controlled by the NOAA Center Weather Service Unit located at the Oakland Air Route Traffic Control Center and are accessible via a secure Web site. The AAZCS cameras can be zoomed and can be panned and tilted to cover a field of view 220 wide. The NOAA observer can see the sky condition as it is changing, thereby making possible a real-time evaluation of the conditions along the approach zones of SFO and SQL. The next-generation network, denoted a remote tower sensor system (RTSS), will soon be deployed at the Half Moon Bay Airport and a version of it will eventually be deployed at Los Angeles International Airport. In addition to remote control of video cameras via secure Web links, the RTSS offers realtime weather observations, remote sensing, portability, and a capability for deployment at remote and uninhabited sites. The RTSS can be used at airports that lack control towers, as well as at major airport hubs, to provide synthetic augmentation of vision for both local and remote operations under what would otherwise be conditions of low or even zero visibility.

  1. Corn and sorghum phenotyping using a fixed-wing UAV-based remote sensing system

    NASA Astrophysics Data System (ADS)

    Shi, Yeyin; Murray, Seth C.; Rooney, William L.; Valasek, John; Olsenholler, Jeff; Pugh, N. Ace; Henrickson, James; Bowden, Ezekiel; Zhang, Dongyan; Thomasson, J. Alex

    2016-05-01

    Recent development of unmanned aerial systems has created opportunities in automation of field-based high-throughput phenotyping by lowering flight operational cost and complexity and allowing flexible re-visit time and higher image resolution than satellite or manned airborne remote sensing. In this study, flights were conducted over corn and sorghum breeding trials in College Station, Texas, with a fixed-wing unmanned aerial vehicle (UAV) carrying two multispectral cameras and a high-resolution digital camera. The objectives were to establish the workflow and investigate the ability of UAV-based remote sensing for automating data collection of plant traits to develop genetic and physiological models. Most important among these traits were plant height and number of plants which are currently manually collected with high labor costs. Vegetation indices were calculated for each breeding cultivar from mosaicked and radiometrically calibrated multi-band imagery in order to be correlated with ground-measured plant heights, populations and yield across high genetic-diversity breeding cultivars. Growth curves were profiled with the aerial measured time-series height and vegetation index data. The next step of this study will be to investigate the correlations between aerial measurements and ground truth measured manually in field and from lab tests.

  2. ARC-2009-ACD09-0218-005

    NASA Image and Video Library

    2009-10-06

    NASA Conducts Airborne Science Aboard Zeppelin Airship: equipped with two imaging instruments enabling remote sensing and atmospheric science measurements not previously practical. Hyperspectral imager and large format camera mounted inside the Zeppelin nose fairing.

  3. Procurement specification color graphic camera system

    NASA Technical Reports Server (NTRS)

    Prow, G. E.

    1980-01-01

    The performance and design requirements for a Color Graphic Camera System are presented. The system is a functional part of the Earth Observation Department Laboratory System (EODLS) and will be interfaced with Image Analysis Stations. It will convert the output of a raster scan computer color terminal into permanent, high resolution photographic prints and transparencies. Images usually displayed will be remotely sensed LANDSAT imager scenes.

  4. Sensible Success

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Commercial remote sensing uses satellite imagery to provide valuable information about the planet's features. By capturing light reflected from the Earth's surface with cameras or sensor systems, usually mounted on an orbiting satellite, data is obtained for business enterprises with an interest in land feature distribution. Remote sensing is practical when applied to large-area coverage, such as agricultural monitoring, regional mapping, environmental assessment, and infrastructure planning. For example, cellular service providers use satellite imagery to select the most ideal location for a communication tower. Crowsey Incorporated has the ability to use remote sensing capabilities to conduct spatial geographic visualizations and other remote-sensing services. Presently, the company has found a demand for these services in the area of litigation support. By using spatial information and analyses, Crowsey helps litigators understand and visualize complex issues and then to communicate a clear argument, with complete indisputable evidence. Crowsey Incorporated is a proud partner in NASA's Mississippi Space Commerce Initiative, with research offices at the John C. Stennis Space Center.

  5. Thermal Remote Sensing with Uav-Based Workflows

    NASA Astrophysics Data System (ADS)

    Boesch, R.

    2017-08-01

    Climate change will have a significant influence on vegetation health and growth. Predictions of higher mean summer temperatures and prolonged summer draughts may pose a threat to agriculture areas and forest canopies. Rising canopy temperatures can be an indicator of plant stress because of the closure of stomata and a decrease in the transpiration rate. Thermal cameras are available for decades, but still often used for single image analysis, only in oblique view manner or with visual evaluations of video sequences. Therefore remote sensing using a thermal camera can be an important data source to understand transpiration processes. Photogrammetric workflows allow to process thermal images similar to RGB data. But low spatial resolution of thermal cameras, significant optical distortion and typically low contrast require an adapted workflow. Temperature distribution in forest canopies is typically completely unknown and less distinct than for urban or industrial areas, where metal constructions and surfaces yield high contrast and sharp edge information. The aim of this paper is to investigate the influence of interior camera orientation, tie point matching and ground control points on the resulting accuracy of bundle adjustment and dense cloud generation with a typically used photogrammetric workflow for UAVbased thermal imagery in natural environments.

  6. Integrated remotely sensed datasets for disaster management

    NASA Astrophysics Data System (ADS)

    McCarthy, Timothy; Farrell, Ronan; Curtis, Andrew; Fotheringham, A. Stewart

    2008-10-01

    Video imagery can be acquired from aerial, terrestrial and marine based platforms and has been exploited for a range of remote sensing applications over the past two decades. Examples include coastal surveys using aerial video, routecorridor infrastructures surveys using vehicle mounted video cameras, aerial surveys over forestry and agriculture, underwater habitat mapping and disaster management. Many of these video systems are based on interlaced, television standards such as North America's NTSC and European SECAM and PAL television systems that are then recorded using various video formats. This technology has recently being employed as a front-line, remote sensing technology for damage assessment post-disaster. This paper traces the development of spatial video as a remote sensing tool from the early 1980s to the present day. The background to a new spatial-video research initiative based at National University of Ireland, Maynooth, (NUIM) is described. New improvements are proposed and include; low-cost encoders, easy to use software decoders, timing issues and interoperability. These developments will enable specialists and non-specialists collect, process and integrate these datasets within minimal support. This integrated approach will enable decision makers to access relevant remotely sensed datasets quickly and so, carry out rapid damage assessment during and post-disaster.

  7. Construction of an unmanned aerial vehicle remote sensing system for crop monitoring

    NASA Astrophysics Data System (ADS)

    Jeong, Seungtaek; Ko, Jonghan; Kim, Mijeong; Kim, Jongkwon

    2016-04-01

    We constructed a lightweight unmanned aerial vehicle (UAV) remote sensing system and determined the ideal method for equipment setup, image acquisition, and image processing. Fields of rice paddy (Oryza sativa cv. Unkwang) grown under three different nitrogen (N) treatments of 0, 50, or 115 kg/ha were monitored at Chonnam National University, Gwangju, Republic of Korea, in 2013. A multispectral camera was used to acquire UAV images from the study site. Atmospheric correction of these images was completed using the empirical line method, and three-point (black, gray, and white) calibration boards were used as pseudo references. Evaluation of our corrected UAV-based remote sensing data revealed that correction efficiency and root mean square errors ranged from 0.77 to 0.95 and 0.01 to 0.05, respectively. The time series maps of simulated normalized difference vegetation index (NDVI) produced using the UAV images reproduced field variations of NDVI reasonably well, both within and between the different N treatments. We concluded that the UAV-based remote sensing technology utilized in this study is potentially an easy and simple way to quantitatively obtain reliable two-dimensional remote sensing information on crop growth.

  8. Earth Observations from the International Space Station: Benefits for Humanity

    NASA Technical Reports Server (NTRS)

    Stefanov, William L.

    2015-01-01

    The International Space Station (ISS) is a unique terrestrial remote sensing platform for observation of the Earth's land surface, oceans, and atmosphere. Unlike automated remote-sensing platforms it has a human crew; is equipped with both internal and externally-mounted active and passive remote sensing instruments; and has an inclined, low-Earth orbit that provides variable views and lighting (day and night) over 95 percent of the inhabited surface of the Earth. As such, it provides a useful complement to autonomous, sun-synchronous sensor systems in higher altitude polar orbits. Beginning in May 2012, NASA ISS sensor systems have been available to respond to requests for data through the International Charter, Space and Major Disasters, also known as the "International Disaster Charter" or IDC. Data from digital handheld cameras, multispectral, and hyperspectral imaging systems has been acquired in response to IDC activations and delivered to requesting agencies through the United States Geological Survey. The characteristics of the ISS for Earth observation will be presented, including past, current, and planned NASA, International Partner, and commercial remote sensing systems. The role and capabilities of the ISS for humanitarian benefit, specifically collection of remotely sensed disaster response data, will be discussed.

  9. Research on enhancing the utilization of digital multispectral data and geographic information systems in global habitability studies

    NASA Technical Reports Server (NTRS)

    Martinko, E. A.; Merchant, J. W.

    1986-01-01

    The University of Kansas Applied Remote Sensing (KARS) program is engaged in a continuing long term research and development effort designed to reveal and facilitate new applications of remote sensing technology for decision makers in governmental agencies and private firms. Some objectives of the program follows. The development of new modes of analyzing multispectral scanner, aerial camera, thermal scanner, and radar data, singly or in concert in order to more effectively use these systems. Merge data derived from remote sensing with data derived from conventional sources in geographic information systems to facilitate better environmental planning. Stimulation of the application of the products of remote sensing systems to problems of resource management and environmental quality now being addressed in NASA's Global Habitability directive. The application of remote sensing techniques and analysis and geographic information systems technology to the solution of significant concerns of state and local officials and private industry. The guidance, assistance and stimulation of faculty, staff and students in the utilization of information from the Earth Resources Satellite (LANDSAT) and Aircraft Programs of NASA in research, education, and public service activities carried at the University of Kansas.

  10. Image processing methods in two and three dimensions used to animate remotely sensed data. [cloud cover

    NASA Technical Reports Server (NTRS)

    Hussey, K. J.; Hall, J. R.; Mortensen, R. A.

    1986-01-01

    Image processing methods and software used to animate nonimaging remotely sensed data on cloud cover are described. Three FORTRAN programs were written in the VICAR2/TAE image processing domain to perform 3D perspective rendering, to interactively select parameters controlling the projection, and to interpolate parameter sets for animation images between key frames. Operation of the 3D programs and transferring the images to film is automated using executive control language and custom hardware to link the computer and camera.

  11. Curiosity on Tilt Table with Mast Up

    NASA Image and Video Library

    2011-03-25

    The Mast Camera Mastcam on NASA Mars rover Curiosity has two rectangular eyes near the top of the rover remote sensing mast. This image shows Curiosity on a tilt table NASA Jet Propulsion Laboratory, Pasadena, California.

  12. Remote sensing systems – Platforms and sensors: Aerial, satellites, UAVs, optical, radar, and LiDAR: Chapter 1

    USGS Publications Warehouse

    Panda, Sudhanshu S.; Rao, Mahesh N.; Thenkabail, Prasad S.; Fitzerald, James E.

    2015-01-01

    The American Society of Photogrammetry and Remote Sensing defined remote sensing as the measurement or acquisition of information of some property of an object or phenomenon, by a recording device that is not in physical or intimate contact with the object or phenomenon under study (Colwell et al., 1983). Environmental Systems Research Institute (ESRI) in its geographic information system (GIS) dictionary defines remote sensing as “collecting and interpreting information about the environment and the surface of the earth from a distance, primarily by sensing radiation that is naturally emitted or reflected by the earth’s surface or from the atmosphere, or by sending signals transmitted from a device and reflected back to it (ESRI, 2014).” The usual source of passive remote sensing data is the measurement of reflected or transmitted electromagnetic radiation (EMR) from the sun across the electromagnetic spectrum (EMS); this can also include acoustic or sound energy, gravity, or the magnetic field from or of the objects under consideration. In this context, the simple act of reading this text is considered remote sensing. In this case, the eye acts as a sensor and senses the light reflected from the object to obtain information about the object. It is the same technology used by a handheld camera to take a photograph of a person or a distant scenic view. Active remote sensing, however, involves sending a pulse of energy and then measuring the returned energy through a sensor (e.g., Radio Detection and Ranging [RADAR], Light Detection and Ranging [LiDAR]). Thermal sensors measure emitted energy by different objects. Thus, in general, passive remote sensing involves the measurement of solar energy reflected from the Earth’s surface, while active remote sensing involves synthetic (man-made) energy pulsed at the environment and the return signals are measured and recorded.

  13. Modeling and Simulation of High Resolution Optical Remote Sensing Satellite Geometric Chain

    NASA Astrophysics Data System (ADS)

    Xia, Z.; Cheng, S.; Huang, Q.; Tian, G.

    2018-04-01

    The high resolution satellite with the longer focal length and the larger aperture has been widely used in georeferencing of the observed scene in recent years. The consistent end to end model of high resolution remote sensing satellite geometric chain is presented, which consists of the scene, the three line array camera, the platform including attitude and position information, the time system and the processing algorithm. The integrated design of the camera and the star tracker is considered and the simulation method of the geolocation accuracy is put forward by introduce the new index of the angle between the camera and the star tracker. The model is validated by the geolocation accuracy simulation according to the test method of the ZY-3 satellite imagery rigorously. The simulation results show that the geolocation accuracy is within 25m, which is highly consistent with the test results. The geolocation accuracy can be improved about 7 m by the integrated design. The model combined with the simulation method is applicable to the geolocation accuracy estimate before the satellite launching.

  14. A review of future remote sensing satellite capabilities

    NASA Technical Reports Server (NTRS)

    Calabrese, M. A.

    1980-01-01

    Existing, planned and future NASA capabilities in the field of remote sensing satellites are reviewed in relation to the use of remote sensing techniques for the identification of irrigated lands. The status of the currently operational Landsat 2 and 3 satellites is indicated, and it is noted that Landsat D is scheduled to be in operation in two years. The orbital configuration and instrumentation of Landsat D are discussed, with particular attention given to the thematic mapper, which is expected to improve capabilities for small field identification and crop discrimination and classification. Future possibilities are then considered, including a multi-spectral resource sampler supplying high spatial and temporal resolution data possibly based on push-broom scanning, Shuttle-maintained Landsat follow-on missions, a satellite to obtain high-resolution stereoscopic data, further satellites providing all-weather radar capability and the Large Format Camera.

  15. Intercomparison of phenological transition dates derived from the PhenoCam Dataset V1.0 and MODIS satellite remote sensing.

    PubMed

    Richardson, Andrew D; Hufkens, Koen; Milliman, Tom; Frolking, Steve

    2018-04-09

    Phenology is a valuable diagnostic of ecosystem health, and has applications to environmental monitoring and management. Here, we conduct an intercomparison analysis using phenological transition dates derived from near-surface PhenoCam imagery and MODIS satellite remote sensing. We used approximately 600 site-years of data, from 128 camera sites covering a wide range of vegetation types and climate zones. During both "greenness rising" and "greenness falling" transition phases, we found generally good agreement between PhenoCam and MODIS transition dates for agricultural, deciduous forest, and grassland sites, provided that the vegetation in the camera field of view was representative of the broader landscape. The correlation between PhenoCam and MODIS transition dates was poor for evergreen forest sites. We discuss potential reasons (including sub-pixel spatial heterogeneity, flexibility of the transition date extraction method, vegetation index sensitivity in evergreen systems, and PhenoCam geolocation uncertainty) for varying agreement between time series of vegetation indices derived from PhenoCam and MODIS imagery. This analysis increases our confidence in the ability of satellite remote sensing to accurately characterize seasonal dynamics in a range of ecosystems, and provides a basis for interpreting those dynamics in the context of tangible phenological changes occurring on the ground.

  16. In-Situ Cameras for Radiometric Correction of Remotely Sensed Data

    NASA Astrophysics Data System (ADS)

    Kautz, Jess S.

    The atmosphere distorts the spectrum of remotely sensed data, negatively affecting all forms of investigating Earth's surface. To gather reliable data, it is vital that atmospheric corrections are accurate. The current state of the field of atmospheric correction does not account well for the benefits and costs of different correction algorithms. Ground spectral data are required to evaluate these algorithms better. This dissertation explores using cameras as radiometers as a means of gathering ground spectral data. I introduce techniques to implement a camera systems for atmospheric correction using off the shelf parts. To aid the design of future camera systems for radiometric correction, methods for estimating the system error prior to construction, calibration and testing of the resulting camera system are explored. Simulations are used to investigate the relationship between the reflectance accuracy of the camera system and the quality of atmospheric correction. In the design phase, read noise and filter choice are found to be the strongest sources of system error. I explain the calibration methods for the camera system, showing the problems of pixel to angle calibration, and adapting the web camera for scientific work. The camera system is tested in the field to estimate its ability to recover directional reflectance from BRF data. I estimate the error in the system due to the experimental set up, then explore how the system error changes with different cameras, environmental set-ups and inversions. With these experiments, I learn about the importance of the dynamic range of the camera, and the input ranges used for the PROSAIL inversion. Evidence that the camera can perform within the specification set for ELM correction in this dissertation is evaluated. The analysis is concluded by simulating an ELM correction of a scene using various numbers of calibration targets, and levels of system error, to find the number of cameras needed for a full-scale implementation.

  17. Testbed for remote telepresence research

    NASA Astrophysics Data System (ADS)

    Adnan, Sarmad; Cheatham, John B., Jr.

    1992-11-01

    Teleoperated robots offer solutions to problems associated with operations in remote and unknown environments, such as space. Teleoperated robots can perform tasks related to inspection, maintenance, and retrieval. A video camera can be used to provide some assistance in teleoperations, but for fine manipulation and control, a telepresence system that gives the operator a sense of actually being at the remote location is more desirable. A telepresence system comprised of a head-tracking stereo camera system, a kinematically redundant arm, and an omnidirectional mobile robot has been developed at the mechanical engineering department at Rice University. This paper describes the design and implementation of this system, its control hardware, and software. The mobile omnidirectional robot has three independent degrees of freedom that permit independent control of translation and rotation, thereby simulating a free flying robot in a plane. The kinematically redundant robot arm has eight degrees of freedom that assist in obstacle and singularity avoidance. The on-board control computers permit control of the robot from the dual hand controllers via a radio modem system. A head-mounted display system provides the user with a stereo view from a pair of cameras attached to the mobile robotics system. The head tracking camera system moves stereo cameras mounted on a three degree of freedom platform to coordinate with the operator's head movements. This telepresence system provides a framework for research in remote telepresence, and teleoperations for space.

  18. Determining wildlife use of wildlife crossing structures under different scenarios.

    DOT National Transportation Integrated Search

    2012-05-01

    This research evaluated Utahs wildlife crossing structures to help UDOT and the Utah Division of Wildlife Resources assess crossing efficacy. In this study, remote motion-sensed cameras were used at 14 designated wildlife crossing culverts and bri...

  19. A mobile device-based imaging spectrometer for environmental monitoring by attaching a lightweight small module to a commercial digital camera.

    PubMed

    Cai, Fuhong; Lu, Wen; Shi, Wuxiong; He, Sailing

    2017-11-15

    Spatially-explicit data are essential for remote sensing of ecological phenomena. Lately, recent innovations in mobile device platforms have led to an upsurge in on-site rapid detection. For instance, CMOS chips in smart phones and digital cameras serve as excellent sensors for scientific research. In this paper, a mobile device-based imaging spectrometer module (weighing about 99 g) is developed and equipped on a Single Lens Reflex camera. Utilizing this lightweight module, as well as commonly used photographic equipment, we demonstrate its utility through a series of on-site multispectral imaging, including ocean (or lake) water-color sensing and plant reflectance measurement. Based on the experiments we obtain 3D spectral image cubes, which can be further analyzed for environmental monitoring. Moreover, our system can be applied to many kinds of cameras, e.g., aerial camera and underwater camera. Therefore, any camera can be upgraded to an imaging spectrometer with the help of our miniaturized module. We believe it has the potential to become a versatile tool for on-site investigation into many applications.

  20. Support for the Naval Research Laboratory Environmental Passive Microwave Remote Sensing Program.

    DTIC Science & Technology

    1983-04-29

    L. H. Gesell te _ C= Project Manager ’ . . , . ".."........... . ., . q J ABSTRACT This document summarizes the data acquisition, reduc- tion, and...film camera , and other environmental sensors. CSC gradually assumed the bulk of the responsibility for opera- ting this equipment. This included running...radiometers, and setting up and operating the strip-film camera and other en- vironmental sensors. Also of significant importance to the missions was

  1. High resolution remote sensing missions of a tethered satellite

    NASA Technical Reports Server (NTRS)

    Vetrella, S.; Moccia, A.

    1986-01-01

    The application of the Tethered Satellite (TS) as an operational remote sensing platform is studied. It represents a new platform capable of covering the altitudes between airplanes and free flying satellites, offering an adequate lifetime, high geometric and radiometric resolution and improved cartographic accuracy. Two operational remote sensing missions are proposed: one using two linear array systems for along track stereoscopic observation and one using a synthetic aperture radar combined with an interferometric technique. These missions are able to improve significantly the accuracy of future real time cartographic systems from space, also allowing, in the case of active microwave systems, the Earth's observation both in adverse weather and at any time, day or night. Furthermore, a simulation program is described in which, in order to examine carefully the potentiality of the TS as a new remote sensing platform, the orbital and attitude dynamics description of the TSS is integrated with the sensor viewing geometry, the Earth's ellipsoid, the atmospheric effects, the Sun illumination and the digital elevation model. A preliminary experiment has been proposed which consist of a metric camera to be deployed downwards during the second Shuttle demonstration flight.

  2. Information recovery through image sequence fusion under wavelet transformation

    NASA Astrophysics Data System (ADS)

    He, Qiang

    2010-04-01

    Remote sensing is widely applied to provide information of areas with limited ground access with applications such as to assess the destruction from natural disasters and to plan relief and recovery operations. However, the data collection of aerial digital images is constrained by bad weather, atmospheric conditions, and unstable camera or camcorder. Therefore, how to recover the information from the low-quality remote sensing images and how to enhance the image quality becomes very important for many visual understanding tasks, such like feature detection, object segmentation, and object recognition. The quality of remote sensing imagery can be improved through meaningful combination of the employed images captured from different sensors or from different conditions through information fusion. Here we particularly address information fusion to remote sensing images under multi-resolution analysis in the employed image sequences. The image fusion is to recover complete information by integrating multiple images captured from the same scene. Through image fusion, a new image with high-resolution or more perceptive for human and machine is created from a time series of low-quality images based on image registration between different video frames.

  3. Introduction and Testing of a Monitoring and Colony-Mapping Method for Waterbird Populations That Uses High-Speed and Ultra-Detailed Aerial Remote Sensing

    PubMed Central

    Bakó, Gábor; Tolnai, Márton; Takács, Ádám

    2014-01-01

    Remote sensing is a method that collects data of the Earth's surface without causing disturbances. Thus, it is worthwhile to use remote sensing methods to survey endangered ecosystems, as the studied species will behave naturally while undisturbed. The latest passive optical remote sensing solutions permit surveys from long distances. State-of-the-art highly sensitive sensor systems allow high spatial resolution image acquisition at high altitudes and at high flying speeds, even in low-visibility conditions. As the aerial imagery captured by an airplane covers the entire study area, all the animals present in that area can be recorded. A population assessment is conducted by visual interpretations of an ortho image map. The basic objective of this study is to determine whether small- and medium-sized bird species are recognizable in the ortho images by using high spatial resolution aerial cameras. The spatial resolution needed for identifying the bird species in the ortho image map was studied. The survey was adjusted to determine the number of birds in a colony at a given time. PMID:25046012

  4. Real-Time and Post-Processed Georeferencing for Hyperpspectral Drone Remote Sensing

    NASA Astrophysics Data System (ADS)

    Oliveira, R. A.; Khoramshahi, E.; Suomalainen, J.; Hakala, T.; Viljanen, N.; Honkavaara, E.

    2018-05-01

    The use of drones and photogrammetric technologies are increasing rapidly in different applications. Currently, drone processing workflow is in most cases based on sequential image acquisition and post-processing, but there are great interests towards real-time solutions. Fast and reliable real-time drone data processing can benefit, for instance, environmental monitoring tasks in precision agriculture and in forest. Recent developments in miniaturized and low-cost inertial measurement systems and GNSS sensors, and Real-time kinematic (RTK) position data are offering new perspectives for the comprehensive remote sensing applications. The combination of these sensors and light-weight and low-cost multi- or hyperspectral frame sensors in drones provides the opportunity of creating near real-time or real-time remote sensing data of target object. We have developed a system with direct georeferencing onboard drone to be used combined with hyperspectral frame cameras in real-time remote sensing applications. The objective of this study is to evaluate the real-time georeferencing comparing with post-processing solutions. Experimental data sets were captured in agricultural and forested test sites using the system. The accuracy of onboard georeferencing data were better than 0.5 m. The results showed that the real-time remote sensing is promising and feasible in both test sites.

  5. Comparing near-earth and satellite remote sensing based phenophase estimates: an analysis using multiple webcams and MODIS (Invited)

    NASA Astrophysics Data System (ADS)

    Hufkens, K.; Richardson, A. D.; Migliavacca, M.; Frolking, S. E.; Braswell, B. H.; Milliman, T.; Friedl, M. A.

    2010-12-01

    In recent years several studies have used digital cameras and webcams to monitor green leaf phenology. Such "near-surface" remote sensing has been shown to be a cost effective means of accurately capturing phenology. Specifically, it allows for accurate tracking of intra- and inter-annual phenological dynamics at high temporal frequency and over broad spatial scales compared to visual observations or tower-based fAPAR and broadband NDVI measurements. Near surface remote sensing measurements therefore show promise for bridging the gap between traditional in-situ measurements of phenology and satellite remote sensing data. For this work, we examined the relationship between phenophase estimates derived from satellite remote sensing (MODIS) and near-earth remote sensing derived from webcams for a select set of sites with high-quality webcam data. A logistic model was used to characterize phenophases for both the webcam and MODIS data. We documented model fit accuracy, phenophase estimates, and model biases for both data sources. Our results show that different vegetation indices (VI's) derived from MODIS produce significantly different phenophase estimates compared to corresponding estimates derived from webcam data. Different VI's showed markedly different radiometric properties, and as a result, influenced phenophase estimates. The study shows that phenophase estimates are not only highly dependent on the algorithm used but also depend on the VI used by the phenology retrieval algorithm. These results highlight the need for a better understanding of how near-earth and satellite remote data relate to eco-physiological and canopy changes during different parts of the growing season.

  6. The Short Wave Aerostat-Mounted Imager (SWAMI): A novel platform for acquiring remotely sensed data from a tethered balloon

    USGS Publications Warehouse

    Vierling, L.A.; Fersdahl, M.; Chen, X.; Li, Z.; Zimmerman, P.

    2006-01-01

    We describe a new remote sensing system called the Short Wave Aerostat-Mounted Imager (SWAMI). The SWAMI is designed to acquire co-located video imagery and hyperspectral data to study basic remote sensing questions and to link landscape level trace gas fluxes with spatially and temporally appropriate spectral observations. The SWAMI can fly at altitudes up to 2 km above ground level to bridge the spatial gap between radiometric measurements collected near the surface and those acquired by other aircraft or satellites. The SWAMI platform consists of a dual channel hyperspectral spectroradiometer, video camera, GPS, thermal infrared sensor, and several meteorological and control sensors. All SWAMI functions (e.g. data acquisition and sensor pointing) can be controlled from the ground via wireless transmission. Sample data from the sampling platform are presented, along with several potential scientific applications of SWAMI data.

  7. Research on airborne infrared leakage detection of natural gas pipeline

    NASA Astrophysics Data System (ADS)

    Tan, Dongjie; Xu, Bin; Xu, Xu; Wang, Hongchao; Yu, Dongliang; Tian, Shengjie

    2011-12-01

    An airborne laser remote sensing technology is proposed to detect natural gas pipeline leakage in helicopter which carrying a detector, and the detector can detect a high spatial resolution of trace of methane on the ground. The principle of the airborne laser remote sensing system is based on tunable diode laser absorption spectroscopy (TDLAS). The system consists of an optical unit containing the laser, camera, helicopter mount, electronic unit with DGPS antenna, a notebook computer and a pilot monitor. And the system is mounted on a helicopter. The principle and the architecture of the airborne laser remote sensing system are presented. Field test experiments are carried out on West-East Natural Gas Pipeline of China, and the results show that airborne detection method is suitable for detecting gas leak of pipeline on plain, desert, hills but unfit for the area with large altitude diversification.

  8. Fast and compact internal scanning CMOS-based hyperspectral camera: the Snapscan

    NASA Astrophysics Data System (ADS)

    Pichette, Julien; Charle, Wouter; Lambrechts, Andy

    2017-02-01

    Imec has developed a process for the monolithic integration of optical filters on top of CMOS image sensors, leading to compact, cost-efficient and faster hyperspectral cameras. Linescan cameras are typically used in remote sensing or for conveyor belt applications. Translation of the target is not always possible for large objects or in many medical applications. Therefore, we introduce a novel camera, the Snapscan (patent pending), exploiting internal movement of a linescan sensor enabling fast and convenient acquisition of high-resolution hyperspectral cubes (up to 2048x3652x150 in spectral range 475-925 nm). The Snapscan combines the spectral and spatial resolutions of a linescan system with the convenience of a snapshot camera.

  9. EXPERIMENTS IN LITHOGRAPHY FROM REMOTE SENSOR IMAGERY.

    USGS Publications Warehouse

    Kidwell, R. H.; McSweeney, J.; Warren, A.; Zang, E.; Vickers, E.

    1983-01-01

    Imagery from remote sensing systems such as the Landsat multispectral scanner and return beam vidicon, as well as synthetic aperture radar and conventional optical camera systems, contains information at resolutions far in excess of that which can be reproduced by the lithographic printing process. The data often require special handling to produce both standard and special map products. Some conclusions have been drawn regarding processing techniques, procedures for production, and printing limitations.

  10. Remotely monitoring evaporation rate and soil water status using thermal imaging and "three-temperatures model (3T Model)" under field-scale conditions.

    PubMed

    Qiu, Guo Yu; Zhao, Ming

    2010-03-01

    Remote monitoring of soil evaporation and soil water status is necessary for water resource and environment management. Ground based remote sensing can be the bridge between satellite remote sensing and ground-based point measurement. The primary object of this study is to provide an algorithm to estimate evaporation and soil water status by remote sensing and to verify its accuracy. Observations were carried out in a flat field with varied soil water content. High-resolution thermal images were taken with a thermal camera; soil evaporation was measured with a weighing lysimeter; weather data were recorded at a nearby meteorological station. Based on the thermal imaging and the three-temperatures model (3T model), we developed an algorithm to estimate soil evaporation and soil water status. The required parameters of the proposed method were soil surface temperature, air temperature, and solar radiation. By using the proposed method, daily variation in soil evaporation was estimated. Meanwhile, soil water status was remotely monitored by using the soil evaporation transfer coefficient. Results showed that the daily variation trends of measured and estimated evaporation agreed with each other, with a regression line of y = 0.92x and coefficient of determination R(2) = 0.69. The simplicity of the proposed method makes the 3T model a potentially valuable tool for remote sensing.

  11. Robust and adaptive band-to-band image transform of UAS miniature multi-lens multispectral camera

    NASA Astrophysics Data System (ADS)

    Jhan, Jyun-Ping; Rau, Jiann-Yeou; Haala, Norbert

    2018-03-01

    Utilizing miniature multispectral (MS) or hyperspectral (HS) cameras by mounting them on an Unmanned Aerial System (UAS) has the benefits of convenience and flexibility to collect remote sensing imagery for precision agriculture, vegetation monitoring, and environment investigation applications. Most miniature MS cameras adopt a multi-lens structure to record discrete MS bands of visible and invisible information. The differences in lens distortion, mounting positions, and viewing angles among lenses mean that the acquired original MS images have significant band misregistration errors. We have developed a Robust and Adaptive Band-to-Band Image Transform (RABBIT) method for dealing with the band co-registration of various types of miniature multi-lens multispectral cameras (Mini-MSCs) to obtain band co-registered MS imagery for remote sensing applications. The RABBIT utilizes modified projective transformation (MPT) to transfer the multiple image geometry of a multi-lens imaging system to one sensor geometry, and combines this with a robust and adaptive correction (RAC) procedure to correct several systematic errors and to obtain sub-pixel accuracy. This study applies three state-of-the-art Mini-MSCs to evaluate the RABBIT method's performance, specifically the Tetracam Miniature Multiple Camera Array (MiniMCA), Micasense RedEdge, and Parrot Sequoia. Six MS datasets acquired at different target distances and dates, and locations are also applied to prove its reliability and applicability. Results prove that RABBIT is feasible for different types of Mini-MSCs with accurate, robust, and rapid image processing efficiency.

  12. Visibility through the gaseous smoke in airborne remote sensing using a DSLR camera

    NASA Astrophysics Data System (ADS)

    Chabok, Mirahmad; Millington, Andrew; Hacker, Jorg M.; McGrath, Andrew J.

    2016-08-01

    Visibility and clarity of remotely sensed images acquired by consumer grade DSLR cameras, mounted on an unmanned aerial vehicle or a manned aircraft, are critical factors in obtaining accurate and detailed information from any area of interest. The presence of substantial haze, fog or gaseous smoke particles; caused, for example, by an active bushfire at the time of data capture, will dramatically reduce image visibility and quality. Although most modern hyperspectral imaging sensors are capable of capturing a large number of narrow range bands of the shortwave and thermal infrared spectral range, which have the potential to penetrate smoke and haze, the resulting images do not contain sufficient spatial detail to enable locating important objects or assist search and rescue or similar applications which require high resolution information. We introduce a new method for penetrating gaseous smoke without compromising spatial resolution using a single modified DSLR camera in conjunction with image processing techniques which effectively improves the visibility of objects in the captured images. This is achieved by modifying a DSLR camera and adding a custom optical filter to enable it to capture wavelengths from 480-1200nm (R, G and Near Infrared) instead of the standard RGB bands (400-700nm). With this modified camera mounted on an aircraft, images were acquired over an area polluted by gaseous smoke from an active bushfire. Processed data using our proposed method shows significant visibility improvements compared with other existing solutions.

  13. Assessment of Various Remote Sensing Technologies in Biomass and Nitrogen Content Estimation Using AN Agricultural Test Field

    NASA Astrophysics Data System (ADS)

    Näsi, R.; Viljanen, N.; Kaivosoja, J.; Hakala, T.; Pandžić, M.; Markelin, L.; Honkavaara, E.

    2017-10-01

    Multispectral and hyperspectral imaging is usually acquired by satellite and aircraft platforms. Recently, miniaturized hyperspectral 2D frame cameras have showed great potential to precise agriculture estimations and they are feasible to combine with lightweight platforms, such as drones. Drone platform is a flexible tool for remote sensing applications with environment and agriculture. The assessment and comparison of different platforms such as satellite, aircraft and drones with different sensors, such as hyperspectral and RGB cameras is an important task in order to understand the potential of the data provided by these equipment and to select the most appropriate according to the user applications and requirements. In this context, open and permanent test fields are very significant and helpful experimental environment, since they provide a comparative data for different platforms, sensors and users, allowing multi-temporal analyses as well. Objective of this work was to investigate the feasibility of an open permanent test field in context of precision agriculture. Satellite (Sentinel-2), aircraft and drones with hyperspectral and RGB cameras were assessed in this study to estimate biomass, using linear regression models and in-situ samples. Spectral data and 3D information were used and compared in different combinations to investigate the quality of the models. The biomass estimation accuracies using linear regression models were better than 90 % for the drone based datasets. The results showed that the use of spectral and 3D features together improved the estimation model. However, estimation of nitrogen content was less accurate with the evaluated remote sensing sensors. The open and permanent test field showed to be suitable to provide an accurate and reliable reference data for the commercial users and farmers.

  14. Transferring Knowledge from a Bird's-Eye View - Earth Observation and Space Travels in Schools

    NASA Astrophysics Data System (ADS)

    Rienow, Andreas; Hodam, Henryk; Menz, Gunter; Voß, Kerstin

    2014-05-01

    In spring 2014, four commercial cameras will be transported by a Dragon spacecraft to the International Space Station (ISS) and mounted to the ESA Columbus laboratory. The cameras will deliver live earth observation data from different angles. The "Columbus-Eye"* project aims at distributing the video and image data produced by those cameras through a web portal. It should primary serve as learning portal for pupils comprising teaching material around the ISS earth observation imagery. The pupils should be motivated to work with the images in order to learn about curriculum relevant topics of natural sciences. The material will be prepared based on the experiences of the FIS* (German abbreviation for "Remote Sensing in Schools") project and its learning portal. Recognizing that in-depth use of satellite imagery can only be achieved by the means of computer aided learning methods, a sizeable number of e-Learning contents in German and English have been created throughout the last 5 years since FIS' kickoff. The talk presents the educational valorization of remote sensing data as well as their interactive implementation for teachers and pupils in both learning portals. It will be shown which possibilities the topic of remote sensing holds ready for teaching the regular curricula of Geography, Biology, Physics, Math and Informatics. Beside the sequenced implementation into digital and interactive teaching units, examples of a richly illustrated encyclopedia as well as easy-to-use image processing tools are given. The presentation finally addresses the question of how synergies of space travels can be used to enhance the fascination of earth observation imagery in the light of problem-based learning in everyday school lessons.

  15. Remote sensing of environmental impact of land use activities

    NASA Technical Reports Server (NTRS)

    Paul, C. K.

    1977-01-01

    The capability to monitor land cover, associated in the past with aerial film cameras and radar systems, was discussed in regard to aircraft and spacecraft multispectral scanning sensors. A proposed thematic mapper with greater spectral and spatial resolutions for the fourth LANDSAT is expected to usher in new environmental monitoring capability. In addition, continuing improvements in image classification by supervised and unsupervised computer techniques are being operationally verified for discriminating environmental impacts of human activities on the land. The benefits of employing remote sensing for this discrimination was shown to far outweigh the incremental costs of converting to an aircraft-satellite multistage system.

  16. Comparison of polarimetric cameras

    DTIC Science & Technology

    2017-03-01

    polarimetry field of science. Maxwell’s differential equations based on Faraday’s concepts put EM waves into transverse wave solutions. His theory of the...Dennis L. Goldstein, David B. Chenault, and Joseph A. Shaw. “Review of Passive Imaging Polarimetry for Remote Sensing Applications.” Applied Optics 45

  17. Low-cost multispectral imaging for remote sensing of lettuce health

    NASA Astrophysics Data System (ADS)

    Ren, David D. W.; Tripathi, Siddhant; Li, Larry K. B.

    2017-01-01

    In agricultural remote sensing, unmanned aerial vehicle (UAV) platforms offer many advantages over conventional satellite and full-scale airborne platforms. One of the most important advantages is their ability to capture high spatial resolution images (1-10 cm) on-demand and at different viewing angles. However, UAV platforms typically rely on the use of multiple cameras, which can be costly and difficult to operate. We present the development of a simple low-cost imaging system for remote sensing of crop health and demonstrate it on lettuce (Lactuca sativa) grown in Hong Kong. To identify the optimal vegetation index, we recorded images of both healthy and unhealthy lettuce, and used them as input in an expectation maximization cluster analysis with a Gaussian mixture model. Results from unsupervised and supervised clustering show that, among four widely used vegetation indices, the blue wide-dynamic range vegetation index is the most accurate. This study shows that it is readily possible to design and build a remote sensing system capable of determining the health status of lettuce at a reasonably low cost (

  18. A low-cost single-camera imaging system for aerial applicators

    USDA-ARS?s Scientific Manuscript database

    Agricultural aircraft provide a readily available and versatile platform for airborne remote sensing. Although various airborne imaging systems are available, most of these systems are either too expensive or too complex to be of practical use for aerial applicators. The objective of this study was ...

  19. Use of remote-sensing techniques to survey the physical habitat of large rivers

    USGS Publications Warehouse

    Edsall, Thomas A.; Behrendt, Thomas E.; Cholwek, Gary; Frey, Jeffery W.; Kennedy, Gregory W.; Smith, Stephen B.; Edsall, Thomas A.; Behrendt, Thomas E.; Cholwek, Gary; Frey, Jeffrey W.; Kennedy, Gregory W.; Smith, Stephen B.

    1997-01-01

    Remote-sensing techniques that can be used to quantitatively characterize the physical habitat in large rivers in the United States where traditional survey approaches typically used in small- and medium-sized streams and rivers would be ineffective or impossible to apply. The state-of-the-art remote-sensing technologies that we discuss here include side-scan sonar, RoxAnn, acoustic Doppler current profiler, remotely operated vehicles and camera systems, global positioning systems, and laser level survey systems. The use of these technologies will permit the collection of information needed to create computer visualizations and hard copy maps and generate quantitative databases that can be used in real-time mode in the field to characterize the physical habitat at a study location of interest and to guide the distribution of sampling effort needed to address other habitat-related study objectives. This report augments habitat sampling and characterization guidance provided by Meador et al. (1993) and is intended for use primarily by U.S. Geological Survey National Water Quality Assessment program managers and scientists who are documenting water quality in streams and rivers of the United States.

  20. A Search-and-Rescue Robot System for Remotely Sensing the Underground Coal Mine Environment

    PubMed Central

    Gao, Junyao; Zhao, Fangzhou; Liu, Yi

    2017-01-01

    This paper introduces a search-and-rescue robot system used for remote sensing of the underground coal mine environment, which is composed of an operating control unit and two mobile robots with explosion-proof and waterproof function. This robot system is designed to observe and collect information of the coal mine environment through remote control. Thus, this system can be regarded as a multifunction sensor, which realizes remote sensing. When the robot system detects danger, it will send out signals to warn rescuers to keep away. The robot consists of two gas sensors, two cameras, a two-way audio, a 1 km-long fiber-optic cable for communication and a mechanical explosion-proof manipulator. Especially, the manipulator is a novel explosion-proof manipulator for cleaning obstacles, which has 3-degree-of-freedom, but is driven by two motors. Furthermore, the two robots can communicate in series for 2 km with the operating control unit. The development of the robot system may provide a reference for developing future search-and-rescue systems. PMID:29065560

  1. Sensing our Environment: Remote sensing in a physics classroom

    NASA Astrophysics Data System (ADS)

    Isaacson, Sivan; Schüttler, Tobias; Cohen-Zada, Aviv L.; Blumberg, Dan G.; Girwidz, Raimund; Maman, Shimrit

    2017-04-01

    Remote sensing is defined as data acquisition of an object, deprived physical contact. Fundamentally, most remote sensing applications are referred to as the use of satellite- or aircraft-based sensor technologies to detect and classify objects mainly on Earth or other planets. In the last years there have been efforts to bring the important subject of remote sensing into schools, however, most of these attempts focused on geography disciplines - restricting to the applications of remote sensing and to a less extent the technique itself and the physics behind it. Optical remote sensing is based on physical principles and technical devices, which are very meaningful from a theoretical point of view as well as for "hands-on" teaching. Some main subjects are radiation, atom and molecular physics, spectroscopy, as well as optics and the semiconductor technology used in modern digital cameras. Thus two objectives were outlined for this project: 1) to investigate the possibilities of using remote sensing techniques in physics teaching, and 2) to identify its impact on pupil's interest in the field of natural sciences. This joint project of the DLR_School_Lab, Oberpfaffenhofen of the German Aerospace Center (DLR) and the Earth and Planetary Image Facility (EPIF) at BGU, was conducted in 2016. Thirty teenagers (ages 16-18) participated in the project and were exposed to the cutting edge methods of earth observation. The pupils on both sides participated in the project voluntarily, knowing that at least some of the project's work had to be done in their leisure time. The pupil's project started with a day at EPIF and DLR respectively, where the project task was explained to the participants and an introduction to remote sensing of vegetation was given. This was realized in lectures and in experimental workshops. During the following two months both groups took several measurements with modern optical remote sensing systems in their home region with a special focus on flora. The teams then processed their data and presented it to their foreign partners for evaluation in a video conference call. Alongside exciting insights about their respective environments and living conditions, the young scientists had daily access to live satellite sensors and remote sensing through the DLR_School_Lab in Germany and the Earth and Planetary Image Facility in Israel. This paper provides an overview regarding the project, the techniques used and the evaluation results following a pre-past-questionnaire design, and above all demonstrates the use of remote sensing as an application for physics teaching in a significant learning environment.

  2. Application of EREP, LANDSAT, and aircraft image data to environmental problems related to coal mining

    NASA Technical Reports Server (NTRS)

    Amato, R. V.; Russell, O. R.; Martin, K. R.; Wier, C. E.

    1975-01-01

    Remote sensing techniques were used to study coal mining sites within the Eastern Interior Coal Basin (Indiana, Illinois, and western Kentucky), the Appalachian Coal Basin (Ohio, West Virginia, and Pennsylvania) and the anthracite coal basins of northeastern Pennsylvania. Remote sensor data evaluated during these studies were acquired by LANDSAT, Skylab and both high and low altitude aircraft. Airborne sensors included multispectral scanners, multiband cameras and standard mapping cameras loaded with panchromatic, color and color infrared films. The research conducted in these areas is a useful prerequisite to the development of an operational monitoring system that can be peridically employed to supply state and federal regulatory agencies with supportive data. Further research, however, must be undertaken to systematically examine those mining processes and features that can be monitored cost effectively using remote sensors and for determining what combination of sensors and ground sampling processes provide the optimum combination for an operational system.

  3. Optimization of spectral bands for hyperspectral remote sensing of forest vegetation

    NASA Astrophysics Data System (ADS)

    Dmitriev, Egor V.; Kozoderov, Vladimir V.

    2013-10-01

    Optimization principles of accounting for the most informative spectral channels in hyperspectral remote sensing data processing serve to enhance the efficiency of the employed high-productive computers. The problem of pattern recognition of the remotely sensed land surface objects with the accent on the forests is outlined from the point of view of the spectral channels optimization on the processed hyperspectral images. The relevant computational procedures are tested using the images obtained by the produced in Russia hyperspectral camera that was installed on a gyro-stabilized platform to conduct the airborne flight campaigns. The Bayesian classifier is used for the pattern recognition of the forests with different tree species and age. The probabilistically optimal algorithm constructed on the basis of the maximum likelihood principle is described to minimize the probability of misclassification given by this classifier. The classification error is the major category to estimate the accuracy of the applied algorithm by the known holdout cross-validation method. Details of the related techniques are presented. Results are shown of selecting the spectral channels of the camera while processing the images having in mind radiometric distortions that diminish the classification accuracy. The spectral channels are selected of the obtained subclasses extracted from the proposed validation techniques and the confusion matrices are constructed that characterize the age composition of the classified pine species as well as the broad age-class recognition for the pine and birch species with the fully illuminated parts of their crowns.

  4. Near-field Oblique Remote Sensing of Stream Water-surface Elevation, Slope, and Surface Velocity

    NASA Astrophysics Data System (ADS)

    Minear, J. T.; Kinzel, P. J.; Nelson, J. M.; McDonald, R.; Wright, S. A.

    2014-12-01

    A major challenge for estimating discharges during flood events or in steep channels is the difficulty and hazard inherent in obtaining in-stream measurements. One possible solution is to use near-field remote sensing to obtain simultaneous water-surface elevations, slope, and surface velocities. In this test case, we utilized Terrestrial Laser Scanning (TLS) to remotely measure water-surface elevations and slope in combination with surface velocities estimated from particle image velocimetry (PIV) obtained by video-camera and/or infrared camera. We tested this method at several sites in New Mexico and Colorado using independent validation data consisting of in-channel measurements from survey-grade GPS and Acoustic Doppler Current Profiler (ADCP) instruments. Preliminary results indicate that for relatively turbid or steep streams, TLS collects tens of thousands of water-surface elevations and slopes in minutes, much faster than conventional means and at relatively high precision, at least as good as continuous survey-grade GPS measurements. Estimated surface velocities from this technique are within 15% of measured velocity magnitudes and within 10 degrees from the measured velocity direction (using extrapolation from the shallowest bin of the ADCP measurements). Accurately aligning the PIV results into Cartesian coordinates appears to be one of the main sources of error, primarily due to the sensitivity at these shallow oblique look angles and the low numbers of stationary objects for rectification. Combining remotely-sensed water-surface elevations, slope, and surface velocities produces simultaneous velocity measurements from a large number of locations in the channel and is more spatially extensive than traditional velocity measurements. These factors make this technique useful for improving estimates of flow measurements during flood flows and in steep channels while also decreasing the difficulty and hazard associated with making measurements in these conditions.

  5. Ultraviolet Imaging with Low Cost Smartphone Sensors: Development and Application of a Raspberry Pi-Based UV Camera.

    PubMed

    Wilkes, Thomas C; McGonigle, Andrew J S; Pering, Tom D; Taggart, Angus J; White, Benjamin S; Bryant, Robert G; Willmott, Jon R

    2016-10-06

    Here, we report, for what we believe to be the first time, on the modification of a low cost sensor, designed for the smartphone camera market, to develop an ultraviolet (UV) camera system. This was achieved via adaptation of Raspberry Pi cameras, which are based on back-illuminated complementary metal-oxide semiconductor (CMOS) sensors, and we demonstrated the utility of these devices for applications at wavelengths as low as 310 nm, by remotely sensing power station smokestack emissions in this spectral region. Given the very low cost of these units, ≈ USD 25, they are suitable for widespread proliferation in a variety of UV imaging applications, e.g., in atmospheric science, volcanology, forensics and surface smoothness measurements.

  6. Application of airborne hyperspectral remote sensing for the retrieval of forest inventory parameters

    NASA Astrophysics Data System (ADS)

    Dmitriev, Yegor V.; Kozoderov, Vladimir V.; Sokolov, Anton A.

    2016-04-01

    Collecting and updating forest inventory data play an important part in the forest management. The data can be obtained directly by using exact enough but low efficient ground based methods as well as from the remote sensing measurements. We present applications of airborne hyperspectral remote sensing for the retrieval of such important inventory parameters as the forest species and age composition. The hyperspectral images of the test region were obtained from the airplane equipped by the produced in Russia light-weight airborne video-spectrometer of visible and near infrared spectral range and high resolution photo-camera on the same gyro-stabilized platform. The quality of the thematic processing depends on many factors such as the atmospheric conditions, characteristics of measuring instruments, corrections and preprocessing methods, etc. An important role plays the construction of the classifier together with methods of the reduction of the feature space. The performance of different spectral classification methods is analyzed for the problem of hyperspectral remote sensing of soil and vegetation. For the reduction of the feature space we used the earlier proposed stable feature selection method. The results of the classification of hyperspectral airborne images by using the Multiclass Support Vector Machine method with Gaussian kernel and the parametric Bayesian classifier based on the Gaussian mixture model and their comparative analysis are demonstrated.

  7. Physical Characteristics of Arctic Clouds from Ground-based Remote-sensing with a Polarized Micro-Pulse Lidar and a 95-GHz Cloud Radar in Ny-Ålesund, Svalbard

    NASA Astrophysics Data System (ADS)

    Shiobara, M.; Takano, T.; Okamoto, H.; Yabuki, M.

    2015-12-01

    Clouds and aerosols are key elements having a potential to change climate by their radiative effects on the energy balance in the global climate system. In the Arctic, we have been continuing ground-based remote-sensing measurements for clouds and aerosols using a sky-radiometer, a micro-pulse lidar (MPL) and an all-sky camera in Ny-Ålesund (78.9N, 11.9E), Svalbard since early 2000's. In addition to such regular operations, several new measurements have been performed with a polarization MPL since August 2013, a 95GHz Doppler cloud radar since September 2013, and a dual frequency microwave radiometer since June 2014. An intensive field experiment for cloud-aerosol-radiation interaction study named A-CARE (PI: J. Ukita) was conducted for water clouds in the period of 23 June - 13 July 2014 and for mixed phase clouds in the period of 30 March - 23 April 2015 in Ny-Alesund. The experiment consisted of ground-based remote-sensing and in-situ cloud microphysics measurements. In this paper, preliminary results from these remote-sensing measurements will be presented, particularly in regard to physical characteristics of Arctic clouds based on radar-lidar collocated observation in Ny-Ålesund.

  8. Long-term monitoring on environmental disasters using multi-source remote sensing technique

    NASA Astrophysics Data System (ADS)

    Kuo, Y. C.; Chen, C. F.

    2017-12-01

    Environmental disasters are extreme events within the earth's system that cause deaths and injuries to humans, as well as causing damages and losses of valuable assets, such as buildings, communication systems, farmlands, forest and etc. In disaster management, a large amount of multi-temporal spatial data is required. Multi-source remote sensing data with different spatial, spectral and temporal resolutions is widely applied on environmental disaster monitoring. With multi-source and multi-temporal high resolution images, we conduct rapid, systematic and seriate observations regarding to economic damages and environmental disasters on earth. It is based on three monitoring platforms: remote sensing, UAS (Unmanned Aircraft Systems) and ground investigation. The advantages of using UAS technology include great mobility and availability in real-time rapid and more flexible weather conditions. The system can produce long-term spatial distribution information from environmental disasters, obtaining high-resolution remote sensing data and field verification data in key monitoring areas. It also supports the prevention and control on ocean pollutions, illegally disposed wastes and pine pests in different scales. Meanwhile, digital photogrammetry can be applied on the camera inside and outside the position parameters to produce Digital Surface Model (DSM) data. The latest terrain environment information is simulated by using DSM data, and can be used as references in disaster recovery in the future.

  9. Effects of Regolith Properties on UV/VIS Spectra and Implications for Lunar Remote Sensing

    NASA Astrophysics Data System (ADS)

    Coman, Ecaterina Oana

    Lunar regolith chemistry, mineralogy, various maturation factors, and grain size dominate the reflectance of the lunar surface at ultraviolet (UV) to visible (VIS) wavelengths. These regolith properties leave unique fingerprints on reflectance spectra in the form of varied spectral shapes, reflectance intensity values, and absorption bands. With the addition of returned lunar soils from the Apollo and Luna missions as ground truth, these spectral fingerprints can be used to derive maps of global lunar chemistry or mineralogy to analyze the range of basalt types on the Moon, their spatial distribution, and source regions for clues to lunar formation history and evolution. The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) is the first lunar imager to detect bands at UV wavelengths (321 and 360 nm) in addition to visible bands (415, 566, 604, 643, and 689 nm). This dissertation uses a combination of laboratory and remote sensing studies to examine the relation between TiO2 concentration and WAC UV/VIS spectral ratios and to test the effects of variations in lunar chemistry, mineralogy, and soil maturity on ultraviolet and visible wavelength reflectance. Chapter 1 presents an introduction to the dissertation that includes some background in lunar mineralogy and remote sensing. Chapter 2 covers coordinated analyses of returned lunar soils using UV-VIS spectroscopy, X-ray diffraction, and micro X-ray fluorescence. Chapter 3 contains comparisons of local and global remote sensing observations of the Moon using LROC WAC and Clementine UVVIS TiO2 detection algorithms and Lunar Prospector (LP) Gamma Ray Spectrometer (GRS)-derived FeO and TiO2 concentrations. While the data shows effects from maturity and FeO on the UV/VIS detection algorithm, a UV/VIS relationship remains a simple yet accurate method for TiO2 detection on the Moon.

  10. Passive detection of vehicle loading

    NASA Astrophysics Data System (ADS)

    McKay, Troy R.; Salvaggio, Carl; Faulring, Jason W.; Salvaggio, Philip S.; McKeown, Donald M.; Garrett, Alfred J.; Coleman, David H.; Koffman, Larry D.

    2012-01-01

    The Digital Imaging and Remote Sensing Laboratory (DIRS) at the Rochester Institute of Technology, along with the Savannah River National Laboratory is investigating passive methods to quantify vehicle loading. The research described in this paper investigates multiple vehicle indicators including brake temperature, tire temperature, engine temperature, acceleration and deceleration rates, engine acoustics, suspension response, tire deformation and vibrational response. Our investigation into these variables includes building and implementing a sensing system for data collection as well as multiple full-scale vehicle tests. The sensing system includes; infrared video cameras, triaxial accelerometers, microphones, video cameras and thermocouples. The full scale testing includes both a medium size dump truck and a tractor-trailer truck on closed courses with loads spanning the full range of the vehicle's capacity. Statistical analysis of the collected data is used to determine the effectiveness of each of the indicators for characterizing the weight of a vehicle. The final sensing system will monitor multiple load indicators and combine the results to achieve a more accurate measurement than any of the indicators could provide alone.

  11. Remotely sensed geology from lander-based to orbital perspectives: Results of FIDO rover May 2000 field tests

    USGS Publications Warehouse

    Jolliff, B.; Knoll, A.; Morris, R.V.; Moersch, J.; McSween, H.; Gilmore, M.; Arvidson, R.; Greeley, R.; Herkenhoff, K.; Squyres, S.

    2002-01-01

    Blind field tests of the Field Integration Design and Operations (FIDO) prototype Mars rover were carried out 7-16 May 2000. A Core Operations Team (COT), sequestered at the Jet Propulsion Laboratory without knowledge of test site location, prepared command sequences and interpreted data acquired by the rover. Instrument sensors included a stereo panoramic camera, navigational and hazard-avoidance cameras, a color microscopic imager, an infrared point spectrometer, and a rock coring drill. The COT designed command sequences, which were relayed by satellite uplink to the rover, and evaluated instrument data. Using aerial photos and Airborne Visible and Infrared Imaging Spectrometer (AVIRIS) data, and information from the rover sensors, the COT inferred the geology of the landing site during the 18 sol mission, including lithologic diversity, stratigraphic relationships, environments of deposition, and weathering characteristics. Prominent lithologic units were interpreted to be dolomite-bearing rocks, kaolinite-bearing altered felsic volcanic materials, and basalt. The color panoramic camera revealed sedimentary layering and rock textures, and geologic relationships seen in rock exposures. The infrared point spectrometer permitted identification of prominent carbonate and kaolinite spectral features and permitted correlations to outcrops that could not be reached by the rover. The color microscopic imager revealed fine-scale rock textures, soil components, and results of coring experiments. Test results show that close-up interrogation of rocks is essential to investigations of geologic environments and that observations must include scales ranging from individual boulders and outcrops (microscopic, macroscopic) to orbital remote sensing, with sufficient intermediate steps (descent images) to connect in situ and remote observations.

  12. Comparison of aerial imagery from manned and unmanned aircraft platforms for monitoring cotton growth

    USDA-ARS?s Scientific Manuscript database

    Unmanned aircraft systems (UAS) have emerged as a low-cost and versatile remote sensing platform in recent years, but little work has been done on comparing imagery from manned and unmanned platforms for crop assessment. The objective of this study was to compare imagery taken from multiple cameras ...

  13. Use of a UAV-mounted video camera to assess feeding behavior of Raramuri Criollo cows

    USDA-ARS?s Scientific Manuscript database

    Interest in use of unmanned aerial vehicles in science has increased in recent years. It is predicted that they will be a preferred remote sensing platform for applications that inform sustainable rangeland management in the future. The objective of this study was to determine whether UAV video moni...

  14. Land cover/use classification of Cairns, Queensland, Australia: A remote sensing study involving the conjunctive use of the airborne imaging spectrometer, the large format camera and the thematic mapper simulator

    NASA Technical Reports Server (NTRS)

    Heric, Matthew; Cox, William; Gordon, Daniel K.

    1987-01-01

    In an attempt to improve the land cover/use classification accuracy obtainable from remotely sensed multispectral imagery, Airborne Imaging Spectrometer-1 (AIS-1) images were analyzed in conjunction with Thematic Mapper Simulator (NS001) Large Format Camera color infrared photography and black and white aerial photography. Specific portions of the combined data set were registered and used for classification. Following this procedure, the resulting derived data was tested using an overall accuracy assessment method. Precise photogrammetric 2D-3D-2D geometric modeling techniques is not the basis for this study. Instead, the discussion exposes resultant spectral findings from the image-to-image registrations. Problems associated with the AIS-1 TMS integration are considered, and useful applications of the imagery combination are presented. More advanced methodologies for imagery integration are needed if multisystem data sets are to be utilized fully. Nevertheless, research, described herein, provides a formulation for future Earth Observation Station related multisensor studies.

  15. Application of remote sensing techniques to hydrography with emphasis on bathymetry. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Meireles, D. S.

    1980-01-01

    Remote sensing techniques are utilized for the determination of hydrographic characteristics, with emphasis in bathymetry. Two sensor systems were utilized: the Metric Camera Wild RC-10 and the Multispectral Scanner of LANDSAT Satellite (MSS-LANDSAT). From photographs of the metric camera, data of photographic density of points with known depth are obtained. A correlation between the variables density x depth is calculated through a regression straight line. From this line, the depth of points with known photographic density is determined. The LANDSAT MSS images are interpreted automatically in the Iterative Multispectral Analysis System (I-100) with the obtention of point subareas with the same gray level. With some simplifications done, it is assumed that the depth of a point is directly related with its gray level. Subareas with points of the same depth are then determined and isobathymetric curves are drawn. The coast line is obtained through the sensor systems already mentioned. Advantages and limitations of the techniques and of the sensor systems utilized are discussed and the results are compared with ground truth.

  16. Experiments in teleoperator and autonomous control of space robotic vehicles

    NASA Technical Reports Server (NTRS)

    Alexander, Harold L.

    1991-01-01

    A program of research embracing teleoperator and automatic navigational control of freely flying satellite robots is presented. Current research goals include: (1) developing visual operator interfaces for improved vehicle teleoperation; (2) determining the effects of different visual interface system designs on operator performance; and (3) achieving autonomous vision-based vehicle navigation and control. This research program combines virtual-environment teleoperation studies and neutral-buoyancy experiments using a space-robot simulator vehicle currently under development. Visual-interface design options under investigation include monoscopic versus stereoscopic displays and cameras, helmet-mounted versus panel-mounted display monitors, head-tracking versus fixed or manually steerable remote cameras, and the provision of vehicle-fixed visual cues, or markers, in the remote scene for improved sensing of vehicle position, orientation, and motion.

  17. Investigating the relationship between peat biogeochemistry and above-ground plant phenology with remote sensing along a gradient of permafrost thaw.

    NASA Astrophysics Data System (ADS)

    Garnello, A.; Dye, D. G.; Bogle, R.; Hough, M.; Raab, N.; Dominguez, S.; Rich, V. I.; Crill, P. M.; Saleska, S. R.

    2016-12-01

    Global climate models predict a 50% - 85% decrease in permafrost area in northern regions by 2100 due to increased temperature and precipitation variability, potentially releasing large stores of carbon as greenhouse gases (GHG) due to microbial activity. Linking belowground biogeochemical processes with observable above ground plant dynamics would greatly increase the ability to track and model GHG emissions from permafrost thaw, but current research has yet to satisfactorily develop this link. We hypothesized that seasonal patterns in peatland biogeochemistry manifests itself as observable plant phenology due to the tight coupling resulting from plant-microbial interactions. We tested this by using an automated, tower-based camera to acquire daily composite (red, green, blue) and near infrared (NIR) images of a thawing permafrost peatland site near Abisko, Sweden. The images encompassed a range of exposures which were merged into high-dynamic-range images, a novel application to remote sensing of plant phenology. The 2016 growing season camera images are accompanied by mid-to-late season CH4 and CO2 fluxes measured from soil collars, and by early-mid-late season peat core samples of the composition of microbial communities and key metabolic genes, and of the organic matter and trace gas composition of peat porewater. Additionally, nearby automated gas flux chambers measured sub-hourly fluxes of CO2 and CH4 from the peat, which will also be incorporated into analysis of relationships between seasonal camera-derived vegetation indices and gas fluxes from habitats with different vegetation types. While remote sensing is a proven method in observing plant phenology, this technology has yet to be combined with soil biogeochemical and microbial community data in regions of permafrost thaw. Establishing a high resolution phenology monitoring system linked to soil biogeochemical processes in subarctic peatlands will advance the understanding of how observable patterns in plant phenology can be used to monitor permafrost thaw and ecosystem carbon cycling.

  18. Estimating time available for sensor fusion exception handling

    NASA Astrophysics Data System (ADS)

    Murphy, Robin R.; Rogers, Erika

    1995-09-01

    In previous work, we have developed a generate, test, and debug methodology for detecting, classifying, and responding to sensing failures in autonomous and semi-autonomous mobile robots. An important issue has arisen from these efforts: how much time is there available to classify the cause of the failure and determine an alternative sensing strategy before the robot mission must be terminated? In this paper, we consider the impact of time for teleoperation applications where a remote robot attempts to autonomously maintain sensing in the presence of failures yet has the option to contact the local for further assistance. Time limits are determined by using evidential reasoning with a novel generalization of Dempster-Shafer theory. Generalized Dempster-Shafer theory is used to estimate the time remaining until the robot behavior must be suspended because of uncertainty; this becomes the time limit on autonomous exception handling at the remote. If the remote cannot complete exception handling in this time or needs assistance, responsibility is passed to the local, while the remote assumes a `safe' state. An intelligent assistant then facilitates human intervention, either directing the remote without human assistance or coordinating data collection and presentation to the operator within time limits imposed by the mission. The impact of time on exception handling activities is demonstrated using video camera sensor data.

  19. Airborne multicamera system for geo-spatial applications

    NASA Astrophysics Data System (ADS)

    Bachnak, Rafic; Kulkarni, Rahul R.; Lyle, Stacey; Steidley, Carl W.

    2003-08-01

    Airborne remote sensing has many applications that include vegetation detection, oceanography, marine biology, geographical information systems, and environmental coastal science analysis. Remotely sensed images, for example, can be used to study the aftermath of episodic events such as the hurricanes and floods that occur year round in the coastal bend area of Corpus Christi. This paper describes an Airborne Multi-Spectral Imaging System that uses digital cameras to provide high resolution at very high rates. The software is based on Delphi 5.0 and IC Imaging Control's ActiveX controls. Both time and the GPS coordinates are recorded. Three successful test flights have been conducted so far. The paper present flight test results and discusses the issues being addressed to fully develop the system.

  20. The analysis of changes in oxbow lakes characteristics using remote sensing data. A case study from Biebrza River in Poland.

    NASA Astrophysics Data System (ADS)

    Slapinska, Malgorzata; Chormanski, Jaroslaw

    2014-05-01

    Biebrza River Valley is located in North-Eastern part of Poland. Biebrza is a river of intermediate size with almost natural character. River has numerous of oxbow lakes. Biebrza River Valley consists of three Basins: Upper, Middle and Lower, which are characterized by different geomorphological structure. Biebrza River Valley is an area of significant ecological importance, especially because it is one of the biggest wetlands in Europe. It consists of almost undisturbed floodplain marshes and fens. Biebrza river is also characterised by low contamination level and small human influence. Because of those characteristics Biebrza River can be treated as a reference area for other floodplains and fen ecosystems in Europe. Since oxbow lakes are the least known part of the river valleys there is a need for more research on them. The objective of this study is the characterisation of the oxbow lake water quality and indirectly oxbow lake state using remote sensing method. For achieving the objective two remote sensing datasets has been analysed: IKONOS and hyperspectral camera AISA. The utility of both data sources was compared and time variability of oxbow lakes was defined. The first part of the remote sensing analysis of oxbow lakes was held with the usage of the satellite images from IKONOS satellite from 20.07.2008 (images were taken from Biebrza National Park resources). All analysis were made in ArcGIS 10.0 and ENVI 5.0. The second part of the image analysis was conducted with the data gained from airborne hyperspectral camera AISA Eagle in August 2013. The oxbow lakes have been described on: state of the habitat, transparency, state of overgrowing, connectivity with the river, maximum area and maximum length. The general method of describing oxbow lakes is visual habitat state, related with natural succession. Three main habitat states of oxbow lakes were designated: privileged (described as 'good'), eutrophic and disappearing. The results confirm the fact that most of the oxbow lakes are habitats which are disappearing or proceeding to disappearance. It also shows the potential of remote sensing data for monitoring this type of water bodies. The fact that first data was collected in 2008 and second in 2013 enabled detection of changes in oxbow lakes during these 5 years.

  1. Towards the Development of a Smart Flying Sensor: Illustration in the Field of Precision Agriculture.

    PubMed

    Hernandez, Andres; Murcia, Harold; Copot, Cosmin; De Keyser, Robin

    2015-07-10

    Sensing is an important element to quantify productivity, product quality and to make decisions. Applications, such as mapping, surveillance, exploration and precision agriculture, require a reliable platform for remote sensing. This paper presents the first steps towards the development of a smart flying sensor based on an unmanned aerial vehicle (UAV). The concept of smart remote sensing is illustrated and its performance tested for the task of mapping the volume of grain inside a trailer during forage harvesting. Novelty lies in: (1) the development of a position-estimation method with time delay compensation based on inertial measurement unit (IMU) sensors and image processing; (2) a method to build a 3D map using information obtained from a regular camera; and (3) the design and implementation of a path-following control algorithm using model predictive control (MPC). Experimental results on a lab-scale system validate the effectiveness of the proposed methodology.

  2. Towards the Development of a Smart Flying Sensor: Illustration in the Field of Precision Agriculture

    PubMed Central

    Hernandez, Andres; Murcia, Harold; Copot, Cosmin; De Keyser, Robin

    2015-01-01

    Sensing is an important element to quantify productivity, product quality and to make decisions. Applications, such as mapping, surveillance, exploration and precision agriculture, require a reliable platform for remote sensing. This paper presents the first steps towards the development of a smart flying sensor based on an unmanned aerial vehicle (UAV). The concept of smart remote sensing is illustrated and its performance tested for the task of mapping the volume of grain inside a trailer during forage harvesting. Novelty lies in: (1) the development of a position-estimation method with time delay compensation based on inertial measurement unit (IMU) sensors and image processing; (2) a method to build a 3D map using information obtained from a regular camera; and (3) the design and implementation of a path-following control algorithm using model predictive control (MPC). Experimental results on a lab-scale system validate the effectiveness of the proposed methodology. PMID:26184205

  3. Ultraviolet Imaging with Low Cost Smartphone Sensors: Development and Application of a Raspberry Pi-Based UV Camera

    PubMed Central

    Wilkes, Thomas C.; McGonigle, Andrew J. S.; Pering, Tom D.; Taggart, Angus J.; White, Benjamin S.; Bryant, Robert G.; Willmott, Jon R.

    2016-01-01

    Here, we report, for what we believe to be the first time, on the modification of a low cost sensor, designed for the smartphone camera market, to develop an ultraviolet (UV) camera system. This was achieved via adaptation of Raspberry Pi cameras, which are based on back-illuminated complementary metal-oxide semiconductor (CMOS) sensors, and we demonstrated the utility of these devices for applications at wavelengths as low as 310 nm, by remotely sensing power station smokestack emissions in this spectral region. Given the very low cost of these units, ≈ USD 25, they are suitable for widespread proliferation in a variety of UV imaging applications, e.g., in atmospheric science, volcanology, forensics and surface smoothness measurements. PMID:27782054

  4. Understanding the Function of Circular Polarisation Vision in Mantis Shrimps: Building a C-Pol Camera

    DTIC Science & Technology

    2008-10-24

    instrumentation from hand-held to remote sensing (RS) and used to address problems such as coral bleaching or algal blooms. Very recent work has...Fig.1 A mantis shrimp looking out from the front entrance of its burrow. This and other species live on coral reefs and in other shallow

  5. Evaluation of Remotely Sensed Data for the Application of Geospatial Techniques to Assess Hurricane Impacts on Coastal Bird Habitat

    DTIC Science & Technology

    2009-08-01

    habitat analysis because of the high horizontal error between the mosaicked image tiles . The imagery was collected with a non-metric camera and likewise...possible with true color imagery (digital orthophotos ) or multispectral imagery, but usually comes at a much higher cost. Due to its availability and

  6. Back in Time

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Under a Jet Propulsion Laboratory SBIR (Small Business Innovative Research), Cambridge Research and Instrumentation Inc., developed a new class of filters for the construction of small, low-cost multispectral imagers. The VariSpec liquid crystal enables users to obtain multi-spectral, ultra-high resolution images using a monochrome CCD (charge coupled device) camera. Application areas include biomedical imaging, remote sensing, and machine vision.

  7. UrtheCast Second-Generation Earth Observation Sensors

    NASA Astrophysics Data System (ADS)

    Beckett, K.

    2015-04-01

    UrtheCast's Second-Generation state-of-the-art Earth Observation (EO) remote sensing platform will be hosted on the NASA segment of International Space Station (ISS). This platform comprises a high-resolution dual-mode (pushbroom and video) optical camera and a dual-band (X and L) Synthetic Aperture RADAR (SAR) instrument. These new sensors will complement the firstgeneration medium-resolution pushbroom and high-definition video cameras that were mounted on the Russian segment of the ISS in early 2014. The new cameras are expected to be launched to the ISS in late 2017 via the Space Exploration Technologies Corporation Dragon spacecraft. The Canadarm will then be used to install the remote sensing platform onto a CBM (Common Berthing Mechanism) hatch on Node 3, allowing the sensor electronics to be accessible from the inside of the station, thus limiting their exposure to the space environment and allowing for future capability upgrades. The UrtheCast second-generation system will be able to take full advantage of the strengths that each of the individual sensors offers, such that the data exploitation capabilities of the combined sensors is significantly greater than from either sensor alone. This represents a truly novel platform that will lead to significant advances in many other Earth Observation applications such as environmental monitoring, energy and natural resources management, and humanitarian response, with data availability anticipated to begin after commissioning is completed in early 2018.

  8. Monitoring the spatial and temporal evolution of slope instability with Digital Image Correlation

    NASA Astrophysics Data System (ADS)

    Manconi, Andrea; Glueer, Franziska; Loew, Simon

    2017-04-01

    The identification and monitoring of ground deformation is important for an appropriate analysis and interpretation of unstable slopes. Displacements are usually monitored with in-situ techniques (e.g., extensometers, inclinometers, geodetic leveling, tachymeters and D-GPS), and/or active remote sensing methods (e.g., LiDAR and radar interferometry). In particular situations, however, the choice of the appropriate monitoring system is constrained by site-specific conditions. Slope areas can be very remote and/or affected by rapid surface changes, thus hardly accessible, often unsafe, for field installations. In many cases the use of remote sensing approaches might be also hindered because of unsuitable acquisition geometries, poor spatial resolution and revisit times, and/or high costs. The increasing availability of digital imagery acquired from terrestrial photo and video cameras allows us nowadays for an additional source of data. The latter can be exploited to visually identify changes of the scene occurring over time, but also to quantify the evolution of surface displacements. Image processing analyses, such as Digital Image Correlation (known also as pixel-offset or feature-tracking), have demonstrated to provide a suitable alternative to detect and monitor surface deformation at high spatial and temporal resolutions. However, a number of intrinsic limitations have to be considered when dealing with optical imagery acquisition and processing, including the effects of light conditions, shadowing, and/or meteorological variables. Here we propose an algorithm to automatically select and process images acquired from time-lapse cameras. We aim at maximizing the results obtainable from large datasets of digital images acquired with different light and meteorological conditions, and at retrieving accurate information on the evolution of surface deformation. We show a successful example of application of our approach in the Swiss Alps, more specifically in the Great Aletsch area, where slope instability was recently reactivated due to the progressive glacier retreat. At this location, time-lapse cameras have been installed during the last two years, ranging from low-cost and low-resolution webcams to more expensive high-resolution reflex cameras. Our results confirm that time-lapse cameras provide quantitative and accurate measurements of surface deformation evolution over space and time, especially in situations when other monitoring instruments fail.

  9. Volcanology 2020: How will thermal remote sensing of volcanic surface activity evolve over the next decade?

    NASA Astrophysics Data System (ADS)

    Ramsey, Michael S.; Harris, Andrew J. L.

    2013-01-01

    Volcanological remote sensing spans numerous techniques, wavelength regions, data collection strategies, targets, and applications. Attempting to foresee and predict the growth vectors in this broad and rapidly developing field is therefore exceedingly difficult. However, we attempted to make such predictions at both the American Geophysical Union (AGU) meeting session entitled Volcanology 2010: How will the science and practice of volcanology change in the coming decade? held in December 2000 and the follow-up session 10 years later, Looking backward and forward: Volcanology in 2010 and 2020. In this summary paper, we assess how well we did with our predictions for specific facets of volcano remote sensing in 2000 the advances made over the most recent decade, and attempt a new look ahead to the next decade. In completing this review, we only consider the subset of the field focused on thermal infrared remote sensing of surface activity using ground-based and space-based technology and the subsequent research results. This review keeps to the original scope of both AGU presentations, and therefore does not address the entire field of volcanological remote sensing, which uses technologies in other wavelength regions (e.g., ultraviolet, radar, etc.) or the study of volcanic processes other than the those associated with surface (mostly effusive) activity. Therefore we do not consider remote sensing of ash/gas plumes, for example. In 2000, we had looked forward to a "golden age" in volcanological remote sensing, with a variety of new orbital missions both planned and recently launched. In addition, exciting field-based sensors such as hand-held thermal cameras were also becoming available and being quickly adopted by volcanologists for both monitoring and research applications. All of our predictions in 2000 came true, but at a pace far quicker than we predicted. Relative to the 2000-2010 timeframe, the coming decade will see far fewer new orbital instruments with direct applications to volcanology. However ground-based technologies and applications will continue to proliferate, and unforeseen technology promises many exciting possibilities that will advance volcano thermal monitoring and science far beyond what we can currently envision.

  10. Intercomparison of Remotely Sensed Vegetation Indices, Ground Spectroscopy, and Foliar Chemistry Data from NEON

    NASA Astrophysics Data System (ADS)

    Hulslander, D.; Warren, J. N.; Weintraub, S. R.

    2017-12-01

    Hyperspectral imaging systems can be used to produce spectral reflectance curves giving rich information about composition, relative abundances of materials, mixes and combinations. Indices based on just a few spectral bands have been used for over 40 years to study vegetation health, mineral abundance, and more. These indices are much simpler to visualize and use than a full hyperspectral data set which may contain over 400 bands. Yet historically, it has been difficult to directly relate remotely sensed spectral indices to quantitative biophysical properties significant to forest ecology such as canopy nitrogen, lignin, and chlorophyll. This linkage is a critical piece in enabling the detection of high value ecological information, usually only available from labor-intensive canopy foliar chemistry sampling, to the geographic and temporal coverage available via remote sensing. Previous studies have shown some promising results linking ground-based data and remotely sensed indices, but are consistently limited in time, geographic extent, and land cover type. Moreover, previous studies are often focused on tuning linkage algorithms for the purpose of achieving good results for only one study site or one type of vegetation, precluding development of more generalized algorithms. The National Ecological Observatory Network (NEON) is a unique system of 47 terrestrial sites covering all of the major eco-climatic domains of the US, including AK, HI, and Puerto Rico. These sites are regularly monitored and sampled using uniform instrumentation and protocols, including both foliar chemistry sampling and remote sensing flights for high resolution hyperspectral, LiDAR, and digital camera data acquisition. In this study we compare the results of foliar chemistry analysis to the remote sensing vegetation indices and investigate possible sources for variance and difference through the use of the larger hyperspectral dataset as well as ground based spectrometer measurements of samples subsequently analyzed for foliar chemistry.

  11. String Theory - Using Kites for Introducing Remote Sensing and In-Situ Measurement Concepts

    NASA Astrophysics Data System (ADS)

    Bland, G.; Bydlowski, D.; Henry, A.

    2016-12-01

    Kites are often overlooked as a practical and accessible tool for gaining an aerial perspective. This perspective can be used as a proxy for the vantage points of space and aircraft, particularly when introducing the concepts of remote sensing and in-situ measurements that form the foundation of much of NASA's Earth science research. Kites combined with miniature cameras and instrumentation, can easily and affordably be used in formal and informal learning environments to demonstrate techniques and develop skills related to gathering information from above. Additionally, collaborative team work can play an important role, particularly in the form of synthesizing flight operations. Hands-on technology exploration can be a component as well, as there are numerous possibilities for creating sensor systems, line-handling techniques, and understanding kite flight itself.

  12. Generation of high-dynamic range image from digital photo

    NASA Astrophysics Data System (ADS)

    Wang, Ying; Potemin, Igor S.; Zhdanov, Dmitry D.; Wang, Xu-yang; Cheng, Han

    2016-10-01

    A number of the modern applications such as medical imaging, remote sensing satellites imaging, virtual prototyping etc use the High Dynamic Range Image (HDRI). Generally to obtain HDRI from ordinary digital image the camera is calibrated. The article proposes the camera calibration method based on the clear sky as the standard light source and takes sky luminance from CIE sky model for the corresponding geographical coordinates and time. The article considers base algorithms for getting real luminance values from ordinary digital image and corresponding programmed implementation of the algorithms. Moreover, examples of HDRI reconstructed from ordinary images illustrate the article.

  13. Optical Constituents at the Mouth of the Columbia River: Variability and Signature in Remotely Sensed Reflectance

    DTIC Science & Technology

    2013-09-30

    constructed at BIO, carried the new Machine Vision Floc Camera (MVFC), a Sequoia Scientific LISST 100x Type B, an RBR CTD, and two pressure-actuated...WetStar CDOM fluorometer, a Sequoia Scientific flow control switch, and a SeaBird 37 CTD. The flow-control switch allows the ac- 9 to collect 0.2-um

  14. REMOTE SENSING OF BIOMASS, LEAF-AREA-INDEX AND CHLOROPHYLL A AND B CONTENT IN THE ACE BASIN AND NATIONAL ESTUARINE RESEARCH RESERVE USING SUB-METER DIGITAL CAMERA IMAGERY. (R828677C003)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  15. Compact camera technologies for real-time false-color imaging in the SWIR band

    NASA Astrophysics Data System (ADS)

    Dougherty, John; Jennings, Todd; Snikkers, Marco

    2013-11-01

    Previously real-time false-colored multispectral imaging was not available in a true snapshot single compact imager. Recent technology improvements now allow for this technique to be used in practical applications. This paper will cover those advancements as well as a case study for its use in UAV's where the technology is enabling new remote sensing methodologies.

  16. Optical alignment of high resolution Fourier transform spectrometers

    NASA Technical Reports Server (NTRS)

    Breckinridge, J. B.; Ocallaghan, F. G.; Cassie, A. G.

    1980-01-01

    Remote sensing, high resolution FTS instruments often contain three primary optical subsystems: Fore-Optics, Interferometer Optics, and Post, or Detector Optics. We discuss the alignment of a double-pass FTS containing a cat's-eye retro-reflector. Also, the alignment of fore-optics containing confocal paraboloids with a reflecting field stop which relays a field image onto a camera is discussed.

  17. Defining habitat covariates in camera-trap based occupancy studies

    PubMed Central

    Niedballa, Jürgen; Sollmann, Rahel; Mohamed, Azlan bin; Bender, Johannes; Wilting, Andreas

    2015-01-01

    In species-habitat association studies, both the type and spatial scale of habitat covariates need to match the ecology of the focal species. We assessed the potential of high-resolution satellite imagery for generating habitat covariates using camera-trapping data from Sabah, Malaysian Borneo, within an occupancy framework. We tested the predictive power of covariates generated from satellite imagery at different resolutions and extents (focal patch sizes, 10–500 m around sample points) on estimates of occupancy patterns of six small to medium sized mammal species/species groups. High-resolution land cover information had considerably more model support for small, patchily distributed habitat features, whereas it had no advantage for large, homogeneous habitat features. A comparison of different focal patch sizes including remote sensing data and an in-situ measure showed that patches with a 50-m radius had most support for the target species. Thus, high-resolution satellite imagery proved to be particularly useful in heterogeneous landscapes, and can be used as a surrogate for certain in-situ measures, reducing field effort in logistically challenging environments. Additionally, remote sensed data provide more flexibility in defining appropriate spatial scales, which we show to impact estimates of wildlife-habitat associations. PMID:26596779

  18. Remote Sensing of Arctic Environmental Conditions and Critical Infrastructure using Infra-Red (IR) Cameras and Unmanned Air Vehicles (UAVs)

    NASA Astrophysics Data System (ADS)

    Hatfield, M. C.; Webley, P.; Saiet, E., II

    2014-12-01

    Remote Sensing of Arctic Environmental Conditions and Critical Infrastructure using Infra-Red (IR) Cameras and Unmanned Air Vehicles (UAVs) Numerous scientific and logistical applications exist in Alaska and other arctic regions requiring analysis of expansive, remote areas in the near infrared (NIR) and thermal infrared (TIR) bands. These include characterization of wild land fire plumes and volcanic ejecta, detailed mapping of lava flows, and inspection of lengthy segments of critical infrastructure, such as the Alaska pipeline and railroad system. Obtaining timely, repeatable, calibrated measurements of these extensive features and infrastructure networks requires localized, taskable assets such as UAVs. The Alaska Center for Unmanned Aircraft Systems Integration (ACUASI) provides practical solutions to these problem sets by pairing various IR sensors with a combination of fixed-wing and multi-rotor air vehicles. Fixed-wing assets, such as the Insitu ScanEagle, offer long reach and extended duration capabilities to quickly access remote locations and provide enduring surveillance of the target of interest. Rotary-wing assets, such as the Aeryon Scout or the ACUASI-built Ptarmigan hexcopter, provide a precision capability for detailed horizontal mapping or vertical stratification of atmospheric phenomena. When included with other ground capabilities, we will show how they can assist in decision support and hazard assessment as well as giving those in emergency management a new ability to increase knowledge of the event at hand while reducing the risk to all involved. Here, in this presentation, we illustrate how UAV's can provide the ideal tool to map and analyze the hazardous events and critical infrastructure under extreme environmental conditions.

  19. Geometric Calibration and Radiometric Correction of the Maia Multispectral Camera

    NASA Astrophysics Data System (ADS)

    Nocerino, E.; Dubbini, M.; Menna, F.; Remondino, F.; Gattelli, M.; Covi, D.

    2017-10-01

    Multispectral imaging is a widely used remote sensing technique, whose applications range from agriculture to environmental monitoring, from food quality check to cultural heritage diagnostic. A variety of multispectral imaging sensors are available on the market, many of them designed to be mounted on different platform, especially small drones. This work focuses on the geometric and radiometric characterization of a brand-new, lightweight, low-cost multispectral camera, called MAIA. The MAIA camera is equipped with nine sensors, allowing for the acquisition of images in the visible and near infrared parts of the electromagnetic spectrum. Two versions are available, characterised by different set of band-pass filters, inspired by the sensors mounted on the WorlView-2 and Sentinel2 satellites, respectively. The camera details and the developed procedures for the geometric calibrations and radiometric correction are presented in the paper.

  20. Laser-Camera Vision Sensing for Spacecraft Mobile Robot Navigation

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Khalil, Ahmad S.; Dorais, Gregory A.; Gawdiak, Yuri

    2002-01-01

    The advent of spacecraft mobile robots-free-flyng sensor platforms and communications devices intended to accompany astronauts or remotely operate on space missions both inside and outside of a spacecraft-has demanded the development of a simple and effective navigation schema. One such system under exploration involves the use of a laser-camera arrangement to predict relative positioning of the mobile robot. By projecting laser beams from the robot, a 3D reference frame can be introduced. Thus, as the robot shifts in position, the position reference frame produced by the laser images is correspondingly altered. Using normalization and camera registration techniques presented in this paper, the relative translation and rotation of the robot in 3D are determined from these reference frame transformations.

  1. SENSOR++: Simulation of Remote Sensing Systems from Visible to Thermal Infrared

    NASA Astrophysics Data System (ADS)

    Paproth, C.; Schlüßler, E.; Scherbaum, P.; Börner, A.

    2012-07-01

    During the development process of a remote sensing system, the optimization and the verification of the sensor system are important tasks. To support these tasks, the simulation of the sensor and its output is valuable. This enables the developers to test algorithms, estimate errors, and evaluate the capabilities of the whole sensor system before the final remote sensing system is available and produces real data. The presented simulation concept, SENSOR++, consists of three parts. The first part is the geometric simulation which calculates where the sensor looks at by using a ray tracing algorithm. This also determines whether the observed part of the scene is shadowed or not. The second part describes the radiometry and results in the spectral at-sensor radiance from the visible spectrum to the thermal infrared according to the simulated sensor type. In the case of earth remote sensing, it also includes a model of the radiative transfer through the atmosphere. The final part uses the at-sensor radiance to generate digital images by using an optical and an electronic sensor model. Using SENSOR++ for an optimization requires the additional application of task-specific data processing algorithms. The principle of the simulation approach is explained, all relevant concepts of SENSOR++ are discussed, and first examples of its use are given, for example a camera simulation for a moon lander. Finally, the verification of SENSOR++ is demonstrated.

  2. The pan-sharpening of satellite and UAV imagery for agricultural applications

    NASA Astrophysics Data System (ADS)

    Jenerowicz, Agnieszka; Woroszkiewicz, Malgorzata

    2016-10-01

    Remote sensing techniques are widely used in many different areas of interest, i.e. urban studies, environmental studies, agriculture, etc., due to fact that they provide rapid, accurate and information over large areas with optimal time, spatial and spectral resolutions. Agricultural management is one of the most common application of remote sensing methods nowadays. Monitoring of agricultural sites and creating information regarding spatial distribution and characteristics of crops are important tasks to provide data for precision agriculture, crop management and registries of agricultural lands. For monitoring of cultivated areas many different types of remote sensing data can be used- most popular are multispectral satellites imagery. Such data allow for generating land use and land cover maps, based on various methods of image processing and remote sensing methods. This paper presents fusion of satellite and unnamed aerial vehicle (UAV) imagery for agricultural applications, especially for distinguishing crop types. Authors in their article presented chosen data fusion methods for satellite images and data obtained from low altitudes. Moreover the authors described pan- sharpening approaches and applied chosen pan- sharpening methods for multiresolution image fusion of satellite and UAV imagery. For such purpose, satellite images from Landsat- 8 OLI sensor and data collected within various UAV flights (with mounted RGB camera) were used. In this article, the authors not only had shown the potential of fusion of satellite and UAV images, but also presented the application of pan- sharpening in crop identification and management.

  3. Mountain pine beetle detection and monitoring: evaluation of airborne imagery

    NASA Astrophysics Data System (ADS)

    Roberts, A.; Bone, C.; Dragicevic, S.; Ettya, A.; Northrup, J.; Reich, R.

    2007-10-01

    The processing and evaluation of digital airborne imagery for detection, monitoring and modeling of mountain pine beetle (MPB) infestations is evaluated. The most efficient and reliable remote sensing strategy for identification and mapping of infestation stages ("current" to "red" to "grey" attack) of MPB in lodgepole pine forests is determined for the most practical and cost effective procedures. This research was planned to specifically enhance knowledge by determining the remote sensing imaging systems and analytical procedures that optimize resource management for this critical forest health problem. Within the context of this study, airborne remote sensing of forest environments for forest health determinations (MPB) is most suitably undertaken using multispectral digitally converted imagery (aerial photography) at scales of 1:8000 for early detection of current MPB attack and 1:16000 for mapping and sequential monitoring of red and grey attack. Digital conversion should be undertaken at 10 to 16 microns for B&W multispectral imagery and 16 to 24 microns for colour and colour infrared imagery. From an "operational" perspective, the use of twin mapping-cameras with colour and B&W or colour infrared film will provide the best approximation of multispectral digital imagery with near comparable performance in a competitive private sector context (open bidding).

  4. Image deblurring by motion estimation for remote sensing

    NASA Astrophysics Data System (ADS)

    Chen, Yueting; Wu, Jiagu; Xu, Zhihai; Li, Qi; Feng, Huajun

    2010-08-01

    The imagery resolution of imaging systems for remote sensing is often limited by image degradation resulting from unwanted motion disturbances of the platform during image exposures. Since the form of the platform vibration can be arbitrary, the lack of priori knowledge about the motion function (the PSF) suggests blind restoration approaches. A deblurring method which combines motion estimation and image deconvolution both for area-array and TDI remote sensing has been proposed in this paper. The image motion estimation is accomplished by an auxiliary high-speed detector and a sub-pixel correlation algorithm. The PSF is then reconstructed from estimated image motion vectors. Eventually, the clear image can be recovered by the Richardson-Lucy (RL) iterative deconvolution algorithm from the blurred image of the prime camera with the constructed PSF. The image deconvolution for the area-array detector is direct. While for the TDICCD detector, an integral distortion compensation step and a row-by-row deconvolution scheme are applied. Theoretical analyses and experimental results show that, the performance of the proposed concept is convincing. Blurred and distorted images can be properly recovered not only for visual observation, but also with significant objective evaluation increment.

  5. Get the Picture?

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Positive Systems has worked in conjunction with Stennis Space Center to design the ADAR System 5500. This is a four-band airborne digital imaging system used to capture multispectral imagery similar to that available from satellite platforms such as Landsat, SPOT and the new generation of high resolution satellites. Positive Systems has provided remote sensing services for the development of digital aerial camera systems and software for commercial aerial imaging applications.

  6. Metric remote sensing experiments in preparation for Spacelab flights. [alpine geomorphology and ice and/or snow cover

    NASA Technical Reports Server (NTRS)

    Galibert, G.

    1978-01-01

    Aerial and ground photographs of Wallis mountains and of Dolomiti di Cortina d'Ampezzo in Italy were made using spectrozonal emulsions and optical multichannel filters. A metric camera was used in the perspective of the first Spacelab flight aboard the space shuttle. Elementary forms of alpine geomorphology and ice or snow phenomena are detectable on these metric scenes.

  7. Lunar Reconnaissance Orbiter Camera Observations Relating to Science and Landing Site Selection in South Pole-Aitken Basin for a Robotic Sample Return Mission

    NASA Technical Reports Server (NTRS)

    Jolliff, B. L.; Clegg-Watkins, R. N.; Petro, N. E.; Lawrence, S. L.

    2016-01-01

    The Moon's South Pole-Aitken basin (SPA) is a high priority target for Solar System exploration, and sample return from SPA is a specific objective in NASA's New Frontiers program. Samples returned from SPA will improve our understanding of early lunar and Solar System events, mainly by placing firm timing constraints on SPA formation and the post-SPA late-heavy bombardment (LHB). Lunar Reconnaissance Orbiter Camera (LROC) images and topographic data, especially Narrow Angle Camera (NAC) scale (1-3 mpp) morphology and digital terrain model (DTM) data are critical for selecting landing sites and assessing landing hazards. Rock components in regolith at a given landing site should include (1) original SPA impact-melt rocks and breccia (to determine the age of the impact event and what materials were incorporated into the melt); (2) impact-melt rocks and breccia from large craters and basins (other than SPA) that represent the post-SPA LHB interval; (3) volcanic basalts derived from the sub-SPA mantle; and (4) older, "cryptomare" (ancient buried volcanics excavated by impact craters, to determine the volcanic history of SPA basin). All of these rock types are sought for sample return. The ancient SPA-derived impact-melt rocks and later-formed melt rocks are needed to determine chronology, and thus address questions of early Solar System dynamics, lunar history, and effects of giant impacts. Surface compositions from remote sensing are consistent with mixtures of SPA impactite and volcanic materials, and near infrared spectral data distinguish areas with variable volcanic contents vs. excavated SPA substrate. Estimating proportions of these rock types in the regolith requires knowledge of the surface deposits, evaluated via morphology, slopes, and terrain ruggedness. These data allow determination of mare-cryptomare-nonmare deposit interfaces in combination with compositional and mineralogical remote sensing to establish the types and relative proportions of materials expected at a given site. Remote sensing compositions, e.g., FeO, also constrain the relative abundances of components. Landing-site assessments use crater and boulder distributions, and slope and terrain rugge

  8. Depth Estimation of Submerged Aquatic Vegetation in Clear Water Streams Using Low-Altitude Optical Remote Sensing

    PubMed Central

    Visser, Fleur; Buis, Kerst; Verschoren, Veerle; Meire, Patrick

    2015-01-01

    UAVs and other low-altitude remote sensing platforms are proving very useful tools for remote sensing of river systems. Currently consumer grade cameras are still the most commonly used sensors for this purpose. In particular, progress is being made to obtain river bathymetry from the optical image data collected with such cameras, using the strong attenuation of light in water. No studies have yet applied this method to map submergence depth of aquatic vegetation, which has rather different reflectance characteristics from river bed substrate. This study therefore looked at the possibilities to use the optical image data to map submerged aquatic vegetation (SAV) depth in shallow clear water streams. We first applied the Optimal Band Ratio Analysis method (OBRA) of Legleiter et al. (2009) to a dataset of spectral signatures from three macrophyte species in a clear water stream. The results showed that for each species the ratio of certain wavelengths were strongly associated with depth. A combined assessment of all species resulted in equally strong associations, indicating that the effect of spectral variation in vegetation is subsidiary to spectral variation due to depth changes. Strongest associations (R2-values ranging from 0.67 to 0.90 for different species) were found for combinations including one band in the near infrared (NIR) region between 825 and 925 nm and one band in the visible light region. Currently data of both high spatial and spectral resolution is not commonly available to apply the OBRA results directly to image data for SAV depth mapping. Instead a novel, low-cost data acquisition method was used to obtain six-band high spatial resolution image composites using a NIR sensitive DSLR camera. A field dataset of SAV submergence depths was used to develop regression models for the mapping of submergence depth from image pixel values. Band (combinations) providing the best performing models (R2-values up to 0.77) corresponded with the OBRA findings. A 10% error was achieved under sub-optimal data collection conditions, which indicates that the method could be suitable for many SAV mapping applications. PMID:26437410

  9. Depth Estimation of Submerged Aquatic Vegetation in Clear Water Streams Using Low-Altitude Optical Remote Sensing.

    PubMed

    Visser, Fleur; Buis, Kerst; Verschoren, Veerle; Meire, Patrick

    2015-09-30

    UAVs and other low-altitude remote sensing platforms are proving very useful tools for remote sensing of river systems. Currently consumer grade cameras are still the most commonly used sensors for this purpose. In particular, progress is being made to obtain river bathymetry from the optical image data collected with such cameras, using the strong attenuation of light in water. No studies have yet applied this method to map submergence depth of aquatic vegetation, which has rather different reflectance characteristics from river bed substrate. This study therefore looked at the possibilities to use the optical image data to map submerged aquatic vegetation (SAV) depth in shallow clear water streams. We first applied the Optimal Band Ratio Analysis method (OBRA) of Legleiter et al. (2009) to a dataset of spectral signatures from three macrophyte species in a clear water stream. The results showed that for each species the ratio of certain wavelengths were strongly associated with depth. A combined assessment of all species resulted in equally strong associations, indicating that the effect of spectral variation in vegetation is subsidiary to spectral variation due to depth changes. Strongest associations (R²-values ranging from 0.67 to 0.90 for different species) were found for combinations including one band in the near infrared (NIR) region between 825 and 925 nm and one band in the visible light region. Currently data of both high spatial and spectral resolution is not commonly available to apply the OBRA results directly to image data for SAV depth mapping. Instead a novel, low-cost data acquisition method was used to obtain six-band high spatial resolution image composites using a NIR sensitive DSLR camera. A field dataset of SAV submergence depths was used to develop regression models for the mapping of submergence depth from image pixel values. Band (combinations) providing the best performing models (R²-values up to 0.77) corresponded with the OBRA findings. A 10% error was achieved under sub-optimal data collection conditions, which indicates that the method could be suitable for many SAV mapping applications.

  10. Design of a multispectral, wedge filter, remote-sensing instrument incorporating a multiport, thinned, CCD area array

    NASA Astrophysics Data System (ADS)

    Demro, James C.; Hartshorne, Richard; Woody, Loren M.; Levine, Peter A.; Tower, John R.

    1995-06-01

    The next generation Wedge Imaging Spectrometer (WIS) instruments currently in integration at Hughes SBRD incorporate advanced features to increase operation flexibility for remotely sensed hyperspectral imagery collection and use. These features include: a) multiple linear wedge filters to tailor the spectral bands to the scene phenomenology; b) simple, replaceable fore-optics to allow different spatial resolutions and coverages; c) data acquisition system (DAS) that collects the full data stream simultaneously from both WIS instruments (VNIR and SWIR/MWIR), stores the data in a RAID storage, and provides for down-loading of the data to MO disks; the WIS DAS also allows selection of the spectral band sets to be stored; d) high-performance VNIR camera subsystem based upon a 512 X 512 CCD area array and associated electronics.

  11. Low cost infrared and near infrared sensors for UAVs

    NASA Astrophysics Data System (ADS)

    Aden, S. T.; Bialas, J. P.; Champion, Z.; Levin, E.; McCarty, J. L.

    2014-11-01

    Thermal remote sensing has a wide range of applications, though the extent of its use is inhibited by cost. Robotic and computer components are now widely available to consumers on a scale that makes thermal data a readily accessible resource. In this project, thermal imagery collected via a lightweight remote sensing Unmanned Aerial Vehicle (UAV) was used to create a surface temperature map for the purpose of providing wildland firefighting crews with a cost-effective and time-saving resource. The UAV system proved to be flexible, allowing for customized sensor packages to be designed that could include visible or infrared cameras, GPS, temperature sensors, and rangefinders, in addition to many data management options. Altogether, such a UAV system could be used to rapidly collect thermal and aerial data, with a geographic accuracy of less than one meter.

  12. Satellite land remote sensing advancements for the eighties; Proceedings of the Eighth Pecora Symposium, Sioux Falls, SD, October 4-7, 1983

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Among the topics discussed are NASA's land remote sensing plans for the 1980s, the evolution of Landsat 4 and the performance of its sensors, the Landsat 4 thematic mapper image processing system radiometric and geometric characteristics, data quality, image data radiometric analysis and spectral/stratigraphic analysis, and thematic mapper agricultural, forest resource and geological applications. Also covered are geologic applications of side-looking airborne radar, digital image processing, the large format camera, the RADARSAT program, the SPOT 1 system's program status, distribution plans, and simulation program, Space Shuttle multispectral linear array studies of the optical and biological properties of terrestrial land cover, orbital surveys of solar-stimulated luminescence, the Space Shuttle imaging radar research facility, and Space Shuttle-based polar ice sounding altimetry.

  13. A Simple Approach to Collecting Useful Wildlife Data Using Remote Camera-Traps in Undergraduate Biology Courses

    ERIC Educational Resources Information Center

    Christensen, David R.

    2016-01-01

    Remote camera-traps are commonly used to estimate the abundance, diversity, behavior and habitat use of wildlife in an inexpensive and nonintrusive manner. Because of the increasing use of remote-cameras in wildlife studies, students interested in wildlife biology should be exposed to the use of remote-cameras early in their academic careers.…

  14. Airborne Optical and Thermal Remote Sensing for Wildfire Detection and Monitoring.

    PubMed

    Allison, Robert S; Johnston, Joshua M; Craig, Gregory; Jennings, Sion

    2016-08-18

    For decades detection and monitoring of forest and other wildland fires has relied heavily on aircraft (and satellites). Technical advances and improved affordability of both sensors and sensor platforms promise to revolutionize the way aircraft detect, monitor and help suppress wildfires. Sensor systems like hyperspectral cameras, image intensifiers and thermal cameras that have previously been limited in use due to cost or technology considerations are now becoming widely available and affordable. Similarly, new airborne sensor platforms, particularly small, unmanned aircraft or drones, are enabling new applications for airborne fire sensing. In this review we outline the state of the art in direct, semi-automated and automated fire detection from both manned and unmanned aerial platforms. We discuss the operational constraints and opportunities provided by these sensor systems including a discussion of the objective evaluation of these systems in a realistic context.

  15. Airborne Optical and Thermal Remote Sensing for Wildfire Detection and Monitoring

    PubMed Central

    Allison, Robert S.; Johnston, Joshua M.; Craig, Gregory; Jennings, Sion

    2016-01-01

    For decades detection and monitoring of forest and other wildland fires has relied heavily on aircraft (and satellites). Technical advances and improved affordability of both sensors and sensor platforms promise to revolutionize the way aircraft detect, monitor and help suppress wildfires. Sensor systems like hyperspectral cameras, image intensifiers and thermal cameras that have previously been limited in use due to cost or technology considerations are now becoming widely available and affordable. Similarly, new airborne sensor platforms, particularly small, unmanned aircraft or drones, are enabling new applications for airborne fire sensing. In this review we outline the state of the art in direct, semi-automated and automated fire detection from both manned and unmanned aerial platforms. We discuss the operational constraints and opportunities provided by these sensor systems including a discussion of the objective evaluation of these systems in a realistic context. PMID:27548174

  16. Spacecraft hazard avoidance utilizing structured light

    NASA Technical Reports Server (NTRS)

    Liebe, Carl Christian; Padgett, Curtis; Chapsky, Jacob; Wilson, Daniel; Brown, Kenneth; Jerebets, Sergei; Goldberg, Hannah; Schroeder, Jeffrey

    2006-01-01

    At JPL, a <5 kg free-flying micro-inspector spacecraft is being designed for host-vehicle inspection. The spacecraft includes a hazard avoidance sensor to navigate relative to the vehicle being inspected. Structured light was selected for hazard avoidance because of its low mass and cost. Structured light is a method of remote sensing 3-dimensional structure of the proximity utilizing a laser, a grating, and a single regular APS camera. The laser beam is split into 400 different beams by a grating to form a regular spaced grid of laser beams that are projected into the field of view of an APS camera. The laser source and the APS camera are separated forming the base of a triangle. The distance to all beam intersections of the host are calculated based on triangulation.

  17. Optical Polarization in the Nearshore

    NASA Astrophysics Data System (ADS)

    Holman, R.

    2008-12-01

    A recent addition to the suite of optical remote sensing methods that have been used to study nearshore processes is the use of imaging polarimetric cameras. Both the degree of polarization and the azimuth of polarized light contain information about the imaged surfaces from which light has been reflected or scattered. In 2007, a polarimetric Argus camera was installed atop the tower at Duck, NC. This talk will examine the various polarization signatures that can be exploited, including the potential for measuring the sea surface slope spectrum of nearshore surf zone waves, the slope of the foreshore beach, water content of foreshore sediments and bubble signatures of dissipating waves.

  18. Preliminary investigation of Large Format Camera photography utility in soil mapping and related agricultural applications

    NASA Technical Reports Server (NTRS)

    Pelletier, R. E.; Hudnall, W. H.

    1987-01-01

    The use of Space Shuttle Large Format Camera (LFC) color, IR/color, and B&W images in large-scale soil mapping is discussed and illustrated with sample photographs from STS 41-6 (October 1984). Consideration is given to the characteristics of the film types used; the photographic scales available; geometric and stereoscopic factors; and image interpretation and classification for soil-type mapping (detecting both sharp and gradual boundaries), soil parent material topographic and hydrologic assessment, natural-resources inventory, crop-type identification, and stress analysis. It is suggested that LFC photography can play an important role, filling the gap between aerial and satellite remote sensing.

  19. International Space Station Data Collection for Disaster Response

    NASA Technical Reports Server (NTRS)

    Stefanov, William L.; Evans, Cynthia A..

    2014-01-01

    Natural disasters - including such events as tropical storms, earthquakes, floods, volcanic eruptions, and wildfires -effect hundreds of millions of people worldwide, and also cause billions of dollars (USD) in damage to the global economy. Remotely sensed data acquired by orbital sensor systems has emerged as a vital tool to identify the extent of damage resulting from a natural disaster, as well as providing near-real time mapping support to response efforts on the ground and humanitarian aid efforts. The International Space Station (ISS) is a unique terrestrial remote sensing platform for acquiring disaster response imagery. Unlike automated remote-sensing platforms it has a human crew; is equipped with both internal and externally-mounted remote sensing instruments; and has an inclined, low-Earth orbit that provides variable views and lighting (day and night) over 95 percent of the inhabited surface of the Earth. As such, it provides a useful complement to free-flyer based, sun-synchronous sensor systems in higher altitude polar orbits. While several nations have well-developed terrestrial remote sensing programs and assets for data collection, many developing nations do not have ready access to such resources. The International Charter, Space and Major Disasters (also known as the "International Disaster Charter", or IDC; http://www.disasterscharter.org/home) addresses this disparity. It is an agreement between agencies of several countries to provide - on a best-effort basis - remotely sensed data of natural disasters to requesting countries in support of disaster response. The lead US agency for interaction with the IDC is the United States Geological Survey (USGS); when an IDC request or "activation" is received, the USGS notifies the science teams for NASA instruments with targeting information for data collection. In the case of the ISS, the Earth Sciences and Remote Sensing (ESRS) Unit, part of the Astromaterials Research and Exploration Science Directorate and supporting the ISS Program Science Office at NASA's Johnson Space Center, receives notification from the USGS and coordinates targeting and data collection with the NASA ISS sensor teams. If data is collected, it is passed back to the USGS for posting on their Hazards Data Distribution System and made available for download. The ISS International Partners (CSA, ESA, JAXA, Roscosmos/Energia) have their own procedures for independently supporting IDC activations using their assets on ISS, and there is currently no joint coordination with NASA ISS sensor teams. Following completion of ISS assembly, NASA remote sensing assets began collecting IDC response data in May 2012. The initial NASA ISS sensor systems available to respond to IDC activations included the ISS Agricultural Camera (ISSAC), an internal multispectral visible-near infrared wavelength system mounted in the Window Observational Research Facility, or WORF; the Crew Earth Observations (CEO) Facility, where the crew collects imagery through Station windows using off-the-shelf handheld digital visible-wavelength cameras; and the Hyperspectral Imager for the Coastal Oceans (HICO), a visible to near-infrared system mounted externally on the Japan Experiment Module Exposed Facility. The ISSAC completed its primary mission and was removed from the WORF in January 2013. It was replaced by the very high resolution ISS SERVIR Environmental Research and Visualization System (ISERV) Pathfinder, a visible-wavelength digital camera, telescope, and pointing system. Since the start of IDC response by NASA sensors on the ISS in May 2012 and as of this report, there have been eighty IDC activations; NASA sensor systems have collected data for twenty-three of these events. Of the twenty-three successful data collections, five involved 2 or more ISS sensor systems responding to the same event. Data has also been collected by International Partners in response to natural disasters, most notably JAXA and Roscosmos/Energia through the Urugan program. Data collected in response to IDC activations is delivered by the ISS sensor teams to the ESRS for quality review and transfer to the USGS, where it is ingested into the Hazards Data Distribution System, or HDDS (https://hdds.usgs.gov/hdds2/; figure 1). This system allows the local agencies that issued the IDC activation request to review and download data. The data is then used to develop secondary products useful for humanitarian response such as flood maps. As of this report, approximately 1000 images collected by NASA ISS sensor systems have been downloaded from the HDDS, indicating that the ISS has assumed a valuable role in disaster response efforts. The ISS is also a unique platform in that it will have multiple users over its lifetime, and that no single remote sensing system has a permanent internal or external berth. This scheduled turnover provides for development of new remote sensing capabilities relevant to disaster response -as well as both research and applied science-and represents a significant contribution to continuance and enhancement of the NASA mission to investigate changes on our home planet.

  20. Preface: The Chang'e-3 lander and rover mission to the Moon

    NASA Astrophysics Data System (ADS)

    Ip, Wing-Huen; Yan, Jun; Li, Chun-Lai; Ouyang, Zi-Yuan

    2014-12-01

    The Chang'e-3 (CE-3) lander and rover mission to the Moon was an intermediate step in China's lunar exploration program, which will be followed by a sample return mission. The lander was equipped with a number of remote-sensing instruments including a pair of cameras (Landing Camera and Terrain Camera) for recording the landing process and surveying terrain, an extreme ultraviolet camera for monitoring activities in the Earth's plasmasphere, and a first-ever Moon-based ultraviolet telescope for astronomical observations. The Yutu rover successfully carried out close-up observations with the Panoramic Camera, mineralogical investigations with the VIS-NIR Imaging Spectrometer, study of elemental abundances with the Active Particle-induced X-ray Spectrometer, and pioneering measurements of the lunar subsurface with Lunar Penetrating Radar. This special issue provides a collection of key information on the instrumental designs, calibration methods and data processing procedures used by these experiments with a perspective of facilitating further analyses of scientific data from CE-3 in preparation for future missions.

  1. Development of an Infrared Remote Sensing System for Continuous Monitoring of Stromboli Volcano

    NASA Astrophysics Data System (ADS)

    Harig, R.; Burton, M.; Rausch, P.; Jordan, M.; Gorgas, J.; Gerhard, J.

    2009-04-01

    In order to monitor gases emitted by Stromboli volcano in the Eolian archipelago, Italy, a remote sensing system based on Fourier-transform infrared spectroscopy has been developed and installed on the summit of Stromboli volcano. Hot rocks and lava are used as sources of infrared radiation. The system is based on an interferometer with a single detector element in combination with an azimuth-elevation scanning mirror system. The mirror system is used to align the field of view of the instrument. In addition, the system is equipped with an infrared camera. Two basic modes of operation have been implemented: The user may use the infrared image to align the system to a vent that is to be examined. In addition, the scanning system may be used for (hyperspectral) imaging of the scene. In this mode, the scanning mirror is set sequentially move to all positions within a region of interest which is defined by the operator using the image generated from the infrared camera. The spectral range used for the measurements is 1600 - 4200 cm-1 allowing the quantification of many gases such as CO, CO2, SO2, and HCl. The spectral resolution is 0.5 cm-1. In order to protect the optical, mechanical and electrical parts of the system from the volcanic gases, all components are contained in a gas-tight aluminium housing. The system is controlled via TCP/IP (data transfer by WLAN), allowing the user to operate it from a remote PC. The infrared image of the scene and measured spectra are transferred to and displayed by a remote PC at INGV or TUHH in real-time. However, the system is capable of autonomous operation on the volcano, once a measurement has been started. Measurements are stored by an internal embedded PC.

  2. Visual Sensing for Urban Flood Monitoring

    PubMed Central

    Lo, Shi-Wei; Wu, Jyh-Horng; Lin, Fang-Pang; Hsu, Ching-Han

    2015-01-01

    With the increasing climatic extremes, the frequency and severity of urban flood events have intensified worldwide. In this study, image-based automated monitoring of flood formation and analyses of water level fluctuation were proposed as value-added intelligent sensing applications to turn a passive monitoring camera into a visual sensor. Combined with the proposed visual sensing method, traditional hydrological monitoring cameras have the ability to sense and analyze the local situation of flood events. This can solve the current problem that image-based flood monitoring heavily relies on continuous manned monitoring. Conventional sensing networks can only offer one-dimensional physical parameters measured by gauge sensors, whereas visual sensors can acquire dynamic image information of monitored sites and provide disaster prevention agencies with actual field information for decision-making to relieve flood hazards. The visual sensing method established in this study provides spatiotemporal information that can be used for automated remote analysis for monitoring urban floods. This paper focuses on the determination of flood formation based on image-processing techniques. The experimental results suggest that the visual sensing approach may be a reliable way for determining the water fluctuation and measuring its elevation and flood intrusion with respect to real-world coordinates. The performance of the proposed method has been confirmed; it has the capability to monitor and analyze the flood status, and therefore, it can serve as an active flood warning system. PMID:26287201

  3. Handheld hyperspectral imager for standoff detection of chemical and biological aerosols

    NASA Astrophysics Data System (ADS)

    Hinnrichs, Michele; Jensen, James O.; McAnally, Gerard

    2004-02-01

    Pacific Advanced Technology has developed a small hand held imaging spectrometer, Sherlock, for gas leak and aerosol detection and imaging. The system is based on a patent technique that uses diffractive optics and image processing algorithms to detect spectral information about objects in the scene of the camera (IMSS Image Multi-spectral Sensing). This camera has been tested at Dugway Proving Ground and Dstl Porton Down facility looking at Chemical and Biological agent simulants. The camera has been used to investigate surfaces contaminated with chemical agent simulants. In addition to Chemical and Biological detection the camera has been used for environmental monitoring of green house gases and is currently undergoing extensive laboratory and field testing by the Gas Technology Institute, British Petroleum and Shell Oil for applications for gas leak detection and repair. The camera contains an embedded Power PC and a real time image processor for performing image processing algorithms to assist in the detection and identification of gas phase species in real time. In this paper we will present an over view of the technology and show how it has performed for different applications, such as gas leak detection, surface contamination, remote sensing and surveillance applications. In addition a sampling of the results form TRE field testing at Dugway in July of 2002 and Dstl at Porton Down in September of 2002 will be given.

  4. Technical note: A simple approach for efficient collection of field reference data for calibrating remote sensing mapping of northern wetlands

    NASA Astrophysics Data System (ADS)

    Gålfalk, Magnus; Karlson, Martin; Crill, Patrick; Bousquet, Philippe; Bastviken, David

    2018-03-01

    The calibration and validation of remote sensing land cover products are highly dependent on accurate field reference data, which are costly and practically challenging to collect. We describe an optical method for collection of field reference data that is a fast, cost-efficient, and robust alternative to field surveys and UAV imaging. A lightweight, waterproof, remote-controlled RGB camera (GoPro HERO4 Silver, GoPro Inc.) was used to take wide-angle images from 3.1 to 4.5 m in altitude using an extendable monopod, as well as representative near-ground (< 1 m) images to identify spectral and structural features that correspond to various land covers in present lighting conditions. A semi-automatic classification was made based on six surface types (graminoids, water, shrubs, dry moss, wet moss, and rock). The method enables collection of detailed field reference data, which is critical in many remote sensing applications, such as satellite-based wetland mapping. The method uses common non-expensive equipment, does not require special skills or training, and is facilitated by a step-by-step manual that is included in the Supplement. Over time a global ground cover database can be built that can be used as reference data for studies of non-forested wetlands from satellites such as Sentinel 1 and 2 (10 m pixel size).

  5. Remotely-Sensed Geology from Lander-Based to Orbital Perspectives: Results for FIDO Rover Field Tests

    NASA Technical Reports Server (NTRS)

    Jolliff, B.; Moersch, J.; Knoll, A.; Morris, R.; Arvidson, R.; Gilmore, M.; Greeley, R.; Herkenhoff, K.; McSween, H.; Squyres, S.

    2000-01-01

    Tests of the FIDO (Field Integration Design and Operations) rover and Athena-like operational scenarios were conducted May 7-16, 2000. A group located at the Jet Propulsion Lab, Pasadena, CA, formed the Core Operations Team (COT) that designed experiments and command sequences while another team tracked, maintained, and secured the rover in the field. The COT had no knowledge of the specific field location, thus the tests were done "blind." In addition to FIDO rover instrumentation, the COT had access to LANDSAT 7, TIMS, and AVIRIS regional coverage and color descent images. Using data from the FIDO instruments, primarily a color microscopic imager (CMI), infrared point spectrometer (IPS; 1.5-2.4 microns), and a three-color stereo panoramic camera (Pancam), the COT correlated lithologic features (mineralogy, rock types) from the simulated landing site to a regional scale. The May test results provide an example of how to relate site geology from landed rover investigations to the regional geology using remote sensing. The capability to relate mineralogic signatures using the point IR spectrometer to remotely sensed, multispectral or hyperspectral data proved to be key to integration of the in-situ and remote data. This exercise demonstrated the potential synergy between lander-based and orbital data, and highlighted the need to investigate a landing site in detail and at multiple scales.

  6. BRESEX: On board supervision, basic architecture and preliminary aspects for payload and space shuttle interface

    NASA Technical Reports Server (NTRS)

    Bergamini, E. W.; Depaula, A. R., Jr.; Martins, R. C. D. O.

    1984-01-01

    Data relative to the on board supervision subsystem are presented which were considered in a conference between INPE and NASA personnel, with the purpose of initiating a joint effort leading to the implementation of the Brazilian remote sensing experiment - (BRESEX). The BRESEX should consist, basically, of a multispectral camera for Earth observation, to be tested in a future space shuttle flight.

  7. VSF Measurements and Inversion for RaDyO

    DTIC Science & Technology

    2012-09-30

    near-surface waters, including the surf zone. APPROACH MASCOT (Multi-Angle SCattering Optical Tool) has a 30 mW 658 nm laser diode source...in Santa Barbara Channel are provided in Fig. 1. Despite the widespread use of polarized laser sources across a diversity of Navy applications, this...operations that rely on divers, cameras, laser imaging systems, and active and passive remote sensing systems. These include mine countermeasures, harbor

  8. An inexpensive active optical remote sensing instrument for assessing aerosol distributions.

    PubMed

    Barnes, John E; Sharma, Nimmi C P

    2012-02-01

    Air quality studies on a broad variety of topics from health impacts to source/sink analyses, require information on the distributions of atmospheric aerosols over both altitude and time. An inexpensive, simple to implement, ground-based optical remote sensing technique has been developed to assess aerosol distributions. The technique, called CLidar (Charge Coupled Device Camera Light Detection and Ranging), provides aerosol altitude profiles over time. In the CLidar technique a relatively low-power laser transmits light vertically into the atmosphere. The transmitted laser light scatters off of air molecules, clouds, and aerosols. The entire beam from ground to zenith is imaged using a CCD camera and wide-angle (100 degree) optics which are a few hundred meters from the laser. The CLidar technique is optimized for low altitude (boundary layer and lower troposphere) measurements where most aerosols are found and where many other profiling techniques face difficulties. Currently the technique is limited to nighttime measurements. Using the CLidar technique aerosols may be mapped over both altitude and time. The instrumentation required is portable and can easily be moved to locations of interest (e.g. downwind from factories or power plants, near highways). This paper describes the CLidar technique, implementation and data analysis and offers specifics for users wishing to apply the technique for aerosol profiles.

  9. Optimal design of an earth observation optical system with dual spectral and high resolution

    NASA Astrophysics Data System (ADS)

    Yan, Pei-pei; Jiang, Kai; Liu, Kai; Duan, Jing; Shan, Qiusha

    2017-02-01

    With the increasing demand of the high-resolution remote sensing images by military and civilians, Countries around the world are optimistic about the prospect of higher resolution remote sensing images. Moreover, design a visible/infrared integrative optic system has important value in earth observation. Because visible system can't identify camouflage and recon at night, so we should associate visible camera with infrared camera. An earth observation optical system with dual spectral and high resolution is designed. The paper mainly researches on the integrative design of visible and infrared optic system, which makes the system lighter and smaller, and achieves one satellite with two uses. The working waveband of the system covers visible, middle infrared (3-5um). Dual waveband clear imaging is achieved with dispersive RC system. The focal length of visible system is 3056mm, F/# is 10.91. And the focal length of middle infrared system is 1120mm, F/# is 4. In order to suppress the middle infrared thermal radiation and stray light, the second imaging system is achieved and the narcissus phenomenon is analyzed. The system characteristic is that the structure is simple. And the especial requirements of the Modulation Transfer Function (MTF), spot, energy concentration, and distortion etc. are all satisfied.

  10. Crop 3D-a LiDAR based platform for 3D high-throughput crop phenotyping.

    PubMed

    Guo, Qinghua; Wu, Fangfang; Pang, Shuxin; Zhao, Xiaoqian; Chen, Linhai; Liu, Jin; Xue, Baolin; Xu, Guangcai; Li, Le; Jing, Haichun; Chu, Chengcai

    2018-03-01

    With the growing population and the reducing arable land, breeding has been considered as an effective way to solve the food crisis. As an important part in breeding, high-throughput phenotyping can accelerate the breeding process effectively. Light detection and ranging (LiDAR) is an active remote sensing technology that is capable of acquiring three-dimensional (3D) data accurately, and has a great potential in crop phenotyping. Given that crop phenotyping based on LiDAR technology is not common in China, we developed a high-throughput crop phenotyping platform, named Crop 3D, which integrated LiDAR sensor, high-resolution camera, thermal camera and hyperspectral imager. Compared with traditional crop phenotyping techniques, Crop 3D can acquire multi-source phenotypic data in the whole crop growing period and extract plant height, plant width, leaf length, leaf width, leaf area, leaf inclination angle and other parameters for plant biology and genomics analysis. In this paper, we described the designs, functions and testing results of the Crop 3D platform, and briefly discussed the potential applications and future development of the platform in phenotyping. We concluded that platforms integrating LiDAR and traditional remote sensing techniques might be the future trend of crop high-throughput phenotyping.

  11. The effect of flight altitude to data quality of fixed-wing UAV imagery: case study in Murcia, Spain

    NASA Astrophysics Data System (ADS)

    Anders, Niels; Keesstra, Saskia; Cammeraat, Erik

    2014-05-01

    Unmanned Aerial System (UAS) are becoming popular tools in the geosciences due to improving technology and processing techniques. They can potentially fill the gap between spaceborne or manned aircraft remote sensing and terrestrial remote sensing, both in terms of spatial and temporal resolution. In this study we tested a fixed-wing Unmanned Aerial System (UAS) for the application of digital landscape analysis. The focus was to analyze the effect of flight altitude and the effect to accuracy and detail of the produced digital elevation models, derived terrain properties and orthophotos. The aircraft was equipped with a Panasonic GX1 16MP pocket camera with 20 mm lens to capture normal JPEG RGB images. Images were processed using Agisoft Photoscan Pro which includes the structure-from-motion and multiview stereopsis algorithms. The test area consisted of small abandoned agricultural fields in semi-arid Murcia in southeastern Spain. The area was severely damaged after a destructive rainfall event, including damaged check dams, rills, deep gully incisions and piping. Results suggest that careful decisions on flight altitude are essential to find a balance between the area coverage, ground sampling distance, UAS ground speed, camera processing speed and the accurate registration of specific soil erosion features of interest.

  12. Optimizing Radiometric Fidelity to Enhance Aerial Image Change Detection Utilizing Digital Single Lens Reflex (DSLR) Cameras

    NASA Astrophysics Data System (ADS)

    Kerr, Andrew D.

    Determining optimal imaging settings and best practices related to the capture of aerial imagery using consumer-grade digital single lens reflex (DSLR) cameras, should enable remote sensing scientists to generate consistent, high quality, and low cost image data sets. Radiometric optimization, image fidelity, image capture consistency and repeatability were evaluated in the context of detailed image-based change detection. The impetus for this research is in part, a dearth of relevant, contemporary literature, on the utilization of consumer grade DSLR cameras for remote sensing, and the best practices associated with their use. The main radiometric control settings on a DSLR camera, EV (Exposure Value), WB (White Balance), light metering, ISO, and aperture (f-stop), are variables that were altered and controlled over the course of several image capture missions. These variables were compared for their effects on dynamic range, intra-frame brightness variation, visual acuity, temporal consistency, and the detectability of simulated cracks placed in the images. This testing was conducted from a terrestrial, rather than an airborne collection platform, due to the large number of images per collection, and the desire to minimize inter-image misregistration. The results point to a range of slightly underexposed image exposure values as preferable for change detection and noise minimization fidelity. The makeup of the scene, the sensor, and aerial platform, influence the selection of the aperture and shutter speed which along with other variables, allow for estimation of the apparent image motion (AIM) motion blur in the resulting images. The importance of the image edges in the image application, will in part dictate the lowest usable f-stop, and allow the user to select a more optimal shutter speed and ISO. The single most important camera capture variable is exposure bias (EV), with a full dynamic range, wide distribution of DN values, and high visual contrast and acuity occurring around -0.7 to -0.3EV exposure bias. The ideal values for sensor gain, was found to be ISO 100, with ISO 200 a less desirable. This study offers researchers a better understanding of the effects of camera capture settings on RSI pairs and their influence on image-based change detection.

  13. Application of Sensor Fusion to Improve Uav Image Classification

    NASA Astrophysics Data System (ADS)

    Jabari, S.; Fathollahi, F.; Zhang, Y.

    2017-08-01

    Image classification is one of the most important tasks of remote sensing projects including the ones that are based on using UAV images. Improving the quality of UAV images directly affects the classification results and can save a huge amount of time and effort in this area. In this study, we show that sensor fusion can improve image quality which results in increasing the accuracy of image classification. Here, we tested two sensor fusion configurations by using a Panchromatic (Pan) camera along with either a colour camera or a four-band multi-spectral (MS) camera. We use the Pan camera to benefit from its higher sensitivity and the colour or MS camera to benefit from its spectral properties. The resulting images are then compared to the ones acquired by a high resolution single Bayer-pattern colour camera (here referred to as HRC). We assessed the quality of the output images by performing image classification tests. The outputs prove that the proposed sensor fusion configurations can achieve higher accuracies compared to the images of the single Bayer-pattern colour camera. Therefore, incorporating a Pan camera on-board in the UAV missions and performing image fusion can help achieving higher quality images and accordingly higher accuracy classification results.

  14. Completely optical orientation determination for an unstabilized aerial three-line camera

    NASA Astrophysics Data System (ADS)

    Wohlfeil, Jürgen

    2010-10-01

    Aerial line cameras allow the fast acquisition of high-resolution images at low costs. Unfortunately the measurement of the camera's orientation with the necessary rate and precision is related with large effort, unless extensive camera stabilization is used. But also stabilization implicates high costs, weight, and power consumption. This contribution shows that it is possible to completely derive the absolute exterior orientation of an unstabilized line camera from its images and global position measurements. The presented approach is based on previous work on the determination of the relative orientation of subsequent lines using optical information from the remote sensing system. The relative orientation is used to pre-correct the line images, in which homologous points can reliably be determined using the SURF operator. Together with the position measurements these points are used to determine the absolute orientation from the relative orientations via bundle adjustment of a block of overlapping line images. The approach was tested at a flight with the DLR's RGB three-line camera MFC. To evaluate the precision of the resulting orientation the measurements of a high-end navigation system and ground control points are used.

  15. San Juan National Forest Land Management Planning Support System (LMPSS) requirements definition

    NASA Technical Reports Server (NTRS)

    Werth, L. F. (Principal Investigator)

    1981-01-01

    The role of remote sensing data as it relates to a three-component land management planning system (geographic information, data base management, and planning model) can be understood only when user requirements are known. Personnel at the San Juan National Forest in southwestern Colorado were interviewed to determine data needs for managing and monitoring timber, rangelands, wildlife, fisheries, soils, water, geology and recreation facilities. While all the information required for land management planning cannot be obtained using remote sensing techniques, valuable information can be provided for the geographic information system. A wide range of sensors such as small and large format cameras, synthetic aperture radar, and LANDSAT data should be utilized. Because of the detail and accuracy required, high altitude color infrared photography should serve as the baseline data base and be supplemented and updated with data from the other sensors.

  16. Combined Infrared Stereo and Laser Ranging Cloud Measurements from Shuttle Mission STS-85

    NASA Technical Reports Server (NTRS)

    Lancaster, R. S.; Spinhirne, J. D.; Manizade, K. F.

    2004-01-01

    Multiangle remote sensing provides a wealth of information for earth and climate monitoring, such as the ability to measure the height of cloud tops through stereoscopic imaging. As technology advances so do the options for developing spacecraft instrumentation versatile enough to meet the demands associated with multiangle measurements. One such instrument is the infrared spectral imaging radiometer, which flew as part of mission STS-85 of the space shuttle in 1997 and was the first earth- observing radiometer to incorporate an uncooled microbolometer array detector as its image sensor. Specifically, a method for computing cloud-top height with a precision of +/- 620 m from the multispectral stereo measurements acquired during this flight has been developed, and the results are compared with coincident direct laser ranging measurements from the shuttle laser altimeter. Mission STS-85 was the first space flight to combine laser ranging and thermal IR camera systems for cloud remote sensing.

  17. Remote sensing and spectral analysis of plumes from ocean dumping in the New York Bight Apex

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.

    1980-01-01

    The application of the remote sensing techniques of aerial photography and multispectral scanning in the qualitative and quantitative analysis of plumes from ocean dumping of waste materials is investigated in the New York Bight Apex. Plumes resulting from the dumping of acid waste and sewage sludge were observed by Ocean Color Scanner at an altitude of 19.7 km and by Modular Multispectral Scanner and mapping camera at an altitude of 3.0 km. Results of the qualitative analysis of multispectral and photographic data for the mapping, location, and identification of pollution features without concurrent sea truth measurements are presented which demonstrate the usefulness of in-scene calibration. Quantitative distributions of the suspended solids in sewage sludge released in spot and line dumps are also determined by a multiple regression analysis of multispectral and sea truth data.

  18. A novel technique to monitor thermal discharges using thermal infrared imaging.

    PubMed

    Muthulakshmi, A L; Natesan, Usha; Ferrer, Vincent A; Deepthi, K; Venugopalan, V P; Narasimhan, S V

    2013-09-01

    Coastal temperature is an important indicator of water quality, particularly in regions where delicate ecosystems sensitive to water temperature are present. Remote sensing methods are highly reliable for assessing the thermal dispersion. The plume dispersion from the thermal outfall of the nuclear power plant at Kalpakkam, on the southeast coast of India, was investigated from March to December 2011 using thermal infrared images along with field measurements. The absolute temperature as provided by the thermal infrared (TIR) images is used in the Arc GIS environment for generating a spatial pattern of the plume movement. Good correlation of the temperature measured by the TIR camera with the field data (r(2) = 0.89) make it a reliable method for the thermal monitoring of the power plant effluents. The study portrays that the remote sensing technique provides an effective means of monitoring the thermal distribution pattern in coastal waters.

  19. UAV remote sensing capability for precision agriculture, forestry and small natural reservation monitoring

    NASA Astrophysics Data System (ADS)

    Šedina, Jaroslav; Pavelka, Karel; Raeva, Paulina

    2017-04-01

    For ecologically valuable areas monitoring, precise agriculture and forestry, thematic maps or small GIS are needed. Remotely Piloted Aircraft Systems (RPAS) data can be obtained on demand in a short time with cm resolution. Data collection is environmentally friendly and low-cost from an economical point of view. This contribution is focused on using eBee drone for mapping or monitoring national natural reserve which is not opened to public and partly pure inaccessible because its moorland nature. Based on a new equipment (thermal imager, multispectral imager, NIR, NIR red-edge and VIS camera) we started new projects in precise agriculture and forestry.

  20. UAV-based remote sensing of the Heumoes landslide, Austria Vorarlberg

    NASA Astrophysics Data System (ADS)

    Niethammer, U.; Joswig, M.

    2009-04-01

    The Heumoes landslide, is located in the eastern Vorarlberg Alps, Austria, 10 km southeast of Dornbirn. The extension of the landslide is about 2000 m in west to east direction and about 500 m at its widest extent in north to south direction. It occurs between an elevation of 940 m in the east and 1360 m in the west, slope angles of more than 60 % can be observed as well as almost flat areas. Its total volume is estimated to be 9.400.000 cubic meters and its average velocities amount to some centimeter per year. Surface signatures or 'photolineations' of creeping landslides, e.g. fractures and rupture lines in sediments and street pavings, and vegetation contrasts by changes of water table in shallow vegetation in principle can be resolved by remote sensing. The necessary ground cell resolution of few centimeters, however, generally can't be achieved by routine areal or satellite imagery. The fast technological progress of unmanned areal vehicles (UAV) and the reduced payload by miniaturized optical cameras now allow for UAV remote sensing applications that are below the high financial limits of military intelligence. Even with 'low-cost' equipment, the necessary centimeter-scale ground cell resolution can be achieved by adapting the flight altitude to some ten to one hundred meters. Operated by scientists experienced in remote-control flight models, UAV remote sensing can now be performed routinely, and campaign-wise after any significant event of, e.g., heavy rainfall, or partial mudflow. We have investigated a concept of UAV-borne remote sensing based on motorized gliders, and four-propeller helicopters or 'quad-rotors'. Several missions were flown over the Heumoes landslide. Between 2006 and 2008 three series UAV-borne photographs of the Heumoes landslide were taken and could be combined to orto-mosaics of the slope area within few centimeters ground cell resolution. We will present the concept of our low cost quad-rotor UAV system and first results of the image-processing based evaluation of the acquired images to characterize spatial and temporal details of landslide behaviour. We will also sketch first schemes of joint interpretation or 'data fusion' of UAV-based remote sensing with the results from geophysical mapping of underground distribution of soil moisture and fracture processes (Walter & Joswig, EGU 2009).

  1. EVA 4 activity on Flight Day 7 to service the Hubble Space Telescope

    NASA Image and Video Library

    1997-02-17

    S82-E-5652 (17 Feb. 1997) --- Astronaut Gregory J. Harbaugh (solid stripe on EMU) uses Remote Manipulator System (RMS) as a cherry-picker device to service Hubble Space Telescope (HST). In cooperation with astronaut Joseph R. Tanner, nearby, the mission specialist was in the process of replacing the HST's Magnetic Sensing System (MSS) protective caps with new, permanent covers. This view was taken with an Electronic Still Camera (ESC).

  2. Snowflake Visualization

    NASA Astrophysics Data System (ADS)

    Bliven, L. F.; Kucera, P. A.; Rodriguez, P.

    2010-12-01

    NASA Snowflake Video Imagers (SVIs) enable snowflake visualization at diverse field sites. The natural variability of frozen precipitation is a complicating factor for remote sensing retrievals in high latitude regions. Particle classification is important for understanding snow/ice physics, remote sensing polarimetry, bulk radiative properties, surface emissivity, and ultimately, precipitation rates and accumulations. Yet intermittent storms, low temperatures, high winds, remote locations and complex terrain can impede us from observing falling snow in situ. SVI hardware and software have some special features. The standard camera and optics yield 8-bit gray-scale images with resolution of 0.05 x 0.1 mm, at 60 frames per second. Gray-scale images are highly desirable because they display contrast that aids particle classification. Black and white (1-bit) systems display no contrast, so there is less information to recognize particle types, which is particularly burdensome for aggregates. Data are analyzed at one-minute intervals using NASA's Precipitation Link Software that produces (a) Particle Catalogs and (b) Particle Size Distributions (PSDs). SVIs can operate nearly continuously for long periods (e.g., an entire winter season), so natural variability can be documented. Let’s summarize results from field studies this past winter and review some recent SVI enhancements. During the winter of 2009-2010, SVIs were deployed at two sites. One SVI supported weather observations during the 2010 Winter Olympics and Paralympics. It was located close to the summit (Roundhouse) of Whistler Mountain, near the town of Whistler, British Columbia, Canada. In addition, two SVIs were located at the King City Weather Radar Station (WKR) near Toronto, Ontario, Canada. Access was prohibited to the SVI on Whistler Mountain during the Olympics due to security concerns. So to meet the schedule for daily data products, we operated the SVI by remote control. We also upgraded the Precipitation Link Software to allow operator selection of image sub-sampling interval during data processing. Thus quick-look data products were delivered on schedule, even for intense storms that generated large data files. Approximately 11 million snowflakes were recorded and we present highlights from the Particle Catalog and the PSDs obtained during the 2010 Winter Olympics and Paralympics. On the other hand, the SVIs at King Radar, Ontario had a standard resolution camera and a higher resolution camera (0.1 x 0.05 mm and 0.05 x 0.05 mm, respectively). The upgraded camera operated well. Using observations from the King Radar site, we will discuss camera durability and data products from the upgraded SVI. During the ’10-11 winter, a standard SVI is deployed in Finland as part of the Light Precipitation Validation Experiment. Two higher solution SVIs are also deployed in Canada at a field site ~30km from WKR, which will provide data for validation of radar polarization signatures and satellite observations.

  3. A feasibility study of damage detection in beams using high-speed camera (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Wan, Chao; Yuan, Fuh-Gwo

    2017-04-01

    In this paper a method for damage detection in beam structures using high-speed camera is presented. Traditional methods of damage detection in structures typically involve contact (i.e., piezoelectric sensor or accelerometer) or non-contact sensors (i.e., laser vibrometer) which can be costly and time consuming to inspect an entire structure. With the popularity of the digital camera and the development of computer vision technology, video cameras offer a viable capability of measurement including higher spatial resolution, remote sensing and low-cost. In the study, a damage detection method based on the high-speed camera was proposed. The system setup comprises a high-speed camera and a line-laser which can capture the out-of-plane displacement of a cantilever beam. The cantilever beam with an artificial crack was excited and the vibration process was recorded by the camera. A methodology called motion magnification, which can amplify subtle motions in a video is used for modal identification of the beam. A finite element model was used for validation of the proposed method. Suggestions for applications of this methodology and challenges in future work will be discussed.

  4. The International Space Station: A Unique Platform for Remote Sensing of Natural Disasters

    NASA Technical Reports Server (NTRS)

    Stefanov, William L.; Evans, Cynthia A.

    2014-01-01

    Assembly of the International Space Station (ISS) was completed in 2012, and the station is now fully operational as a platform for remote sensing instruments tasked with collecting scientific data about the Earth system. Remote sensing systems are mounted inside the ISS, primarily in the U.S. Destiny Module's Window Observational Research Facility (WORF), or are located on the outside of the ISS on any of several attachment points. While NASA and other space agencies have had remote sensing systems orbiting Earth and collecting publicly available data since the early 1970s, these sensors are carried onboard free-flying, unmanned satellites. These satellites are traditionally placed into Sun-synchronous polar orbits that allow imaging of the entire surface of the Earth to be repeated with approximately the same Sun illumination (typically local solar noon) over specific areas, with set revisit times that allow uniform data to be taken over long time periods and enable straightforward analysis of change over time. In contrast, the ISS has an inclined, Sun-asynchronous orbit (the solar illumination for data collections over any location changes as the orbit precesses) that carries it over locations on the Earth between approximately 52degnorth and 52deg south latitudes (figure 1). The ISS is also unique among NASA orbital platforms in that it has a human crew. The presence of a crew provides options not available to robotic sensors and platforms, such as the ability to collect unscheduled data of an unfolding event using handheld digital cameras as part of the Crew Earth Observations (CEO) facility and on-the-fly assessment of environmental conditions, such as cloud cover, to determine whether conditions are favorable for data collection. The crew can also swap out internal sensor systems installed in the WORF as needed. The ISS orbit covers more than 90 percent of the inhabited surface of the Earth, allowing the ISS to pass over the same ground locations at different times of the day and night. This is important for two reasons: 1) certain surface processes (i.e., development of coastal fog banks) occur at times other than local solar noon, making it difficult to collect relevant data from traditional satellite platforms, and 2) it provides opportunities for the ISS to collect data for short-duration events, such as natural disasters, that polar-orbiting satellites may miss due to their orbital dynamics - in essence, the ISS can be "in the right place at the right time" to collect data. An immediate application of ISS remote sensing data collection is that the data can be used to provide information for humanitarian aid after a natural disaster. This activity contributes directly to the station's Benefits to Humanity mission. The International Charter, Space and Major Disasters (also known as the International Disaster Charter, or IDC) is an agreement between agencies of several countries to provide - on a best-effort basis - remotely sensed data related to natural disasters to requesting countries in support of disaster response. In the United States, the lead agency for interaction with the IDC is the United States Geological Survey (USGS); when an IDC request, or activation, is received, the USGS notifies the science teams for NASA instruments with targeting information for data collection. In the case of the ISS, Earth scientists in the JSC ARES Directorate, in association with the ISS Program Science Office, coordinate targeting and data collection with the USGS. If data is collected, it is passed back to the USGS for posting on its Hazards Data Distribution System and made available for download. The ISS was added to the USGS's list of NASA remote sensing assets that could respond to IDC activations in May 2012. Initially, the NASA ISS sensor systems available to respond to IDC activations included the ISS Agricultural Camera (ISSAC), an internal multispectral visible-near infrared wavelength system mounted in the WORF; CEO, a project that collects imagery through the ISS windows using off-the-shelf handheld digital visible-wavelength cameras; and the Hyperspectral Imager for the Coastal Oceans (HICO), a visible to near-infrared system mounted externally on the Japanese Experiment Module - Exposed Facility. Since May 2012, there have been 37 IDC activations; ISS sensor systems have collected data for 10 of these events.

  5. MVO Automation Platform: Addressing Unmet Needs in Clinical Laboratories with Microcontrollers, 3D Printing, and Open-Source Hardware/Software.

    PubMed

    Iglehart, Brian

    2018-05-01

    Laboratory automation improves test reproducibility, which is vital to patient care in clinical laboratories. Many small and specialty laboratories are excluded from the benefits of automation due to low sample number, cost, space, and/or lack of automation expertise. The Minimum Viable Option (MVO) automation platform was developed to address these hurdles and fulfill an unmet need. Consumer 3D printing enabled rapid iterative prototyping to allow for a variety of instrumentation and assay setups and procedures. Three MVO versions have been produced. MVOv1.1 successfully performed part of a clinical assay, and results were comparable to those of commercial automation. Raspberry Pi 3 Model B (RPI3) single-board computers with Sense Hardware Attached on Top (HAT) and Raspberry Pi Camera Module V2 hardware were remotely accessed and evaluated for their suitability to qualify the latest MVOv1.2 platform. Sense HAT temperature, barometric pressure, and relative humidity sensors were stable in climate-controlled environments and are useful in identifying appropriate laboratory spaces for automation placement. The RPI3 with camera plus digital dial indicator logged axis travel experiments. RPI3 with camera and Sense HAT as a light source showed promise when used for photometric dispensing tests. Individual well standard curves were necessary for well-to-well light and path length compensations.

  6. Remote camera observations of lava dome growth at Mount St. Helens, Washington, October 2004 to February 2006: Chapter 11 in A volcano rekindled: the renewed eruption of Mount St. Helens, 2004-2006

    USGS Publications Warehouse

    Poland, Michael P.; Dzurisin, Daniel; LaHusen, Richard G.; Major, John J.; Lapcewich, Dennis; Endo, Elliot T.; Gooding, Daniel J.; Schilling, Steve P.; Janda, Christine G.; Sherrod, David R.; Scott, William E.; Stauffer, Peter H.

    2008-01-01

    Images from a Web-based camera (Webcam) located 8 km north of Mount St. Helens and a network of remote, telemetered digital cameras were used to observe eruptive activity at the volcano between October 2004 and February 2006. The cameras offered the advantages of low cost, low power, flexibility in deployment, and high spatial and temporal resolution. Images obtained from the cameras provided important insights into several aspects of dome extrusion, including rockfalls, lava extrusion rates, and explosive activity. Images from the remote, telemetered digital cameras were assembled into time-lapse animations of dome extrusion that supported monitoring, research, and outreach efforts. The wide-ranging utility of remote camera imagery should motivate additional work, especially to develop the three-dimensional quantitative capabilities of terrestrial camera networks.

  7. NH11B-1726: FrankenRaven: A New Platform for Remote Sensing

    NASA Technical Reports Server (NTRS)

    Dahlgren, Robert; Fladeland, Matthew M.; Pinsker, Ethan A.; Jasionowicz, John P.; Jones, Lowell L.; Pscheid, Matthew J.

    2016-01-01

    Small, modular aircraft are an emerging technology with a goal to maximize flexibility and enable multi-mission support. This reports the progress of an unmanned aerial system (UAS) project conducted at the NASA Ames Research Center (ARC) in 2016. This interdisciplinary effort builds upon the success of the 2014 FrankenEye project to apply rapid prototyping techniques to UAS, to develop a variety of platforms to host remote sensing instruments. In 2016, ARC received AeroVironment RQ-11A and RQ-11B Raven UAS from the US Department of the Interior, Office of Aviation Services. These aircraft have electric propulsion, a wingspan of roughly 1.3m, and have demonstrated reliability in challenging environments. The Raven airframe is an ideal foundation to construct more complex aircraft, and student interns using 3D printing were able to graft multiple Raven wings and fuselages into FrankenRaven aircraft. Aeronautical analysis shows that the new configuration has enhanced flight time, payload capacity, and distance compared to the original Raven. The FrankenRaven avionics architecture replaces the mil-spec avionics with COTS technology based upon the 3DR Pixhawk PX4 autopilot with a safety multiplexer for failsafe handoff to 2.4 GHz RC control and 915 MHz telemetry. This project demonstrates how design reuse, rapid prototyping, and modular subcomponents can be leveraged into flexible airborne platforms that can host a variety of remote sensing payloads and even multiple payloads. Modularity advances a new paradigm: mass-customization of aircraft around given payload(s). Multi-fuselage designs are currently under development to host a wide variety of payloads including a zenith-pointing spectrometer, a magnetometer, a multi-spectral camera, and a RGB camera. After airworthiness certification, flight readiness review, and test flights are performed at Crows Landing airfield in central California, field data will be taken at Kilauea volcano in Hawaii and other locations.

  8. FrankenRaven: A New Platform for Remote Sensing

    NASA Astrophysics Data System (ADS)

    Dahlgren, R. P.; Fladeland, M. M.; Pinsker, E. A.; Jasionowicz, J. P.; Jones, L. L.; Mosser, C. D.; Pscheid, M. J.; Weidow, N. L.; Kelly, P. J.; Kern, C.; Werner, C. A.; Johnson, M. S.

    2016-12-01

    Small, modular aircraft are an emerging technology with a goal to maximize flexibility and enable multi-mission support. This reports the progress of an unmanned aerial system (UAS) project conducted at the NASA Ames Research Center (ARC) in 2016. This interdisciplinary effort builds upon the success of the 2014 FrankenEye project to apply rapid prototyping techniques to UAS, to develop a variety of platforms to host remote sensing instruments. In 2016, ARC received AeroVironment RQ-11A and RQ-11B Raven UAS from the US Department of the Interior, Office of Aviation Services. These aircraft have electric propulsion, a wingspan of roughly 1.3m, and have demonstrated reliability in challenging environments. The Raven airframe is an ideal foundation to construct more complex aircraft, and student interns using 3D printing were able to graft multiple Raven wings and fuselages into "FrankenRaven" aircraft. Aeronautical analysis shows that the new configuration has enhanced flight time, payload capacity, and distance compared to the original Raven. The FrankenRaven avionics architecture replaces the mil-spec avionics with COTS technology based upon the 3DR Pixhawk PX4 autopilot with a safety multiplexer for failsafe handoff to 2.4 GHz RC control and 915 MHz telemetry. This project demonstrates how design reuse, rapid prototyping, and modular subcomponents can be leveraged into flexible airborne platforms that can host a variety of remote sensing payloads and even multiple payloads. Modularity advances a new paradigm: mass-customization of aircraft around given payload(s). Multi-fuselage designs are currently under development to host a wide variety of payloads including a zenith-pointing spectrometer, a magnetometer, a multi-spectral camera, and a RGB camera. After airworthiness certification, flight readiness review, and test flights are performed at Crows Landing airfield in central California, field data will be taken at Kilauea volcano in Hawaii and other locations.

  9. Using Remote Sensing to Determine Timing of High Altitude Grass Hay Growth Stages

    NASA Astrophysics Data System (ADS)

    Mefford, B.

    2015-12-01

    Remote sensing has become the standard for collecting data to determine potential irrigation consumptive use in Wyoming for the Green River Basin. The Green River Basin within Wyoming is around 10.8 million acres, located in south western Wyoming and is a sub-basin of the Colorado River Basin. Grass hay is the main crop grown in the basin. The majority of the hay is grown at elevations 7,000 feet above mean sea level. Daily potential irrigation consumptive use is calculated for the basin during the growing season (May 1st to September 30th). To determine potential irrigation consumptive use crop coefficients, reference evapotranspiration (ET) and effective precipitation are required. Currently crop coefficients are the hardest to determine as most research on crop coefficients are based at lower elevations. Values for crop coefficients for grass hay still apply to high altitude grass hay, but the hay grows at a much slower rate than low elevation grass hay. To be able to more accurately determine the timing of the growth stages of hay in this basin, time-lapse cameras were installed at two different irrigated hay fields in the basin for the 2015 growing season and took pictures automatically once a day at 1 P.M.. Both of the fields also contained a permanent research grade weather station. Imagery obtained from these cameras was used as indicators of timing of the major growth stages of the hay and the length of days between the stages. A crop coefficient value was applied every day in the growing season based on the results from the imagery. Daily potential ET was calculated using the crop coefficients and the data from the on-site weather stations. The final result was potential irrigation induced crop consumptive use for each site. Using remote sensing provided necessary information that normally would be applied arbitrarily in determining irrigation induced consumptive use in the Green River Basin.

  10. Laying the foundation to use Raspberry Pi 3 V2 camera module imagery for scientific and engineering purposes

    NASA Astrophysics Data System (ADS)

    Pagnutti, Mary; Ryan, Robert E.; Cazenavette, George; Gold, Maxwell; Harlan, Ryan; Leggett, Edward; Pagnutti, James

    2017-01-01

    A comprehensive radiometric characterization of raw-data format imagery acquired with the Raspberry Pi 3 and V2.1 camera module is presented. The Raspberry Pi is a high-performance single-board computer designed to educate and solve real-world problems. This small computer supports a camera module that uses a Sony IMX219 8 megapixel CMOS sensor. This paper shows that scientific and engineering-grade imagery can be produced with the Raspberry Pi 3 and its V2.1 camera module. Raw imagery is shown to be linear with exposure and gain (ISO), which is essential for scientific and engineering applications. Dark frame, noise, and exposure stability assessments along with flat fielding results, spectral response measurements, and absolute radiometric calibration results are described. This low-cost imaging sensor, when calibrated to produce scientific quality data, can be used in computer vision, biophotonics, remote sensing, astronomy, high dynamic range imaging, and security applications, to name a few.

  11. Standoff aircraft IR characterization with ABB dual-band hyper spectral imager

    NASA Astrophysics Data System (ADS)

    Prel, Florent; Moreau, Louis; Lantagne, Stéphane; Bullis, Ritchie D.; Roy, Claude; Vallières, Christian; Levesque, Luc

    2012-09-01

    Remote sensing infrared characterization of rapidly evolving events generally involves the combination of a spectro-radiometer and infrared camera(s) as separated instruments. Time synchronization, spatial coregistration, consistent radiometric calibration and managing several systems are important challenges to overcome; they complicate the target infrared characterization data processing and increase the sources of errors affecting the final radiometric accuracy. MR-i is a dual-band Hyperspectal imaging spectro-radiometer, that combines two 256 x 256 pixels infrared cameras and an infrared spectro-radiometer into one single instrument. This field instrument generates spectral datacubes in the MWIR and LWIR. It is designed to acquire the spectral signatures of rapidly evolving events. The design is modular. The spectrometer has two output ports configured with two simultaneously operated cameras to either widen the spectral coverage or to increase the dynamic range of the measured amplitudes. Various telescope options are available for the input port. Recent platform developments and field trial measurements performances will be presented for a system configuration dedicated to the characterization of airborne targets.

  12. MicMac GIS application: free open source

    NASA Astrophysics Data System (ADS)

    Duarte, L.; Moutinho, O.; Teodoro, A.

    2016-10-01

    The use of Remotely Piloted Aerial System (RPAS) for remote sensing applications is becoming more frequent as the technologies on on-board cameras and the platform itself are becoming a serious contender to satellite and airplane imagery. MicMac is a photogrammetric tool for image matching that can be used in different contexts. It is an open source software and it can be used as a command line or with a graphic interface (for each command). The main objective of this work was the integration of MicMac with QGIS, which is also an open source software, in order to create a new open source tool applied to photogrammetry/remote sensing. Python language was used to develop the application. This tool would be very useful in the manipulation and 3D modelling of a set of images. The main objective was to create a toolbar in QGIS with the basic functionalities with intuitive graphic interfaces. The toolbar is composed by three buttons: produce the points cloud, create the Digital Elevation Model (DEM) and produce the orthophoto of the study area. The application was tested considering 35 photos, a subset of images acquired by a RPAS in the Aguda beach area, Porto, Portugal. They were used in order to create a 3D terrain model and from this model obtain an orthophoto and the corresponding DEM. The code is open and can be modified according to the user requirements. This integration would be very useful in photogrammetry and remote sensing community combined with GIS capabilities.

  13. The effects of spatially displaced visual feedback on remote manipulator performance

    NASA Technical Reports Server (NTRS)

    Smith, Randy L.; Stuart, Mark A.

    1989-01-01

    The effects of spatially displaced visual feedback on the operation of a camera viewed remote manipulation task are analyzed. A remote manipulation task is performed by operators exposed to the following different viewing conditions: direct view of the work site; normal camera view; reversed camera view; inverted/reversed camera view; and inverted camera view. The task completion performance times are statistically analyzed with a repeated measures analysis of variance, and a Newman-Keuls pairwise comparison test is administered to the data. The reversed camera view is ranked third out of four camera viewing conditions, while the normal viewing condition is found significantly slower than the direct viewing condition. It is shown that generalization to remote manipulation applications based upon the results of direct manipulation studies are quite useful, but they should be made cautiously.

  14. Landsat 3 return beam vidicon response artifacts

    USGS Publications Warehouse

    ,; Clark, B.

    1981-01-01

    The return beam vidicon (RBV) sensing systems employed aboard Landsats 1, 2, and 3 have all been similar in that they have utilized vidicon tube cameras. These are not mirror-sweep scanning devices such as the multispectral scanner (MSS) sensors that have also been carried aboard the Landsat satellites. The vidicons operate more like common television cameras, using an electron gun to read images from a photoconductive faceplate.In the case of Landsats 1 and 2, the RBV system consisted of three such vidicons which collected remote sensing data in three distinct spectral bands. Landsat 3, however, utilizes just two vidicon cameras, both of which sense data in a single broad band. The Landsat 3 RBV system additionally has a unique configuration. As arranged, the two cameras can be shuttered alternately, twice each, in the same time it takes for one MSS scene to be acquired. This shuttering sequence results in four RBV "subscenes" for every MSS scene acquired, similar to the four quadrants of a square. See Figure 1. Each subscene represents a ground area of approximately 98 by 98 km. The subscenes are designated A, B, C, and D, for the northwest, northeast, southwest, and southeast quarters of the full scene, respectively. RBV data products are normally ordered, reproduced, and sold on a subscene basis and are in general referred to in this way. Each exposure from the RBV camera system presents an image which is 98 km on a side. When these analog video data are subsequently converted to digital form, the picture element, or pixel, that results is 19 m on a side with an effective resolution element of 30 m. This pixel size is substantially smaller than that obtainable in MSS images (the MSS has an effective resolution element of 73.4 m), and, when RBV images are compared to equivalent MSS images, better resolution in the RBV data is clearly evident. It is for this reason that the RBV system can be a valuable tool for remote sensing of earth resources.Until recently, RBV imagery was processed directly from wideband video tape data onto 70-mm film. This changed in September 1980 when digital production of RBV data at the NASA Goddard Space Flight Center (GSFC) began. The wideband video tape data are now subjected to analog-to-digital preprocessing and corrected both radiometrically and geometrically to produce high-density digital tapes (HDT's). The HDT data are subsequently transmitted via satellite (Domsat) to the EROS Data Center (EDC) where they are used to generate 241-mm photographic images at a scale of 1:500,000. Computer-compatible tapes of the data are also generated as digital products. Of the RBV data acquired since September 1, 1980, approximately 2,800 subscenes per month have been processed at EDC.

  15. Development and field testing of a Light Aircraft Oil Surveillance System (LAOSS)

    NASA Technical Reports Server (NTRS)

    Burns, W.; Herz, M. J.

    1976-01-01

    An experimental device consisting of a conventional TV camera with a low light level photo image tube and motor driven polarized filter arrangement was constructed to provide a remote means of discriminating the presence of oil on water surfaces. This polarized light filtering system permitted a series of successive, rapid changes between the vertical and horizontal components of reflected polarized skylight and caused the oil based substances to be more easily observed and identified as a flashing image against a relatively static water surface background. This instrument was flight tested, and the results, with targets of opportunity and more systematic test site data, indicate the potential usefulness of this airborne remote sensing instrument.

  16. Methods for LWIR Radiometric Calibration and Characterization

    NASA Technical Reports Server (NTRS)

    Ryan, Robert; Pagnutti, Mary; Zanoni, Vicki; Harrington, Gary; Howell, Dane; Stewart, Randy

    2002-01-01

    The utility of a thermal remote sensing system increases with it's ability to retrieve surface temperature or radiance accurately. The radiometer measures the water surface radiant temperature. Combining these measurements with atmospheric pressure, temperature, and water vapor profiles, a top-of-the-atmosphere tradiance estimate can be caluclated with a radiativer transfer code to compare to trhe sensor's output. A novel approach has been developed using an uncooled infrared camera mounted on a boom, to quantify buoy effects.

  17. Remote sensing of cloud droplet size distributions in DC3 with the UMBC-LACO Rainbow Polarimetric Imager (RPI)

    NASA Astrophysics Data System (ADS)

    Buczkowski, S.; Martins, J.; Fernandez-Borda, R.; Cieslak, D.; Hall, J.

    2013-12-01

    The UMBC Rainbow Polarimetric Imager is a small form factor VIS imaging polarimeter suitable for use on a number of platforms. An optical system based on a Phillips prism with three Bayer filter color detectors, each detecting a separate polarization state, allows simultaneous detection of polarization and spectral information. A Mueller matrix-like calibration scheme corrects for polarization artifacts in the optical train and allows retrieval of the polarization state of incoming light to better than 0.5%. Coupled with wide field of view optics (~90°), RPI can capture images of cloudbows over a wide range of aircraft headings and solar zenith angles for retrieval of cloud droplet size distribution (DSD) parameters. In May-June 2012, RPI was flown in a nadir port on the NASA DC-8 during the DC3 field campaign. We will show examples of cloudbow DSD parameter retrievals from the campaign to demonstrate the efficacy of such a system to terrestrial atmospheric remote sensing. RPI image from DC3 06/15/2012 flight. Left panel is raw image from the RPI 90° camera. Middle panel is Stokes 'q' parameter retrieved from full three camera dataset. Right panel is a horizontal cut in 'q' through the glory. Both middle and right panels clearly show cloudbow features which can be fit to infer cloud DSD parameters.

  18. Remote sensing of multiple vital signs using a CMOS camera-equipped infrared thermography system and its clinical application in rapidly screening patients with suspected infectious diseases.

    PubMed

    Sun, Guanghao; Nakayama, Yosuke; Dagdanpurev, Sumiyakhand; Abe, Shigeto; Nishimura, Hidekazu; Kirimoto, Tetsuo; Matsui, Takemi

    2017-02-01

    Infrared thermography (IRT) is used to screen febrile passengers at international airports, but it suffers from low sensitivity. This study explored the application of a combined visible and thermal image processing approach that uses a CMOS camera equipped with IRT to remotely sense multiple vital signs and screen patients with suspected infectious diseases. An IRT system that produced visible and thermal images was used for image acquisition. The subjects' respiration rates were measured by monitoring temperature changes around the nasal areas on thermal images; facial skin temperatures were measured simultaneously. Facial blood circulation causes tiny color changes in visible facial images that enable the determination of the heart rate. A logistic regression discriminant function predicted the likelihood of infection within 10s, based on the measured vital signs. Sixteen patients with an influenza-like illness and 22 control subjects participated in a clinical test at a clinic in Fukushima, Japan. The vital-sign-based IRT screening system had a sensitivity of 87.5% and a negative predictive value of 91.7%; these values are higher than those of conventional fever-based screening approaches. Multiple vital-sign-based screening efficiently detected patients with suspected infectious diseases. It offers a promising alternative to conventional fever-based screening. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  19. Development of a Near Ground Remote Sensing System

    PubMed Central

    Zhang, Yanchao; Xiao, Yuzhao; Zhuang, Zaichun; Zhou, Liping; Liu, Fei; He, Yong

    2016-01-01

    Unmanned Aerial Vehicles (UAVs) have shown great potential in agriculture and are increasingly being developed for agricultural use. There are still a lot of experiments that need to be done to improve their performance and explore new uses, but experiments using UAVs are limited by many conditions like weather and location and the time it takes to prepare for a flight. To promote UAV remote sensing, a near ground remote sensing platform was developed. This platform consists of three major parts: (1) mechanical structures like a horizontal rail, vertical cylinder, and three axes gimbal; (2) power supply and control parts; (3) onboard application components. This platform covers five degrees of freedom (DOFs): horizontal, vertical, pitch, roll, yaw. A stm32 ARM single chip was used as the controller of the whole platform and another stm32 MCU was used to stabilize the gimbal. The gimbal stabilizer communicates with the main controller via a CAN bus. A multispectral camera was mounted on the gimbal. Software written in C++ language was developed as the graphical user interface. Operating parameters were set via this software and the working status was displayed in this software. To test how well the system works, a laser distance meter was used to measure the slide rail’s repeat accuracy. A 3-axis vibration analyzer was used to test the system stability. Test results show that the horizontal repeat accuracy was less than 2 mm; vertical repeat accuracy was less than 1 mm; vibration was less than 2 g and remained at an acceptable level. This system has high accuracy and stability and can therefore be used for various near ground remote sensing studies. PMID:27164111

  20. Implications of atmospheric conditions for analysis of surface temperature variability derived from landscape-scale thermography.

    PubMed

    Hammerle, Albin; Meier, Fred; Heinl, Michael; Egger, Angelika; Leitinger, Georg

    2017-04-01

    Thermal infrared (TIR) cameras perfectly bridge the gap between (i) on-site measurements of land surface temperature (LST) providing high temporal resolution at the cost of low spatial coverage and (ii) remotely sensed data from satellites that provide high spatial coverage at relatively low spatio-temporal resolution. While LST data from satellite (LST sat ) and airborne platforms are routinely corrected for atmospheric effects, such corrections are barely applied for LST from ground-based TIR imagery (using TIR cameras; LST cam ). We show the consequences of neglecting atmospheric effects on LST cam of different vegetated surfaces at landscape scale. We compare LST measured from different platforms, focusing on the comparison of LST data from on-site radiometry (LST osr ) and LST cam using a commercially available TIR camera in the region of Bozen/Bolzano (Italy). Given a digital elevation model and measured vertical air temperature profiles, we developed a multiple linear regression model to correct LST cam data for atmospheric influences. We could show the distinct effect of atmospheric conditions and related radiative processes along the measurement path on LST cam , proving the necessity to correct LST cam data on landscape scale, despite their relatively low measurement distances compared to remotely sensed data. Corrected LST cam data revealed the dampening effect of the atmosphere, especially at high temperature differences between the atmosphere and the vegetated surface. Not correcting for these effects leads to erroneous LST estimates, in particular to an underestimation of the heterogeneity in LST, both in time and space. In the most pronounced case, we found a temperature range extension of almost 10 K.

  1. RESOURCESAT-2: a mission for Earth resources management

    NASA Astrophysics Data System (ADS)

    Venkata Rao, M.; Gupta, J. P.; Rattan, Ram; Thyagarajan, K.

    2006-12-01

    The Indian Space Research Organisation (ISRO) has established an operational Remote sensing satellite system by launching its first satellite, IRS-1A in 1988, followed by a series of IRS spacecraft. The IRS-1C/1D satellites with their unique combination of Payloads have taken a lead position in the Global remote sensing scenario. Realising the growing User demands for the "Multi" level approach in terms of Spatial, Spectral, Temporal and Radiometric resolutions, ISRO identified the Resourcesat as a continuity as well as improved RS Satellite. The Resourcesat-1 (IRS-P6) was launched in October 2003 using PSLV launch vehicle and it is in operational service. Resourcesat-2 is its follow-on Mission scheduled for launch in 2008. Each Resourcesat satellite carries three Electro-optical cameras as its payload - LISS-3, LISS-4 and AWIFS. All the three are multi-spectral push-broom scanners with linear array CCDs as Detectors. LISS-3 and AWIFS operate in four identical spectral bands in the VIS-NIR-SWIR range while LISS-4 is a high resolution camera with three spectral bands in VIS-NIR range. In order to meet the stringent requirements of band-to-band registration and platform stability, several improvements have been incorporated in the mainframe Bus configuration like wide field Star trackers, precision Gyroscopes, on-board GPS receiver etc,. The Resourcesat data finds its application in several areas like agricultural crop discrimination and monitoring, crop acreage/yield estimation, precision farming, water resources, forest mapping, Rural infrastructure development, disaster management etc,. to name a few. A brief description of the Payload cameras, spacecraft bus elements and operational modes and few applications are presented.

  2. Robot Towed Shortwave Infrared Camera for Specific Surface Area Retrieval of Surface Snow

    NASA Astrophysics Data System (ADS)

    Elliott, J.; Lines, A.; Ray, L.; Albert, M. R.

    2017-12-01

    Optical grain size and specific surface area are key parameters for measuring the atmospheric interactions of snow, as well as tracking metamorphosis and allowing for the ground truthing of remote sensing data. We describe a device using a shortwave infrared camera with changeable optical bandpass filters (centered at 1300 nm and 1550 nm) that can be used to quickly measure the average SSA over an area of 0.25 m^2. The device and method are compared with calculations made from measurements taken with a field spectral radiometer. The instrument is designed to be towed by a small autonomous ground vehicle, and therefore rides above the snow surface on ultra high molecular weight polyethylene (UHMW) skis.

  3. Remote-controlled pan, tilt, zoom cameras at Kilauea and Mauna Loa Volcanoes, Hawai'i

    USGS Publications Warehouse

    Hoblitt, Richard P.; Orr, Tim R.; Castella, Frederic; Cervelli, Peter F.

    2008-01-01

    Lists of important volcano-monitoring disciplines usually include seismology, geodesy, and gas geochemistry. Visual monitoring - the essence of volcanology - is usually not mentioned. Yet, observations of the outward appearance of a volcano provide data that is equally as important as that provided by the other disciplines. The eye was almost certainly the first volcano monitoring-tool used by early man. Early volcanology was mostly descriptive and was based on careful visual observations of volcanoes. There is still no substitute for the eye of an experienced volcanologist. Today, scientific instruments replace or augment our senses as monitoring tools because instruments are faster and more sensitive, work tirelessly day and night, keep better records, operate in hazardous environments, do not generate lawsuits when damaged or destroyed, and in most cases are cheaper. Furthermore, instruments are capable of detecting phenomena that are outside the reach of our senses. The human eye is now augmented by the camera. Sequences of timed images provide a record of visual phenomena that occur on and above the surface of volcanoes. Photographic monitoring is a fundamental monitoring tool; image sequences can often provide the basis for interpreting other data streams. Monitoring data are most useful when they are generated and are available for analysis in real-time or near real-time. This report describes the current (as of 2006) system for real-time photograph acquisition and transmission from remote sites on Kilauea and Mauna Loa volcanoes to the U.S. Geological Survey Hawaiian Volcano Observatory (HVO). It also describes how the photographs are archived and analyzed. In addition to providing system documentation for HVO, we hope that the report will prove useful as a practical guide to the construction of a high-bandwidth network for the telemetry of real-time data from remote locations.

  4. Aswan High Dam in 6-meter Resolution from the International Space Station

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Astronaut photography of the Earth from the International Space Station has achieved resolutions close to those available from commercial remote sensing satellites-with many photographs having spatial resolutions of less than six meters. Astronauts take the photographs by hand and physically compensate for the motion of the spacecraft relative to the Earth while the images are being acquired. The achievement was highlighted in an article entitled 'Space Station Allows Remote Sensing of Earth to within Six Meters' published in this week's edition of Eos, Transactions of the American Geophysical Union. Lines painted on airport runways at the Aswan Airport served to independently validate the spatial resolution of the camera sensor. For press information, read: International Space Station Astronauts Set New Standard for Earth Photography For details, see Robinson, J. A. and Evans, C. A. 2002. Space Station Allows Remote Sensing of Earth to within Six Meters. Eos, Transactions, American Geophysical Union 83(17):185, 188. See some of the other detailed photographs posted to Earth Observatory: Pyramids at Giza Bermuda Downtown Houston The image above represents a detailed portion of a digitized NASA photograph STS102-303-17, and was provided by the Earth Sciences and Image Analysis Laboratory at Johnson Space Center. Additional images taken by astronauts and cosmonauts can be viewed at the NASA-JSC Gateway to Astronaut Photography of Earth.

  5. The application of the unmanned aerial vehicle remote sensing technology in the FAST project construction

    NASA Astrophysics Data System (ADS)

    Zhu, Boqin

    2015-08-01

    The purpose of using unmanned aerial vehicle (UAV) remote sensing application in Five-hundred-meter aperture spherical telescope (FAST) project is to dynamically record the construction process with high resolution image, monitor the environmental impact, and provide services for local environmental protection and the reserve immigrants. This paper introduces the use of UAV remote sensing system and the course design and implementation for the FAST site. Through the analysis of the time series data, we found that: (1) since the year 2012, the project has been widely carried out; (2) till 2013, the internal project begun to take shape;(3) engineering excavation scope was kept stable in 2014, and the initial scale of the FAST engineering construction has emerged as in the meantime, the vegetation recovery went well on the bare soil area; (4) in 2015, none environmental problems caused by engineering construction and other engineering geological disaster were found in the work area through the image interpretation of UAV images. This paper also suggested that the UAV technology need some improvements to fulfill the requirements of surveying and mapping specification., including a new data acquisition and processing measures assigned with the background of highly diverse elevation, usage of telephoto camera, hierarchical photography with different flying height, and adjustment with terrain using the joint empty three settlement method.

  6. Using Airborne Remote Sensing to Increase Situational Awareness in Civil Protection and Humanitarian Relief - the Importance of User Involvement

    NASA Astrophysics Data System (ADS)

    Römer, H.; Kiefl, R.; Henkel, F.; Wenxi, C.; Nippold, R.; Kurz, F.; Kippnich, U.

    2016-06-01

    Enhancing situational awareness in real-time (RT) civil protection and emergency response scenarios requires the development of comprehensive monitoring concepts combining classical remote sensing disciplines with geospatial information science. In the VABENE++ project of the German Aerospace Center (DLR) monitoring tools are being developed by which innovative data acquisition approaches are combined with information extraction as well as the generation and dissemination of information products to a specific user. DLR's 3K and 4k camera system which allow for a RT acquisition and pre-processing of high resolution aerial imagery are applied in two application examples conducted with end users: a civil protection exercise with humanitarian relief organisations and a large open-air music festival in cooperation with a festival organising company. This study discusses how airborne remote sensing can significantly contribute to both, situational assessment and awareness, focussing on the downstream processes required for extracting information from imagery and for visualising and disseminating imagery in combination with other geospatial information. Valuable user feedback and impetus for further developments has been obtained from both applications, referring to innovations in thematic image analysis (supporting festival site management) and product dissemination (editable web services). Thus, this study emphasises the important role of user involvement in application-related research, i.e. by aligning it closer to user's requirements.

  7. Combined Infrared Stereo and Laser Ranging Cloud Measurements from Shuttle Mission STS-85

    NASA Technical Reports Server (NTRS)

    Lancaster, Redgie S.; Spinhirne, James D.; OCStarr, David (Technical Monitor)

    2001-01-01

    Multi-angle remote sensing provides a wealth of information for earth and climate monitoring. And, as technology advances so do the options for developing instrumentation versatile enough to meet the demands associated with these types of measurements. In the current work, the multiangle measurement capability of the Infrared Spectral Imaging Radiometer is demonstrated. This instrument flew as part of mission STS-85 of the space shuttle Columbia in 1997 and was the first earth-observing radiometer to incorporate an uncooled microbolometer array detector as its image sensor. Specifically, a method for computing cloud-top height from the multi-spectral stereo measurements acquired during this flight has been developed and the results demonstrate that a vertical precision of 10.6 km was achieved. Further, the accuracy of these measurements is confirmed by comparison with coincident direct laser ranging measurements from the Shuttle Laser Altimeter. Mission STS-85 was the first space flight to combine laser ranging and thermal IR camera systems for cloud remote sensing.

  8. JPRS Report Science & Technology Japan 16th International Congress of the International Society for Photogrammetry and Remote Sensing Volume 1

    DTIC Science & Technology

    1989-01-24

    coherent noise . To overcome this disadvantages, a new holographic inverse filtering system has been developed by the authors. The inverse filter is...beam is blocked. The deblurred aerial image is formed in the image plane IP (the back focal plane of L2). The frequency of the grating used in this... impulse response of the optical system. For certain types of blurs, which include linear motion of the camera under the assumption that the picture

  9. Propagation Limitations in Remote Sensing.

    DTIC Science & Technology

    Contents: Multi-sensors and systems in remote sensing ; Radar sensing systems over land; Remote sensing techniques in oceanography; Influence of...propagation media and background; Infrared techniques in remote sensing ; Photography in remote sensing ; Analytical studies in remote sensing .

  10. University of Virginia suborbital infrared sensing experiment

    NASA Astrophysics Data System (ADS)

    Holland, Stephen; Nunnally, Clayton; Armstrong, Sarah; Laufer, Gabriel

    2002-03-01

    An Orion sounding rocket launched from Wallops Flight Facility carried a University of Virginia payload to an altitude of 47 km and returned infrared measurements of the Earth's upper atmosphere and video images of the ocean. The payload launch was the result of a three-year undergraduate design project by a multi-disciplinary student group from the University of Virginia and James Madison University. As part of a new multi-year design course, undergraduate students designed, built, tested, and participated in the launch of a suborbital platform from which atmospheric remote sensors and other scientific experiments could operate. The first launch included a simplified atmospheric measurement system intended to demonstrate full system operation and remote sensing capabilities during suborbital flight. A thermoelectrically cooled HgCdTe infrared detector, with peak sensitivity at 10 micrometers , measured upwelling radiation and a small camera and VCR system, aligned with the infrared sensor, provided a ground reference. Additionally, a simple orientation sensor, consisting of three photodiodes, equipped with red, green, and blue light with dichroic filters, was tested. Temperature measurements of the upper atmosphere were successfully obtained during the flight. Video images were successfully recorded on-board the payload and proved a valuable tool in the data analysis process. The photodiode system, intended as a replacement for the camera and VCR system, functioned well, despite low signal amplification. This fully integrated and flight tested payload will serve as a platform for future atmospheric sensing experiments. It is currently being modified for a second suborbital flight that will incorporate a gas filter correlation radiometry (GFCR) instrument to measure the distribution of stratospheric methane and imaging capabilities to record the chlorophyll distribution in the Metompkin Bay as an indicator of pollution runoff.

  11. Scaling forest phenology from trees to the landscape using an unmanned aerial vehicle

    NASA Astrophysics Data System (ADS)

    Klosterman, S.; Melaas, E. K.; Martinez, A.; Richardson, A. D.

    2013-12-01

    Vegetation phenology monitoring has yielded a decades-long archive documenting the impacts of global change on the biosphere. However, the coarse spatial resolution of remote sensing obscures the organismic level processes driving phenology, while point measurements on the ground limit the extent of observation. Unmanned aerial vehicles (UAVs) enable low altitude remote sensing at higher spatial and temporal resolution than available from space borne platforms, and have the potential to elucidate the links between organism scale processes and landscape scale analyses of terrestrial phenology. This project demonstrates the use of a low cost multirotor UAV, equipped with a consumer grade digital camera, for observation of deciduous forest phenology and comparison to ground- and tower-based data as well as remote sensing. The UAV was flown approximately every five days during the spring green-up period in 2013, to obtain aerial photography over an area encompassing a 250m resolution MODIS (Moderate Resolution Imaging Spectroradiometer) pixel at Harvard Forest in central Massachusetts, USA. The imagery was georeferenced and tree crowns were identified using a detailed species map of the study area. Image processing routines were used to extract canopy 'greenness' time series, which were used to calculate phenology transition dates corresponding to early, middle, and late stages of spring green-up for the dominant canopy trees. Aggregated species level phenology estimates from the UAV data, including the mean and variance of phenology transition dates within species in the study area, were compared to model predictions based on visual assessment of a smaller sample size of individual trees, indicating the extent to which limited ground observations represent the larger landscape. At an intermediate scale, the UAV data was compared to data from repeat digital photography, integrating over larger portions of canopy within and near the study area, as a validation step and to see how well tower-based approaches characterize the surrounding landscape. Finally, UAV data was compared to MODIS data to determine how tree crowns within a remote sensing pixel combine to create the aggregate landscape phenology measured by remote sensing, using an area weighted average of the phenology of all dominant crowns.

  12. Immersive telepresence system using high-resolution omnidirectional movies and a locomotion interface

    NASA Astrophysics Data System (ADS)

    Ikeda, Sei; Sato, Tomokazu; Kanbara, Masayuki; Yokoya, Naokazu

    2004-05-01

    Technology that enables users to experience a remote site virtually is called telepresence. A telepresence system using real environment images is expected to be used in the field of entertainment, medicine, education and so on. This paper describes a novel telepresence system which enables users to walk through a photorealistic virtualized environment by actual walking. To realize such a system, a wide-angle high-resolution movie is projected on an immersive multi-screen display to present users the virtualized environments and a treadmill is controlled according to detected user's locomotion. In this study, we use an omnidirectional multi-camera system to acquire images real outdoor scene. The proposed system provides users with rich sense of walking in a remote site.

  13. Remote Leak Detection: Indirect Thermal Technique

    NASA Technical Reports Server (NTRS)

    Clements, Sandra

    2002-01-01

    Remote sensing technologies are being considered for efficient, low cost gas leak detection. Eleven specific techniques have been identified for further study and evaluation of several of these is underway. The Indirect Thermal Technique is one of the techniques that is being explored. For this technique, an infrared camera is used to detect the temperature change of a pipe or fitting at the site of a gas leak. This temperature change is caused by the change in temperature of the gas expanding from the leak site. During the 10-week NFFP program, the theory behind the technique was further developed, experiments were performed to determine the conditions for which the technique might be viable, and a proof-of-concept system was developed and tested in the laboratory.

  14. Scaling-up camera traps: monitoring the planet's biodiversity with networks of remote sensors

    USGS Publications Warehouse

    Steenweg, Robin; Hebblewhite, Mark; Kays, Roland; Ahumada, Jorge A.; Fisher, Jason T.; Burton, Cole; Townsend, Susan E.; Carbone, Chris; Rowcliffe, J. Marcus; Whittington, Jesse; Brodie, Jedediah; Royle, Andy; Switalski, Adam; Clevenger, Anthony P.; Heim, Nicole; Rich, Lindsey N.

    2017-01-01

    Countries committed to implementing the Convention on Biological Diversity's 2011–2020 strategic plan need effective tools to monitor global trends in biodiversity. Remote cameras are a rapidly growing technology that has great potential to transform global monitoring for terrestrial biodiversity and can be an important contributor to the call for measuring Essential Biodiversity Variables. Recent advances in camera technology and methods enable researchers to estimate changes in abundance and distribution for entire communities of animals and to identify global drivers of biodiversity trends. We suggest that interconnected networks of remote cameras will soon monitor biodiversity at a global scale, help answer pressing ecological questions, and guide conservation policy. This global network will require greater collaboration among remote-camera studies and citizen scientists, including standardized metadata, shared protocols, and security measures to protect records about sensitive species. With modest investment in infrastructure, and continued innovation, synthesis, and collaboration, we envision a global network of remote cameras that not only provides real-time biodiversity data but also serves to connect people with nature.

  15. First results from the TOPSAT camera

    NASA Astrophysics Data System (ADS)

    Greenway, Paul; Tosh, Ian; Morris, Nigel; Burton, Gary; Cawley, Steve

    2017-11-01

    The TopSat camera is a low cost remote sensing imager capable of producing 2.5 metre resolution panchromatic imagery, funded by the British National Space Centre's Mosaic programme. The instrument was designed and assembled at the Space Science & Technology Department of the CCLRC's Rutherford Appleton Laboratory (RAL) in the UK, and was launched on the 27th October 2005 from Plesetsk Cosmodrome in Northern Russia on a Kosmos-3M. The camera utilises an off-axis three mirror system, which has the advantages of excellent image quality over a wide field of view, combined with a compactness that makes its overall dimensions smaller than its focal length. Keeping the costs to a minimum has been a major design driver in the development of this camera. The camera is part of the TopSat mission, which is a collaboration between four UK organisations; QinetiQ, Surrey Satellite Technology Ltd (SSTL), RAL and Infoterra. Its objective is to demonstrate provision of rapid response high resolution imagery to fixed and mobile ground stations using a low cost minisatellite. The paper "Development of the TopSat Camera" presented by RAL at the 5th ICSO in 2004 described the opto-mechanical design, assembly, alignment and environmental test methods implemented. Now that the spacecraft is in orbit and successfully acquiring images, this paper presents the first results from the camera and makes an initial assessment of the camera's in-orbit performance.

  16. Earth view: A business guide to orbital remote sensing

    NASA Technical Reports Server (NTRS)

    Bishop, Peter C.

    1990-01-01

    The following subject areas are covered: Earth view - a guide to orbital remote sensing; current orbital remote sensing systems (LANDSAT, SPOT image, MOS-1, Soviet remote sensing systems); remote sensing satellite; and remote sensing organizations.

  17. Medium-sized aperture camera for Earth observation

    NASA Astrophysics Data System (ADS)

    Kim, Eugene D.; Choi, Young-Wan; Kang, Myung-Seok; Kim, Ee-Eul; Yang, Ho-Soon; Rasheed, Ad. Aziz Ad.; Arshad, Ahmad Sabirin

    2017-11-01

    Satrec Initiative and ATSB have been developing a medium-sized aperture camera (MAC) for an earth observation payload on a small satellite. Developed as a push-broom type high-resolution camera, the camera has one panchromatic and four multispectral channels. The panchromatic channel has 2.5m, and multispectral channels have 5m of ground sampling distances at a nominal altitude of 685km. The 300mm-aperture Cassegrain telescope contains two aspheric mirrors and two spherical correction lenses. With a philosophy of building a simple and cost-effective camera, the mirrors incorporate no light-weighting, and the linear CCDs are mounted on a single PCB with no beam splitters. MAC is the main payload of RazakSAT to be launched in 2005. RazakSAT is a 180kg satellite including MAC, designed to provide high-resolution imagery of 20km swath width on a near equatorial orbit (NEqO). The mission objective is to demonstrate the capability of a high-resolution remote sensing satellite system on a near equatorial orbit. This paper describes the overview of the MAC and RarakSAT programmes, and presents the current development status of MAC focusing on key optical aspects of Qualification Model.

  18. Use of wildlife webcams - Literature review and annotated bibliography

    USGS Publications Warehouse

    Ratz, Joan M.; Conk, Shannon J.

    2010-01-01

    The U.S. Fish and Wildlife Service National Conservation Training Center requested a literature review product that would serve as a resource to natural resource professionals interested in using webcams to connect people with nature. The literature review focused on the effects on the public of viewing wildlife through webcams and on information regarding installation and use of webcams. We searched the peer reviewed, published literature for three topics: wildlife cameras, virtual tourism, and technological nature. Very few publications directly addressed the effect of viewing wildlife webcams. The review of information on installation and use of cameras yielded information about many aspects of the use of remote photography, but not much specifically regarding webcams. Aspects of wildlife camera use covered in the literature review include: camera options, image retrieval, system maintenance and monitoring, time to assemble, power source, light source, camera mount, frequency of image recording, consequences for animals, and equipment security. Webcam technology is relatively new and more publication regarding the use of the technology is needed. Future research should specifically study the effect that viewing wildlife through webcams has on the viewers' conservation attitudes, behaviors, and sense of connectedness to nature.

  19. Systems approach to the design of the CCD sensors and camera electronics for the AIA and HMI instruments on solar dynamics observatory

    NASA Astrophysics Data System (ADS)

    Waltham, N.; Beardsley, S.; Clapp, M.; Lang, J.; Jerram, P.; Pool, P.; Auker, G.; Morris, D.; Duncan, D.

    2017-11-01

    Solar Dynamics Observatory (SDO) is imaging the Sun in many wavelengths near simultaneously and with a resolution ten times higher than the average high-definition television. In this paper we describe our innovative systems approach to the design of the CCD cameras for two of SDO's remote sensing instruments, the Atmospheric Imaging Assembly (AIA) and the Helioseismic and Magnetic Imager (HMI). Both instruments share use of a custom-designed 16 million pixel science-grade CCD and common camera readout electronics. A prime requirement was for the CCD to operate with significantly lower drive voltages than before, motivated by our wish to simplify the design of the camera readout electronics. Here, the challenge lies in the design of circuitry to drive the CCD's highly capacitive electrodes and to digitize its analogue video output signal with low noise and to high precision. The challenge is greatly exacerbated when forced to work with only fully space-qualified, radiation-tolerant components. We describe our systems approach to the design of the AIA and HMI CCD and camera electronics, and the engineering solutions that enabled us to comply with both mission and instrument science requirements.

  20. Flight Calibration of the LROC Narrow Angle Camera

    NASA Astrophysics Data System (ADS)

    Humm, D. C.; Tschimmel, M.; Brylow, S. M.; Mahanti, P.; Tran, T. N.; Braden, S. E.; Wiseman, S.; Danton, J.; Eliason, E. M.; Robinson, M. S.

    2016-04-01

    Characterization and calibration are vital for instrument commanding and image interpretation in remote sensing. The Lunar Reconnaissance Orbiter Camera Narrow Angle Camera (LROC NAC) takes 500 Mpixel greyscale images of lunar scenes at 0.5 meters/pixel. It uses two nominally identical line scan cameras for a larger crosstrack field of view. Stray light, spatial crosstalk, and nonlinearity were characterized using flight images of the Earth and the lunar limb. These are important for imaging shadowed craters, studying ˜1 meter size objects, and photometry respectively. Background, nonlinearity, and flatfield corrections have been implemented in the calibration pipeline. An eight-column pattern in the background is corrected. The detector is linear for DN = 600--2000 but a signal-dependent additive correction is required and applied for DN<600. A predictive model of detector temperature and dark level was developed to command dark level offset. This avoids images with a cutoff at DN=0 and minimizes quantization error in companding. Absolute radiometric calibration is derived from comparison of NAC images with ground-based images taken with the Robotic Lunar Observatory (ROLO) at much lower spatial resolution but with the same photometric angles.

  1. Technology study of quantum remote sensing imaging

    NASA Astrophysics Data System (ADS)

    Bi, Siwen; Lin, Xuling; Yang, Song; Wu, Zhiqiang

    2016-02-01

    According to remote sensing science and technology development and application requirements, quantum remote sensing is proposed. First on the background of quantum remote sensing, quantum remote sensing theory, information mechanism, imaging experiments and prototype principle prototype research situation, related research at home and abroad are briefly introduced. Then we expounds compress operator of the quantum remote sensing radiation field and the basic principles of single-mode compression operator, quantum quantum light field of remote sensing image compression experiment preparation and optical imaging, the quantum remote sensing imaging principle prototype, Quantum remote sensing spaceborne active imaging technology is brought forward, mainly including quantum remote sensing spaceborne active imaging system composition and working principle, preparation and injection compression light active imaging device and quantum noise amplification device. Finally, the summary of quantum remote sensing research in the past 15 years work and future development are introduced.

  2. Dust Removal on Mars Using Laser-Induced Breakdown Spectroscopy

    NASA Technical Reports Server (NTRS)

    Graff, T. G.; Morris, R. V.; Clegg, S. M.; Wiens, R. C.; Anderson, R. B.

    2011-01-01

    Dust coatings on the surface of Mars complicate and, if sufficiently thick, mask the spectral characteristics and compositional determination of underlying material from in situ and remote sensing instrumentation. The Laser-Induced Breakdown Spectroscopy (LIBS) portion of the Chemistry & Camera (ChemCam) instrument, aboard the Mars Science Laboratory (MSL) rover, will be the first active remote sensing technique deployed on Mars able to remove dust. ChemCam utilizes a 5 ns pulsed 1067 nm high-powered laser focused to less than 400 m diameter on targets at distances up to 7 m [1,2]. With multiple laser pulses, dust and weathering coatings can be remotely analyzed and potentially removed using this technique [2,3]. A typical LIBS measurement during MSL surface operations is planned to consist of 50 laser pulses at 14 mJ, with the first 5 to 10 pulses used to analyze as well as remove any surface coating. Additionally, ChemCam's Remote Micro-Imager (RMI) is capable of resolving 200 m details at a distance of 2 m, or 1 mm at 10 m [1,4]. In this study, we report on initial laboratory experiments conducted to characterize the removal of dust coatings using similar LIBS parameters as ChemCam under Mars-like conditions. These experiments serve to better understand the removal of surface dust using LIBS and to facilitate the analysis of ChemCam LIBS spectral data and RMI images.

  3. Composite video and graphics display for multiple camera viewing system in robotics and teleoperation

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B. (Inventor); Venema, Steven C. (Inventor)

    1991-01-01

    A system for real-time video image display for robotics or remote-vehicle teleoperation is described that has at least one robot arm or remotely operated vehicle controlled by an operator through hand-controllers, and one or more television cameras and optional lighting element. The system has at least one television monitor for display of a television image from a selected camera and the ability to select one of the cameras for image display. Graphics are generated with icons of cameras and lighting elements for display surrounding the television image to provide the operator information on: the location and orientation of each camera and lighting element; the region of illumination of each lighting element; the viewed region and range of focus of each camera; which camera is currently selected for image display for each monitor; and when the controller coordinate for said robot arms or remotely operated vehicles have been transformed to correspond to coordinates of a selected or nonselected camera.

  4. Composite video and graphics display for camera viewing systems in robotics and teleoperation

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B. (Inventor); Venema, Steven C. (Inventor)

    1993-01-01

    A system for real-time video image display for robotics or remote-vehicle teleoperation is described that has at least one robot arm or remotely operated vehicle controlled by an operator through hand-controllers, and one or more television cameras and optional lighting element. The system has at least one television monitor for display of a television image from a selected camera and the ability to select one of the cameras for image display. Graphics are generated with icons of cameras and lighting elements for display surrounding the television image to provide the operator information on: the location and orientation of each camera and lighting element; the region of illumination of each lighting element; the viewed region and range of focus of each camera; which camera is currently selected for image display for each monitor; and when the controller coordinate for said robot arms or remotely operated vehicles have been transformed to correspond to coordinates of a selected or nonselected camera.

  5. A high throughput geocomputing system for remote sensing quantitative retrieval and a case study

    NASA Astrophysics Data System (ADS)

    Xue, Yong; Chen, Ziqiang; Xu, Hui; Ai, Jianwen; Jiang, Shuzheng; Li, Yingjie; Wang, Ying; Guang, Jie; Mei, Linlu; Jiao, Xijuan; He, Xingwei; Hou, Tingting

    2011-12-01

    The quality and accuracy of remote sensing instruments have been improved significantly, however, rapid processing of large-scale remote sensing data becomes the bottleneck for remote sensing quantitative retrieval applications. The remote sensing quantitative retrieval is a data-intensive computation application, which is one of the research issues of high throughput computation. The remote sensing quantitative retrieval Grid workflow is a high-level core component of remote sensing Grid, which is used to support the modeling, reconstruction and implementation of large-scale complex applications of remote sensing science. In this paper, we intend to study middleware components of the remote sensing Grid - the dynamic Grid workflow based on the remote sensing quantitative retrieval application on Grid platform. We designed a novel architecture for the remote sensing Grid workflow. According to this architecture, we constructed the Remote Sensing Information Service Grid Node (RSSN) with Condor. We developed a graphic user interface (GUI) tools to compose remote sensing processing Grid workflows, and took the aerosol optical depth (AOD) retrieval as an example. The case study showed that significant improvement in the system performance could be achieved with this implementation. The results also give a perspective on the potential of applying Grid workflow practices to remote sensing quantitative retrieval problems using commodity class PCs.

  6. Contactless physiological signals extraction based on skin color magnification

    NASA Astrophysics Data System (ADS)

    Suh, Kun Ha; Lee, Eui Chul

    2017-11-01

    Although the human visual system is not sufficiently sensitive to perceive blood circulation, blood flow caused by cardiac activity makes slight changes on human skin surfaces. With advances in imaging technology, it has become possible to capture these changes through digital cameras. However, it is difficult to obtain clear physiological signals from such changes due to its fineness and noise factors, such as motion artifacts and camera sensing disturbances. We propose a method for extracting physiological signals with improved quality from skin colored-videos recorded with a remote RGB camera. The results showed that our skin color magnification method reveals the hidden physiological components remarkably in the time-series signal. A Korea Food and Drug Administration-approved heart rate monitor was used for verifying the resulting signal synchronized with the actual cardiac pulse, and comparisons of signal peaks showed correlation coefficients of almost 1.0. In particular, our method can be an effective preprocessing before applying additional postfiltering techniques to improve accuracy in image-based physiological signal extractions.

  7. Design and development of an airborne multispectral imaging system

    NASA Astrophysics Data System (ADS)

    Kulkarni, Rahul R.; Bachnak, Rafic; Lyle, Stacey; Steidley, Carl W.

    2002-08-01

    Advances in imaging technology and sensors have made airborne remote sensing systems viable for many applications that require reasonably good resolution at low cost. Digital cameras are making their mark on the market by providing high resolution at very high rates. This paper describes an aircraft-mounted imaging system (AMIS) that is being designed and developed at Texas A&M University-Corpus Christi (A&M-CC) with the support of a grant from NASA. The approach is to first develop and test a one-camera system that will be upgraded into a five-camera system that offers multi-spectral capabilities. AMIS will be low cost, rugged, portable and has its own battery power source. Its immediate use will be to acquire images of the Coastal area in the Gulf of Mexico for a variety of studies covering vast spectra from near ultraviolet region to near infrared region. This paper describes AMIS and its characteristics, discusses the process for selecting the major components, and presents the progress.

  8. Using turbulence scintillation to assist object ranging from a single camera viewpoint.

    PubMed

    Wu, Chensheng; Ko, Jonathan; Coffaro, Joseph; Paulson, Daniel A; Rzasa, John R; Andrews, Larry C; Phillips, Ronald L; Crabbs, Robert; Davis, Christopher C

    2018-03-20

    Image distortions caused by atmospheric turbulence are often treated as unwanted noise or errors in many image processing studies. Our study, however, shows that in certain scenarios the turbulence distortion can be very helpful in enhancing image processing results. This paper describes a novel approach that uses the scintillation traits recorded on a video clip to perform object ranging with reasonable accuracy from a single camera viewpoint. Conventionally, a single camera would be confused by the perspective viewing problem, where a large object far away looks the same as a small object close by. When the atmospheric turbulence phenomenon is considered, the edge or texture pixels of an object tend to scintillate and vary more with increased distance. This turbulence induced signature can be quantitatively analyzed to achieve object ranging with reasonable accuracy. Despite the inevitable fact that turbulence will cause random blurring and deformation of imaging results, it also offers convenient solutions to some remote sensing and machine vision problems, which would otherwise be difficult.

  9. Automated geo/ortho registered aerial imagery product generation using the mapping system interface card (MSIC)

    NASA Astrophysics Data System (ADS)

    Bratcher, Tim; Kroutil, Robert; Lanouette, André; Lewis, Paul E.; Miller, David; Shen, Sylvia; Thomas, Mark

    2013-05-01

    The development concept paper for the MSIC system was first introduced in August 2012 by these authors. This paper describes the final assembly, testing, and commercial availability of the Mapping System Interface Card (MSIC). The 2.3kg MSIC is a self-contained, compact variable configuration, low cost real-time precision metadata annotator with embedded INS/GPS designed specifically for use in small aircraft. The MSIC was specifically designed to convert commercial-off-the-shelf (COTS) digital cameras and imaging/non-imaging spectrometers with Camera Link standard data streams into mapping systems for airborne emergency response and scientific remote sensing applications. COTS digital cameras and imaging/non-imaging spectrometers covering the ultraviolet through long-wave infrared wavelengths are important tools now readily available and affordable for use by emergency responders and scientists. The MSIC will significantly enhance the capability of emergency responders and scientists by providing a direct transformation of these important COTS sensor tools into low-cost real-time aerial mapping systems.

  10. Anisotropy of thermal infrared remote sensing over urban areas : assessment from airborne data and modeling approach

    NASA Astrophysics Data System (ADS)

    Hénon, A.; Mestayer, P.; Lagouarde, J.-P.; Lee, J. H.

    2009-09-01

    Due to the morphological complexity of the urban canopy and to the variability in thermal properties of the building materials, the heterogeneity of the surface temperatures generates a strong directional anisotropy of thermal infrared remote sensing signal. Thermal infrared (TIR) data obtained with an airborne FLIR camera over Toulouse (France) city centre during the CAPITOUL experiment (feb. 2004 - feb. 2005) show brightness temperature anisotropies ranging from 3 °C by night to more than 10 °C by sunny days. These data have been analyzed in view of developing a simple approach to correct TIR satellite remote sensing from the canopy-generated anisotropy, and to further evaluate the sensible heat fluxes. The methodology is based on the identification of 6 different classes of surfaces: roofs, walls and grounds, sunlit or shaded, respectively. The thermo-radiative model SOLENE is used to simulate, with a 1 m resolution computational grid, the surface temperatures of an 18000 m² urban district, in the same meteorological conditions as during the observation. A pixel-by-pixel comparison with both hand-held temperature measurements and airborne camera images allows to assess the actual values of the radiative and thermal parameters of the scene elements. SOLENE is then used to simulate a generic street-canyon geometry, whose sizes average the morphological parameters of the actual streets in the district, for 18 different geographical orientations. The simulated temperatures are then integrated for different viewing positions, taking into account shadowing and masking, and directional temperatures are determined for the 6 surface classes. The class ratios in each viewing direction are derived from images of the district generated by using the POVRAY software, and used to weigh the temperatures of each class and to compute the resulting directional brightness temperature at the district scale for a given sun direction (time in the day). Simulated and measured anisotropies are finally compared for several flights over Toulouse in summer and winter. An inverse method is further proposed to obtain the surface temperatures from the directional brightness temperatures, which may be extended to deduce the sensible heat fluxes separately from the buildings and from the ground.

  11. Can Commercial Digital Cameras Be Used as Multispectral Sensors? A Crop Monitoring Test.

    PubMed

    Lebourgeois, Valentine; Bégué, Agnès; Labbé, Sylvain; Mallavan, Benjamin; Prévot, Laurent; Roux, Bruno

    2008-11-17

    The use of consumer digital cameras or webcams to characterize and monitor different features has become prevalent in various domains, especially in environmental applications. Despite some promising results, such digital camera systems generally suffer from signal aberrations due to the on-board image processing systems and thus offer limited quantitative data acquisition capability. The objective of this study was to test a series of radiometric corrections having the potential to reduce radiometric distortions linked to camera optics and environmental conditions, and to quantify the effects of these corrections on our ability to monitor crop variables. In 2007, we conducted a five-month experiment on sugarcane trial plots using original RGB and modified RGB (Red-Edge and NIR) cameras fitted onto a light aircraft. The camera settings were kept unchanged throughout the acquisition period and the images were recorded in JPEG and RAW formats. These images were corrected to eliminate the vignetting effect, and normalized between acquisition dates. Our results suggest that 1) the use of unprocessed image data did not improve the results of image analyses; 2) vignetting had a significant effect, especially for the modified camera, and 3) normalized vegetation indices calculated with vignetting-corrected images were sufficient to correct for scene illumination conditions. These results are discussed in the light of the experimental protocol and recommendations are made for the use of these versatile systems for quantitative remote sensing of terrestrial surfaces.

  12. Review: advances in in situ and satellite phenological observations in Japan

    NASA Astrophysics Data System (ADS)

    Nagai, Shin; Nasahara, Kenlo Nishida; Inoue, Tomoharu; Saitoh, Taku M.; Suzuki, Rikie

    2016-04-01

    To accurately evaluate the responses of spatial and temporal variation of ecosystem functioning (evapotranspiration and photosynthesis) and services (regulating and cultural services) to the rapid changes caused by global warming, we depend on long-term, continuous, near-surface, and satellite remote sensing of phenology over wide areas. Here, we review such phenological studies in Japan and discuss our current knowledge, problems, and future developments. In contrast with North America and Europe, Japan has been able to evaluate plant phenology along vertical and horizontal gradients within a narrow area because of the country's high topographic relief. Phenological observation networks that support scientific studies and outreach activities have used near-surface tools such as digital cameras and spectral radiometers. Differences in phenology among ecosystems and tree species have been detected by analyzing the seasonal variation of red, green, and blue digital numbers (RGB values) extracted from phenological images, as well as spectral reflectance and vegetation indices. The relationships between seasonal variations in RGB-derived indices or spectral characteristics and the ecological and CO2 flux measurement data have been well validated. In contrast, insufficient satellite remote-sensing observations have been conducted because of the coarse spatial resolution of previous datasets, which could not detect the heterogeneous plant phenology that results from Japan's complex topography and vegetation. To improve Japanese phenological observations, multidisciplinary analysis and evaluation will be needed to link traditional phenological observations with "index trees," near-surface and satellite remote-sensing observations, "citizen science" (observations by citizens), and results published on the Internet.

  13. Assessing UAVs in Monitoring Crop Evapotranspiration within a Heterogeneous Soil

    NASA Astrophysics Data System (ADS)

    Rouze, G.; Neely, H.; Morgan, C.; Kustas, W. P.; McKee, L.; Prueger, J. H.; Cope, D.; Yang, C.; Thomasson, A.; Jung, J.

    2017-12-01

    Airborne and satellite remote sensing methods have been developed to provide ET estimates across entire management fields. However, airborne-based ET is not particularly cost-effective and satellite-based ET provides insufficient spatial/temporal information. ET estimations through remote sensing are also problematic where soils are highly variable within a given management field. Unlike airborne/satellite-based ET, Unmanned Aerial Vehicle (UAV)-based ET has the potential to increase the spatial and temporal detail of these measurements, particularly within a heterogeneous soil landscape. However, it is unclear to what extent UAVs can model ET. The overall goal of this project was to assess the capability of UAVs in modeling ET across a heterogeneous landscape. Within a 20-ha irrigated cotton field in Central Texas, low-altitude UAV surveys were conducted throughout the growing season over two soil types. UAVs were equipped with thermal and multispectral cameras to obtain canopy temperature and NDVI, respectively. UAV data were supplemented simultaneously with ground-truth measurements such as Leaf Area Index (LAI) and plant height. Both remote sensing and ground-truth parameters were used to model ET using a Two-Source Energy Balance (TSEB) model. UAV-based estimations of ET and other energy balance components were validated against energy balance measurements obtained from nearby eddy covariance towers that were installed within each soil type. UAV-based ET fluxes were also compared with airborne and satellite (Landsat 8)-based ET fluxes collected near the time of the UAV survey.

  14. Applications of Remote Sensing to Emergency Management.

    DTIC Science & Technology

    1980-02-15

    Contents: Foundations of Remote Sensing : Data Acquisition and Interpretation; Availability of Remote Sensing Technology for Disaster Response...Imaging Systems, Current and Near Future Satellite and Aircraft Remote Sensing Systems; Utilization of Remote Sensing in Disaster Response: Categories of...Disasters, Phases of Monitoring Activities; Recommendations for Utilization of Remote Sensing Technology in Disaster Response; Selected Reading List.

  15. Development, characterization, and modeling of a tunable filter camera

    NASA Astrophysics Data System (ADS)

    Sartor, Mark Alan

    1999-10-01

    This paper describes the development, characterization, and modeling of a Tunable Filter Camera (TFC). The TFC is a new multispectral instrument with electronically tuned spectral filtering and low-light-level sensitivity. It represents a hybrid between hyperspectral and multispectral imaging spectrometers that incorporates advantages from each, addressing issues such as complexity, cost, lack of sensitivity, and adaptability. These capabilities allow the TFC to be applied to low- altitude video surveillance for real-time spectral and spatial target detection and image exploitation. Described herein are the theory and principles of operation for the TFC, which includes a liquid crystal tunable filter, an intensified CCD, and a custom apochromatic lens. The results of proof-of-concept testing, and characterization of two prototype cameras are included, along with a summary of the design analyses for the development of a multiple-channel system. A significant result of this effort was the creation of a system-level model, which was used to facilitate development and predict performance. It includes models for the liquid crystal tunable filter and intensified CCD. Such modeling was necessary in the design of the system and is useful for evaluation of the system in remote-sensing applications. Also presented are characterization data from component testing, which included quantitative results for linearity, signal to noise ratio (SNR), linearity, and radiometric response. These data were used to help refine and validate the model. For a pre-defined source, the spatial and spectral response, and the noise of the camera, system can now be predicted. The innovation that sets this development apart is the fact that this instrument has been designed for integrated, multi-channel operation for the express purpose of real-time detection/identification in low- light-level conditions. Many of the requirements for the TFC were derived from this mission. In order to provide background for the design requirements for the TFC development, the mission and principles of operation behind the multi-channel system will be reviewed. Given the combination of the flexibility, simplicity, and sensitivity, the TFC and its multiple-channel extension can play a significant role in the next generation of remote-sensing instruments.

  16. Towards non-contact photo-acoustic endoscopy using speckle pattern analysis

    NASA Astrophysics Data System (ADS)

    Lengenfelder, Benjamin; Mehari, Fanuel; Tang, Yuqi; Klämpfl, Florian; Zalevsky, Zeev; Schmidt, Michael

    2017-03-01

    Photoacoustic Tomography combines the advantages of optical and acoustic imaging as it makes use of the high optical contrast of tissue and the high resolution of ultrasound. Furthermore, high penetration depths in tissue in the order of several centimeters can be achieved by the combination of these modalities. Extensive research is being done in the field of miniaturization of photoacoustic devices, as photoacoustic imaging could be of significant benefits for the physician during endoscopic interventions. All the existing miniature systems are based on contact transducers for signal detection that are placed at the distal end of an endoscopic device. This makes the manufacturing process difficult and impedance matching to the inspected surface a requirement. The requirement for contact limits the view of the physician during the intervention. Consequently, a fiber based non-contact optical sensing technique would be highly beneficial for the development of miniaturized photoacoustic endoscopic devices. This work demonstrates the feasibility of surface displacement detection using remote speckle-sensing using a high speed camera and an imaging fiber bundle that is used in commercially available video endoscopes. The feasibility of displacement sensing is demonstrated by analysis of phantom vibrations which are induced by loudspeaker membrane oscillations. Since the usability of the remote speckle-sensing for photo-acoustic signal detection was already demonstrated, the fiber bundle approach demonstrates the potential for non-contact photoacoustic detections during endoscopy.

  17. Application of narrow-band television to industrial and commercial communications

    NASA Technical Reports Server (NTRS)

    Embrey, B. C., Jr.; Southworth, G. R.

    1974-01-01

    The development of narrow-band systems for use in space systems is presented. Applications of the technology to future spacecraft requirements are discussed along with narrow-band television's influence in stimulating development within the industry. The transferral of the technology into industrial and commercial communications is described. Major areas included are: (1) medicine; (2) education; (3) remote sensing for traffic control; and (5) weather observation. Applications in data processing, image enhancement, and information retrieval are provided by the combination of the TV camera and the computer.

  18. AutoCNet: A Python library for sparse multi-image correspondence identification for planetary data

    NASA Astrophysics Data System (ADS)

    Laura, Jason; Rodriguez, Kelvin; Paquette, Adam C.; Dunn, Evin

    2018-01-01

    In this work we describe the AutoCNet library, written in Python, to support the application of computer vision techniques for n-image correspondence identification in remotely sensed planetary images and subsequent bundle adjustment. The library is designed to support exploratory data analysis, algorithm and processing pipeline development, and application at scale in High Performance Computing (HPC) environments for processing large data sets and generating foundational data products. We also present a brief case study illustrating high level usage for the Apollo 15 Metric camera.

  19. EVA 4 activity on Flight Day 7 to service the Hubble Space Telescope

    NASA Image and Video Library

    1997-02-17

    S82-E-5606 (17 Feb. 1997) --- Astronaut Gregory J. Harbaugh at work on Hubble Space Telescope (HST), with the assistance of astronaut Joseph R. Tanner (out of frame) on Remote Manipulator System (RMS). After replacing the HST's Solar Array Drive Electronics (SADE), Harbaugh and Tanner replaced the Magnetic Sensing System (MSS) protective lids with new, permanent covers; and they installed pre-cut insulation pieces to correct tears in the HST's protective covering caused by temperature changes in space. This view was taken with an Electronic Still Camera (ESC).

  20. The remote sensing image segmentation mean shift algorithm parallel processing based on MapReduce

    NASA Astrophysics Data System (ADS)

    Chen, Xi; Zhou, Liqing

    2015-12-01

    With the development of satellite remote sensing technology and the remote sensing image data, traditional remote sensing image segmentation technology cannot meet the massive remote sensing image processing and storage requirements. This article put cloud computing and parallel computing technology in remote sensing image segmentation process, and build a cheap and efficient computer cluster system that uses parallel processing to achieve MeanShift algorithm of remote sensing image segmentation based on the MapReduce model, not only to ensure the quality of remote sensing image segmentation, improved split speed, and better meet the real-time requirements. The remote sensing image segmentation MeanShift algorithm parallel processing algorithm based on MapReduce shows certain significance and a realization of value.

  1. REMOTE SENSING TECHNOLOGIES APPLICATIONS RESEARCH

    EPA Science Inventory

    Remote sensing technologies applications research supports the ORD Landscape Sciences Program (LSP) in two separate areas: operational remote sensing, and remote sensing research and development. Operational remote sensing is provided to the LSP through the use of current and t...

  2. Remote sensing of smoke, clouds, and radiation using AVIRIS during SCAR experiments

    NASA Technical Reports Server (NTRS)

    Gao, Bo-Cai; Remer, Lorraine; Kaufman, Yorman J.

    1995-01-01

    During the past two years, researchers from several institutes joined together to take part in two SCAR experiments. The SCAR-A (Sulfates, Clouds And Radiation - Atlantic) took place in the mid-Atlantic region of the United States in July, 1993. remote sensing data were acquired with the Airborne Visible Infrared Imaging Spectrometer (AVIRIS), the MODIS Airborne Simulator (MAS), and a RC-10 mapping camera from an ER-2 aircraft at 20 km. In situ measurements of aerosol and cloud microphysical properties were made with a variety of instruments equipped on the University of Washington's C-131A research aircraft. Ground based measurements of aerosol optical depths and particle size distributions were made using a network of sunphotometers. The main purpose of SCAR-A experiment was to study the optical, physical and chemical properties of sulfate aerosols and their interaction with clouds and radiation. Sulfate particles are believed to affect the energy balance of the earth by directly reflecting solar radiation back to space and by increasing the cloud albedo. The SCAR-C (Smoke, Clouds And Radiation - California) took place on the west coast areas during September - October of 1994. Sets of aircraft and ground-based instruments, similar to those used during SCAR-A, were used during SCAR-C. Remote sensing of fires and smoke from AVIRIS and MAS imagers on the ER-2 aircraft was combined with a complete in situ characterization of the aerosol and trace gases from the C-131A aircraft of the University of Washington and the Cesna aircraft from the U.S. Forest Service. The comprehensive data base acquired during SCAR-A and SCAR-C will contribute to a better understanding of the role of clouds and aerosols in global change studies. The data will also be used to develop satellite remote sensing algorithms from MODIS on the Earth Observing System.

  3. Tephra dispersal and fallout reconstructed integrating field, ground-based and satellite-based data: Application to the 23rd November 2013 Etna paroxysm

    NASA Astrophysics Data System (ADS)

    Poret, M.; Corradini, S.; Merucci, L.; Costa, A.; Andronico, D.; Montopoli, M.; Vulpiani, G.; Scollo, S.; Freret-Lorgeril, V.

    2017-12-01

    On the 23rd November 2013, Etna erupted giving one of the most intense lava fountain recorded. The eruption produced a buoyant plume that rose higher than 10 km a.s.l. from which two volcanic clouds were observed from satellite at two different atmospheric levels. A Previous study described one of the two clouds as mainly composed by ash making use of remote sensing instruments. Besides, the second cloud is made of ice/SO2 droplets and is not measurable in terms of ash mass. Both clouds spread out under north-easterly winds transporting the tephra from Etna towards the Puglia region. The untypical meteorological conditions permit to collect tephra samples in proximal areas to the Etna emission source as well as far away in the Calabria region. The eruption was observed by satellite (MSG-SEVIRI, MODIS) and ground-based (X-band weather radar, VIS/IR cameras and L-band Doppler radar) remote sensing systems. This study uses the FALL3D code to model the evolution of the plume and the tephra deposition by constraining the simulation results with remote sensing products for volcanic cloud (cloud height, fine ash Mass - Ma, Aerosol Optical Depth at 0.55 mm - AOD). Among the input parameters, the Total Grain-Size Distribution (TGSD) is reconstructed by integrating field deposits with estimations from the X-band radar data. The optimal TGSD was selected through an inverse problem method that best-fits both the field deposits and airborne measurements. The results of the simulations capture the main behavior of the two volcanic clouds at their altitudes. The best agreement between the simulated Ma and AOD and the SEVIRI retrievals indicates a PM20 fraction of 3.4 %. The total erupted mass is estimated at 1.6 × 109 kg in consistency with the estimations made from remote sensing data (3.0 × 109 kg) and ground deposit (1.3 × 109 kg).

  4. Using Remotely Sensed Data for Climate Change Mitigation and Adaptation: A Collaborative Effort Between the Climate Change Adaptation Science Investigators Workgroup (CASI), NASA Johnson Space Center, and Jacobs Technology

    NASA Technical Reports Server (NTRS)

    Jagge, Amy

    2016-01-01

    With ever changing landscapes and environmental conditions due to human induced climate change, adaptability is imperative for the long-term success of facilities and Federal agency missions. To mitigate the effects of climate change, indicators such as above-ground biomass change must be identified to establish a comprehensive monitoring effort. Researching the varying effects of climate change on ecosystems can provide a scientific framework that will help produce informative, strategic and tactical policies for environmental adaptation. As a proactive approach to climate change mitigation, NASA tasked the Climate Change Adaptation Science Investigators Workgroup (CASI) to provide climate change expertise and data to Center facility managers and planners in order to ensure sustainability based on predictive models and current research. Generation of historical datasets that will be used in an agency-wide effort to establish strategies for climate change mitigation and adaptation at NASA facilities is part of the CASI strategy. Using time series of historical remotely sensed data is well-established means of measuring change over time. CASI investigators have acquired multispectral and hyperspectral optical and LiDAR remotely sensed datasets from NASA Earth Observation Satellites (including the International Space Station), airborne sensors, and astronaut photography using hand held digital cameras to create a historical dataset for the Johnson Space Center, as well as the Houston and Galveston area. The raster imagery within each dataset has been georectified, and the multispectral and hyperspectral imagery has been atmospherically corrected. Using ArcGIS for Server, the CASI-Regional Remote Sensing data has been published as an image service, and can be visualized through a basic web mapping application. Future work will include a customized web mapping application created using a JavaScript Application Programming Interface (API), and inclusion of the CASI data for the NASA Johnson Space Center into a NASA-Wide GIS Institutional Portal.

  5. Remote Sensing of Vegetation Species Diversity: The Utility of Integrated Airborne Hyperspectral and Lidar Data

    NASA Astrophysics Data System (ADS)

    Krause, Keith Stuart

    The change, reduction, or extinction of species is a major issue currently facing the Earth. Efforts are underway to measure, monitor, and protect habitats that contain high species diversity. Remote sensing technology shows extreme value for monitoring species diversity by mapping ecosystems and using those land cover maps or other derived data as proxies to species number and distribution. The National Ecological Observatory Network (NEON) Airborne Observation Platform (AOP) consists of remote sensing instruments such as an imaging spectrometer, a full-waveform lidar, and a high-resolution color camera. AOP collected data over the Ordway-Swisher Biological Station (OSBS) in May 2014. A majority of the OSBS site is covered by the Sandhill ecosystem, which contains a very high diversity of vegetation species and is a native habitat for several threatened fauna species. The research presented here investigates ways to analyze the AOP data to map ecosystems at the OSBS site. The research attempts to leverage the high spatial resolution data and study the variability of the data within a ground plot scale along with integrating data from the different sensors. Mathematical features are derived from the data and brought into a decision tree classification algorithm (rpart), in order to create an ecosystem map for the site. The hyperspectral and lidar features serve as proxies for chemical, functional, and structural differences in the vegetation types for each of the ecosystems. K-folds cross validation shows a training accuracy of 91%, a validation accuracy of 78%, and a 66% accuracy using independent ground validation. The results presented here represent an important contribution to utilizing integrated hyperspectral and lidar remote sensing data for ecosystem mapping, by relating the spatial variability of the data within a ground plot scale to a collection of vegetation types that make up a given ecosystem.

  6. Emergency Response Imagery Related to Hurricanes Harvey, Irma, and Maria

    NASA Astrophysics Data System (ADS)

    Worthem, A. V.; Madore, B.; Imahori, G.; Woolard, J.; Sellars, J.; Halbach, A.; Helmricks, D.; Quarrick, J.

    2017-12-01

    NOAA's National Geodetic Survey (NGS) and Remote Sensing Division acquired and rapidly disseminated emergency response imagery related to the three recent hurricanes Harvey, Irma, and Maria. Aerial imagery was collected using a Trimble Digital Sensor System, a high-resolution digital camera, by means of NOAA's King Air 350ER and DeHavilland Twin Otter (DHC-6) Aircraft. The emergency response images are used to assess the before and after effects of the hurricanes' damage. The imagery aids emergency responders, such as FEMA, Coast Guard, and other state and local governments, in developing recovery strategies and efforts by prioritizing areas most affected and distributing appropriate resources. Collected imagery is also used to provide damage assessment for use in long-term recovery and rebuilding efforts. Additionally, the imagery allows for those evacuated persons to see images of their homes and neighborhoods remotely. Each of the individual images are processed through ortho-rectification and merged into a uniform mosaic image. These remotely sensed datasets are publically available, and often used by web-based map servers as well as, federal, state, and local government agencies. This poster will show the imagery collected for these three hurricanes and the processes involved in getting data quickly into the hands of those that need it most.

  7. Head-coupled remote stereoscopic camera system for telepresence applications

    NASA Astrophysics Data System (ADS)

    Bolas, Mark T.; Fisher, Scott S.

    1990-09-01

    The Virtual Environment Workstation Project (VIEW) at NASA's Ames Research Center has developed a remotely controlled stereoscopic camera system that can be used for telepresence research and as a tool to develop and evaluate configurations for head-coupled visual systems associated with space station telerobots and remote manipulation robotic arms. The prototype camera system consists of two lightweight CCD video cameras mounted on a computer controlled platform that provides real-time pan, tilt, and roll control of the camera system in coordination with head position transmitted from the user. This paper provides an overall system description focused on the design and implementation of the camera and platform hardware configuration and the development of control software. Results of preliminary performance evaluations are reported with emphasis on engineering and mechanical design issues and discussion of related psychophysiological effects and objectives.

  8. Sugarcane Crop Extraction Using Object-Oriented Method from ZY-3 High Resolution Satellite Tlc Image

    NASA Astrophysics Data System (ADS)

    Luo, H.; Ling, Z. Y.; Shao, G. Z.; Huang, Y.; He, Y. Q.; Ning, W. Y.; Zhong, Z.

    2018-04-01

    Sugarcane is one of the most important crops in Guangxi, China. As the development of satellite remote sensing technology, more remotely sensed images can be used for monitoring sugarcane crop. With the help of Three Line Camera (TLC) images, wide coverage and stereoscopic mapping ability, Chinese ZY-3 high resolution stereoscopic mapping satellite is useful in attaining more information for sugarcane crop monitoring, such as spectral, shape, texture difference between forward, nadir and backward images. Digital surface model (DSM) derived from ZY-3 TLC images are also able to provide height information for sugarcane crop. In this study, we make attempt to extract sugarcane crop from ZY-3 images, which are acquired in harvest period. Ortho-rectified TLC images, fused image, DSM are processed for our extraction. Then Object-oriented method is used in image segmentation, example collection, and feature extraction. The results of our study show that with the help of ZY-3 TLC image, the information of sugarcane crop in harvest time can be automatic extracted, with an overall accuracy of about 85.3 %.

  9. Data management and digital delivery of analog data

    USGS Publications Warehouse

    Miller, W.A.; Longhenry, Ryan; Smith, T.

    2008-01-01

    The U.S. Geological Survey's (USGS) data archive at the Earth Resources Observation and Science (EROS) Center is a comprehensive and impartial record of the Earth's changing land surface. USGS/EROS has been archiving and preserving land remote sensing data for over 35 years. This remote sensing archive continues to grow as aircraft and satellites acquire more imagery. As a world leader in preserving data, USGS/EROS has a reputation as a technological innovator in solving challenges and ensuring that access to these collections is available. Other agencies also call on the USGS to consider their collections for long-term archive support. To improve access to the USGS film archive, each frame on every roll of film is being digitized by automated high performance digital camera systems. The system robotically captures a digital image from each film frame for the creation of browse and medium resolution image files. Single frame metadata records are also created to improve access that otherwise involves interpreting flight indexes. USGS/EROS is responsible for over 8.6 million frames of aerial photographs and 27.7 million satellite images.

  10. DART: Recent Advances in Remote Sensing Data Modeling With Atmosphere, Polarization, and Chlorophyll Fluorescence

    NASA Technical Reports Server (NTRS)

    Gastellu-Etchegorry, Jean-Phil; Lauret, Nicolas; Yin, Tiangang; Landier, Lucas; Kallel, Abdelaziz; Malenovsky, Zbynek; Bitar, Ahmad Al; Aval, Josselin; Benhmida, Sahar; Qi, Jianbo; hide

    2017-01-01

    To better understand the life-essential cycles and processes of our planet and to further develop remote sensing (RS) technology, there is an increasing need for models that simulate the radiative budget (RB) and RS acquisitions of urban and natural landscapes using physical approaches and considering the three-dimensional (3-D) architecture of Earth surfaces. Discrete anisotropic radiative transfer (DART) is one of the most comprehensive physically based 3-D models of Earth-atmosphere radiative transfer, covering the spectral domain from ultraviolet to thermal infrared wavelengths. It simulates the optical 3-DRB and optical signals of proximal, aerial, and satellite imaging spectrometers and laser scanners, for any urban and/or natural landscapes and for any experimental and instrumental configurations. It is freely available for research and teaching activities. In this paper, we briefly introduce DART theory and present recent advances in simulated sensors (LiDAR and cameras with finite field of view) and modeling mechanisms (atmosphere, specular reflectance with polarization and chlorophyll fluorescence). A case study demonstrating a novel application of DART to investigate urban landscapes is also presented.

  11. Remote Sensing of Aerosol in the Terrestrial Atmosphere from Space: New Missions

    NASA Technical Reports Server (NTRS)

    Milinevsky, G.; Yatskiv, Ya.; Degtyaryov, O.; Syniavskyi, I.; Ivanov, Yu.; Bovchaliuk, A.; Mishchenko, M.; Danylevsky, V.; Sosonkin, M.; Bovchaliuk, V.

    2015-01-01

    The distribution and properties of atmospheric aerosols on a global scale are not well known in terms of determination of their effects on climate. This mostly is due to extreme variability of aerosol concentrations, properties, sources, and types. Aerosol climate impact is comparable to the effect of greenhouse gases, but its influence is more difficult to measure, especially with respect to aerosol microphysical properties and the evaluation of anthropogenic aerosol effect. There are many satellite missions studying aerosol distribution in the terrestrial atmosphere, such as MISR/Terra, OMI/Aura, AVHHR, MODIS/Terra and Aqua, CALIOP/CALIPSO. To improve the quality of data and climate models, and to reduce aerosol climate forcing uncertainties, several new missions are planned. The gap in orbital instruments for studying aerosol microphysics has arisen after the Glory mission failed during launch in 2011. In this review paper, we describe several planned aerosol space missions, including the Ukrainian project Aerosol-UA that obtains data using a multi-channel scanning polarimeter and wide-angle polarimetric camera. The project is designed for remote sensing of the aerosol microphysics and cloud properties on a global scale.

  12. Offshore remote sensing of the ocean by stereo vision systems

    NASA Astrophysics Data System (ADS)

    Gallego, Guillermo; Shih, Ping-Chang; Benetazzo, Alvise; Yezzi, Anthony; Fedele, Francesco

    2014-05-01

    In recent years, remote sensing imaging systems for the measurement of oceanic sea states have attracted renovated attention. Imaging technology is economical, non-invasive and enables a better understanding of the space-time dynamics of ocean waves over an area rather than at selected point locations of previous monitoring methods (buoys, wave gauges, etc.). We present recent progress in space-time measurement of ocean waves using stereo vision systems on offshore platforms, which focus on sea states with wavelengths in the range of 0.01 m to 1 m. Both traditional disparity-based systems and modern elevation-based ones are presented in a variational optimization framework: the main idea is to pose the stereoscopic reconstruction problem of the surface of the ocean in a variational setting and design an energy functional whose minimizer is the desired temporal sequence of wave heights. The functional combines photometric observations as well as spatial and temporal smoothness priors. Disparity methods estimate the disparity between images as an intermediate step toward retrieving the depth of the waves with respect to the cameras, whereas elevation methods estimate the ocean surface displacements directly in 3-D space. Both techniques are used to measure ocean waves from real data collected at offshore platforms in the Black Sea (Crimean Peninsula, Ukraine) and the Northern Adriatic Sea (Venice coast, Italy). Then, the statistical and spectral properties of the resulting oberved waves are analyzed. We show the advantages and disadvantages of the presented stereo vision systems and discuss furure lines of research to improve their performance in critical issues such as the robustness of the camera calibration in spite of undesired variations of the camera parameters or the processing time that it takes to retrieve ocean wave measurements from the stereo videos, which are very large datasets that need to be processed efficiently to be of practical usage. Multiresolution and short-time approaches would improve efficiency and scalability of the techniques so that wave displacements are obtained in feasible times.

  13. Design and development of the 2m resolution camera for ROCSAT-2

    NASA Astrophysics Data System (ADS)

    Uguen, Gilbert; Luquet, Philippe; Chassat, François

    2017-11-01

    EADS-Astrium has recently completed the development of a 2m-resolution camera, so-called RSI (Remote Sensing Instrument), for the small-satellite ROCSAT-2, which is the second component of the long-term space program of the Republic of China. The National Space Program Office of Taïwan selected EADS-Astrium as the Prime Contractor for the development of the spacecraft, including the bus and the main instrument RSI. The main challenges for the RSI development were: - to introduce innovative technologies in order to meet the high performance requirements while achieving the design simplicity necessary for the mission (low mass, low power) - to have a development approach and verification compatible with the very tight development schedule This paper describes the instrument design together with the development and verification logic that were implemented to successfully meet these objectives.

  14. Tunnel-Site Selection by Remote Sensing Techniques

    DTIC Science & Technology

    A study of the role of remote sensing for geologic reconnaissance for tunnel-site selection was commenced. For this study, remote sensing was defined...conventional remote sensing . Future research directions are suggested, and the extension of remote sensing to include airborne passive microwave

  15. System and method for evaluating wind flow fields using remote sensing devices

    DOEpatents

    Schroeder, John; Hirth, Brian; Guynes, Jerry

    2016-12-13

    The present invention provides a system and method for obtaining data to determine one or more characteristics of a wind field using a first remote sensing device and a second remote sensing device. Coordinated data is collected from the first and second remote sensing devices and analyzed to determine the one or more characteristics of the wind field. The first remote sensing device is positioned to have a portion of the wind field within a first scanning sector of the first remote sensing device. The second remote sensing device is positioned to have the portion of the wind field disposed within a second scanning sector of the second remote sensing device.

  16. Reproducibility of UAV-based earth topography reconstructions based on Structure-from-Motion algorithms

    NASA Astrophysics Data System (ADS)

    Clapuyt, Francois; Vanacker, Veerle; Van Oost, Kristof

    2016-05-01

    Combination of UAV-based aerial pictures and Structure-from-Motion (SfM) algorithm provides an efficient, low-cost and rapid framework for remote sensing and monitoring of dynamic natural environments. This methodology is particularly suitable for repeated topographic surveys in remote or poorly accessible areas. However, temporal analysis of landform topography requires high accuracy of measurements and reproducibility of the methodology as differencing of digital surface models leads to error propagation. In order to assess the repeatability of the SfM technique, we surveyed a study area characterized by gentle topography with an UAV platform equipped with a standard reflex camera, and varied the focal length of the camera and location of georeferencing targets between flights. Comparison of different SfM-derived topography datasets shows that precision of measurements is in the order of centimetres for identical replications which highlights the excellent performance of the SfM workflow, all parameters being equal. The precision is one order of magnitude higher for 3D topographic reconstructions involving independent sets of ground control points, which results from the fact that the accuracy of the localisation of ground control points strongly propagates into final results.

  17. Exploring Models and Data for Remote Sensing Image Caption Generation

    NASA Astrophysics Data System (ADS)

    Lu, Xiaoqiang; Wang, Binqiang; Zheng, Xiangtao; Li, Xuelong

    2018-04-01

    Inspired by recent development of artificial satellite, remote sensing images have attracted extensive attention. Recently, noticeable progress has been made in scene classification and target detection.However, it is still not clear how to describe the remote sensing image content with accurate and concise sentences. In this paper, we investigate to describe the remote sensing images with accurate and flexible sentences. First, some annotated instructions are presented to better describe the remote sensing images considering the special characteristics of remote sensing images. Second, in order to exhaustively exploit the contents of remote sensing images, a large-scale aerial image data set is constructed for remote sensing image caption. Finally, a comprehensive review is presented on the proposed data set to fully advance the task of remote sensing caption. Extensive experiments on the proposed data set demonstrate that the content of the remote sensing image can be completely described by generating language descriptions. The data set is available at https://github.com/201528014227051/RSICD_optimal

  18. Research on active imaging information transmission technology of satellite borne quantum remote sensing

    NASA Astrophysics Data System (ADS)

    Bi, Siwen; Zhen, Ming; Yang, Song; Lin, Xuling; Wu, Zhiqiang

    2017-08-01

    According to the development and application needs of Remote Sensing Science and technology, Prof. Siwen Bi proposed quantum remote sensing. Firstly, the paper gives a brief introduction of the background of quantum remote sensing, the research status and related researches at home and abroad on the theory, information mechanism and imaging experiments of quantum remote sensing and the production of principle prototype.Then, the quantization of pure remote sensing radiation field, the state function and squeezing effect of quantum remote sensing radiation field are emphasized. It also describes the squeezing optical operator of quantum light field in active imaging information transmission experiment and imaging experiments, achieving 2-3 times higher resolution than that of coherent light detection imaging and completing the production of quantum remote sensing imaging prototype. The application of quantum remote sensing technology can significantly improve both the signal-to-noise ratio of information transmission imaging and the spatial resolution of quantum remote sensing .On the above basis, Prof.Bi proposed the technical solution of active imaging information transmission technology of satellite borne quantum remote sensing, launched researches on its system composition and operation principle and on quantum noiseless amplifying devices, providing solutions and technical basis for implementing active imaging information technology of satellite borne Quantum Remote Sensing.

  19. Depth Perception In Remote Stereoscopic Viewing Systems

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B.; Von Sydow, Marika

    1989-01-01

    Report describes theoretical and experimental studies of perception of depth by human operators through stereoscopic video systems. Purpose of such studies to optimize dual-camera configurations used to view workspaces of remote manipulators at distances of 1 to 3 m from cameras. According to analysis, static stereoscopic depth distortion decreased, without decreasing stereoscopitc depth resolution, by increasing camera-to-object and intercamera distances and camera focal length. Further predicts dynamic stereoscopic depth distortion reduced by rotating cameras around center of circle passing through point of convergence of viewing axes and first nodal points of two camera lenses.

  20. Introduction to the physics and techniques of remote sensing

    NASA Technical Reports Server (NTRS)

    Elachi, Charles

    1987-01-01

    This book presents a comprehensive overview of the basics behind remote-sensing physics, techniques, and technology. The physics of wave/matter interactions, techniques of remote sensing across the electromagnetic spectrum, and the concepts behind remote sensing techniques now established and future ones under development are discussed. Applications of remote sensing are described for a wide variety of earth and planetary atmosphere and surface sciences. Solid surface sensing across the electromagnetic spectrum, ocean surface sensing, basic principles of atmospheric sensing and radiative transfer, and atmospheric remote sensing in the microwave, millimeter, submillimeter, and infrared regions are examined.

  1. Biophysical control of intertidal benthic macroalgae revealed by high-frequency multispectral camera images

    NASA Astrophysics Data System (ADS)

    van der Wal, Daphne; van Dalen, Jeroen; Wielemaker-van den Dool, Annette; Dijkstra, Jasper T.; Ysebaert, Tom

    2014-07-01

    Intertidal benthic macroalgae are a biological quality indicator in estuaries and coasts. While remote sensing has been applied to quantify the spatial distribution of such macroalgae, it is generally not used for their monitoring. We examined the day-to-day and seasonal dynamics of macroalgal cover on a sandy intertidal flat using visible and near-infrared images from a time-lapse camera mounted on a tower. Benthic algae were identified using supervised, semi-supervised and unsupervised classification techniques, validated with monthly ground-truthing over one year. A supervised classification (based on maximum likelihood, using training areas identified in the field) performed best in discriminating between sediment, benthic diatom films and macroalgae, with highest spectral separability between macroalgae and diatoms in spring/summer. An automated unsupervised classification (based on the Normalised Differential Vegetation Index NDVI) allowed detection of daily changes in macroalgal coverage without the need for calibration. This method showed a bloom of macroalgae (filamentous green algae, Ulva sp.) in summer with > 60% cover, but with pronounced superimposed day-to-day variation in cover. Waves were a major factor in regulating macroalgal cover, but regrowth of the thalli after a summer storm was fast (2 weeks). Images and in situ data demonstrated that the protruding tubes of the polychaete Lanice conchilega facilitated both settlement (anchorage) and survival (resistance to waves) of the macroalgae. Thus, high-frequency, high resolution images revealed the mechanisms for regulating the dynamics in cover of the macroalgae and for their spatial structuring. Ramifications for the mode, timing, frequency and evaluation of monitoring macroalgae by field and remote sensing surveys are discussed.

  2. [Thematic Issue: Remote Sensing.

    ERIC Educational Resources Information Center

    Howkins, John, Ed.

    1978-01-01

    Four of the articles in this publication discuss the remote sensing of the Earth and its resources by satellites. Among the topics dealt with are the development and management of remote sensing systems, types of satellites used for remote sensing, the uses of remote sensing, and issues involved in using information obtained through remote…

  3. 75 FR 65304 - Advisory Committee on Commercial Remote Sensing (ACCRES); Request for Nominations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-22

    ... Commercial Remote Sensing (ACCRES); Request for Nominations AGENCY: National Oceanic and Atmospheric... Commercial Remote Sensing (ACCRES). SUMMARY: The Advisory Committee on Commercial Remote Sensing (ACCRES) was... Atmosphere, on matters relating to the U.S. commercial remote sensing industry and NOAA's activities to carry...

  4. Dog and Cat Interactions in a Remote Aboriginal Community.

    PubMed

    Kennedy, Brooke; Brown, Wendy Y; Vernes, Karl; Körtner, Gerhard; Butler, James R A

    2018-04-26

    This study examined dog and cat demographics, roaming behaviours, and interspecific interactions in a remote Aboriginal island community using multiple methods. Our results revealed temporal differences between the roaming behaviours of dogs, cats, and wildlife. Dogs showed crepuscular behaviour, being active around dawn (5:30 a.m. to 9:30 a.m.) and dusk (6:00 p.m. and 11:35 p.m.). The majority of cats were active between dawn (6:30 a.m.) and dusk (7:30 p.m.) and travelled shorter distances than dogs. However, some cats were also observed roaming between dusk and dawn, and were likely to be hunting since flightless wildlife were also recorded on our remote-sensing cameras during this time. These baseline data provide evidence to suggest that new management programs are needed to reduce the number of roaming cats and therefore their potential impacts on native wildlife. Collaborations between Aboriginal owners and other stakeholders is necessary to design innovative and effective animal management and policy on the island.

  5. Literature relevant to remote sensing of water quality

    NASA Technical Reports Server (NTRS)

    Middleton, E. M.; Marcell, R. F.

    1983-01-01

    References relevant to remote sensing of water quality were compiled, organized, and cross-referenced. The following general categories were included: (1) optical properties and measurement of water characteristics; (2) interpretation of water characteristics by remote sensing, including color, transparency, suspended or dissolved inorganic matter, biological materials, and temperature; (3) application of remote sensing for water quality monitoring; (4) application of remote sensing according to water body type; and (5) manipulation, processing and interpretation of remote sensing digital water data.

  6. Learning Methods of Remote Sensing In the 2013 Curriculum of Secondary School

    NASA Astrophysics Data System (ADS)

    Lili Somantri, Nandi

    2016-11-01

    The new remote sensing material included in the subjects of geography in the curriculum of 1994. For geography teachers generation of 90s and over who in college do not get the material remote sensing, for teaching is a tough matter. Most teachers only give a theoretical matter, and do not carry out practical reasons in the lack of facilities and infrastructure of computer laboratories. Therefore, in this paper studies the importance about the method or manner of teaching remote sensing material in schools. The purpose of this paper is 1) to explain the position of remote sensing material in the study of geography, 2) analyze the Geography Curriculum 2013 Subjects related to remote sensing material, 3) describes a method of teaching remote sensing material in schools. The method used in this paper is a descriptive analytical study supported by the literature. The conclusion of this paper that the position of remote sensing in the study of geography is a method or a way to obtain spatial data earth's surface. In the 2013 curriculum remote sensing material has been applied to the study of land use and transportation. Remote sensing methods of teaching must go through a practicum, which starts from the introduction of the theory of remote sensing, data extraction phase of remote sensing imagery to produce maps, both visually and digitally, field surveys, interpretation of test accuracy, and improved maps.

  7. JPRS Report, Science & Technology, China, Remote Sensing Systems, Applications.

    DTIC Science & Technology

    1991-01-17

    Partial Contents: Short Introduction to Nation’s Remote Sensing Units, Domestic Airborne Remote - Sensing System, Applications in Monitoring Natural...Disasters, Applications of Imagery From Experimental Satellites Launched in 1985, 1986, Current Status, Future Prospects for Domestic Remote - Sensing -Satellite...Ground Station, and Radar Remote - Sensing Technology Used to Monitor Yellow River Delta,

  8. Can Commercial Digital Cameras Be Used as Multispectral Sensors? A Crop Monitoring Test

    PubMed Central

    Lebourgeois, Valentine; Bégué, Agnès; Labbé, Sylvain; Mallavan, Benjamin; Prévot, Laurent; Roux, Bruno

    2008-01-01

    The use of consumer digital cameras or webcams to characterize and monitor different features has become prevalent in various domains, especially in environmental applications. Despite some promising results, such digital camera systems generally suffer from signal aberrations due to the on-board image processing systems and thus offer limited quantitative data acquisition capability. The objective of this study was to test a series of radiometric corrections having the potential to reduce radiometric distortions linked to camera optics and environmental conditions, and to quantify the effects of these corrections on our ability to monitor crop variables. In 2007, we conducted a five-month experiment on sugarcane trial plots using original RGB and modified RGB (Red-Edge and NIR) cameras fitted onto a light aircraft. The camera settings were kept unchanged throughout the acquisition period and the images were recorded in JPEG and RAW formats. These images were corrected to eliminate the vignetting effect, and normalized between acquisition dates. Our results suggest that 1) the use of unprocessed image data did not improve the results of image analyses; 2) vignetting had a significant effect, especially for the modified camera, and 3) normalized vegetation indices calculated with vignetting-corrected images were sufficient to correct for scene illumination conditions. These results are discussed in the light of the experimental protocol and recommendations are made for the use of these versatile systems for quantitative remote sensing of terrestrial surfaces. PMID:27873930

  9. Proceedings of the 2004 High Spatial Resolution Commercial Imagery Workshop

    NASA Technical Reports Server (NTRS)

    2006-01-01

    Topics covered include: NASA Applied Sciences Program; USGS Land Remote Sensing: Overview; QuickBird System Status and Product Overview; ORBIMAGE Overview; IKONOS 2004 Calibration and Validation Status; OrbView-3 Spatial Characterization; On-Orbit Modulation Transfer Function (MTF) Measurement of QuickBird; Spatial Resolution Characterization for QuickBird Image Products 2003-2004 Season; Image Quality Evaluation of QuickBird Super Resolution and Revisit of IKONOS: Civil and Commercial Application Project (CCAP); On-Orbit System MTF Measurement; QuickBird Post Launch Geopositional Characterization Update; OrbView-3 Geometric Calibration and Geopositional Accuracy; Geopositional Statistical Methods; QuickBird and OrbView-3 Geopositional Accuracy Assessment; Initial On-Orbit Spatial Resolution Characterization of OrbView-3 Panchromatic Images; Laboratory Measurement of Bidirectional Reflectance of Radiometric Tarps; Stennis Space Center Verification and Validation Capabilities; Joint Agency Commercial Imagery Evaluation (JACIE) Team; Adjacency Effects in High Resolution Imagery; Effect of Pulse Width vs. GSD on MTF Estimation; Camera and Sensor Calibration at the USGS; QuickBird Geometric Verification; Comparison of MODTRAN to Heritage-based Results in Vicarious Calibration at University of Arizona; Using Remotely Sensed Imagery to Determine Impervious Surface in Sioux Falls, South Dakota; Estimating Sub-Pixel Proportions of Sagebrush with a Regression Tree; How Do YOU Use the National Land Cover Dataset?; The National Map Hazards Data Distribution System; Recording a Troubled World; What Does This-Have to Do with This?; When Can a Picture Save a Thousand Homes?; InSAR Studies of Alaska Volcanoes; Earth Observing-1 (EO-1) Data Products; Improving Access to the USGS Aerial Film Collections: High Resolution Scanners; Improving Access to the USGS Aerial Film Collections: Phoenix Digitizing System Product Distribution; System and Product Characterization: Issues Approach; Innovative Approaches to Analysis of Lidar Data for the National Map; Changes in Imperviousness near Military Installations; Geopositional Accuracy Evaluations of QuickBird and OrbView-3: Civil and Commercial Applications Project (CCAP); Geometric Accuracy Assessment: OrbView ORTHO Products; QuickBird Radiometric Calibration Update; OrbView-3 Radiometric Calibration; QuickBird Radiometric Characterization; NASA Radiometric Characterization; Establishing and Verifying the Traceability of Remote-Sensing Measurements to International Standards; QuickBird Applications; Airport Mapping and Perpetual Monitoring Using IKONOS; OrbView-3 Relative Accuracy Results and Impacts on Exploitation and Accuracy Improvement; Using Remotely Sensed Imagery to Determine Impervious Surface in Sioux Falls, South Dakota; Applying High-Resolution Satellite Imagery and Remotely Sensed Data to Local Government Applications: Sioux Falls, South Dakota; Automatic Co-Registration of QuickBird Data for Change Detection Applications; Developing Coastal Surface Roughness Maps Using ASTER and QuickBird Data Sources; Automated, Near-Real Time Cloud and Cloud Shadow Detection in High Resolution VNIR Imagery; Science Applications of High Resolution Imagery at the USGS EROS Data Center; Draft Plan for Characterizing Commercial Data Products in Support of Earth Science Research; Atmospheric Correction Prototype Algorithm for High Spatial Resolution Multispectral Earth Observing Imaging Systems; Determining Regional Arctic Tundra Carbon Exchange: A Bottom-Up Approach; Using IKONOS Imagery to Assess Impervious Surface Area, Riparian Buffers and Stream Health in the Mid-Atlantic Region; Commercial Remote Sensing Space Policy Civil Implementation Update; USGS Commercial Remote Sensing Data Contracts (CRSDC); and Commercial Remote Sensing Space Policy (CRSSP): Civil Near-Term Requirements Collection Update.

  10. [A review on polarization information in the remote sensing detection].

    PubMed

    Gong, Jie-Qiong; Zhan, Hai-Gang; Liu, Da-Zhao

    2010-04-01

    Polarization is one of the inherent characteristics. Because the surface of the target structure, internal structure, and the angle of incident light are different, the earth's surface and any target in atmosphere under optical interaction process will have their own characteristic nature of polarization. Polarimetric characteristics of radiation energy from the targets are used in polarization remote sensing detection as detective information. Polarization remote sensing detection can get the seven-dimensional information of targets in complicated backgrounds, detect well-resolved outline of targets and low-reflectance region of objectives, and resolve the problems of atmospheric detection and identification camouflage detection which the traditional remote sensing detection can not solve, having good foreground in applications. This paper introduces the development of polarization information in the remote sensing detection from the following four aspects. The rationale of polarization remote sensing detection is the base of polarization remote sensing detection, so it is firstly introduced. Secondly, the present researches on equipments that are used in polarization remote sensing detection are particularly and completely expatiated. Thirdly, the present exploration of theoretical simulation of polarization remote sensing detection is well detailed. Finally, the authors present the applications research home and abroad of the polarization remote sensing detection technique in the fields of remote sensing, atmospheric sounding, sea surface and underwater detection, biology and medical diagnosis, astronomical observation and military, summing up the current problems in polarization remote sensing detection. The development trend of polarization remote sensing detection technology in the future is pointed out in order to provide a reference for similar studies.

  11. Ground based remote sensing retrievals and observations of snowfall in the Telemark region of Norway

    NASA Astrophysics Data System (ADS)

    Pettersen, C.; L'Ecuyer, T. S.; Wood, N.; Cooper, S.; Wolff, M. A.; Petersen, W. A.; Bliven, L. F.; Tushaus, S. A.

    2017-12-01

    Snowfall can be broadly categorized into deep and shallow events, based on the vertical extent of the frozen precipitation in the column. The two categories are driven by different thermodynamic and physical mechanisms in the atmosphere and surface. Though satellites can observe and recognize these patterns in snowfall, these measurements are limited - particularly in cases of shallow and light precipitation and over complex terrain. By enhancing satellite measurements with ground-based instrumentation, whether with limited-term field campaigns or long-term strategic sites, we can further our understanding and assumptions about different snowfall modes. We present data collected in a recently deployed ground suite of instruments based in Norway. The Meteorological Institute of Norway has a snow measurement suite in Haukeliseter located in the orographically complex Telemark region. This suite consists of several snow accumulation instruments as well as meteorological data (temperature, dew point, wind speeds and directions). A joint project between University of Wisconsin and University of Utah augmented this suite with a 24 GHz radar MicroRain Radar (MRR), a NASA Particle Imaging Package (PIP), and a Multi-Angle Snowflake Camera (MASC). Preliminary data from this campaign are presented along with coincident overpasses from the GPM satellite. We compare the ground-based and spaceborne remotely sensed estimates of snowfall with snow gauge observations from the Haukeliseter site. Finally, we discuss how particle size distribution and fall velocity observations from the PIP and MASC can be used to improve remotely-sensed snowfall retrievals as a function of environmental conditions at Haukeliseter.

  12. FIDO prototype Mars rover field trials, Black Rock Summit, Nevada, as test of the ability of robotic mobility systems to conduct field science

    NASA Astrophysics Data System (ADS)

    Arvidson, R. E.; Squyres, S. W.; Baumgartner, E. T.; Schenker, P. S.; Niebur, C. S.; Larsen, K. W.; SeelosIV, F. P.; Snider, N. O.; Jolliff, B. L.

    2002-08-01

    The Field Integration Design and Operations (FIDO) prototype Mars rover was deployed and operated remotely for 2 weeks in May 2000 in the Black Rock Summit area of Nevada. The blind science operation trials were designed to evaluate the extent to which FIDO-class rovers can be used to conduct traverse science and collect samples. FIDO-based instruments included stereo cameras for navigation and imaging, an infrared point spectrometer, a color microscopic imager for characterization of rocks and soils, and a rock drill for core acquisition. Body-mounted ``belly'' cameras aided drill deployment, and front and rear hazard cameras enabled terrain hazard avoidance. Airborne Visible and Infrared Imaging Spectrometer (AVIRIS) data, a high spatial resolution IKONOS orbital image, and a suite of descent images were used to provide regional- and local-scale terrain and rock type information, from which hypotheses were developed for testing during operations. The rover visited three sites, traversed 30 m, and acquired 1.3 gigabytes of data. The relatively small traverse distance resulted from a geologically rich site in which materials identified on a regional scale from remote-sensing data could be identified on a local scale using rover-based data. Results demonstrate the synergy of mapping terrain from orbit and during descent using imaging and spectroscopy, followed by a rover mission to test inferences and to make discoveries that can be accomplished only with surface mobility systems.

  13. Advancing High Spatial and Spectral Resolution Remote Sensing for Observing Plant Community Response to Environmental Variability and Change in the Alaskan Arctic

    NASA Astrophysics Data System (ADS)

    Vargas Zesati, Sergio A.

    The Arctic is being impacted by climate change more than any other region on Earth. Impacts to terrestrial ecosystems have the potential to manifest through feedbacks with other components of the Earth System. Of particular concern is the potential for the massive store of soil organic carbon to be released from arctic permafrost to the atmosphere where it could exacerbate greenhouse warming and impact global climate and biogeochemical cycles. Even though substantial gains to our understanding of the changing Arctic have been made, especially over the past decade, linking research results from plot to regional scales remains a challenge due to the lack of adequate low/mid-altitude sampling platforms, logistic constraints, and the lack of cross-scale validation of research methodologies. The prime motivation of this study is to advance observational capacities suitable for documenting multi-scale environmental change in arctic terrestrial landscapes through the development and testing of novel ground-based and low altitude remote sensing methods. Specifically this study addressed the following questions: • How well can low-cost kite aerial photography and advanced computer vision techniques model the microtopographic heterogeneity of changing tundra surfaces? • How does imagery from kite aerial photography and fixed time-lapse digital cameras (pheno-cams) compare in their capacity to monitor plot-level phenological dynamics of arctic vegetation communities? • Can the use of multi-scale digital imaging systems be scaled to improve measurements of ecosystem properties and processes at the landscape level? • How do results from ground-based and low altitude digital remote sensing of the spatiotemporal variability in ecosystem processes compare with those from satellite remote sensing platforms? Key findings from this study suggest that cost-effective alternative digital imaging and remote sensing methods are suitable for monitoring and quantifying plot to landscape level ecosystem structure and phenological dynamics at multiple temporal scales. Overall, this study has furthered our knowledge of how tundra ecosystems in the Arctic change seasonally and how such change could impact remote sensing studies conducted from multiple platforms and across multiple spatial scales. Additionally, this study also highlights the urgent need for research into the validation of satellite products in order to better understand the causes and consequences of the changing Arctic and its potential effects on global processes. This study focused on sites located in northern Alaska and was formed in collaboration with Florida International University (FIU) and Grand Valley State University (GVSU) as a contribution to the US Arctic Observing Network (AON). All efforts were supported through the National Science Foundation (NSF), the Cyber-ShARE Center of Excellence, and the International Tundra Experiment (ITEX).

  14. Autonomous exploration and mapping of unknown environments

    NASA Astrophysics Data System (ADS)

    Owens, Jason; Osteen, Phil; Fields, MaryAnne

    2012-06-01

    Autonomous exploration and mapping is a vital capability for future robotic systems expected to function in arbitrary complex environments. In this paper, we describe an end-to-end robotic solution for remotely mapping buildings. For a typical mapping system, an unmanned system is directed to enter an unknown building at a distance, sense the internal structure, and, barring additional tasks, while in situ, create a 2-D map of the building. This map provides a useful and intuitive representation of the environment for the remote operator. We have integrated a robust mapping and exploration system utilizing laser range scanners and RGB-D cameras, and we demonstrate an exploration and metacognition algorithm on a robotic platform. The algorithm allows the robot to safely navigate the building, explore the interior, report significant features to the operator, and generate a consistent map - all while maintaining localization.

  15. Cybernetic Basis and System Practice of Remote Sensing and Spatial Information Science

    NASA Astrophysics Data System (ADS)

    Tan, X.; Jing, X.; Chen, R.; Ming, Z.; He, L.; Sun, Y.; Sun, X.; Yan, L.

    2017-09-01

    Cybernetics provides a new set of ideas and methods for the study of modern science, and it has been fully applied in many areas. However, few people have introduced cybernetics into the field of remote sensing. The paper is based on the imaging process of remote sensing system, introducing cybernetics into the field of remote sensing, establishing a space-time closed-loop control theory for the actual operation of remote sensing. The paper made the process of spatial information coherently, and improved the comprehensive efficiency of the space information from acquisition, procession, transformation to application. We not only describes the application of cybernetics in remote sensing platform control, sensor control, data processing control, but also in whole system of remote sensing imaging process control. We achieve the information of output back to the input to control the efficient operation of the entire system. This breakthrough combination of cybernetics science and remote sensing science will improve remote sensing science to a higher level.

  16. Development of the SEASIS instrument for SEDSAT

    NASA Technical Reports Server (NTRS)

    Maier, Mark W.

    1996-01-01

    Two SEASIS experiment objectives are key: take images that allow three axis attitude determination and take multi-spectral images of the earth. During the tether mission it is also desirable to capture images for the recoiling tether from the endmass perspective (which has never been observed). SEASIS must store all its imagery taken during the tether mission until the earth downlink can be established. SEASIS determines attitude with a panoramic camera and performs earth observation with a telephoto lens camera. Camera video is digitized, compressed, and stored in solid state memory. These objectives are addressed through the following architectural choices: (1) A camera system using a Panoramic Annular Lens (PAL). This lens has a 360 deg. azimuthal field of view by a +45 degree vertical field measured from a plan normal to the lens boresight axis. It has been shown in Mr. Mark Steadham's UAH M.S. thesis that his camera can determine three axis attitude anytime the earth and one other recognizable celestial object (for example, the sun) is in the field of view. This will be essentially all the time during tether deployment. (2) A second camera system using telephoto lens and filter wheel. The camera is a black and white standard video camera. The filters are chosen to cover the visible spectral bands of remote sensing interest. (3) A processor and mass memory arrangement linked to the cameras. Video signals from the cameras are digitized, compressed in the processor, and stored in a large static RAM bank. The processor is a multi-chip module consisting of a T800 Transputer and three Zoran floating point Digital Signal Processors. This processor module was supplied under ARPA contract by the Space Computer Corporation to demonstrate its use in space.

  17. UVMAS: Venus ultraviolet-visual mapping spectrometer

    NASA Astrophysics Data System (ADS)

    Bellucci, G.; Zasova, L.; Altieri, F.; Nuccilli, F.; Ignatiev, N.; Moroz, V.; Khatuntsev, I.; Korablev, O.; Rodin, A.

    This paper summarizes the capabilities and technical solutions of an Ultraviolet Visual Mapping Spectrometer designed for remote sensing of Venus from a planetary orbiter. The UVMAS consists of a multichannel camera with a spectral range 0.19 << 0.49 μm which acquires data in several spectral channels (up to 400) with a spectral resolution of 0.58 nm. The instantaneous field of view of the instrument is 0.244 × 0.244 mrad. These characteristics allow: a) to study the upper clouds dynamics and chemistry; b) giving constraints on the unknown absorber; c) observation of the night side airglow.

  18. Exploration of Mars with the ChemCam LIBS Instrument and the Curiosity Rover

    NASA Technical Reports Server (NTRS)

    Newsom, Horton E.

    2016-01-01

    The Mars Science Laboratory (MSL) Curiosity rover landed on Mars in August 2012, and has been exploring the planet ever since. Dr. Horton E. Newsom will discuss the MSL's design and main goal, which is to characterize past environments that may have been conducive to the evolution and sustainability of life. He will also discuss Curiosity's science payload, and remote sensing, analytical capabilities, and direct discoveries of the Chemistry & Camera (ChemCam) instrument, which is the first Laser Induced Breakdown Spectrometer (LIBS) to operate on another planetary surface and determine the chemistry of the rocks and soils.

  19. Remote Sensing in Polarized Light

    NASA Technical Reports Server (NTRS)

    Whitehead, Victor S.; Coulson, Kinsell L.

    1988-01-01

    Preliminary analysis of polarized images of earth collected by hand-held cameras on Shuttle Missions 51A, 51G, 51I, and 61A indicate that information of the earth's surface and atmosphere exists in those data. To ensure that follow-on research in polarization is focused upon and that the experiments are properly designed to address specific questions, 26 scientists with past experience and interest in polarization observations met at the Lyndon B. Johnson Space Center on November 3 to 5, 1987. This conference report summarizes the discussions and provides the recommendations of the group for follow-on research.

  20. An earth imaging camera simulation using wide-scale construction of reflectance surfaces

    NASA Astrophysics Data System (ADS)

    Murthy, Kiran; Chau, Alexandra H.; Amin, Minesh B.; Robinson, M. Dirk

    2013-10-01

    Developing and testing advanced ground-based image processing systems for earth-observing remote sensing applications presents a unique challenge that requires advanced imagery simulation capabilities. This paper presents an earth-imaging multispectral framing camera simulation system called PayloadSim (PaySim) capable of generating terabytes of photorealistic simulated imagery. PaySim leverages previous work in 3-D scene-based image simulation, adding a novel method for automatically and efficiently constructing 3-D reflectance scenes by draping tiled orthorectified imagery over a geo-registered Digital Elevation Map (DEM). PaySim's modeling chain is presented in detail, with emphasis given to the techniques used to achieve computational efficiency. These techniques as well as cluster deployment of the simulator have enabled tuning and robust testing of image processing algorithms, and production of realistic sample data for customer-driven image product development. Examples of simulated imagery of Skybox's first imaging satellite are shown.

  1. Interpretation of multispectral and infrared thermal surveys of the Suez Canal Zone, Egypt

    NASA Technical Reports Server (NTRS)

    Elshazly, E. M.; Hady, M. A. A. H.; Hafez, M. A. A.; Salman, A. B.; Morsy, M. A.; Elrakaiby, M. M.; Alaassy, I. E. E.; Kamel, A. F.

    1977-01-01

    Remote sensing airborne surveys were conducted, as part of the plan of rehabilitation, of the Suez Canal Zone using I2S multispectral camera and Bendix LN-3 infrared passive scanner. The multispectral camera gives four separate photographs for the same scene in the blue, green, red, and near infrared bands. The scanner was operated in the microwave bands of 8 to 14 microns and the thermal surveying was carried out both at night and in the day time. The surveys, coupled with intensive ground investigations, were utilized in the construction of new geological, structural lineation and drainage maps for the Suez Canal Zone on a scale of approximately 1:20,000, which are superior to the maps made by normal aerial photography. A considerable number of anomalies belonging to various types were revealed through the interpretation of the executed multispectral and infrared thermal surveys.

  2. Accurate evaluation of sensitivity for calibration between a LiDAR and a panoramic camera used for remote sensing

    NASA Astrophysics Data System (ADS)

    García-Moreno, Angel-Iván; González-Barbosa, José-Joel; Ramírez-Pedraza, Alfonso; Hurtado-Ramos, Juan B.; Ornelas-Rodriguez, Francisco-Javier

    2016-04-01

    Computer-based reconstruction models can be used to approximate urban environments. These models are usually based on several mathematical approximations and the usage of different sensors, which implies dependency on many variables. The sensitivity analysis presented in this paper is used to weigh the relative importance of each uncertainty contributor into the calibration of a panoramic camera-LiDAR system. Both sensors are used for three-dimensional urban reconstruction. Simulated and experimental tests were conducted. For the simulated tests we analyze and compare the calibration parameters using the Monte Carlo and Latin hypercube sampling techniques. Sensitivity analysis for each variable involved into the calibration was computed by the Sobol method, which is based on the analysis of the variance breakdown, and the Fourier amplitude sensitivity test method, which is based on Fourier's analysis. Sensitivity analysis is an essential tool in simulation modeling and for performing error propagation assessments.

  3. Estimating the Infrared Radiation Wavelength Emitted by a Remote Control Device Using a Digital Camera

    ERIC Educational Resources Information Center

    Catelli, Francisco; Giovannini, Odilon; Bolzan, Vicente Dall Agnol

    2011-01-01

    The interference fringes produced by a diffraction grating illuminated with radiation from a TV remote control and a red laser beam are, simultaneously, captured by a digital camera. Based on an image with two interference patterns, an estimate of the infrared radiation wavelength emitted by a TV remote control is made. (Contains 4 figures.)

  4. Quantitative optical metrology with CMOS cameras

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Kolenovic, Ervin; Ferguson, Curtis F.

    2004-08-01

    Recent advances in laser technology, optical sensing, and computer processing of data, have lead to the development of advanced quantitative optical metrology techniques for high accuracy measurements of absolute shapes and deformations of objects. These techniques provide noninvasive, remote, and full field of view information about the objects of interest. The information obtained relates to changes in shape and/or size of the objects, characterizes anomalies, and provides tools to enhance fabrication processes. Factors that influence selection and applicability of an optical technique include the required sensitivity, accuracy, and precision that are necessary for a particular application. In this paper, sensitivity, accuracy, and precision characteristics in quantitative optical metrology techniques, and specifically in optoelectronic holography (OEH) based on CMOS cameras, are discussed. Sensitivity, accuracy, and precision are investigated with the aid of National Institute of Standards and Technology (NIST) traceable gauges, demonstrating the applicability of CMOS cameras in quantitative optical metrology techniques. It is shown that the advanced nature of CMOS technology can be applied to challenging engineering applications, including the study of rapidly evolving phenomena occurring in MEMS and micromechatronics.

  5. Near-earth orbital guidance and remote sensing

    NASA Technical Reports Server (NTRS)

    Powers, W. F.

    1972-01-01

    The curriculum of a short course in remote sensing and parameter optimization is presented. The subjects discussed are: (1) basics of remote sensing and the user community, (2) multivariant spectral analysis, (3) advanced mathematics and physics of remote sensing, (4) the atmospheric environment, (5) imaging sensing, and (6)nonimaging sensing. Mathematical models of optimization techniques are developed.

  6. Operational programs in forest management and priority in the utilization of remote sensing

    NASA Technical Reports Server (NTRS)

    Douglass, R. W.

    1978-01-01

    A speech is given on operational remote sensing programs in forest management and the importance of remote sensing in forestry is emphasized. Forest service priorities in using remote sensing are outlined.

  7. Remote sensing, land use, and demography - A look at people through their effects on the land

    NASA Technical Reports Server (NTRS)

    Paul, C. K.; Landini, A. J.

    1976-01-01

    Relevant causes of failure by the remote sensing community in the urban scene are analyzed. The reasons for the insignificant role of remote sensing in urban land use data collection are called the law of realism, the incompatibility of remote sensing and urban management system data formats is termed the law of nominal/ordinal systems compatibility, and the land use/population correlation dilemma is referred to as the law of missing persons. The study summarizes the three laws of urban land use information for which violations, avoidance, or ignorance have caused the decline of present remote sensing research. Particular attention is given to the rationale for urban land use information and for remote sensing. It is shown that remote sensing of urban land uses compatible with the three laws can be effectively developed by realizing the 10 percent contribution of remote sensing to urban land use planning data collection.

  8. Thematic Conference on Geologic Remote Sensing, 8th, Denver, CO, Apr. 29-May 2, 1991, Proceedings. Vols. 1 & 2

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The proceedings contain papers discussing the state-of-the-art exploration, engineering, and environmental applications of geologic remote sensing, along with the research and development activities aimed at increasing the future capabilities of this technology. The following topics are addressed: spectral geology, U.S. and international hydrocarbon exporation, radar and thermal infrared remote sensing, engineering geology and hydrogeology, mineral exploration, remote sensing for marine and environmental applications, image processing and analysis, geobotanical remote sensing, and data integration and geographic information systems. Particular attention is given to spectral alteration mapping with imaging spectrometers, mapping the coastal plain of the Congo with airborne digital radar, applications of remote sensing techniques to the assessment of dam safety, remote sensing of ferric iron minerals as guides for gold exploration, principal component analysis for alteration mappping, and the application of remote sensing techniques for gold prospecting in the north Fujian province.

  9. Methods of training the graduate level and professional geologist in remote sensing technology

    NASA Technical Reports Server (NTRS)

    Kolm, K. E.

    1981-01-01

    Requirements for a basic course in remote sensing to accommodate the needs of the graduate level and professional geologist are described. The course should stress the general topics of basic remote sensing theory, the theory and data types relating to different remote sensing systems, an introduction to the basic concepts of computer image processing and analysis, the characteristics of different data types, the development of methods for geological interpretations, the integration of all scales and data types of remote sensing in a given study, the integration of other data bases (geophysical and geochemical) into a remote sensing study, and geological remote sensing applications. The laboratories should stress hands on experience to reinforce the concepts and procedures presented in the lecture. The geologist should then be encouraged to pursue a second course in computer image processing and analysis of remotely sensed data.

  10. Remote sensing of Earth terrain

    NASA Technical Reports Server (NTRS)

    Kong, J. A.

    1993-01-01

    Progress report on remote sensing of Earth terrain covering the period from Jan. to June 1993 is presented. Areas of research include: radiative transfer model for active and passive remote sensing of vegetation canopy; polarimetric thermal emission from rough ocean surfaces; polarimetric passive remote sensing of ocean wind vectors; polarimetric thermal emission from periodic water surfaces; layer model with tandom spheriodal scatterers for remote sensing of vegetation canopy; application of theoretical models to active and passive remote sensing of saline ice; radiative transfer theory for polarimetric remote sensing of pine forest; scattering of electromagnetic waves from a dense medium consisting of correlated mie scatterers with size distributions and applications to dry snow; variance of phase fluctuations of waves propagating through a random medium; polarimetric signatures of a canopy of dielectric cylinders based on first and second order vector radiative transfer theory; branching model for vegetation; polarimetric passive remote sensing of periodic surfaces; composite volume and surface scattering model; and radar image classification.

  11. Remote sensing by satellite - Technical and operational implications for international cooperation

    NASA Technical Reports Server (NTRS)

    Doyle, S. E.

    1976-01-01

    International cooperation in the U.S. Space Program is discussed and related to the NASA program for remote sensing of the earth. Satellite remote sensing techniques are considered along with the selection of the best sensors and wavelength bands. The technology of remote sensing satellites is considered with emphasis on the Landsat system configuration. Future aspects of remote sensing satellites are considered.

  12. Remote sensing in operational range management programs in Western Canada

    NASA Technical Reports Server (NTRS)

    Thompson, M. D.

    1977-01-01

    A pilot program carried out in Western Canada to test remote sensing under semi-operational conditions and display its applicability to operational range management programs was described. Four agencies were involved in the program, two in Alberta and two in Manitoba. Each had different objectives and needs for remote sensing within its range management programs, and each was generally unfamiliar with remote sensing techniques and their applications. Personnel with experience and expertise in the remote sensing and range management fields worked with the agency personnel through every phase of the pilot program. Results indicate that these agencies have found remote sensing to be a cost effective tool and will begin to utilize remote sensing in their operational work during ensuing seasons.

  13. 3-dimensional telepresence system for a robotic environment

    DOEpatents

    Anderson, Matthew O.; McKay, Mark D.

    2000-01-01

    A telepresence system includes a camera pair remotely controlled by a control module affixed to an operator. The camera pair provides for three dimensional viewing and the control module, affixed to the operator, affords hands-free operation of the camera pair. In one embodiment, the control module is affixed to the head of the operator and an initial position is established. A triangulating device is provided to track the head movement of the operator relative to the initial position. A processor module receives input from the triangulating device to determine where the operator has moved relative to the initial position and moves the camera pair in response thereto. The movement of the camera pair is predetermined by a software map having a plurality of operation zones. Each zone therein corresponds to unique camera movement parameters such as speed of movement. Speed parameters include constant speed, or increasing or decreasing. Other parameters include pan, tilt, slide, raise or lowering of the cameras. Other user interface devices are provided to improve the three dimensional control capabilities of an operator in a local operating environment. Such other devices include a pair of visual display glasses, a microphone and a remote actuator. The pair of visual display glasses are provided to facilitate three dimensional viewing, hence depth perception. The microphone affords hands-free camera movement by utilizing voice commands. The actuator allows the operator to remotely control various robotic mechanisms in the remote operating environment.

  14. NEON Airborne Remote Sensing of Terrestrial Ecosystems

    NASA Astrophysics Data System (ADS)

    Kampe, T. U.; Leisso, N.; Krause, K.; Karpowicz, B. M.

    2012-12-01

    The National Ecological Observatory Network (NEON) is the continental-scale research platform that will collect information on ecosystems across the United States to advance our understanding and ability to forecast environmental change at the continental scale. One of NEON's observing systems, the Airborne Observation Platform (AOP), will fly an instrument suite consisting of a high-fidelity visible-to-shortwave infrared imaging spectrometer, a full waveform small footprint LiDAR, and a high-resolution digital camera on a low-altitude aircraft platform. NEON AOP is focused on acquiring data on several terrestrial Essential Climate Variables including bioclimate, biodiversity, biogeochemistry, and land use products. These variables are collected throughout a network of 60 sites across the Continental United States, Alaska, Hawaii and Puerto Rico via ground-based and airborne measurements. Airborne remote sensing plays a critical role by providing measurements at the scale of individual shrubs and larger plants over hundreds of square kilometers. The NEON AOP plays the role of bridging the spatial scales from that of individual organisms and stands to the scale of satellite-based remote sensing. NEON is building 3 airborne systems to facilitate the routine coverage of NEON sites and provide the capacity to respond to investigator requests for specific projects. The first NEON imaging spectrometer, a next-generation VSWIR instrument, was recently delivered to NEON by JPL. This instrument has been integrated with a small-footprint waveform LiDAR on the first NEON airborne platform (AOP-1). A series of AOP-1 test flights were conducted during the first year of NEON's construction phase. The goal of these flights was to test out instrument functionality and performance, exercise remote sensing collection protocols, and provide provisional data for algorithm and data product validation. These test flights focused the following questions: What is the optimal remote sensing data collection protocol to meet NEON science requirements? How do aircraft altitude, spatial sampling, spatial resolution, and LiDAR instrument configuration affect data retrievals? What are appropriate algorithms to derive ECVs from AOP data? What methodology should be followed to validate AOP remote sensing products and how should ground truth data be collected? Early test flights were focused on radiometric and geometric calibration as well as processing from raw data to Level-1 products. Subsequent flights were conducted focusing on collecting vegetation chemistry and structure measurements. These test flights that were conducted during 2012 have proved to be extremely valuable for verifying instrument functionality and performance, exercising remote sensing collection protocols, and providing data for algorithm and science product validation. Results from these early flights are presented, including the radiometric and geometric calibration of the AOP instruments. These 2012 flight campaigns are just the first of a series of test flights that will take place over the next several years as part of the NEON observatory construction. Lessons learned from these early campaigns will inform both airborne and ground data collection methodologies for future campaigns as well as guide the AOP sampling strategy before NEON enters full science operations.

  15. Focus adjustment method for CBERS 3 and 4 satellites Mux camera to be performed in air condition and its experimental verification for best performance in orbital vacuum condition

    NASA Astrophysics Data System (ADS)

    Scaduto, Lucimara C. N.; Malavolta, Alexandre T.; Modugno, Rodrigo G.; Vales, Luiz F.; Carvalho, Erica G.; Evangelista, Sérgio; Stefani, Mario A.; de Castro Neto, Jarbas C.

    2017-11-01

    The first Brazilian remote sensing multispectral camera (MUX) is currently under development at Opto Eletronica S.A. It consists of a four-spectral-band sensor covering a 450nm to 890nm wavelength range. This camera will provide images within a 20m ground resolution at nadir. The MUX camera is part of the payload of the upcoming Sino-Brazilian satellites CBERS 3&4 (China-Brazil Earth Resource Satellite). The preliminary alignment between the optical system and the CCD sensor, which is located at the focal plane assembly, was obtained in air condition, clean room environment. A collimator was used for the performance evaluation of the camera. The preliminary performance evaluation of the optical channel was registered by compensating the collimator focus position due to changes in the test environment, as an air-to-vacuum environment transition leads to a defocus process in this camera. Therefore, it is necessary to confirm that the alignment of the camera must always be attained ensuring that its best performance is reached for an orbital vacuum condition. For this reason and as a further step on the development process, the MUX camera Qualification Model was tested and evaluated inside a thermo-vacuum chamber and submitted to an as-orbit vacuum environment. In this study, the influence of temperature fields was neglected. This paper reports on the performance evaluation and discusses the results for this camera when operating within those mentioned test conditions. The overall optical tests and results show that the "in air" adjustment method was suitable to be performed, as a critical activity, to guarantee the equipment according to its design requirements.

  16. PROCEEDINGS OF THE FOURTH SYMPOSIUM ON REMOTE SENSING OF ENVIRONMENT; 12, 13, 14 APRIL 1966.

    DTIC Science & Technology

    The symposium was conducted as part of a continuing program investigating the field of remote sensing , its potential in scientific research and...information on all aspects of remote sensing , with special emphasis on such topics as needs for remotely sensed data, data management, and the special... remote sensing programs, data acquisition, data analysis and application, and equipment design, were presented. (Author)

  17. Remote sensing and image interpretation

    NASA Technical Reports Server (NTRS)

    Lillesand, T. M.; Kiefer, R. W. (Principal Investigator)

    1979-01-01

    A textbook prepared primarily for use in introductory courses in remote sensing is presented. Topics covered include concepts and foundations of remote sensing; elements of photographic systems; introduction to airphoto interpretation; airphoto interpretation for terrain evaluation; photogrammetry; radiometric characteristics of aerial photographs; aerial thermography; multispectral scanning and spectral pattern recognition; microwave sensing; and remote sensing from space.

  18. The Ship Tethered Aerostat Remote Sensing System (STARRS): Observations of Small-Scale Surface Lateral Transport During the LAgrangian Submesoscale ExpeRiment (LASER)

    NASA Astrophysics Data System (ADS)

    Carlson, D. F.; Novelli, G.; Guigand, C.; Özgökmen, T.; Fox-Kemper, B.; Molemaker, M. J.

    2016-02-01

    The Consortium for Advanced Research on the Transport of Hydrocarbon in the Environment (CARTHE) will carry out the LAgrangian Submesoscale ExpeRiment (LASER) to study the role of small-scale processes in the transport and dispersion of oil and passive tracers. The Ship-Tethered Aerostat Remote Sensing System (STARRS) will observe small-scale surface dispersion in the open ocean. STARRS is built around a high-lift-capacity (30 kg) helium-filled aerostat. STARRS is equipped with a high resolution digital camera. An integrated GNSS receiver and inertial navigation system permit direct geo-rectification of the imagery. Consortium for Advanced Research on the Transport of Hydrocarbon in the Environment (CARTHE) will carry out the LAgrangian Submesoscale ExpeRiment (LASER) to study the role of small-scale processes in the transport and dispersion of oil and passive tracers. The Ship-Tethered Aerostat Remote Sensing System (STARRS) was developed to produce observational estimates of small-scale surface dispersion in the open ocean. STARRS is built around a high-lift-capacity (30 kg) helium-filled aerostat. STARRS is equipped with a high resolution digital camera. An integrated GNSS receiver and inertial navigation system permit direct geo-rectification of the imagery. Thousands of drift cards deployed in the field of view of STARRS and tracked over time provide the first observational estimates of small-scale (1-500 m) surface dispersion in the open ocean. The STARRS imagery will be combined with GPS-tracked surface drifter trajectories, shipboard observations, and aerial surveys of sea surface temperature in the DeSoto Canyon. In addition to obvious applications to oil spill modelling, the STARRS observations will provide essential benchmarks for high resolution numerical modelsDrift cards deployed in the field of view of STARRS and tracked over time provide the first observational estimates of small-scale (1-100 m) surface dispersion in the open ocean. The STARRS imagery will be combined with GPS-tracked surface drifter trajectories, shipboard observations, and aerial surveys of sea surface temperature in the DeSoto Canyon. In addition to obvious applications to oil spill modelling, the STARRS observations will provide essential benchmarks for high resolution numerical models

  19. Estimating Sulfur Dioxide in Volcanic Plumes Using an Ultraviolet Camera. First Results from Lascar, Ollagüe and Irruputuncu Volcanoes

    NASA Astrophysics Data System (ADS)

    Geoffroy, C. A.; Amigo, A.

    2014-12-01

    Volcanic gas fluxes give important information on both the amount of degassing and magma reservoirs. In most of magmas, water vapor (H2O) and carbon dioxide (CO2) are major components of volcanic gas. However, sulfur dioxide (SO2) is one of the targets of remote sensing due to their low concentration in the environment and easy detection by ultraviolet spectroscopy. Accordingly, plume imaging using passive ultraviolet cameras is a relatively simple method to study volcanic degassing, expeditious manner and can be used up from distances of about 10 km from source of emissions. We estimated SO2 concentrations and fluxes in volcanic plumes with the ultraviolet camera Envicam-2, developed by Nicarnica Aviation, acquired by the Geological Survey of Chile (SERNAGEOMIN). The camera has filters that allow passage of ultraviolet radiation at wavelengths of interest. For determining whether there is absorption of radiation associated with the presence of SO2 the Beer-Lambert law was used for quantifying concentrations using appropriate calibration cells. SO2 emissions to the atmosphere were estimated using wind speed as an approximation to the plume transport. In this study we reported the implementation of a new methodology for using Envicam-2 and subsequent collection of SO2 concentrations and fluxes in passive degassing volcanoes. Measurements were done at Lascar, Ollagüe and Irruputuncu volcanoes, located in northern Chile. The volcanoes were chosen because of optimal atmospheric conditions for ultraviolet imaging. Results indicate concentrations within the expected ranges for three volcanoes generally between 400-1700 ppm•m. In the case of Láscar volcano, the emission rates of SO2 range from 250 to 500 tonnes/day for a same image of the plume. In particular, wind speed was determined from scaling images and are consistent with data from regional numerical models, as well as records of the meteorological stations installed at the ALMA astronomical center, located about 40 km north of the volcano. This study reveals new insights and challenges related to remote sensing of volcanic gases in Chile. In particular, the evolution of the SO2 emission in active volcanoes can be a powerful monitoring tool that can be complemented with other geophysical techniques.

  20. Geotechnical applications of remote sensing and remote data transmission; Proceedings of the Symposium, Cocoa Beach, FL, Jan. 31-Feb. 1, 1986

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, A.I.; Pettersson, C.B.

    1988-01-01

    Papers and discussions concerning the geotechnical applications of remote sensing and remote data transmission, sources of remotely sensed data, and glossaries of remote sensing and remote data transmission terms, acronyms, and abbreviations are presented. Aspects of remote sensing use covered include the significance of lineaments and their effects on ground-water systems, waste-site use and geotechnical characterization, the estimation of reservoir submerging losses using CIR aerial photographs, and satellite-based investigation of the significance of surficial deposits for surface mining operations. Other topics presented include the location of potential ground subsidence and collapse features in soluble carbonate rock, optical Fourier analysis ofmore » surface features of interest in geotechnical engineering, geotechnical applications of U.S. Government remote sensing programs, updating the data base for a Geographic Information System, the joint NASA/Geosat Test Case Project, the selection of remote data telemetry methods for geotechnical applications, the standardization of remote sensing data collection and transmission, and a comparison of airborne Goodyear electronic mapping system/SAR with satelliteborne Seasat/SAR radar imagery.« less

  1. Education in Environmental Remote Sensing: Potentials and Problems.

    ERIC Educational Resources Information Center

    Kiefer, Ralph W.; Lillesand, Thomas M.

    1983-01-01

    Discusses remote sensing principles and applications and the status and needs of remote sensing education in the United States. A summary of the fundamental policy issues that will determine remote sensing's future role in environmental and resource managements is included. (Author/BC)

  2. THE EPA REMOTE SENSING ARCHIVE

    EPA Science Inventory

    What would you do if you were faced with organizing 30 years of remote sensing projects that had been haphazardly stored at two separate locations for years then combined? The EPA Remote Sensing Archive, currently located in Las Vegas, Nevada. contains the remote sensing data and...

  3. Jellyfish Patch Detecting Using Low Latitude Remote Sensing System

    NASA Astrophysics Data System (ADS)

    Lee, J. S.; Jo, Y. H.

    2015-12-01

    Jellyfish can be asexual and sexual reproduction depending on the environment, and it has excellent environmental adaptability and reproduction than other sea creatures. If the marine environment become worse, jellyfish can take advantage in the competition for survival. Marine environmental changes caused by rapid climate change, dyke construction and land reclamation will increase the amount of jellyfish and as a result can lead to a various social and economic problems. In this study, jellyfish were observed in coastal area using a low-altitude Helikite remote sensing system for the first time. Helikite is a type of helium balloon plus a kite that can get the data with optical sensors for the desired spatial resolutions by adjusting the altitudes. In addition, it has an advantage that can monitor any objects for a long time at one place as long as the electric power and helium last. In this study, we observed the jellyfish patches using a digital camera in the Chesapeake Bay and estimate populations and size of jellyfish patches through image processing. Research results suggests that we can have long-term real-time observations for not only jellyfish, but also other harmful marine creatures.

  4. Remotely sensed data available from the US Geological Survey EROS Data Center

    USGS Publications Warehouse

    Dwyer, John L.; Qu, J.J.; Gao, W.; Kafatos, M.; Murphy , R.E.; Salomonson, V.V.

    2006-01-01

    The Center for Earth Resources Observation Systems (EROS) is a field center of the geography discipline within the US geological survey (USGS) of the Department of the Interior. The EROS Data Center (EDC) was established in the early 1970s as the nation’s principal archive of remotely sensed data. Initially the EDC was responsible for the archive, reproduction, and distribution of black-and-white and color-infrared aerial photography acquired under numerous mapping programs conducted by various Federal agencies including the USGS, Department of Agriculture, Environmental Protection Agency, and NASA. The EDC was also designated the central archive for data acquired by the first satellite sensor designed for broad-scale earth observations in support of civilian agency needs for earth resource information. A four-band multispectral scanner (MSS) and a return-beam vidicon (RBV) camera were initially flown on the Earth Resources Technology Satellite-1, subsequently designated Landsat-1. The synoptic coverage, moderate spatial resolution, and multi-spectral view provided by these data stimulated scientists with an unprecedented perspective from which to study the Earth’s surface and to understand the relationships between human activity and natural systems.

  5. Crew Earth Observations: Twelve Years of Documenting Earth from the International Space Station

    NASA Technical Reports Server (NTRS)

    Evans, Cynthia A.; Stefanov, William L.; Willis, Kimberley; Runco, Susan; Wilkinson, M. Justin; Dawson, Melissa; Trenchard, Michael

    2012-01-01

    The Crew Earth Observations (CEO) payload was one of the initial experiments aboard the International Space Station, and has been continuously collecting data about the Earth since Expedition 1. The design of the experiment is simple: using state-of-the-art camera equipment, astronauts collect imagery of the Earth's surface over defined regions of scientific interest and also document dynamic events such as storms systems, floods, wild fires and volcanic eruptions. To date, CEO has provided roughly 600,000 images of Earth, capturing views of features and processes on land, the oceans, and the atmosphere. CEO data are less rigorously constrained than other remote sensing data, but the volume of data, and the unique attributes of the imagery provide a rich and understandable view of the Earth that is difficult to achieve from the classic remote sensing platforms. In addition, the length-of-record of the imagery dataset, especially when combined with astronaut photography from other NASA and Russian missions starting in the early 1960s, provides a valuable record of changes on the surface of the Earth over 50 years. This time period coincides with the rapid growth of human settlements and human infrastructure.

  6. Multi-Sensor Radiometric Study to Detect Pathologies in Historical Buildings

    NASA Astrophysics Data System (ADS)

    Del Pozo, S.; Herrero-Pascual, J.; Felipe-García, B.; Hernández-López, D.; Rodríguez-Gonzálvez, P.; González-Aguilera, D.

    2015-02-01

    This paper presents a comparative study with different remote sensing technologies to recognize pathologies in façades of historical buildings. Building materials deteriorate over the years due to different extrinsic and intrinsic agents, so assessing these diseases in a non-invasive way is crucial to help preserve them. Most of these buildings are extremely valuable and some of them have been declared monuments of cultural interest. In this way through close range remote sensing techniques, it is possible to study material pathologies in a rigorous way and in a short duration field campaign. For the investigation two different acquisition systems were applied, active and passive methods. The terrestrial laser scanner FARO Focus 3D was used as active sensor, working at the wavelength of 905 nm. For the case of passive sensors, a Nikon D-5000 and a 6- bands Mini-MCA multispectral camera (530-801 nm) were applied covering visible and near infrared spectral range. This analysis allows assessing the sensor, or sensors combination, suitability for pathologies detection, addressing the limitations according to the spatial and spectral resolution. Moreover, the pathology detection by unsupervised classification methods is addressed in order to evaluate the automation capability of this process.

  7. Thermophysical properties of the MER and Beagle II landing site regions on Mars

    NASA Astrophysics Data System (ADS)

    Jakosky, Bruce M.; Hynek, Brian M.; Pelkey, Shannon M.; Mellon, Michael T.; Martínez-Alonso, Sara; Putzig, Nathaniel E.; Murphy, Nate; Christensen, Philip R.

    2006-08-01

    We analyzed remote-sensing observations of the Isidis Basin, Gusev Crater, and Meridiani Planum landing sites for Beagle II, MER-A Spirit, and MER-B Opportunity spacecraft, respectively. We emphasized the thermophysical properties using daytime and nighttime radiance measurements from the Mars Global Surveyor (MGS) Thermal Emission Spectrometer and Mars Odyssey Thermal Emission Imaging System (THEMIS) and thermal inertias derived from nighttime data sets. THEMIS visible images, MGS Mars Orbiter Camera (MOC) narrow-angle images, and MGS Mars Orbiter Laser Altimeter (MOLA) data are incorporated as well. Additionally, the remote-sensing data were compared with ground-truth at the MER sites. The Isidis Basin surface layer has been shaped by aeolian processes and erosion by slope winds coming off of the southern highlands and funneling through notches between massifs. In the Gusev region, surface materials of contrasting thermophysical properties have been interpreted as rocks or bedrock, duricrust, and dust deposits; these are consistent with a complex geological history dominated by volcanic and aeolian processes. At Meridiani Planum the many layers having different thermophysical and erosional properties suggest periodic deposition of differing sedimentological facies possibly related to clast size, grain orientation and packing, or mineralogy.

  8. Remote sensing of deep hermatypic coral reefs in Puerto Rico and the U.S. Virgin Islands using the Seabed autonomous underwater vehicle

    NASA Astrophysics Data System (ADS)

    Armstrong, Roy A.; Singh, Hanumant

    2006-09-01

    Optical imaging of coral reefs and other benthic communities present below one attenuation depth, the limit of effective airborne and satellite remote sensing, requires the use of in situ platforms such as autonomous underwater vehicles (AUVs). The Seabed AUV, which was designed for high-resolution underwater optical and acoustic imaging, was used to characterize several deep insular shelf reefs of Puerto Rico and the US Virgin Islands using digital imagery. The digital photo transects obtained by the Seabed AUV provided quantitative data on living coral, sponge, gorgonian, and macroalgal cover as well as coral species richness and diversity. Rugosity, an index of structural complexity, was derived from the pencil-beam acoustic data. The AUV benthic assessments could provide the required information for selecting unique areas of high coral cover, biodiversity and structural complexity for habitat protection and ecosystem-based management. Data from Seabed sensors and related imaging technologies are being used to conduct multi-beam sonar surveys, 3-D image reconstruction from a single camera, photo mosaicking, image based navigation, and multi-sensor fusion of acoustic and optical data.

  9. Thermal infrared imaging of the temporal variability in stomatal conductance for fruit trees

    NASA Astrophysics Data System (ADS)

    Struthers, Raymond; Ivanova, Anna; Tits, Laurent; Swennen, Rony; Coppin, Pol

    2015-07-01

    Repeated measurements using thermal infrared remote sensing were used to characterize the change in canopy temperature over time and factors that influenced this change on 'Conference' pear trees (Pyrus communis L.). Three different types of sensors were used, a leaf porometer to measure leaf stomatal conductance, a thermal infrared camera to measure the canopy temperature and a meteorological sensor to measure weather variables. Stomatal conductance of water stressed pear was significantly lower than in the control group 9 days after stress began. This decrease in stomatal conductance reduced transpiration, reducing evaporative cooling that increased canopy temperature. Using thermal infrared imaging with wavelengths between 7.5 and13 μm, the first significant difference was measured 18 days after stress began. A second order derivative described the average rate of change of the difference between the stress treatment and control group. The average rate of change for stomatal conductance was 0.06 (mmol m-2 s-1) and for canopy temperature was -0.04 (°C) with respect to days. Thermal infrared remote sensing and data analysis presented in this study demonstrated that the differences in canopy temperatures between the water stress and control treatment due to stomata regulation can be validated.

  10. Mini AERCam: A Free-Flying Robot for Space Inspection

    NASA Technical Reports Server (NTRS)

    Fredrickson, Steven

    2001-01-01

    The NASA Johnson Space Center Engineering Directorate is developing the Autonomous Extravehicular Robotic Camera (AERCam), a free-flying camera system for remote viewing and inspection of human spacecraft. The AERCam project team is currently developing a miniaturized version of AERCam known as Mini AERCam, a spherical nanosatellite 7.5 inches in diameter. Mini AERCam development builds on the success of AERCam Sprint, a 1997 Space Shuttle flight experiment, by integrating new on-board sensing and processing capabilities while simultaneously reducing volume by 80%. Achieving these productivity-enhancing capabilities in a smaller package depends on aggressive component miniaturization. Technology innovations being incorporated include micro electromechanical system (MEMS) gyros, "camera-on-a-chip" CMOS imagers, rechargeable xenon gas propulsion, rechargeable lithium ion battery, custom avionics based on the PowerPC 740 microprocessor, GPS relative navigation, digital radio frequency communications and tracking, micropatch antennas, digital instrumentation, and dense mechanical packaging. The Mini AERCam free-flyer will initially be integrated into an approximate flight-like configuration for laboratory demonstration on an airbearing table. A pilot-in-the-loop and hardware-in-the-loop simulation to simulate on-orbit navigation and dynamics will complement the airbearing table demonstration. The Mini AERCam lab demonstration is intended to form the basis for future development of an AERCam flight system that provides on-orbit views of the Space Shuttle and International Space Station unobtainable from fixed cameras, cameras on robotic manipulators, or cameras carried by space-walking crewmembers.

  11. Coupling effect analysis between landslides, river channel changes and sediment budgets - extreme climate events in Laishe River, southern Taiwan

    NASA Astrophysics Data System (ADS)

    Chang, Kuo-Jen; Huang, Mei-Jen; Tseng, Chih-Ming

    2016-04-01

    Taiwan, due to the high seismicity and high annual rainfall, numerous landslides triggered every year and severe impacts affect the island. Concerning to the catastrophic landslides, the key information of landslide, including range of landslide, volume estimation and the subsequent evolution are important when analyzing the triggering mechanism, hazard assessment and mitigation. Thus, the morphological analysis gives a general overview for the landslides and been considered as one of the most fundamental information. Typhoon Morakot brought extreme and long-time rainfall for Taiwan in August 2009, and caused severe disasters. In this study we integrate several technologies, especially by Unmanned Aerial Vehicle (UAV) and multi-spectral camera, to decipher the consequence and the potential hazard, and the social impact. In recent years, the remote sensing technology improves rapidly, providing a wide range of image, essential and precise information. This study integrates several methods, including, 1) Remote-sensing images gathered by Unmanned Aerial Vehicle (UAV) and by aerial photos taken in different periods; 2) field in-situ geologic investigation; 3) Differential GPS, RTK GPS geomatic measurements. The methods allow to constructing the DTMs before and after landslide, as well as the subsequent periods by using aerial photos and UAV derived images. The data sets permits to analysis the morphological changes. In the past, the study of sediment budgets usually relies on field investigation, but due to inconvenient transportation, topographical barriers, or located in remote areas, etc. the survey is hardly to be completed sometimes. In recent years, the rapid development of remote sensing technology improves image resolution and quality significantly. Remote sensing technology can provide a wide range of image data, and provide essential and precious information. The purpose of this study is to investigate the phenomenon of river migration and to evaluate the amount of migration along Laishe River by analyzing the 3D DEM before and after the typhoon Morakot. The DEMs are built by using the aerial images taken by digital mapping camera (DMC) and by airborne digital scanner 40 (ADS40) before and after typhoon event. Recently, this research integrates Unmanned Aerial Vehicle (UAV) and oblique photogrammetric technologies for image acquisition by 5-10cm GSD photos. This approach permits to construct true 3D model so as to decipher ground information more realistically. 10-20cm DSM and DEM, and field GPS, were compiled together to decipher the morphologic changes. All the information, especially by means of true 3D model, the datasets provides detail ground information that may use to evaluate the landslide triggering mechanism and river channel evolution. The goals of this study is to integrates the UAS system and to decipher the sliding process and morphologic changes of large landslide areas, sediment transport and budgets, and to investigate the phenomenon of river migration. The results of this study provides not only geomatics and GIS dataset of the hazards, but also for essential geomorphologic information for other study, and for hazard mitigation and planning, as well.

  12. Research on remote sensing image pixel attribute data acquisition method in AutoCAD

    NASA Astrophysics Data System (ADS)

    Liu, Xiaoyang; Sun, Guangtong; Liu, Jun; Liu, Hui

    2013-07-01

    The remote sensing image has been widely used in AutoCAD, but AutoCAD lack of the function of remote sensing image processing. In the paper, ObjectARX was used for the secondary development tool, combined with the Image Engine SDK to realize remote sensing image pixel attribute data acquisition in AutoCAD, which provides critical technical support for AutoCAD environment remote sensing image processing algorithms.

  13. Bibliography of Remote Sensing Techniques Used in Wetland Research.

    DTIC Science & Technology

    1993-01-01

    remote sensing technology for detecting changes in wetland environments. This report documents a bibliographic search conducted as part of that work unit on applications of remote sensing techniques in wetland research. Results were used to guide research efforts on the use of remote sensing technology for wetland change detection and assessment. The citations are presented in three appendixes, organized by wetland type, sensor type, and author.... Change detection, Wetland assessment, Remote sensing ,

  14. MACSAT - A Near Equatorial Earth Observation Mission

    NASA Astrophysics Data System (ADS)

    Kim, B. J.; Park, S.; Kim, E.-E.; Park, W.; Chang, H.; Seon, J.

    MACSAT mission was initiated by Malaysia to launch a high-resolution remote sensing satellite into Near Equatorial Orbit (NEO). Due to its geographical location, Malaysia can have large benefits from NEO satellite operation. From the baseline circular orbit of 685 km altitude with 7 degrees of inclination, the neighboring regions around Malaysian territory can be frequently monitored. The equatorial environment around the globe can also be regularly observed with unique revisit characteristics. The primary mission objective of MACSAT program is to develop and validate technologies for a near equatorial orbit remote sensing satellite system. MACSAT is optimally designed to accommodate an electro-optic Earth observation payload, Medium-sized Aperture Camera (MAC). Malaysian and Korean joint engineering teams are formed for the effective implementation of the satellite system. An integrated team approach is adopted for the joint development for MACSAT. MAC is a pushbroom type camera with 2.5 m of Ground Sampling Distance (GSD) in panchromatic band and 5 m of GSD in four multi-spectral bands. The satellite platform is a mini-class satellite. Including MAC payload, the satellite weighs under 200 kg. Spacecraft bus is designed optimally to support payload operations during 3 years of mission life. The payload has 20 km of swath width with +/- 30 o of tilting capability. 32 Gbits of solid state recorder is implemented as the mass image storage. The ground element is an integrated ground station for mission control and payload operation. It is equipped with S- band up/down link for commanding and telemetry reception as well as 30 Mbps class X-band down link for image reception and processing. The MACSAT system is capable of generating 1:25,000-scale image maps. It is also anticipated to have capability for cross-track stereo imaging for Digital elevation Model (DEM) generation.

  15. Spanning Scale and Platform to Track Spring and Autumn Phenology

    NASA Astrophysics Data System (ADS)

    Schwartz, Mark D.

    2016-04-01

    Important opportunities to further understanding of ecosystem processes can be realized through improved integration and utilization of multiple phenological measures. Combining satellite-derived remote sensing data, which facilitate needed spatial integration and large area coverage with detailed conventional (visual) ground observations, which provide necessary information on species timing differences, is an important path for advancement in this area. A relatively new resource to address this scaling issue is near-surface remote sensing data collected from fixed position cameras. This paper presents on-going findings from a multi-year comparison of the spring and autumn seasonal transitions in Downer Woods, a small urban woodlot on the University of Wisconsin-Milwaukee campus (43.08°N, 87.88°W) dominated by white ash (Fraxinus americana) and basswood (Tilia americana) trees. The study area is under observation from a visible/near-infrared camera installed in March 2013 that is part of the Phenocam network (http://phenocam.sr.unh.edu), and also has detailed ground-based species-specific visual phenological observations collected in both spring and autumn, as well as air/soil temperatures and light sensor data measured under the canopy. The results show that at this location, the Phenocam visible/near-infrared band data series can be successfully compared to aggregated species visual phenological observations. Further, both of these changes can be in turn simulated by process models based on seasonal temperature changes. Thus, the concurrent collection of these data suggest a coherent process whereby more robust ground-based species-aggregated "pixel" data can be produced which will be scalable to large areas, and potentially be applicable to more complex environments and ecosystems. Such an approach could potentially improve phenology-based spatial estimates of carbon and energy flux.

  16. Hyperspatial Thermal Imaging of Surface Hydrothermal Features at Pilgrim Hot Springs, Alaska using a small Unmanned Aerial System (sUAS)

    NASA Astrophysics Data System (ADS)

    Haselwimmer, C. E.; Wilson, R.; Upton, C.; Prakash, A.; Holdmann, G.; Walker, G.

    2013-12-01

    Thermal remote sensing provides a valuable tool for mapping and monitoring surface hydrothermal features associated with geothermal activity. The increasing availability of low-cost, small Unmanned Aerial Systems (sUAS) with integrated thermal imaging sensors offers a means to undertake very high spatial resolution (hyperspatial), quantitative thermal remote sensing of surface geothermal features in support of exploration and long-term monitoring efforts. Results from the deployment of a quadcopter sUAS equipped with a thermal camera over Pilgrim Hot Springs, Alaska for detailed mapping and heat flux estimation for hot springs, seeps, and thermal pools are presented. Hyperspatial thermal infrared imagery (4 cm pixels) was acquired over Pilgrim Hot Springs in July 2013 using a FLIR TAU 640 camera operating from an Aeryon Scout sUAS flying at an altitude of 40m. The registered and mosaicked thermal imagery is calibrated to surface temperature values using in-situ measurements of uniform blackbody tarps and the temperatures of geothermal and other surface pools acquired with a series of water temperature loggers. Interpretation of the pre-processed thermal imagery enables the delineation of hot springs, the extents of thermal pools, and the flow and mixing of individual geothermal outflow plumes with an unprecedented level of detail. Using the surface temperatures of thermal waters derived from the FLIR data and measured in-situ meteorological parameters the hot spring heat flux and outflow rate is calculated using a heat budget model for a subset of the thermal drainage. The heat flux/outflow rate estimates derived from the FLIR data are compared against in-situ measurements of the hot spring outflow rate recorded at the time of the thermal survey.

  17. The French proposal for a high spatial resolution Hyperspectral mission

    NASA Astrophysics Data System (ADS)

    Carrère, Véronique; Briottet, Xavier; Jacquemoud, Stéphane; Marion, Rodolphe; Bourguignon, Anne; Chami, Malik; Chanussot, Jocelyn; Chevrel, Stéphane; Deliot, Philippe; Dumont, Marie; Foucher, Pierre-Yves; Gomez, Cécile; Roman-Minghelli, Audrey; Sheeren, David; Weber, Christiane; Lefèvre, Marie-José; Mandea, Mioara

    2014-05-01

    More than 25 years of airborne imaging spectroscopy and spaceborne sensors such as Hyperion or HICO have clearly demonstrated the ability of such a remote sensing technique to produce value added information regarding surface composition and physical properties for a large variety of applications. Scheduled missions such as EnMAP and PRISMA prove the increased interest of the scientific community for such a type of remote sensing data. In France, a group of Science and Defence users of imaging spectrometry data (Groupe de Synthèse Hyperspectral, GSH) established an up-to-date review of possible applications, define instrument specifications required for accurate, quantitative retrieval of diagnostic parameters, and identify fields of application where imaging spectrometry is a major contribution. From these conclusions, CNES (French Space Agency) decided a phase 0 study for an hyperspectral mission concept, named at this time HYPXIM (HYPerspectral-X IMagery), the main fields of applications are vegetation biodiversity, coastal and inland waters, geosciences, urban environment, atmospheric sciences, cryosphere and Defence. Results pointed out applications where high spatial resolution was necessary and would not be covered by the other foreseen hyperspectral missions. The phase A started at the beginning of 2013 based on the following HYPXIM characteristics: a hyperspectral camera covering the [0.4 - 2.5 µm] spectral range with a 8 m ground sampling distance (GSD) and a PAN camera with a 1.85 m GSD, onboard a mini-satellite platform. This phase A is currently stopped due to budget constraints. Nevertheless, the Science team is currently focusing on the preparation for the next CNES prospective meeting (March, 2014), an important step for the future of the mission. This paper will provide an update of the status of this mission and of new results obtained by the Science team.

  18. Kite Aerial Photography as a Tool for Remote Sensing

    ERIC Educational Resources Information Center

    Sallee, Jeff; Meier, Lesley R.

    2010-01-01

    As humans, we perform remote sensing nearly all the time. This is because we acquire most of our information about our surroundings through the senses of sight and hearing. Whether viewed by the unenhanced eye or a military satellite, remote sensing is observing objects from a distance. With our current technology, remote sensing has become a part…

  19. Remote sensing for detecting and mapping whitefly (Bemisia tabaci) infestations

    USDA-ARS?s Scientific Manuscript database

    Remote sensing technology has long been used for detecting insect infestations on agricultural crops. With recent advances in remote sensing sensors and other spatial information technologies such as Global Position Systems (GPS) and Geographic Information Systems (GIS), remote sensing is finding mo...

  20. Reflections on Earth--Remote-Sensing Research from Your Classroom.

    ERIC Educational Resources Information Center

    Campbell, Bruce A.

    2001-01-01

    Points out the uses of remote sensing in different areas, and introduces the program "Reflections on Earth" which provides access to basic and instructional information on remote sensing to students and teachers. Introduces students to concepts related to remote sensing and measuring distances. (YDS)

  1. Remote-Sensing Practice and Potential

    DTIC Science & Technology

    1974-05-01

    Six essential processes that must be accomplished if use of a remote - sensing system is to result in useful information are defined as problem...to be useful in remote - sensing projects are described. An overview of the current state-of-the-art of remote sensing is presented.

  2. History and future of remote sensing technology and education

    NASA Technical Reports Server (NTRS)

    Colwell, R. N.

    1980-01-01

    A historical overview of the discovery and development of photography, related sciences, and remote sensing technology is presented. The role of education to date in the development of remote sensing is discussed. The probable future and potential of remote sensing and training is described.

  3. Ten ways remote sensing can contribute to conservation

    USGS Publications Warehouse

    Rose, Robert A.; Byler, Dirck; Eastman, J. Ron; Fleishman, Erica; Geller, Gary; Goetz, Scott; Guild, Liane; Hamilton, Healy; Hansen, Matt; Headley, Rachel; Hewson, Jennifer; Horning, Ned; Kaplin, Beth A.; Laporte, Nadine; Leidner, Allison K.; Leimgruber, Peter; Morisette, Jeffrey T.; Musinsky, John; Pintea, Lilian; Prados, Ana; Radeloff, Volker C.; Rowen, Mary; Saatchi, Sassan; Schill, Steve; Tabor, Karyn; Turner, Woody; Vodacek, Anthony; Vogelmann, James; Wegmann, Martin; Wilkie, David; Wilson, Cara

    2014-01-01

    In an effort to increase conservation effectiveness through the use of Earth observation technologies, a group of remote sensing scientists affiliated with government and academic institutions and conservation organizations identified 10 questions in conservation for which the potential to be answered would be greatly increased by use of remotely sensed data and analyses of those data. Our goals were to increase conservation practitioners’ use of remote sensing to support their work, increase collaboration between the conservation science and remote sensing communities, identify and develop new and innovative uses of remote sensing for advancing conservation science, provide guidance to space agencies on how future satellite missions can support conservation science, and generate support from the public and private sector in the use of remote sensing data to address the 10 conservation questions. We identified a broad initial list of questions on the basis of an email chain-referral survey. We then used a workshop-based iterative and collaborative approach to whittle the list down to these final questions (which represent 10 major themes in conservation): How can global Earth observation data be used to model species distributions and abundances? How can remote sensing improve the understanding of animal movements? How can remotely sensed ecosystem variables be used to understand, monitor, and predict ecosystem response and resilience to multiple stressors? How can remote sensing be used to monitor the effects of climate on ecosystems? How can near real-time ecosystem monitoring catalyze threat reduction, governance and regulation compliance, and resource management decisions? How can remote sensing inform configuration of protected area networks at spatial extents relevant to populations of target species and ecosystem services? How can remote sensing-derived products be used to value and monitor changes in ecosystem services? How can remote sensing be used to monitor and evaluate the effectiveness of conservation efforts? How does the expansion and intensification of agriculture and aquaculture alter ecosystems and the services they provide? How can remote sensing be used to determine the degree to which ecosystems are being disturbed or degraded and the effects of these changes on species and ecosystem functions?

  4. Ten ways remote sensing can contribute to conservation.

    PubMed

    Rose, Robert A; Byler, Dirck; Eastman, J Ron; Fleishman, Erica; Geller, Gary; Goetz, Scott; Guild, Liane; Hamilton, Healy; Hansen, Matt; Headley, Rachel; Hewson, Jennifer; Horning, Ned; Kaplin, Beth A; Laporte, Nadine; Leidner, Allison; Leimgruber, Peter; Morisette, Jeffrey; Musinsky, John; Pintea, Lilian; Prados, Ana; Radeloff, Volker C; Rowen, Mary; Saatchi, Sassan; Schill, Steve; Tabor, Karyn; Turner, Woody; Vodacek, Anthony; Vogelmann, James; Wegmann, Martin; Wilkie, David; Wilson, Cara

    2015-04-01

    In an effort to increase conservation effectiveness through the use of Earth observation technologies, a group of remote sensing scientists affiliated with government and academic institutions and conservation organizations identified 10 questions in conservation for which the potential to be answered would be greatly increased by use of remotely sensed data and analyses of those data. Our goals were to increase conservation practitioners' use of remote sensing to support their work, increase collaboration between the conservation science and remote sensing communities, identify and develop new and innovative uses of remote sensing for advancing conservation science, provide guidance to space agencies on how future satellite missions can support conservation science, and generate support from the public and private sector in the use of remote sensing data to address the 10 conservation questions. We identified a broad initial list of questions on the basis of an email chain-referral survey. We then used a workshop-based iterative and collaborative approach to whittle the list down to these final questions (which represent 10 major themes in conservation): How can global Earth observation data be used to model species distributions and abundances? How can remote sensing improve the understanding of animal movements? How can remotely sensed ecosystem variables be used to understand, monitor, and predict ecosystem response and resilience to multiple stressors? How can remote sensing be used to monitor the effects of climate on ecosystems? How can near real-time ecosystem monitoring catalyze threat reduction, governance and regulation compliance, and resource management decisions? How can remote sensing inform configuration of protected area networks at spatial extents relevant to populations of target species and ecosystem services? How can remote sensing-derived products be used to value and monitor changes in ecosystem services? How can remote sensing be used to monitor and evaluate the effectiveness of conservation efforts? How does the expansion and intensification of agriculture and aquaculture alter ecosystems and the services they provide? How can remote sensing be used to determine the degree to which ecosystems are being disturbed or degraded and the effects of these changes on species and ecosystem functions? © 2014 Society for Conservation Biology.

  5. Comparative Performance of Ground vs. Aerially Assessed RGB and Multispectral Indices for Early-Growth Evaluation of Maize Performance under Phosphorus Fertilization

    PubMed Central

    Gracia-Romero, Adrian; Kefauver, Shawn C.; Vergara-Díaz, Omar; Zaman-Allah, Mainassara A.; Prasanna, Boddupalli M.; Cairns, Jill E.; Araus, José L.

    2017-01-01

    Low soil fertility is one of the factors most limiting agricultural production, with phosphorus deficiency being among the main factors, particularly in developing countries. To deal with such environmental constraints, remote sensing measurements can be used to rapidly assess crop performance and to phenotype a large number of plots in a rapid and cost-effective way. We evaluated the performance of a set of remote sensing indices derived from Red-Green-Blue (RGB) images and multispectral (visible and infrared) data as phenotypic traits and crop monitoring tools for early assessment of maize performance under phosphorus fertilization. Thus, a set of 26 maize hybrids grown under field conditions in Zimbabwe was assayed under contrasting phosphorus fertilization conditions. Remote sensing measurements were conducted in seedlings at two different levels: at the ground and from an aerial platform. Within a particular phosphorus level, some of the RGB indices strongly correlated with grain yield. In general, RGB indices assessed at both ground and aerial levels correlated in a comparable way with grain yield except for indices a* and u*, which correlated better when assessed at the aerial level than at ground level and Greener Area (GGA) which had the opposite correlation. The Normalized Difference Vegetation Index (NDVI) evaluated at ground level with an active sensor also correlated better with grain yield than the NDVI derived from the multispectral camera mounted in the aerial platform. Other multispectral indices like the Soil Adjusted Vegetation Index (SAVI) performed very similarly to NDVI assessed at the aerial level but overall, they correlated in a weaker manner with grain yield than the best RGB indices. This study clearly illustrates the advantage of RGB-derived indices over the more costly and time-consuming multispectral indices. Moreover, the indices best correlated with GY were in general those best correlated with leaf phosphorous content. However, these correlations were clearly weaker than against grain yield and only under low phosphorous conditions. This work reinforces the effectiveness of canopy remote sensing for plant phenotyping and crop management of maize under different phosphorus nutrient conditions and suggests that the RGB indices are the best option. PMID:29230230

  6. A remotely piloted aircraft system in major incident management: concept and pilot, feasibility study.

    PubMed

    Abrahamsen, Håkon B

    2015-06-10

    Major incidents are complex, dynamic and bewildering task environments characterised by simultaneous, rapidly changing events, uncertainty and ill-structured problems. Efficient management, communication, decision-making and allocation of scarce medical resources at the chaotic scene of a major incident is challenging and often relies on sparse information and data. Communication and information sharing is primarily voice-to-voice through phone or radio on specified radio frequencies. Visual cues are abundant and difficult to communicate between teams and team members that are not co-located. The aim was to assess the concept and feasibility of using a remotely piloted aircraft (RPA) system to support remote sensing in simulated major incident exercises. We carried out an experimental, pilot feasibility study. A custom-made, remotely controlled, multirotor unmanned aerial vehicle with vertical take-off and landing was equipped with digital colour- and thermal imaging cameras, a laser beam, a mechanical gripper arm and an avalanche transceiver. We collected data in five simulated exercises: 1) mass casualty traffic accident, 2) mountain rescue, 3) avalanche with buried victims, 4) fisherman through thin ice and 5) search for casualties in the dark. The unmanned aerial vehicle was remotely controlled, with high precision, in close proximity to air space obstacles at very low levels without compromising work on the ground. Payload capacity and tolerance to wind and turbulence were limited. Aerial video, shot from different altitudes, and remote aerial avalanche beacon search were streamed wirelessly in real time to a monitor at a ground base. Electromagnetic interference disturbed signal reception in the ground monitor. A small remotely piloted aircraft can be used as an effective tool carrier, although limited by its payload capacity, wind speed and flight endurance. Remote sensing using already existing remotely piloted aircraft technology in pre-hospital environments is feasible and can be used to support situation assessment and information exchange at a major incident scene. Regulations are needed to ensure the safe use of unmanned aerial vehicles in major incidents. Ethical issues are abundant.

  7. Role of remote sensing in documenting living resources

    NASA Technical Reports Server (NTRS)

    Wagner, P. E.; Anderson, R. R.; Brun, B.; Eisenberg, M.; Genys, J. B.; Lear, D. W., Jr.; Miller, M. H.

    1978-01-01

    Specific cases of known or potentially useful applications of remote sensing in assessing biological resources are discussed. It is concluded that the more usable remote sensing techniques relate to the measurement of population fluctuations in aquatic systems. Sensing of the flora and the fauna of the Bay is considered with emphasis on direct sensing of aquatic plant populations and of water quality. Recommendations for remote sensing projects are given.

  8. Commercial future: making remote sensing a media event

    NASA Astrophysics Data System (ADS)

    Lurie, Ian

    1999-12-01

    The rapid growth of commercial remote sensing has made high quality digital sensing data widely available -- now, remote sensing must become and remain a strong, commercially viable industry. However, this new industry cannot survive without an educated consumer base. To access markets, remote sensing providers must make their product more accessible, both literally and figuratively: Potential customers must be able to find the data they require, when they require it, and they must understand the utility of the information available to them. The Internet and the World Wide Web offer the perfect medium to educate potential customers and to sell remote sensing data to those customers. A well-designed web presence can provide both an information center and a market place for companies offering their data for sale. A very high potential web-based market for remote sensing lies in media. News agencies, web sites, and a host of other visual media services can use remote sensing data to provide current, relevant information regarding news around the world. This paper will provide a model for promotion and sale of remote sensing data via the Internet.

  9. Pervasive sensing

    NASA Astrophysics Data System (ADS)

    Nagel, David J.

    2000-11-01

    The coordinated exploitation of modern communication, micro- sensor and computer technologies makes it possible to give global reach to our senses. Web-cameras for vision, web- microphones for hearing and web-'noses' for smelling, plus the abilities to sense many factors we cannot ordinarily perceive, are either available or will be soon. Applications include (1) determination of weather and environmental conditions on dense grids or over large areas, (2) monitoring of energy usage in buildings, (3) sensing the condition of hardware in electrical power distribution and information systems, (4) improving process control and other manufacturing, (5) development of intelligent terrestrial, marine, aeronautical and space transportation systems, (6) managing the continuum of routine security monitoring, diverse crises and military actions, and (7) medicine, notably the monitoring of the physiology and living conditions of individuals. Some of the emerging capabilities, such as the ability to measure remotely the conditions inside of people in real time, raise interesting social concerns centered on privacy issues. Methods for sensor data fusion and designs for human-computer interfaces are both crucial for the full realization of the potential of pervasive sensing. Computer-generated virtual reality, augmented with real-time sensor data, should be an effective means for presenting information from distributed sensors.

  10. 77 FR 39220 - Advisory Committee on Commercial Remote Sensing (ACCRES); Charter Renewal

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-02

    ... Commercial Remote Sensing (ACCRES); Charter Renewal AGENCY: National Oceanic and Atmospheric Administration... Committee on Commercial Remote Sensing (ACCRES) was renewed on March 14, 2012. SUPPLEMENTARY INFORMATION: In... Commercial Remote Sensing (ACCRES) is in the public interest in connection with the performance of duties...

  11. 76 FR 66042 - Advisory Committee on Commercial Remote Sensing (ACCRES); Request for Nominations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-25

    ... Commercial Remote Sensing (ACCRES); Request for Nominations ACTION: Notice requesting nominations for the Advisory Committee on Commercial Remote Sensing (ACCRES). SUMMARY: The Advisory Committee on Commercial Remote Sensing (ACCRES) was established to advise the Secretary of Commerce, through the Under Secretary...

  12. An introduction to quantitative remote sensing. [data processing

    NASA Technical Reports Server (NTRS)

    Lindenlaub, J. C.; Russell, J.

    1974-01-01

    The quantitative approach to remote sensing is discussed along with the analysis of remote sensing data. Emphasis is placed on the application of pattern recognition in numerically oriented remote sensing systems. A common background and orientation for users of the LARS computer software system is provided.

  13. VERDEX: A virtual environment demonstrator for remote driving applications

    NASA Technical Reports Server (NTRS)

    Stone, Robert J.

    1991-01-01

    One of the key areas of the National Advanced Robotics Centre's enabling technologies research program is that of the human system interface, phase 1 of which started in July 1989 and is currently addressing the potential of virtual environments to permit intuitive and natural interactions between a human operator and a remote robotic vehicle. The aim of the first 12 months of this program (to September, 1990) is to develop a virtual human-interface demonstrator for use later as a test bed for human factors experimentation. This presentation will describe the current state of development of the test bed, and will outline some human factors issues and problems for more general discussion. In brief, the virtual telepresence system for remote driving has been designed to take the following form. The human operator will be provided with a helmet-mounted stereo display assembly, facilities for speech recognition and synthesis (using the Marconi Macrospeak system), and a VPL DataGlove Model 2 unit. The vehicle to be used for the purposes of remote driving is a Cybermotion Navmaster K2A system, which will be equipped with a stereo camera and microphone pair, mounted on a motorized high-speed pan-and-tilt head incorporating a closed-loop laser ranging sensor for camera convergence control (currently under contractual development). It will be possible to relay information to and from the vehicle and sensory system via an umbilical or RF link. The aim is to develop an interactive audio-visual display system capable of presenting combined stereo TV pictures and virtual graphics windows, the latter featuring control representations appropriate for vehicle driving and interaction using a graphical 'hand,' slaved to the flex and tracking sensors of the DataGlove and an additional helmet-mounted Polhemus IsoTrack sensor. Developments planned for the virtual environment test bed include transfer of operator control between remote driving and remote manipulation, dexterous end effector integration, virtual force and tactile sensing (also the focus of a current ARRL contract, initially employing a 14-pneumatic bladder glove attachment), and sensor-driven world modeling for total virtual environment generation and operator-assistance in remote scene interrogation.

  14. Remote Sensing of Ecology, Biodiversity and Conservation: A Review from the Perspective of Remote Sensing Specialists

    PubMed Central

    Wang, Kai; Franklin, Steven E.; Guo, Xulin; Cattet, Marc

    2010-01-01

    Remote sensing, the science of obtaining information via noncontact recording, has swept the fields of ecology, biodiversity and conservation (EBC). Several quality review papers have contributed to this field. However, these papers often discuss the issues from the standpoint of an ecologist or a biodiversity specialist. This review focuses on the spaceborne remote sensing of EBC from the perspective of remote sensing specialists, i.e., it is organized in the context of state-of-the-art remote sensing technology, including instruments and techniques. Herein, the instruments to be discussed consist of high spatial resolution, hyperspectral, thermal infrared, small-satellite constellation, and LIDAR sensors; and the techniques refer to image classification, vegetation index (VI), inversion algorithm, data fusion, and the integration of remote sensing (RS) and geographic information system (GIS). PMID:22163432

  15. Remote sensing of ecology, biodiversity and conservation: a review from the perspective of remote sensing specialists.

    PubMed

    Wang, Kai; Franklin, Steven E; Guo, Xulin; Cattet, Marc

    2010-01-01

    Remote sensing, the science of obtaining information via noncontact recording, has swept the fields of ecology, biodiversity and conservation (EBC). Several quality review papers have contributed to this field. However, these papers often discuss the issues from the standpoint of an ecologist or a biodiversity specialist. This review focuses on the spaceborne remote sensing of EBC from the perspective of remote sensing specialists, i.e., it is organized in the context of state-of-the-art remote sensing technology, including instruments and techniques. Herein, the instruments to be discussed consist of high spatial resolution, hyperspectral, thermal infrared, small-satellite constellation, and LIDAR sensors; and the techniques refer to image classification, vegetation index (VI), inversion algorithm, data fusion, and the integration of remote sensing (RS) and geographic information system (GIS).

  16. Remote Sensing and Reflectance Profiling in Entomology.

    PubMed

    Nansen, Christian; Elliott, Norman

    2016-01-01

    Remote sensing describes the characterization of the status of objects and/or the classification of their identity based on a combination of spectral features extracted from reflectance or transmission profiles of radiometric energy. Remote sensing can be benchtop based, and therefore acquired at a high spatial resolution, or airborne at lower spatial resolution to cover large areas. Despite important challenges, airborne remote sensing technologies will undoubtedly be of major importance in optimized management of agricultural systems in the twenty-first century. Benchtop remote sensing applications are becoming important in insect systematics and in phenomics studies of insect behavior and physiology. This review highlights how remote sensing influences entomological research by enabling scientists to nondestructively monitor how individual insects respond to treatments and ambient conditions. Furthermore, novel remote sensing technologies are creating intriguing interdisciplinary bridges between entomology and disciplines such as informatics and electrical engineering.

  17. Remote Sensing in Geography in the New Millennium: Prospects, Challenges, and Opportunities

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale A.; Jensen, John R.; Morain, Stanley A.; Walsh, Stephen J.; Ridd, Merrill K.

    1999-01-01

    Remote sensing science contributes greatly to our understanding of the Earth's ecosystems and cultural landscapes. Almost all the natural and social sciences, including geography, rely heavily on remote sensing to provide quantitative, and indispensable spatial information. Many geographers have made significant contributions to remote sensing science since the 1970s, including the specification of advanced remote sensing systems, improvements in analog and digital image analysis, biophysical modeling, and terrain analysis. In fact, the Remote Sensing Specialty Group (RSSG) is one of the largest specialty groups within the AAG with over 500 members. Remote sensing in concert with a geographic information systems, offers much value to geography as both an incisive spatial-analytical tool and as a scholarly pursuit that adds to the body of geographic knowledge on the whole. The "power" of remote sensing as a research endeavor in geography lies in its capabilities for obtaining synoptic, near-real time data at many spatial and temporal scales, and in many regions of the electromagnetic spectrum - from microwave, to RADAR, to visible, and reflective and thermal infrared. In turn, these data present a vast compendium of information for assessing Earth attributes and characte6stics that are at the very core of geography. Here we revisit how remote sensing has become a fundamental and important tool for geographical research, and how with the advent of new and improved sensing systems to be launched in the near future, remote sensing will further advance geographical analysis in the approaching New Millennium.

  18. General-Purpose Serial Interface For Remote Control

    NASA Technical Reports Server (NTRS)

    Busquets, Anthony M.; Gupton, Lawrence E.

    1990-01-01

    Computer controls remote television camera. General-purpose controller developed to serve as interface between host computer and pan/tilt/zoom/focus functions on series of automated video cameras. Interface port based on 8251 programmable communications-interface circuit configured for tristated outputs, and connects controller system to any host computer with RS-232 input/output (I/O) port. Accepts byte-coded data from host, compares them with prestored codes in read-only memory (ROM), and closes or opens appropriate switches. Six output ports control opening and closing of as many as 48 switches. Operator controls remote television camera by speaking commands, in system including general-purpose controller.

  19. Agricultural Production Monitoring in the Sahel Using Remote Sensing: Present Possibilities and Research Needs

    DTIC Science & Technology

    1993-01-01

    during the agricultural season. Satellite remote sensing can contribute significantly to such a system by collecting information on crops and on...well as techniques to derive biophysical variables from remotely-sensed data. Finally, the integration of these remote - sensing techniques with crop

  20. Difficulties of biomass estimation over natural grassland

    NASA Astrophysics Data System (ADS)

    Kertész, Péter; Gecse, Bernadett; Pintér, Krisztina; Fóti, Szilvia; Nagy, Zoltán

    2017-04-01

    Estimation of biomass amount in grasslands using remote sensing is a challenge due to the high diversity and different phenologies of the constituting plant species. The aim of this study was to estimate the biomass amount (dry weight per area) during the vegetation period of a diverse semi-natural grassland with remote sensing. A multispectral camera (Tetracam Mini-MCA 6) was used with 3 cm ground resolution. The pre-processing method includes noise reduction, the correction for the vignetting effect and the calculation of the reflectance using an Incident Light Sensor (ILS). Calibration was made with ASD spectrophotometer as reference. To estimate biomass Partial Least Squares Regression (PLSR) statistical method was used with 5 bands and NDVI as input variables. Above ground biomass was cut in 15 quadrats (50×50 cm) as reference. The best prediction was attained in spring (r2=0.94, RMSE: 26.37 g m-2). The average biomass amount was 167 g m-2. The variability of the biomass is mainly determined by the relief, which causes the high and low biomass patches to be stable. The reliability of biomass estimation was negatively affected by the appearance of flowers and by the senescent plant parts during the summer. To determine the effects of flower's presence on the biomass estimation, 20 dominant species with visually dominant flowers in the area were selected and cover of flowers (%) were estimated in permanent plots during measurement campaigns. If the cover of flowers was low (<25%), the biomass amount estimation was successful (r2 >0,9), while at higher cover of flowers (>30%), the estimation failed (r2 <0,2). This effect restricts the usage of the remote sensing method to the spring - early summer period in diverse grasslands.

  1. Spectral Imaging from Uavs Under Varying Illumination Conditions

    NASA Astrophysics Data System (ADS)

    Hakala, T.; Honkavaara, E.; Saari, H.; Mäkynen, J.; Kaivosoja, J.; Pesonen, L.; Pölönen, I.

    2013-08-01

    Rapidly developing unmanned aerial vehicles (UAV) have provided the remote sensing community with a new rapidly deployable tool for small area monitoring. The progress of small payload UAVs has introduced greater demand for light weight aerial payloads. For applications requiring aerial images, a simple consumer camera provides acceptable data. For applications requiring more detailed spectral information about the surface, a new Fabry-Perot interferometer based spectral imaging technology has been developed. This new technology produces tens of successive images of the scene at different wavelength bands in very short time. These images can be assembled in spectral data cubes with stereoscopic overlaps. On field the weather conditions vary and the UAV operator often has to decide between flight in sub optimal conditions and no flight. Our objective was to investigate methods for quantitative radiometric processing of images taken under varying illumination conditions, thus expanding the range of weather conditions during which successful imaging flights can be made. A new method that is based on insitu measurement of irradiance either in UAV platform or in ground was developed. We tested the methods in a precision agriculture application using realistic data collected in difficult illumination conditions. Internal homogeneity of the original image data (average coefficient of variation in overlapping images) was 0.14-0.18. In the corrected data, the homogeneity was 0.10-0.12 with a correction based on broadband irradiance measured in UAV, 0.07-0.09 with a correction based on spectral irradiance measurement on ground, and 0.05-0.08 with a radiometric block adjustment based on image data. Our results were very promising, indicating that quantitative UAV based remote sensing could be operational in diverse conditions, which is prerequisite for many environmental remote sensing applications.

  2. Detailed Characterization of Nearshore Processes During NCEX

    NASA Astrophysics Data System (ADS)

    Holland, K.; Kaihatu, J. M.; Plant, N.

    2004-12-01

    Recent technology advances have allowed the coupling of remote sensing methods with advanced wave and circulation models to yield detailed characterizations of nearshore processes. This methodology was demonstrated as part of the Nearshore Canyon EXperiment (NCEX) in La Jolla, CA during Fall 2003. An array of high-resolution, color digital cameras was installed to monitor an alongshore distance of nearly 2 km out to depths of 25 m. This digital imagery was analyzed over the three-month period through an automated process to produce hourly estimates of wave period, wave direction, breaker height, shoreline position, sandbar location, and bathymetry at numerous locations during daylight hours. Interesting wave propagation patterns in the vicinity of the canyons were observed. In addition, directional wave spectra and swash / surf flow velocities were estimated using more computationally intensive methods. These measurements were used to provide forcing and boundary conditions for the Delft3D wave and circulation model, giving additional estimates of nearshore processes such as dissipation and rip currents. An optimal approach for coupling these remotely sensed observations to the numerical model was selected to yield accurate, but also timely characterizations. This involved assimilation of directional spectral estimates near the offshore boundary to mimic forcing conditions achieved under traditional approaches involving nested domains. Measurements of breaker heights and flow speeds were also used to adaptively tune model parameters to provide enhanced accuracy. Comparisons of model predictions and video observations show significant correlation. As compared to nesting within larger-scale and coarser resolution models, the advantages of providing boundary conditions data using remote sensing is much improved resolution and fidelity. For example, rip current development was both modeled and observed. These results indicate that this approach to data-model coupling is tenable and may be useful in near-real-time characterizations required by many applied scenarios.

  3. Application of Near-Surface Remote Sensing and computer algorithms in evaluating impacts of agroecosystem management on Zea mays (corn) phenological development in the Platte River - High Plains Aquifer Long Term Agroecosystem Research Network field sites.

    NASA Astrophysics Data System (ADS)

    Okalebo, J. A.; Das Choudhury, S.; Awada, T.; Suyker, A.; LeBauer, D.; Newcomb, M.; Ward, R.

    2017-12-01

    The Long-term Agroecosystem Research (LTAR) network is a USDA-ARS effort that focuses on conducting research that addresses current and emerging issues in agriculture related to sustainability and profitability of agroecosystems in the face of climate change and population growth. There are 18 sites across the USA covering key agricultural production regions. In Nebraska, a partnership between the University of Nebraska - Lincoln and ARD/USDA resulted in the establishment of the Platte River - High Plains Aquifer LTAR site in 2014. The site conducts research to sustain multiple ecosystem services focusing specifically on Nebraska's main agronomic production agroecosystems that comprise of abundant corn, soybeans, managed grasslands and beef production. As part of the national LTAR network, PR-HPA participates and contributes near-surface remotely sensed imagery of corn, soybean and grassland canopy phenology to the PhenoCam Network through high-resolution digital cameras. This poster highlights the application, advantages and usefulness of near-surface remotely sensed imagery in agroecosystem studies and management. It demonstrates how both Infrared and Red-Green-Blue imagery may be applied to monitor phenological events as well as crop abiotic stresses. Computer-based algorithms and analytic techniques proved very instrumental in revealing crop phenological changes such as green-up and tasseling in corn. This poster also reports the suitability and applicability of corn-derived computer based algorithms for evaluating phenological development of sorghum since both crops have similarities in their phenology; with sorghum panicles being similar to corn tassels. This later assessment was carried out using a sorghum dataset obtained from the Transportation Energy Resources from Renewable Agriculture Phenotyping Reference Platform project, Maricopa Agricultural Center, Arizona.

  4. Innovative methodologies and technologies for thermal energy release measurement.

    NASA Astrophysics Data System (ADS)

    Marotta, Enrica; Peluso, Rosario; Avino, Rosario; Belviso, Pasquale; Caliro, Stefano; Carandente, Antonio; Chiodini, Giovanni; Mangiacapra, Annarita; Petrillo, Zaccaria; Sansivero, Fabio; Vilardo, Giuseppe; Marfe, Barbara

    2016-04-01

    Volcanoes exchange heat, gases and other fluids between the interrior of the Earth and its atmosphere influencing processes both at the surface and above it. This work is devoted to improve the knowledge on the parameters that control the anomalies in heat flux and chemical species emissions associated with the diffuse degassing processes of volcanic and hydrothermal zones. We are studying and developing innovative medium range remote sensing technologies to measure the variations through time of heat flux and chemical emissions in order to boost the definition of the activity state of a volcano and allowing a better assessment of the related hazard and risk mitigation. The current methodologies used to measure heat flux (i.e. CO2 flux or temperature gradient) are either poorly efficient or effective, and are unable to detect short to medium time (days to months) variation trends in the heat flux. Remote sensing of these parameters will allow for measurements faster than already accredited methods therefore it will be both more effective and efficient in case of emergency and it will be used to make quick routine monitoring. We are currently developing a method based on drone-born IR cameras to measure the ground surface temperature that, in a purely conductive regime, is directly correlated to the shallow temperature gradient. The use of flying drones will allow to quickly obtain a mapping of areas with thermal anomalies and a measure of their temperature at distance in the order of hundreds of meters. Further development of remote sensing will be done through the use, on flying drones, of multispectral and/or iperspectral sensors, UV scanners in order to be able to detect the amount of chemical species released in the athmosphere.

  5. Method of determining forest production from remotely sensed forest parameters

    DOEpatents

    Corey, J.C.; Mackey, H.E. Jr.

    1987-08-31

    A method of determining forest production entirely from remotely sensed data in which remotely sensed multispectral scanner (MSS) data on forest 5 composition is combined with remotely sensed radar imaging data on forest stand biophysical parameters to provide a measure of forest production. A high correlation has been found to exist between the remotely sensed radar imaging data and on site measurements of biophysical 10 parameters such as stand height, diameter at breast height, total tree height, mean area per tree, and timber stand volume.

  6. Removal of Surface-Reflected Light for the Measurement of Remote-Sensing Reflectance from an Above-Surface Platform

    DTIC Science & Technology

    2010-12-01

    remote - sensing reflectance) can be highly inaccurate if a spectrally constant value is applied (although errors can be reduced by carefully filtering measured raw data). To remove surface-reflected light in field measurements of remote sensing reflectance, a spectral optimization approach was applied, with results compared with those from remote sensing models and from direct measurements. The agreement from different determinations suggests that reasonable results for remote sensing reflectance of clear

  7. Removal of Surface-Reflected Light for the Measurement of Remote-Sensing Reflectance from an Above-Surface Platform

    DTIC Science & Technology

    2010-12-06

    remote - sensing reflectance) can be highly inaccurate if a spectrally constant value is applied (although errors can be reduced by carefully filtering measured raw data). To remove surface-reflected light in field measurements of remote sensing reflectance, a spectral optimization approach was applied, with results compared with those from remote sensing models and from direct measurements. The agreement from different determinations suggests that reasonable results for remote sensing reflectance of clear

  8. A Robust Mechanical Sensing System for Unmanned Sea Surface Vehicles

    NASA Technical Reports Server (NTRS)

    Kulczycki, Eric A.; Magnone, Lee J.; Huntsberger, Terrance; Aghazarian, Hrand; Padgett, Curtis W.; Trotz, David C.; Garrett, Michael S.

    2009-01-01

    The need for autonomous navigation and intelligent control of unmanned sea surface vehicles requires a mechanically robust sensing architecture that is watertight, durable, and insensitive to vibration and shock loading. The sensing system developed here comprises four black and white cameras and a single color camera. The cameras are rigidly mounted to a camera bar that can be reconfigured to mount multiple vehicles, and act as both navigational cameras and application cameras. The cameras are housed in watertight casings to protect them and their electronics from moisture and wave splashes. Two of the black and white cameras are positioned to provide lateral vision. They are angled away from the front of the vehicle at horizontal angles to provide ideal fields of view for mapping and autonomous navigation. The other two black and white cameras are positioned at an angle into the color camera's field of view to support vehicle applications. These two cameras provide an overlap, as well as a backup to the front camera. The color camera is positioned directly in the middle of the bar, aimed straight ahead. This system is applicable to any sea-going vehicle, both on Earth and in space.

  9. New Airborne Sensors and Platforms for Solving Specific Tasks in Remote Sensing

    NASA Astrophysics Data System (ADS)

    Kemper, G.

    2012-07-01

    A huge number of small and medium sized sensors entered the market. Today's mid format sensors reach 80 MPix and allow to run projects of medium size, comparable with the first big format digital cameras about 6 years ago. New high quality lenses and new developments in the integration prepared the market for photogrammetric work. Companies as Phase One or Hasselblad and producers or integrators as Trimble, Optec, and others utilized these cameras for professional image production. In combination with small camera stabilizers they can be used also in small aircraft and make the equipment small and easy transportable e.g. for rapid assessment purposes. The combination of different camera sensors enables multi or hyper-spectral installations e.g. useful for agricultural or environmental projects. Arrays of oblique viewing cameras are in the market as well, in many cases these are small and medium format sensors combined as rotating or shifting devices or just as a fixed setup. Beside the proper camera installation and integration, also the software that controls the hardware and guides the pilot has to solve much more tasks than a normal FMS did in the past. Small and relatively cheap Laser Scanners (e.g. Riegl) are in the market and a proper combination with MS Cameras and an integrated planning and navigation is a challenge that has been solved by different softwares. Turnkey solutions are available e.g. for monitoring power line corridors where taking images is just a part of the job. Integration of thermal camera systems with laser scanner and video capturing must be combined with specific information of the objects stored in a database and linked when approaching the navigation point.

  10. Field Data Collection: an Essential Element in Remote Sensing Applications

    NASA Technical Reports Server (NTRS)

    Pettinger, L. R.

    1971-01-01

    Field data collected in support of remote sensing projects are generally used for the following purposes: (1) calibration of remote sensing systems, (2) evaluation of experimental applications of remote sensing imagery on small test sites, and (3) designing and evaluating operational regional resource studies and inventories which are conducted using the remote sensing imagery obtained. Field data may be used to help develop a technique for a particular application, or to aid in the application of that technique to a resource evaluation or inventory problem for a large area. Scientists at the Forestry Remote Sensing Laboratory have utilized field data for both purposes. How meaningful field data has been collected in each case is discussed.

  11. Remote sensing and eLearning 2.0 for school education

    NASA Astrophysics Data System (ADS)

    Voss, Kerstin; Goetzke, Roland; Hodam, Henryk

    2010-10-01

    The "Remote Sensing in Schools" project aims at improving the integration of "Satellite remote sensing" into school teaching. Therefore, it is the project's overall objective to teach students in primary and secondary schools the basics and fields of application of remote sensing. Existing results show that many teachers are interested in remote sensing and at same time motivated to integrate it into their teaching. Despite the good intention, in the end, the implementation often fails due to the complexity and poor set-up of the information provided. Therefore, a comprehensive and well-structured learning platform on the topic of remote sensing is developed. The platform shall allow a structured introduction to the topic.

  12. Atmospheric aerosol and gas sensing using Scheimpflug lidar

    NASA Astrophysics Data System (ADS)

    Mei, Liang; Brydegaard, Mikkel

    2015-04-01

    This work presents a new lidar technique for atmospheric remote sensing based on Scheimpflug principle, which describes the relationship between nonparallel image- and object-planes[1]. When a laser beam is transmitted into the atmosphere, the implication is that the backscattering echo of the entire illuminated probe volume can be in focus simultaneously without diminishing the aperture. The range-resolved backscattering echo can be retrieved by using a tilted line scan or two-dimensional CCD/CMOS camera. Rather than employing nanosecond-pulsed lasers, cascade detectors, and MHz signal sampling, all of high cost and complexity, we have developed a robust and inexpensive atmospheric lidar system based on compact laser diodes and array detectors. We present initial applications of the Scheimpflug lidar for atmospheric aerosol monitoring in bright sunlight, with a 3 W, 808 nm CW laser diode. Kilohertz sampling rates are also achieved with applications for wind speed and entomology [2]. Further, a proof-of-principle demonstration of differential absorption lidar (DIAL) based on the Scheimpflug lidar technique is presented [3]. By utilizing a 30 mW narrow band CW laser diode emitting at around 760 nm, the detailed shape of an oxygen absorption line can be resolved remotely with an integration time of 6 s and measurement cycle of 1 minute during night time. The promising results demonstrated in this work show potential for the Scheimpflug lidar technique for remote atmospheric aerosol and gas sensing, and renews hope for robust and realistic instrumentation for atmospheric lidar sensing. [1] F. Blais, "Review of 20 years of range sensor development," Journal of Electronic Imaging, vol. 13, pp. 231-243, Jan 2004. [2] M. Brydegaard, A. Gebru, and S. Svanberg, "Super resolution laser radar with blinking atmospheric particles - application to interacting flying insects " Progress In Electromagnetics Research, vol. 147, pp. 141-151, 2014. [3] L. Mei and M. Brydegaard, "Contineous-wave differential absorption lidar," Submitted to Laser and Photonics Reviews, 2014.

  13. Remote sensing programs and courses in engineering and water resources

    NASA Technical Reports Server (NTRS)

    Kiefer, R. W.

    1981-01-01

    The content of typical basic and advanced remote sensing and image interpretation courses are described and typical remote sensing graduate programs of study in civil engineering and in interdisciplinary environmental remote sensing and water resources management programs are outlined. Ideally, graduate programs with an emphasis on remote sensing and image interpretation should be built around a core of five courses: (1) a basic course in fundamentals of remote sensing upon which the more specialized advanced remote sensing courses can build; (2) a course dealing with visual image interpretation; (3) a course dealing with quantitative (computer-based) image interpretation; (4) a basic photogrammetry course; and (5) a basic surveying course. These five courses comprise up to one-half of the course work required for the M.S. degree. The nature of other course work and thesis requirements vary greatly, depending on the department in which the degree is being awarded.

  14. Remote sensing research in geographic education: An alternative view

    NASA Technical Reports Server (NTRS)

    Wilson, H.; Cary, T. K.; Goward, S. N.

    1981-01-01

    It is noted that within many geography departments remote sensing is viewed as a mere technique a student should learn in order to carry out true geographic research. This view inhibits both students and faculty from investigation of remotely sensed data as a new source of geographic knowledge that may alter our understanding of the Earth. The tendency is for geographers to accept these new data and analysis techniques from engineers and mathematicians without questioning the accompanying premises. This black-box approach hinders geographic applications of the new remotely sensed data and limits the geographer's contribution to further development of remote sensing observation systems. It is suggested that geographers contribute to the development of remote sensing through pursuit of basic research. This research can be encouraged, particularly among students, by demonstrating the links between geographic theory and remotely sensed observations, encouraging a healthy skepticism concerning the current understanding of these data.

  15. Research on assessment and improvement method of remote sensing image reconstruction

    NASA Astrophysics Data System (ADS)

    Sun, Li; Hua, Nian; Yu, Yanbo; Zhao, Zhanping

    2018-01-01

    Remote sensing image quality assessment and improvement is an important part of image processing. Generally, the use of compressive sampling theory in remote sensing imaging system can compress images while sampling which can improve efficiency. A method of two-dimensional principal component analysis (2DPCA) is proposed to reconstruct the remote sensing image to improve the quality of the compressed image in this paper, which contain the useful information of image and can restrain the noise. Then, remote sensing image quality influence factors are analyzed, and the evaluation parameters for quantitative evaluation are introduced. On this basis, the quality of the reconstructed images is evaluated and the different factors influence on the reconstruction is analyzed, providing meaningful referential data for enhancing the quality of remote sensing images. The experiment results show that evaluation results fit human visual feature, and the method proposed have good application value in the field of remote sensing image processing.

  16. Unmanned aerial vehicle: A unique platform for low-altitude remote sensing for crop management

    USDA-ARS?s Scientific Manuscript database

    Unmanned aerial vehicles (UAV) provide a unique platform for remote sensing to monitor crop fields that complements remote sensing from satellite, aircraft and ground-based platforms. The UAV-based remote sensing is versatile at ultra-low altitude to be able to provide an ultra-high-resolution imag...

  17. Application of the remote-sensing communication model to a time-sensitive wildfire remote-sensing system

    Treesearch

    Christopher D. Lippitt; Douglas A. Stow; Philip J. Riggan

    2016-01-01

    Remote sensing for hazard response requires a priori identification of sensor, transmission, processing, and distribution methods to permit the extraction of relevant information in timescales sufficient to allow managers to make a given time-sensitive decision. This study applies and demonstrates the utility of the Remote Sensing Communication...

  18. 75 FR 32360 - Proposed Information Collection; Comment Request; Licensing of Private Remote-Sensing Space Systems

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-08

    ... Collection; Comment Request; Licensing of Private Remote-Sensing Space Systems AGENCY: National Oceanic and.... Abstract NOAA has established requirements for the licensing of private operators of remote-sensing space... Land Remote- Sensing Policy Act of 1992 and with the national security and international obligations of...

  19. 78 FR 44536 - Proposed Information Collection; Comment Request; Licensing of Private Remote-Sensing Space Systems

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-24

    ... Collection; Comment Request; Licensing of Private Remote-Sensing Space Systems AGENCY: National Oceanic and... for the licensing of private operators of remote-sensing space systems. The information in applications and subsequent reports is needed to ensure compliance with the Land Remote- Sensing Policy Act of...

  20. Advancement of China’s Visible Light Remote Sensing Technology In Aerospace,

    DTIC Science & Technology

    1996-03-19

    Aerospace visible light film systems were among the earliest space remote sensing systems to be developed in China. They have been applied very well...makes China the third nation in the world to master space remote sensing technology, it also puts recoverable remote sensing satellites among the first

  1. Polarimetric passive remote sensing of periodic surfaces

    NASA Technical Reports Server (NTRS)

    Veysoglu, Murat E.; Yueh, H. A.; Shin, R. T.; Kong, J. A.

    1991-01-01

    The concept of polarimetry in active remote sensing is extended to passive remote sensing. The potential use of the third and fourth Stokes parameters U and V, which play an important role in polarimetric active remote sensing, is demonstrated for passive remote sensing. It is shown that, by the use of the reciprocity principle, the polarimetric parameters of passive remote sensing can be obtained through the solution of the associated direct scattering problem. These ideas are applied to study polarimetric passive remote sensing of periodic surfaces. The solution of the direct scattering problem is obtained by an integral equation formulation which involves evaluation of periodic Green's functions and normal derivative of those on the surface. Rapid evaluation of the slowly convergent series associated with these functions is observed to be critical for the feasibility of the method. New formulas, which are rapidly convergent, are derived for the calculation of these series. The study has shown that the brightness temperature of the Stokes parameter U can be significant in passive remote sensing. Values as high as 50 K are observed for certain configurations.

  2. Improved head-controlled TV system produces high-quality remote image

    NASA Technical Reports Server (NTRS)

    Goertz, R.; Lindberg, J.; Mingesz, D.; Potts, C.

    1967-01-01

    Manipulator operator uses an improved resolution tv camera/monitor positioning system to view the remote handling and processing of reactive, flammable, explosive, or contaminated materials. The pan and tilt motions of the camera and monitor are slaved to follow the corresponding motions of the operators head.

  3. From planets to crops and back: Remote sensing makes sense

    NASA Astrophysics Data System (ADS)

    Mustard, John F.

    2017-04-01

    Remotely sensed data and the instruments that acquire them are core parts of Earth and planetary observation systems. They are used to quantify the Earth's interconnected systems, and remote sensing is the only way to get a daily, or more frequent, snapshot of the status of the Earth. It really is the Earth's stethoscope. In a similar manner remote sensing is the rock hammer of the planetary scientist and the only way comprehensive data sets can be acquired. To risk offending many remotely sensed data acquired across the electromagnetic spectrum, it is the tricorder to explore known and unknown planets. Arriving where we are today in the use of remotely sensed data in the solar system has been a continually evolving synergy between Earth observation, planetary exploration, and fundamental laboratory work.

  4. Remote sensing of on-road vehicle emissions: Mechanism, applications and a case study from Hong Kong

    NASA Astrophysics Data System (ADS)

    Huang, Yuhan; Organ, Bruce; Zhou, John L.; Surawski, Nic C.; Hong, Guang; Chan, Edward F. C.; Yam, Yat Shing

    2018-06-01

    Vehicle emissions are a major contributor to air pollution in cities and have serious health impacts to their inhabitants. On-road remote sensing is an effective and economic tool to monitor and control vehicle emissions. In this review, the mechanism, accuracy, advantages and limitations of remote sensing were introduced. Then the applications and major findings of remote sensing were critically reviewed. It was revealed that the emission distribution of on-road vehicles was highly skewed so that the dirtiest 10% vehicles accounted for over half of the total fleet emissions. Such findings highlighted the importance and effectiveness of using remote sensing for in situ identification of high-emitting vehicles for further inspection and maintenance programs. However, the accuracy and number of vehicles affected by screening programs were greatly dependent on the screening criteria. Remote sensing studies showed that the emissions of gasoline and diesel vehicles were significantly reduced in recent years, with the exception of NOx emissions of diesel vehicles in spite of greatly tightened automotive emission regulations. Thirdly, the experience and issues of using remote sensing for identifying high-emitting vehicles in Hong Kong (where remote sensing is a legislative instrument for enforcement purposes) were reported. That was followed by the first time ever identification and discussion of the issue of frequent false detection of diesel high-emitters using remote sensing. Finally, the challenges and future research directions of on-road remote sensing were elaborated.

  5. Foliar Temperature Gradients as Drivers of Budburst in Douglas-fir: New Applications of Thermal Infrared Imagery

    NASA Astrophysics Data System (ADS)

    Miller, R.; Lintz, H. E.; Thomas, C. K.; Salino-Hugg, M. J.; Niemeier, J. J.; Kruger, A.

    2014-12-01

    Budburst, the initiation of annual growth in plants, is sensitive to climate and is used to monitor physiological responses to climate change. Accurately forecasting budburst response to these changes demands an understanding of the drivers of budburst. Current research and predictive models focus on population or landscape-level drivers, yet fundamental questions regarding drivers of budburst diversity within an individual tree remain unanswered. We hypothesize that foliar temperature, an important physiological property, may be a dominant driver of differences in the timing of budburst within a single tree. Studying these differences facilitates development of high throughput phenotyping technology used to improve predictive budburst models. We present spatial and temporal variation in foliar temperature as a function of physical drivers culminating in a single-tree budburst model based on foliar temperature. We use a novel remote sensing approach, combined with on-site meteorological measurements, to demonstrate important intra-canopy differences between air and foliar temperature. We mounted a thermal infrared camera within an old-growth canopy at the H.J. Andrews LTER forest and imaged an 8m by 10.6m section of a Douglas-fir crown. Sampling one image per minute, approximately 30,000 thermal infrared images were collected over a one-month period to approximate foliar temperature before, during and after budburst. Using time-lapse photography in the visible spectrum, we documented budburst at fifteen-minute intervals with eight cameras stratified across the thermal infrared camera's field of view. Within the imaged tree's crown, we installed a pyranometer, 2D sonic anemometer and fan-aspirated thermohygrometer and collected 3,000 measurements of net shortwave radiation, wind speed, air temperature and relative humidity. We documented a difference of several days in the timing of budburst across both vertical and horizontal gradients. We also observed clear spatial and temporal foliar temperature gradients. In addition to exploring physical drivers of budburst, this remote sensing approach provides insight into intra-canopy structural complexity and opportunities to advance our understanding of vegetation-­atmospheric interactions.

  6. Remote sensing of natural resources: Quarterly literature review

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A quarterly review of technical literature concerning remote sensing techniques is presented. The format contains indexed and abstracted materials with emphasis on data gathering techniques performed or obtained remotely from space, aircraft, or ground-based stations. Remote sensor applications including the remote sensing of natural resources are presented.

  7. EPA Enviropod. A summary of the use of the Enviropod under a Memorandum of Understanding among EPA Region 8, the State of Utah, and the University of Utah Research Institute

    NASA Technical Reports Server (NTRS)

    Ridd, M. K.

    1984-01-01

    Twenty-three missions were flown using the EPA's panoramic camera to obtain color and color infrared photographs of landslide and flood damage in Utah. From the state's point of view, there were many successes. The biggest single obstacle to smooth and continued performance was unavailable aircraft. The Memorandum of Understanding between the State of Utah, the Environmental Protection Agency, and the Center for Remote Sensing and Cartography is included along with forms for planning enviropod missions, for requesting flights, and for obtaining feedback from participating agencies.

  8. Event-Based Sensing and Control for Remote Robot Guidance: An Experimental Case

    PubMed Central

    Santos, Carlos; Martínez-Rey, Miguel; Santiso, Enrique

    2017-01-01

    This paper describes the theoretical and practical foundations for remote control of a mobile robot for nonlinear trajectory tracking using an external localisation sensor. It constitutes a classical networked control system, whereby event-based techniques for both control and state estimation contribute to efficient use of communications and reduce sensor activity. Measurement requests are dictated by an event-based state estimator by setting an upper bound to the estimation error covariance matrix. The rest of the time, state prediction is carried out with the Unscented transformation. This prediction method makes it possible to select the appropriate instants at which to perform actuations on the robot so that guidance performance does not degrade below a certain threshold. Ultimately, we obtained a combined event-based control and estimation solution that drastically reduces communication accesses. The magnitude of this reduction is set according to the tracking error margin of a P3-DX robot following a nonlinear trajectory, remotely controlled with a mini PC and whose pose is detected by a camera sensor. PMID:28878144

  9. Instrumentation for Aim Point Determination in the Close-in Battle

    DTIC Science & Technology

    2007-12-01

    Rugged camcorder with remote “ lipstick ” camera (http://www.samsung.com/Products/ Camcorder/DigitalMemory/files/scx210wl.pdf). ........ 5 Figure 5...target. One way of making a measurement is to mount a small “ lipstick ” camera to the rifle with a mount similar to the laser-tag transmitter mount...technology.com/contractors/surveillance/viotac-inc/viotac-inc1.html). Figure 4. Rugged camcorder with remote “ lipstick ” camera (http://www.samsung.com

  10. Time Series Remote Sensing in Monitoring the Spatio-Temporal Dynamics of Plant Invasions: A Study of Invasive Saltcedar (Tamarix Spp.)

    NASA Astrophysics Data System (ADS)

    Diao, Chunyuan

    In today's big data era, the increasing availability of satellite and airborne platforms at various spatial and temporal scales creates unprecedented opportunities to understand the complex and dynamic systems (e.g., plant invasion). Time series remote sensing is becoming more and more important to monitor the earth system dynamics and interactions. To date, most of the time series remote sensing studies have been conducted with the images acquired at coarse spatial scale, due to their relatively high temporal resolution. The construction of time series at fine spatial scale, however, is limited to few or discrete images acquired within or across years. The objective of this research is to advance the time series remote sensing at fine spatial scale, particularly to shift from discrete time series remote sensing to continuous time series remote sensing. The objective will be achieved through the following aims: 1) Advance intra-annual time series remote sensing under the pure-pixel assumption; 2) Advance intra-annual time series remote sensing under the mixed-pixel assumption; 3) Advance inter-annual time series remote sensing in monitoring the land surface dynamics; and 4) Advance the species distribution model with time series remote sensing. Taking invasive saltcedar as an example, four methods (i.e., phenological time series remote sensing model, temporal partial unmixing method, multiyear spectral angle clustering model, and time series remote sensing-based spatially explicit species distribution model) were developed to achieve the objectives. Results indicated that the phenological time series remote sensing model could effectively map saltcedar distributions through characterizing the seasonal phenological dynamics of plant species throughout the year. The proposed temporal partial unmixing method, compared to conventional unmixing methods, could more accurately estimate saltcedar abundance within a pixel by exploiting the adequate temporal signatures of saltcedar. The multiyear spectral angle clustering model could guide the selection of the most representative remotely sensed image for repetitive saltcedar mapping over space and time. Through incorporating spatial autocorrelation, the species distribution model developed in the study could identify the suitable habitats of saltcedar at a fine spatial scale and locate appropriate areas at high risk of saltcedar infestation. Among 10 environmental variables, the distance to the river and the phenological attributes summarized by the time series remote sensing were regarded as the most important. These methods developed in the study provide new perspectives on how the continuous time series can be leveraged under various conditions to investigate the plant invasion dynamics.

  11. Forest mensuration with remote sensing: A retrospective and a vision for the future

    Treesearch

    Randolph H. Wynne

    2004-01-01

    Remote sensing, while occasionally oversold, has clear potential to reduce the overall cost of traditional forest inventories. Perhaps most important, some of the information needed for more intensive, rather than extensive, forest management is available from remote sensing. These new information needs may justify increased use and the increased cost of remote sensing...

  12. 15 CFR 960.12 - Data policy for remote sensing space systems.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Data policy for remote sensing space... REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE LICENSING OF PRIVATE REMOTE SENSING SYSTEMS Licenses § 960.12 Data policy for remote sensing space systems. (a) In accordance with the Act, if the U.S. Government...

  13. Remote Sensing: Analyzing Satellite Images to Create Higher Order Thinking Skills.

    ERIC Educational Resources Information Center

    Marks, Steven K.; And Others

    1996-01-01

    Presents a unit that uses remote-sensing images from satellites and other spacecraft to provide new perspectives of the earth and generate greater global awareness. Relates the levels of Bloom's hierarchy to different aspects of the remote sensing unit to confirm that the concepts and principles of remote sensing and related images belong in…

  14. 15 CFR 960.12 - Data policy for remote sensing space systems.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 15 Commerce and Foreign Trade 3 2011-01-01 2011-01-01 false Data policy for remote sensing space... REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE LICENSING OF PRIVATE REMOTE SENSING SYSTEMS Licenses § 960.12 Data policy for remote sensing space systems. (a) In accordance with the Act, if the U.S. Government...

  15. 15 CFR 960.12 - Data policy for remote sensing space systems.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 15 Commerce and Foreign Trade 3 2012-01-01 2012-01-01 false Data policy for remote sensing space... REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE LICENSING OF PRIVATE REMOTE SENSING SYSTEMS Licenses § 960.12 Data policy for remote sensing space systems. (a) In accordance with the Act, if the U.S. Government...

  16. 15 CFR 960.12 - Data policy for remote sensing space systems.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 15 Commerce and Foreign Trade 3 2014-01-01 2014-01-01 false Data policy for remote sensing space... REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE LICENSING OF PRIVATE REMOTE SENSING SYSTEMS Licenses § 960.12 Data policy for remote sensing space systems. (a) In accordance with the Act, if the U.S. Government...

  17. 15 CFR 960.12 - Data policy for remote sensing space systems.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 15 Commerce and Foreign Trade 3 2013-01-01 2013-01-01 false Data policy for remote sensing space... REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE LICENSING OF PRIVATE REMOTE SENSING SYSTEMS Licenses § 960.12 Data policy for remote sensing space systems. (a) In accordance with the Act, if the U.S. Government...

  18. Annotated bibliography of remote sensing methods for monitoring desertification

    USGS Publications Warehouse

    Walker, A.S.; Robinove, Charles J.

    1981-01-01

    Remote sensing techniques are valuable for locating, assessing, and monitoring desertification. Remotely sensed data provide a permanent record of the condition of the land in a format that allows changes in land features and condition to be measured. The annotated bibliography of 118 items discusses remote sensing methods that may be applied to desertification studies.

  19. Applied Remote Sensing Program (ARSP)

    NASA Technical Reports Server (NTRS)

    Johnson, J. D.; Foster, K. E.; Mouat, D. A.; Miller, D. A.; Conn, J. S.

    1976-01-01

    The activities and accomplishments of the Applied Remote Sensing Program during FY 1975-1976 are reported. The principal objective of the Applied Remote Sensing Program continues to be designed projects having specific decision-making impacts as a principal goal. These projects are carried out in cooperation and collaboration with local, state and federal agencies whose responsibilities lie with planning, zoning and environmental monitoring and/or assessment in the application of remote sensing techniques. The end result of the projects is the use by the involved agencies of remote sensing techniques in problem solving.

  20. Communicating remote sensing concepts in an interdisciplinary environment

    NASA Technical Reports Server (NTRS)

    Chung, R.

    1981-01-01

    Although remote sensing is currently multidisciplinary in its applications, many of its terms come from the engineering sciences, particularly from the field of pattern recognition. Scholars from fields such as the social sciences, botany, and biology, may experience initial difficulty with remote sensing terminology, even though parallel concepts exist in their own fields. Some parallel concepts and terminologies from nonengineering fields, which might enhance the understanding of remote sensing concepts in an interdisciplinary situation are identified. Feedbacks which this analogue strategy might have on remote sensing itself are explored.

  1. People, Places and Pixels: Remote Sensing in the Service of Society

    NASA Technical Reports Server (NTRS)

    Lulla, Kamlesh

    2003-01-01

    What is the role of Earth remote sensing and other geospatial technologies in our society? Recent global events have brought into focus the role of geospatial science and technology such as remote sensing, GIS, GPS in assisting the professionals who are responsible for operations such as rescue and recovery of sites after a disaster or a terrorist act. This paper reviews the use of recent remote sensing products from satellites such as IKONOS in these efforts. Aerial and satellite imagery used in land mine detection has been evaluated and the results of this evaluation will be discussed. Synopsis of current and future ISS Earth Remote Sensing capabilities will be provided. The role of future missions in humanitarian use of remote sensing will be explored.

  2. Distributed Sensing and Processing for Multi-Camera Networks

    NASA Astrophysics Data System (ADS)

    Sankaranarayanan, Aswin C.; Chellappa, Rama; Baraniuk, Richard G.

    Sensor networks with large numbers of cameras are becoming increasingly prevalent in a wide range of applications, including video conferencing, motion capture, surveillance, and clinical diagnostics. In this chapter, we identify some of the fundamental challenges in designing such systems: robust statistical inference, computationally efficiency, and opportunistic and parsimonious sensing. We show that the geometric constraints induced by the imaging process are extremely useful for identifying and designing optimal estimators for object detection and tracking tasks. We also derive pipelined and parallelized implementations of popular tools used for statistical inference in non-linear systems, of which multi-camera systems are examples. Finally, we highlight the use of the emerging theory of compressive sensing in reducing the amount of data sensed and communicated by a camera network.

  3. The application of remote sensing techniques to the study of ophiolites

    NASA Astrophysics Data System (ADS)

    Khan, Shuhab D.; Mahmood, Khalid

    2008-08-01

    Satellite remote sensing methods are a powerful tool for detailed geologic analysis, especially in inaccessible regions of the earth's surface. Short-wave infrared (SWIR) bands are shown to provide spectral information bearing on the lithologic, structural, and geochemical character of rock bodies such as ophiolites, allowing for a more comprehensive assessment of the lithologies present, their stratigraphic relationships, and geochemical character. Most remote sensing data are widely available for little or no cost, along with user-friendly software for non-specialists. In this paper we review common remote sensing systems and methods that allow for the discrimination of solid rock (lithologic) components of ophiolite complexes and their structural relationships. Ophiolites are enigmatic rock bodies which associated with most, if not all, plate collision sutures. Ophiolites are ideal for remote sensing given their widely recognized diversity of lithologic types and structural relationships. Accordingly, as a basis for demonstrating the utility of remote sensing techniques, we briefly review typical ophiolites in the Tethyan tectonic belt. As a case study, we apply integrated remote sensing studies of a well-studied example, the Muslim Bagh ophiolite, located in Balochistan, western Pakistan. On this basis, we attempt to demonstrate how remote sensing data can validate and reconcile existing information obtained from field studies. The lithologic and geochemical diversity of Muslim Bagh are representative of Tethyan ophiolites. Despite it's remote location it has been extensively mapped and characterized by structural and geochemical studies, and is virtually free of vegetative cover. Moreover, integrating the remote sensing data with 'ground truth' information thus offers the potential of an improved template for interpreting remote sensing data sets of other ophiolites for which little or no field information is available.

  4. Who Goes There? Linking Remote Cameras and Schoolyard Science to Empower Action

    ERIC Educational Resources Information Center

    Tanner, Dawn; Ernst, Julie

    2013-01-01

    Taking Action Opportunities (TAO) is a curriculum that combines guided reflection, a focus on the local environment, and innovative use of wildlife technology to empower student action toward improving the environment. TAO is experientially based and uses remote cameras as a tool for schoolyard exploration. Through TAO, students engage in research…

  5. Remote Sensing and Remote Control Activities in Europe and America: Part 2--Remote Sensing Ground Stations in Europe,

    DTIC Science & Technology

    1996-04-08

    Development tasks and products of remote sensing ground stations in Europe are represented by the In-Sec Corporation and the Schlumberger Industries Corporation. The article presents the main products of these two corporations.

  6. [Estimation of desert vegetation coverage based on multi-source remote sensing data].

    PubMed

    Wan, Hong-Mei; Li, Xia; Dong, Dao-Rui

    2012-12-01

    Taking the lower reaches of Tarim River in Xinjiang of Northwest China as study areaAbstract: Taking the lower reaches of Tarim River in Xinjiang of Northwest China as study area and based on the ground investigation and the multi-source remote sensing data of different resolutions, the estimation models for desert vegetation coverage were built, with the precisions of different estimation methods and models compared. The results showed that with the increasing spatial resolution of remote sensing data, the precisions of the estimation models increased. The estimation precision of the models based on the high, middle-high, and middle-low resolution remote sensing data was 89.5%, 87.0%, and 84.56%, respectively, and the precisions of the remote sensing models were higher than that of vegetation index method. This study revealed the change patterns of the estimation precision of desert vegetation coverage based on different spatial resolution remote sensing data, and realized the quantitative conversion of the parameters and scales among the high, middle, and low spatial resolution remote sensing data of desert vegetation coverage, which would provide direct evidence for establishing and implementing comprehensive remote sensing monitoring scheme for the ecological restoration in the study area.

  7. Grid workflow validation using ontology-based tacit knowledge: A case study for quantitative remote sensing applications

    NASA Astrophysics Data System (ADS)

    Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi

    2017-01-01

    Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.

  8. SUSI 62 A Robust and Safe Parachute Uav with Long Flight Time and Good Payload

    NASA Astrophysics Data System (ADS)

    Thamm, H. P.

    2011-09-01

    In many research areas in the geo-sciences (erosion, land use, land cover change, etc.) or applications (e.g. forest management, mining, land management etc.) there is a demand for remote sensing images of a very high spatial and temporal resolution. Due to the high costs of classic aerial photo campaigns, the use of a UAV is a promising option for obtaining the desired remote sensed information at the time it is needed. However, the UAV must be easy to operate, safe, robust and should have a high payload and long flight time. For that purpose, the parachute UAV SUSI 62 was developed. It consists of a steel frame with a powerful 62 cm3 2- stroke engine and a parachute wing. The frame can be easily disassembled for transportation or to replace parts. On the frame there is a gimbal mounted sensor carrier where different sensors, standard SLR cameras and/or multi-spectral and thermal sensors can be mounted. Due to the design of the parachute, the SUSI 62 is very easy to control. Two different parachute sizes are available for different wind speed conditions. The SUSI 62 has a payload of up to 8 kg providing options to use different sensors at the same time or to extend flight duration. The SUSI 62 needs a runway of between 10 m and 50 m, depending on the wind conditions. The maximum flight speed is approximately 50 km/h. It can be operated in a wind speed of up to 6 m/s. The design of the system utilising a parachute UAV makes it comparatively safe as a failure of the electronics or the remote control only results in the UAV coming to the ground at a slow speed. The video signal from the camera, the GPS coordinates and other flight parameters are transmitted to the ground station in real time. An autopilot is available, which guarantees that the area of investigation is covered at the desired resolution and overlap. The robustly designed SUSI 62 has been used successfully in Europe, Africa and Australia for scientific projects and also for agricultural, forestry and industrial applications.

  9. An Approach of Registration between Remote Sensing Image and Electronic Chart Based on Coastal Line

    NASA Astrophysics Data System (ADS)

    Li, Ying; Yu, Shuiming; Li, Chuanlong

    Remote sensing plays an important role marine oil spill emergency. In order to implement a timely and effective countermeasure, it is important to provide exact position of oil spills. Therefore it is necessary to match remote sensing image and electronic chart properly. Variance ordinarily exists between oil spill image and electronic chart, although geometric correction is applied to remote sensing image. It is difficult to find the steady control points on sea to make exact rectification of remote sensing image. An improved relaxation algorithm was developed for finding the control points along the coastline since oil spills occurs generally near the coast. A conversion function is created with the least square, and remote sensing image can be registered with the vector map based on this function. SAR image was used as the remote sensing data and shape format map as the electronic chart data. The results show that this approach can guarantee the precision of the registration, which is essential for oil spill monitoring.

  10. The U.S. Geological Survey land remote sensing program

    USGS Publications Warehouse

    Saunders, T.; Feuquay, J.; Kelmelis, J.A.

    2003-01-01

    The U.S. Geological Survey has been a provider of remotely sensed information for decades. As the availability and use of satellite data has grown, USGS has placed increasing emphasis on expanding the knowledge about the science of remote sensing and on making remotely sensed data more accessible. USGS encourages widespread availability and distribution of these data and through its programs, encourages and enables a variety of research activities and the development of useful applications of the data. The science of remote sensing has great potential for assisting in the monitoring and assessment of the impacts of natural disasters, management and analysis of environmental, biological, energy, and mineral investigations, and supporting informed public policy decisions. By establishing the Land Remote Sensing Program (LRS) as a major unit of the USGS Geography Program, USGS has taken the next step to further increase support for the accessibility, understanding, and use of remotely sensed data. This article describes the LRS Program, its mission and objectives, and how the program has been structured to accomplish its goals.

  11. Microwave Remote Sensing Modeling of Ocean Surface Salinity and Winds Using an Empirical Sea Surface Spectrum

    NASA Technical Reports Server (NTRS)

    Yueh, Simon H.

    2004-01-01

    Active and passive microwave remote sensing techniques have been investigated for the remote sensing of ocean surface wind and salinity. We revised an ocean surface spectrum using the CMOD-5 geophysical model function (GMF) for the European Remote Sensing (ERS) C-band scatterometer and the Ku-band GMF for the NASA SeaWinds scatterometer. The predictions of microwave brightness temperatures from this model agree well with satellite, aircraft and tower-based microwave radiometer data. This suggests that the impact of surface roughness on microwave brightness temperatures and radar scattering coefficients of sea surfaces can be consistently characterized by a roughness spectrum, providing physical basis for using combined active and passive remote sensing techniques for ocean surface wind and salinity remote sensing.

  12. Online catalog access and distribution of remotely sensed information

    NASA Astrophysics Data System (ADS)

    Lutton, Stephen M.

    1997-09-01

    Remote sensing is providing voluminous data and value added information products. Electronic sensors, communication electronics, computer software, hardware, and network communications technology have matured to the point where a distributed infrastructure for remotely sensed information is a reality. The amount of remotely sensed data and information is making distributed infrastructure almost a necessity. This infrastructure provides data collection, archiving, cataloging, browsing, processing, and viewing for applications from scientific research to economic, legal, and national security decision making. The remote sensing field is entering a new exciting stage of commercial growth and expansion into the mainstream of government and business decision making. This paper overviews this new distributed infrastructure and then focuses on describing a software system for on-line catalog access and distribution of remotely sensed information.

  13. Integration of near-surface remote sensing and eddy covariance measurements: new insights on managed ecosystem structure and functioning

    NASA Astrophysics Data System (ADS)

    Hatala, J.; Sonnentag, O.; Detto, M.; Runkle, B.; Vargas, R.; Kelly, M.; Baldocchi, D. D.

    2009-12-01

    Ground-based, visible light imagery has been used for different purposes in agricultural and ecological research. A series of recent studies explored the utilization of networked digital cameras to continuously monitor vegetation by taking oblique canopy images at fixed view angles and time intervals. In our contribution we combine high temporal resolution digital camera imagery, eddy-covariance, and meteorological measurements with weekly field-based hyperspectral and LAI measurements to gain new insights on temporal changes in canopy structure and functioning of two managed ecosystems in California’s Sacramento-San Joaquin River Delta: a pasture infested by the invasive perennial pepperweed (Lepidium latifolium) and a rice plantation (Oryza sativa). Specific questions we address are: a) how does year-round grazing affect pepperweed canopy development, b) is it possible to identify phenological key events of managed ecosystems (pepperweed: flowering; rice: heading) from the limited spectral information of digital camera imagery, c) is a simple greenness index derived from digital camera imagery sufficient to track leaf area index and canopy development of managed ecosystems, and d) what are the scales of temporal correlation between digital camera signals and carbon and water fluxes of managed ecosystems? Preliminary results for the pasture-pepperweed ecosystem show that year-round grazing inhibits the accumulation of dead stalks causing earlier green-up and that digital camera imagery is well suited to capture the onset of flowering and the associated decrease in photosynthetic CO2 uptake. Results from our analyses are of great relevance from both a global environmental change and land management perspective.

  14. Remote Sensing and the Environment.

    ERIC Educational Resources Information Center

    Osmers, Karl

    1991-01-01

    Suggests using remote sensing technology to help students make sense of the natural world. Explains that satellite information allows observation of environmental changes over time. Identifies possible student projects based on remotely sensed data. Recommends obtaining the assistance of experts and seeking funding through effective project…

  15. Use of remote sensing in agriculture

    NASA Technical Reports Server (NTRS)

    Pettry, D. E.; Powell, N. L.; Newhouse, M. E.

    1974-01-01

    Remote sensing studies in Virginia and Chesapeake Bay areas to investigate soil and plant conditions via remote sensing technology are reported ant the results given. Remote sensing techniques and interactions are also discussed. Specific studies on the effects of soil moisture and organic matter on energy reflection of extensively occurring Sassafras soils are discussed. Greenhouse and field studies investigating the effects of chlorophyll content of Irish potatoes on infrared reflection are presented. Selected ground truth and environmental monitoring data are shown in summary form. Practical demonstrations of remote sensing technology in agriculture are depicted and future use areas are delineated.

  16. NASA Glenn OHIOVIEW FY01/02 Project

    NASA Technical Reports Server (NTRS)

    2003-01-01

    The results of the research performed by the university principal investigators are herein compiled. OhioView's general goals were: 1) To increase remote sensing education for Ohio s undergraduate and graduate students, and also enhancing curriculum in the mathematics and science for K-12 students using the capabilities of remote sensing; 2) To conduct advanced research to develop novel remote sensing applications, i.e. to turn data into information for more applications; 3) To maximize the use of remote sensing technology by the general public through outreach and the development of tools for more user-friendly access to remote sensing data.

  17. The availability of conventional forms of remotely sensed data

    USGS Publications Warehouse

    Sturdevant, James A.; Holm, Thomas M.

    1982-01-01

    For decades Federal and State agencies have been collecting aerial photographs of various film types and scales over parts of the United States. More recently, worldwide Earth resources data acquired by orbiting satellites have inundated the remote sensing community. Determining the types of remotely sensed data that are publicly available can be confusing to the land-resource manager, planner, and scientist. This paper is a summary of the more commonly used types of remotely sensed data (aircraft and satellite) and their public availability. Special emphasis is placed on the National High-Altitude Photography (NHAP) program and future remote-sensing satellites.

  18. Abstract on the Effective validation of both new and existing methods for the observation and forecasting of volcanic emissions

    NASA Astrophysics Data System (ADS)

    Sathnur, Ashwini

    2017-04-01

    Validation of the Existing products of the Remote Sensing instruments Review Comment Number 1 Ground - based instruments and space - based instruments are available for remote sensing of the Volcanic eruptions. Review Comment Number 2 The sunlight spectrum appears over the volcanic geographic area. This sunlight is reflected with the image of the volcano geographic area, to the satellite. The satellite captures this emitted spectrum of the image and further calculates the occurrences of the volcanic eruption. Review Comment Number 3 This computation system derives the presence and detection of sulphur dioxide and Volcanic Ash in the emitted spectrum. The temperature of the volcanic region is also measured. If these inputs derive the possibility of occurrence of an eruption, then the data is manually captured by the system for further usage and hazard mitigation. Review Comment Number 4 The instrument is particularly important in capturing the volcanogenic signal. This capturing operation should be carried out during the appropriate time of the day. This is carried out ideally at the time of the day when the reflected image spectra is best available. Capturing the data is not advisable to be performed at the night time, as the sunlight spectra is at its minimum. This would lead to erroneous data interpretation, as there is no sunlight for reflection of the volcanic region. Thus leading to the least capture of the emitted light spectra. Review Comment Number 5 An ideal area coverage of the spectrometer is mandatory. This is basically for the purpose of capturing the right area of data, in order to precisely derive the occurrence of a volcanic eruption. The larger the spatial resolution, there would be a higher capture of the geographic region, and this would lead to a lesser precise data capture. This would lead to missing details in the data capture. Review Comment Number 6 Ideal qualities for the remote sensing instrument are mentioned below:- Minimum "false" positives. Cost - free data made available. Minimum band - width problem. Rapid communication system. Validation and Requirements of the New products of the Remote Sensing instruments The qualities of the existing products would be present in the new products also. Along with these qualities, newly devised additional qualities are also required in order to build an advanced remote sensing instrument. The new additional requirements are mentioned below:- Review Comment Number 1 Enlarging the spatial resolution so that the volcanic plumes erupting from the early volcanic eruption is captured by the remote sensing instrument. This spatial resolution data capture would involve better video and camera facilities on the remote sensing instrument. Review Comment Number 2 Capturing the traces of carbon, carbonic acid and water vapour, along with the existing product's capture of sulphur dioxide and volcanic Ash. Review Comment Number 3 Creating an additional module in the instrument to derive the functionality of forecasting a volcanic eruption. This new forecast model should be able to predict the occurrences of volcanic eruption several months in advance. This is basically to create mechanisms for providing early solutions to the problems of mitigation of volcanic hazards. Review Comment Number 4 Creating additional features in the remote sensing instrument to enable the automatic transfer of forecasted eruptions of volcanoes, to the disaster relief operations team. This transfer of information is to be performed automatically, without any request raised from the relief operations team, for the predicted forecast information. This is for the purpose of receiving the information at the right - time, thus eliminating any possibility of occurrences of errors during hazard management.

  19. Automated Ground-based Time-lapse Camera Monitoring of West Greenland ice sheet outlet Glaciers: Challenges and Solutions

    NASA Astrophysics Data System (ADS)

    Ahn, Y.; Box, J. E.; Balog, J.; Lewinter, A.

    2008-12-01

    Monitoring Greenland outlet glaciers using remotely sensed data has drawn a great attention in earth science communities for decades and time series analysis of sensory data has provided important variability information of glacier flow by detecting speed and thickness changes, tracking features and acquiring model input. Thanks to advancements of commercial digital camera technology and increased solid state storage, we activated automatic ground-based time-lapse camera stations with high spatial/temporal resolution in west Greenland outlet and collected one-hour interval data continuous for more than one year at some but not all sites. We believe that important information of ice dynamics are contained in these data and that terrestrial mono-/stereo-photogrammetry can provide theoretical/practical fundamentals in data processing along with digital image processing techniques. Time-lapse images over periods in west Greenland indicate various phenomenon. Problematic is rain, snow, fog, shadows, freezing of water on camera enclosure window, image over-exposure, camera motion, sensor platform drift, and fox chewing of instrument cables, and the pecking of plastic window by ravens. Other problems include: feature identification, camera orientation, image registration, feature matching in image pairs, and feature tracking. Another obstacle is that non-metric digital camera contains large distortion to be compensated for precise photogrammetric use. Further, a massive number of images need to be processed in a way that is sufficiently computationally efficient. We meet these challenges by 1) identifying problems in possible photogrammetric processes, 2) categorizing them based on feasibility, and 3) clarifying limitation and alternatives, while emphasizing displacement computation and analyzing regional/temporal variability. We experiment with mono and stereo photogrammetric techniques in the aide of automatic correlation matching for efficiently handling the enormous data volumes.

  20. NASA's Applied Remote Sensing Training (ARSET) Webinar Series

    Atmospheric Science Data Center

    2016-07-12

    NASA's Applied Remote Sensing Training (ARSET) Webinar Series Tuesday, July 12, 2016 ... you of a free training opportunity: Introduction to Remote Sensing for Air Quality Applications Webinar Series Beginning in ...

  1. Tropospheric Passive Remote Sensing

    NASA Technical Reports Server (NTRS)

    Keafer, L. S., Jr. (Editor)

    1982-01-01

    The long term role of airborne/spaceborne passive remote sensing systems for tropospheric air quality research and the identification of technology advances required to improve the performance of passive remote sensing systems were discussed.

  2. Remote Sensing as a Demonstration of Applied Physics.

    ERIC Educational Resources Information Center

    Colwell, Robert N.

    1980-01-01

    Provides information about the field of remote sensing, including discussions of geo-synchronous and sun-synchronous remote-sensing platforms, the actual physical processes and equipment involved in sensing, the analysis of images by humans and machines, and inexpensive, small scale methods, including aerial photography. (CS)

  3. Opportunities and problems in introducing or expanding the teaching of remote sensing in universities

    NASA Technical Reports Server (NTRS)

    Maxwell, E. L.

    1980-01-01

    The need for degree programs in remote sensing is considered. Any education program which claims to train remote sensing specialists must include expertise in the physical principles upon which remote sensing is based. These principles dictate the limits of engineering and design, computer analysis, photogrammetry, and photointerpretation. Faculty members must be hired to provide emphasis in those five areas.

  4. Remote sensing of vegetation fires and its contribution to a fire management information system

    Treesearch

    Stephane P. Flasse; Simon N. Trigg; Pietro N. Ceccato; Anita H. Perryman; Andrew T. Hudak; Mark W. Thompson; Bruce H. Brockett; Moussa Drame; Tim Ntabeni; Philip E. Frost; Tobias Landmann; Johan L. le Roux

    2004-01-01

    In the last decade, research has proven that remote sensing can provide very useful support to fire managers. This chapter provides an overview of the types of information remote sensing can provide to the fire community. First, it considers fire management information needs in the context of a fire management information system. An introduction to remote sensing then...

  5. Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing

    DTIC Science & Technology

    2016-07-15

    AFRL-AFOSR-JP-TR-2016-0068 Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing Hean-Teik...SUBTITLE Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER... electromagnetics to the application in microwave remote sensing as well as extension of modelling capability with computational flexibility to study

  6. Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing

    DTIC Science & Technology

    2016-07-15

    AFRL-AFOSR-JP-TR-2016-0068 Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing Hean-Teik...SUBTITLE Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER...electromagnetics to the application in microwave remote sensing as well as extension of modelling capability with computational flexibility to study

  7. Basic Remote Sensing Investigations for Beach Reconnaissance.

    DTIC Science & Technology

    Progress is reported on three tasks designed to develop remote sensing beach reconnaissance techniques applicable to the benthic, beach intertidal...and beach upland zones. Task 1 is designed to develop remote sensing indicators of important beach composition and physical parameters which will...ultimately prove useful in models to predict beach conditions. Task 2 is designed to develop remote sensing techniques for survey of bottom features in

  8. Bridging the Scales from Field to Region with Practical Tools to Couple Time- and Space-Synchronized Data from Flux Towers and Networks with Proximal and Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Burba, G. G.; Avenson, T.; Burkart, A.; Gamon, J. A.; Guan, K.; Julitta, T.; Pastorello, G.; Sakowska, K.

    2017-12-01

    Many hundreds of flux towers are presently operational as standalone projects and as parts of regional networks. However, the vast majority of these towers do not allow straightforward coupling with remote sensing (drone, aircraft, satellite, etc.) data, and even fewer have optical sensors for validation of remote sensing products, and upscaling from field to regional levels. In 2016-2017, new tools to collect, process, and share time-synchronized flux data from multiple towers were developed and deployed globally. Originally designed to automate site and data management, and to streamline flux data analysis, these tools allow relatively easy matching of tower data with remote sensing data: GPS-driven PTP time protocol synchronizes instrumentation within the station, different stations with each other, and all of these to remote sensing data to precisely align remote sensing and flux data in time Footprint size and coordinates computed and stored with flux data help correctly align tower flux footprints and drone, aircraft or satellite motion to precisely align optical and flux data in space Full snapshot of the remote sensing pixel can then be constructed, including leaf-level, ground optical sensor, and flux tower measurements from the same footprint area, closely coupled with the remote sensing measurements to help interpret remote sensing data, validate models, and improve upscaling Additionally, current flux towers can be augmented with advanced ground optical sensors and can use standard routines to deliver continuous products (e.g. SIF, PRI, NDVI, etc.) based on automated field spectrometers (e.g., FloX and RoX, etc.) and other optical systems. Several dozens of new towers already operational globally can be readily used for the proposed workflow. Over 500 active traditional flux towers can be updated to synchronize their data with remote sensing measurements. This presentation will show how the new tools are used by major networks, and describe how this approach can be utilized for matching remote sensing and tower data to aid in ground truthing, improve scientific interactions, and promote joint grant writing and other forms of collaboration between the flux and remote sensing communities.

  9. Monitoring Crop Phenology and Growth Stages from Space: Opportunities and Challenges

    NASA Astrophysics Data System (ADS)

    Gao, F.; Anderson, M. C.; Mladenova, I. E.; Kustas, W. P.; Alfieri, J. G.

    2014-12-01

    Crop growth stages in concert with weather and soil moisture conditions can have a significant impact on crop yields. In the U.S., crop growth stages and conditions are reported by farmers at the county level. These reports are somewhat subjective and fluctuate between different reporters, locations and times. Remote sensing data provide an alternative approach to monitoring crop growth over large areas in a more consistent and quantitative way. In the recent years, remote sensing data have been used to detect vegetation phenology at 1-km spatial resolution globally. However, agricultural applications at field scale require finer spatial resolution remote sensing data. Landsat (30-m) data have been successfully used for agricultural applications. There are many medium resolution sensors available today or in near future. These include Landsat, SPOT, RapidEye, ASTER and future Sentinel-2 etc. Approaches have been developed in the past several years to integrate remote sensing data from different sensors which may have different sensor characteristics, and spatial and temporal resolutions. This allows us opportunities today to map crop growth stages and conditions using dense time-series remote sensing at field scales. However, remotely sensed phenology (or phenological metrics) is normally derived based on the mathematical functions of the time-series data. The phenological metrics are determined by either identifying inflection (curvature) points or some pre-defined thresholds in the remote sensing phenology algorithms. Furthermore, physiological crop growth stages may not be directly correlated to the remotely sensed phenology. The relationship between remotely sensed phenology and crop growth stages is likely to vary for specific crop types and varieties, growing stages, conditions and even locations. In this presentation, we will examine the relationship between remotely sensed phenology and crop growth stages using in-situ measurements from Fluxnet sites and crop progress reports from USDA NASS. We will present remote sensing approaches and focus on: 1) integrating multiple sources of remote sensing data; and 2) extracting crop phenology at field scales. An example in the U.S. Corn Belt area will be presented and analyzed. Future directions for mapping crop growth stages will be discussed.

  10. Remote Sensing: A Film Review.

    ERIC Educational Resources Information Center

    Carter, David J.

    1986-01-01

    Reviews the content of 19 films on remote sensing published between 1973 and 1980. Concludes that they are overly simplistic, notably outdated, and generally too optimistic about the potential of remote sensing from space for resource exploration and environmental problem-solving. Provides names and addresses of more current remote sensing…

  11. Reconnaissance of marine resources

    NASA Technical Reports Server (NTRS)

    Szekielda, K.-H.; Suszkowski, D. J.; Tabor, P. S.

    1975-01-01

    A test area along the NW Coast of Africa was used during the Skylab mission to study the distribution of temperature and plankton. The S190B Earth Terrain Camera with a spectral film response of 0.4-0.7 micrometers allowed qualitative estimates of the distribution patterns of suspended material. Differentiation between inorganic particles and phytoplankton could be made by comparing the green band and the red band of the S190A Camera System. The pictorial display of data obtained from the S191 scanning radiometer in the 10-11 micrometer atmospheric window allowed a detailed interpretation of the temperature distribution in the area where cold upwelled water reaches the euphotic zone. The comparison between infrared data and the imageries taken simultaneously indicated the origin of the cold water as well as the pathway within the Canary current. A fish survey carried out almost simultaneously in the area, by echosounding, showed high correlation between the position of good fishing grounds and the distribution of plankton as detected by remote sensing detectors on Skylab.

  12. Astronomical Polarimetry with the RIT Polarization Imaging Camera

    NASA Astrophysics Data System (ADS)

    Vorobiev, Dmitry V.; Ninkov, Zoran; Brock, Neal

    2018-06-01

    In the last decade, imaging polarimeters based on micropolarizer arrays have been developed for use in terrestrial remote sensing and metrology applications. Micropolarizer-based sensors are dramatically smaller and more mechanically robust than other polarimeters with similar spectral response and snapshot capability. To determine the suitability of these new polarimeters for astronomical applications, we developed the RIT Polarization Imaging Camera to investigate the performance of these devices, with a special attention to the low signal-to-noise regime. We characterized the device performance in the lab, by determining the relative throughput, efficiency, and orientation of every pixel, as a function of wavelength. Using the resulting pixel response model, we developed demodulation procedures for aperture photometry and imaging polarimetry observing modes. We found that, using the current calibration, RITPIC is capable of detecting polarization signals as small as ∼0.3%. The relative ease of data collection, calibration, and analysis provided by these sensors suggest than they may become an important tool for a number of astronomical targets.

  13. An integrated compact airborne multispectral imaging system using embedded computer

    NASA Astrophysics Data System (ADS)

    Zhang, Yuedong; Wang, Li; Zhang, Xuguo

    2015-08-01

    An integrated compact airborne multispectral imaging system using embedded computer based control system was developed for small aircraft multispectral imaging application. The multispectral imaging system integrates CMOS camera, filter wheel with eight filters, two-axis stabilized platform, miniature POS (position and orientation system) and embedded computer. The embedded computer has excellent universality and expansibility, and has advantages in volume and weight for airborne platform, so it can meet the requirements of control system of the integrated airborne multispectral imaging system. The embedded computer controls the camera parameters setting, filter wheel and stabilized platform working, image and POS data acquisition, and stores the image and data. The airborne multispectral imaging system can connect peripheral device use the ports of the embedded computer, so the system operation and the stored image data management are easy. This airborne multispectral imaging system has advantages of small volume, multi-function, and good expansibility. The imaging experiment results show that this system has potential for multispectral remote sensing in applications such as resource investigation and environmental monitoring.

  14. Utilizing the Southwest Ultraviolet Imaging System (SwUIS) on the International Space Station

    NASA Astrophysics Data System (ADS)

    Schindhelm, Eric; Stern, S. Alan; Ennico-Smith, Kimberly

    2013-09-01

    We present the Southwest Ultraviolet Imaging System (SwUIS), a compact, low-cost instrument designed for remote sensing observations from a manned platform in space. It has two chief configurations; a high spatial resolution mode with a 7-inch Maksutov-Cassegrain telescope, and a large field-of-view camera mode using a lens assembly. It can operate with either an intensified CCD or an electron multiplying CCD camera. Interchangeable filters and lenses enable broadband and narrowband imaging at UV/visible/near-infrared wavelengths, over a range of spatial resolution. SwUIS has flown previously on Space Shuttle flights STS-85 and STS-93, where it recorded multiple UV images of planets, comets, and vulcanoids. We describe the instrument and its capabilities in detail. The SWUIS's broad wavelength coverage and versatile range of hardware configurations make it an attractive option for use as a facility instrument for Earth science and astronomical imaging investigations aboard the International Space Station.

  15. Develop Direct Geo-referencing System Based on Open Source Software and Hardware Platform

    NASA Astrophysics Data System (ADS)

    Liu, H. S.; Liao, H. M.

    2015-08-01

    Direct geo-referencing system uses the technology of remote sensing to quickly grasp images, GPS tracks, and camera position. These data allows the construction of large volumes of images with geographic coordinates. So that users can be measured directly on the images. In order to properly calculate positioning, all the sensor signals must be synchronized. Traditional aerial photography use Position and Orientation System (POS) to integrate image, coordinates and camera position. However, it is very expensive. And users could not use the result immediately because the position information does not embed into image. To considerations of economy and efficiency, this study aims to develop a direct geo-referencing system based on open source software and hardware platform. After using Arduino microcontroller board to integrate the signals, we then can calculate positioning with open source software OpenCV. In the end, we use open source panorama browser, panini, and integrate all these to open source GIS software, Quantum GIS. A wholesome collection of data - a data processing system could be constructed.

  16. Educational activities of remote sensing archaeology (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Hadjimitsis, Diofantos G.; Agapiou, Athos; Lysandrou, Vasilki; Themistocleous, Kyriacos; Cuca, Branka; Nisantzi, Argyro; Lasaponara, Rosa; Masini, Nicola; Krauss, Thomas; Cerra, Daniele; Gessner, Ursula; Schreier, Gunter

    2016-10-01

    Remote sensing science is increasingly being used to support archaeological and cultural heritage research in various ways. Satellite sensors either passive or active are currently used in a systematic basis to detect buried archaeological remains and to systematic monitor tangible heritage. In addition, airborne and low altitude systems are being used for documentation purposes. Ground surveys using remote sensing tools such as spectroradiometers and ground penetrating radars can detect variations of vegetation and soil respectively, which are linked to the presence of underground archaeological features. Education activities and training of remote sensing archaeology to young people is characterized of highly importance. Specific remote sensing tools relevant for archaeological research can be developed including web tools, small libraries, interactive learning games etc. These tools can be then combined and aligned with archaeology and cultural heritage. This can be achieved by presenting historical and pre-historical records, excavated sites or even artifacts under a "remote sensing" approach. Using such non-form educational approach, the students can be involved, ask, read, and seek to learn more about remote sensing and of course to learn about history. The paper aims to present a modern didactical concept and some examples of practical implementation of remote sensing archaeology in secondary schools in Cyprus. The idea was built upon an ongoing project (ATHENA) focused on the sue of remote sensing for archaeological research in Cyprus. Through H2020 ATHENA project, the Remote Sensing Science and Geo-Environment Research Laboratory at the Cyprus University of Technology (CUT), with the support of the National Research Council of Italy (CNR) and the German Aerospace Centre (DLR) aims to enhance its performance in all these new technologies.

  17. Remote Sensing and the Earth.

    ERIC Educational Resources Information Center

    Brosius, Craig A.; And Others

    This document is designed to help senior high school students study remote sensing technology and techniques in relation to the environmental sciences. It discusses the acquisition, analysis, and use of ecological remote data. Material is divided into three sections and an appendix. Section One is an overview of the basics of remote sensing.…

  18. Microwave remote sensing of snowpack properties

    NASA Technical Reports Server (NTRS)

    Rango, A. (Editor)

    1980-01-01

    Topic concerning remote sensing capabilities for providing reliable snow cover data and measurement of snow water equivalents are discussed. Specific remote sensing technqiues discussed include those in the microwave region of the electromagnetic spectrum.

  19. Commerical Remote Sensing Data Contract

    USGS Publications Warehouse

    ,

    2005-01-01

    The U. S. Geological Survey's (USGS) Commercial Remote Sensing Data Contracts (CRSDCs) provide government agencies with access to a broad range of commercially available remotely sensed airborne and satellite data. These contracts were established to support The National Map partners, other Federal Civilian agency programs, and Department of Defense programs that require data for the United States and its territories. Experience shows that centralized procurement of remotely sensed data leads to considerable cost savings to the Federal government through volume discounts, reduction of redundant contract administrative costs, and avoidance of duplicate purchases. These contracts directly support the President's Commercial Remote Sensing Space Policy, signed in 2003, by providing a centralized mechanism for civil agencies to acquire commercial remote sensing products to support their mission needs in an efficient and coordinated way. CRSDC administration is provided by the USGS Mid-Continent Mapping Center in Rolla, Missouri.

  20. Object-oriented recognition of high-resolution remote sensing image

    NASA Astrophysics Data System (ADS)

    Wang, Yongyan; Li, Haitao; Chen, Hong; Xu, Yuannan

    2016-01-01

    With the development of remote sensing imaging technology and the improvement of multi-source image's resolution in satellite visible light, multi-spectral and hyper spectral , the high resolution remote sensing image has been widely used in various fields, for example military field, surveying and mapping, geophysical prospecting, environment and so forth. In remote sensing image, the segmentation of ground targets, feature extraction and the technology of automatic recognition are the hotspot and difficulty in the research of modern information technology. This paper also presents an object-oriented remote sensing image scene classification method. The method is consist of vehicles typical objects classification generation, nonparametric density estimation theory, mean shift segmentation theory, multi-scale corner detection algorithm, local shape matching algorithm based on template. Remote sensing vehicles image classification software system is designed and implemented to meet the requirements .

  1. New horizons in remote sensing for forest range resource management

    USGS Publications Warehouse

    Lauer, D.T.

    1985-01-01

    Forest and range resource scientists were among the first to recognize the potential of aircraft and satellite remote sensing for management of timber, forage, water, and wildlife resource. Today, data from a variety of sensor systems are being put to practical use for inventorying, monitoring, and assessing forest and range resources. In the future, improved sensor systems providing new kinds of data will be available. Likewise, new types of data handling and processing systems can be anticipated. Among the new or anticipated aircraft and satellite systems and/or data are National High-Altitude Photograph II, U. S. Geological Survey-acquired Side-Looking Airborne Radar, the Landsat thematic mapper, the National Oceanic Resolution Radiometer, the French Systeme Probatoire d'Observation de la Terre (SPOT) satellite, the European Space Agency Earth Resources Satellite, the National Aeronautics and Space Administration Large Format Camera and Shuttle Imaging Radar (SIR-A, -B, and -C), and a variety of other systems in existence or planned by the Soviets, Japanese, Canadians, Chinese, Brazilians, Indonesians, and other. Application examples are presented that illustrate uses of 1-kilometer-resolution AVHRR data, 80-meter Landsat multispectral scanner data, 30-meter Landsat thematic mapper data, and 10-meter SPOT-simulator data. These examples address fire fuel monitoring, land cover mapping, rangeland assessment, and soils landscape mapping.

  2. A Review of Oil Spill Remote Sensing

    PubMed Central

    Brown, Carl E.

    2017-01-01

    The technical aspects of oil spill remote sensing are examined and the practical uses and drawbacks of each technology are given with a focus on unfolding technology. The use of visible techniques is ubiquitous, but limited to certain observational conditions and simple applications. Infrared cameras offer some potential as oil spill sensors but have several limitations. Both techniques, although limited in capability, are widely used because of their increasing economy. The laser fluorosensor uniquely detects oil on substrates that include shoreline, water, soil, plants, ice, and snow. New commercial units have come out in the last few years. Radar detects calm areas on water and thus oil on water, because oil will reduce capillary waves on a water surface given moderate winds. Radar provides a unique option for wide area surveillance, all day or night and rainy/cloudy weather. Satellite-carried radars with their frequent overpass and high spatial resolution make these day–night and all-weather sensors essential for delineating both large spills and monitoring ship and platform oil discharges. Most strategic oil spill mapping is now being carried out using radar. Slick thickness measurements have been sought for many years. The operative technique at this time is the passive microwave. New techniques for calibration and verification have made these instruments more reliable. PMID:29301212

  3. Infrared remote sensing of hazardous vapours: surveillance of public areas during the FIFA Football World Cup 2006

    NASA Astrophysics Data System (ADS)

    Harig, Roland; Matz, Gerhard; Rusch, Peter; Gerhard, Hans-Hennig; Gerhard, Jörn-Hinnrich; Schlabs, Volker

    2007-04-01

    The German ministry of the interior, represented by the civil defence agency BBK, established analytical task forces for the analysis of released chemicals in the case of fires, chemical accidents, terrorist attacks, or war. One of the first assignments of the task forces was the provision of analytical services during the football world cup 2006. One part of the equipment of these emergency response forces is a remote sensing system that allows identification and visualisation of hazardous clouds from long distances, the scanning infrared gas imaging system SIGIS 2. The system is based on an interferometer with a single detector element in combination with a telescope and a synchronised scanning mirror. The system allows 360° surveillance. The system is equipped with a video camera and the results of the analyses of the spectra are displayed by an overlay of a false colour image on the video image. This allows a simple evaluation of the position and the size of a cloud. The system was deployed for surveillance of stadiums and public viewing areas, where large crowds watched the games. Although no intentional or accidental releases of hazardous gases occurred in the stadiums and in the public viewing areas, the systems identified and located various foreign gases in the air.

  4. UFCN: a fully convolutional neural network for road extraction in RGB imagery acquired by remote sensing from an unmanned aerial vehicle

    NASA Astrophysics Data System (ADS)

    Kestur, Ramesh; Farooq, Shariq; Abdal, Rameen; Mehraj, Emad; Narasipura, Omkar; Mudigere, Meenavathi

    2018-01-01

    Road extraction in imagery acquired by low altitude remote sensing (LARS) carried out using an unmanned aerial vehicle (UAV) is presented. LARS is carried out using a fixed wing UAV with a high spatial resolution vision spectrum (RGB) camera as the payload. Deep learning techniques, particularly fully convolutional network (FCN), are adopted to extract roads by dense semantic segmentation. The proposed model, UFCN (U-shaped FCN) is an FCN architecture, which is comprised of a stack of convolutions followed by corresponding stack of mirrored deconvolutions with the usage of skip connections in between for preserving the local information. The limited dataset (76 images and their ground truths) is subjected to real-time data augmentation during training phase to increase the size effectively. Classification performance is evaluated using precision, recall, accuracy, F1 score, and brier score parameters. The performance is compared with support vector machine (SVM) classifier, a one-dimensional convolutional neural network (1D-CNN) model, and a standard two-dimensional CNN (2D-CNN). The UFCN model outperforms the SVM, 1D-CNN, and 2D-CNN models across all the performance parameters. Further, the prediction time of the proposed UFCN model is comparable with SVM, 1D-CNN, and 2D-CNN models.

  5. Java-Library for the Access, Storage and Editing of Calibration Metadata of Optical Sensors

    NASA Astrophysics Data System (ADS)

    Firlej, M.; Kresse, W.

    2016-06-01

    The standardization of the calibration of optical sensors in photogrammetry and remote sensing has been discussed for more than a decade. Projects of the German DGPF and the European EuroSDR led to the abstract International Technical Specification ISO/TS 19159-1:2014 "Calibration and validation of remote sensing imagery sensors and data - Part 1: Optical sensors". This article presents the first software interface for a read- and write-access to all metadata elements standardized in the ISO/TS 19159-1. This interface is based on an xml-schema that was automatically derived by ShapeChange from the UML-model of the Specification. The software interface serves two cases. First, the more than 300 standardized metadata elements are stored individually according to the xml-schema. Secondly, the camera manufacturers are using many administrative data that are not a part of the ISO/TS 19159-1. The new software interface provides a mechanism for input, storage, editing, and output of both types of data. Finally, an output channel towards a usual calibration protocol is provided. The interface is written in Java. The article also addresses observations made when analysing the ISO/TS 19159-1 and compiles a list of proposals for maturing the document, i.e. for an updated version of the Specification.

  6. Fast Occlusion and Shadow Detection for High Resolution Remote Sensing Image Combined with LIDAR Point Cloud

    NASA Astrophysics Data System (ADS)

    Hu, X.; Li, X.

    2012-08-01

    The orthophoto is an important component of GIS database and has been applied in many fields. But occlusion and shadow causes the loss of feature information which has a great effect on the quality of images. One of the critical steps in true orthophoto generation is the detection of occlusion and shadow. Nowadays LiDAR can obtain the digital surface model (DSM) directly. Combined with this technology, image occlusion and shadow can be detected automatically. In this paper, the Z-Buffer is applied for occlusion detection. The shadow detection can be regarded as a same problem with occlusion detection considering the angle between the sun and the camera. However, the Z-Buffer algorithm is computationally expensive. And the volume of scanned data and remote sensing images is very large. Efficient algorithm is another challenge. Modern graphics processing unit (GPU) is much more powerful than central processing unit (CPU). We introduce this technology to speed up the Z-Buffer algorithm and get 7 times increase in speed compared with CPU. The results of experiments demonstrate that Z-Buffer algorithm plays well in occlusion and shadow detection combined with high density of point cloud and GPU can speed up the computation significantly.

  7. A Review of Oil Spill Remote Sensing.

    PubMed

    Fingas, Merv; Brown, Carl E

    2017-12-30

    The technical aspects of oil spill remote sensing are examined and the practical uses and drawbacks of each technology are given with a focus on unfolding technology. The use of visible techniques is ubiquitous, but limited to certain observational conditions and simple applications. Infrared cameras offer some potential as oil spill sensors but have several limitations. Both techniques, although limited in capability, are widely used because of their increasing economy. The laser fluorosensor uniquely detects oil on substrates that include shoreline, water, soil, plants, ice, and snow. New commercial units have come out in the last few years. Radar detects calm areas on water and thus oil on water, because oil will reduce capillary waves on a water surface given moderate winds. Radar provides a unique option for wide area surveillance, all day or night and rainy/cloudy weather. Satellite-carried radars with their frequent overpass and high spatial resolution make these day-night and all-weather sensors essential for delineating both large spills and monitoring ship and platform oil discharges. Most strategic oil spill mapping is now being carried out using radar. Slick thickness measurements have been sought for many years. The operative technique at this time is the passive microwave. New techniques for calibration and verification have made these instruments more reliable.

  8. Analyzing soil erosion using a multi-temporal UAV data set after one year of active agriculture in Navarra, Spain

    NASA Astrophysics Data System (ADS)

    Anders, Niels; Keesstra, Saskia; Masselink, Rens

    2014-05-01

    Unmanned Aerial System (UAS) are becoming popular tools in the geosciences due to improving technology and processing/analysis techniques. They can potentially fill the gap between spaceborne or manned aircraft remote sensing and terrestrial remote sensing, both in terms of spatial and temporal resolution. In this study we analyze a multi-temporal data set that was acquired with a fixed-wing UAS in an agricultural catchment (2 sq. km) in Navarra, Spain. The goal of this study is to register soil erosion activity after one year of agricultural activity. The aircraft was equipped with a Panasonic GX1 16MP pocket camera with a 20 mm lens to capture normal JPEG RGB images. The data set consisted of two sets of imagery acquired in the end of February in 2013 and 2014 after harvesting. The raw images were processed using Agisoft Photoscan Pro which includes the structure-from-motion (SfM) and multi-view stereopsis (MVS) algorithms producing digital surface models and orthophotos of both data sets. A discussion is presented that is focused on the suitability of multi-temporal UAS data and SfM/MVS processing for quantifying soil loss, mapping the distribution of eroded materials and analyzing re-occurrences of rill patterns after plowing.

  9. Experimental Sea Slicks in the Marsen (Maritime Remote Sensing) Exercise.

    DTIC Science & Technology

    1980-10-30

    Experimental slicks with various surface properties were generated in the North Sea as part of the MARSEN (Maritime Remote Sensing ) exercise. The one...with remote sensing instrumentation. Because of the numerous effects of surface films on air-sea interfacial processes, these experiments were designed...information was obtained on the influence of sea surface films on the interpretation of signals received by remote sensing systems. Criteria for the

  10. SYMPOSIUM ON REMOTE SENSING IN THE POLAR REGIONS

    DTIC Science & Technology

    The Arctic Institute of North America long has been interested in encouraging full and specific attention to applications of remote sensing to polar...research problems. The major purpose of the symposium was to acquaint scientists and technicians concerned with remote sensing with some of the...special problems of the polar areas and, in turn, to acquaint polar scientists with the potential of the use of remote sensing . The Symposium therefore was

  11. Methods of Determining Playa Surface Conditions Using Remote Sensing

    DTIC Science & Technology

    1987-10-08

    NO. 11. TITLE (include Security Classification) METHODS OF DETERMINING PLAYA SURFACE CONDITIONS USING REMOTE SENSING 12. PERSONAL AUTHOR(S) J. PONDER...PLAYA SURFACE CONDITIONS USING REMOTE SENSING J. Ponder Henley U. S. Army Engineer Topographic Laboratories Fort Belvoir, Virginia 22060-5546 "ABSTRACT...geochemistry, hydrology and remote sensing but all of these are important to the understanding of these unique geomorphic features. There is a large body

  12. Needs Assessment for the Use of NASA Remote Sensing Data in the Development and Implementation of Estuarine and Coastal Water Quality Standards

    NASA Technical Reports Server (NTRS)

    Spiering, Bruce; Underwood, Lauren; Ellis, Chris; Lehrter, John; Hagy, Jim; Schaeffer, Blake

    2010-01-01

    The goals of the project are to provide information from satellite remote sensing to support numeric nutrient criteria development and to determine data processing methods and data quality requirements to support nutrient criteria development and implementation. The approach is to identify water quality indicators that are used by decision makers to assess water quality and that are related to optical properties of the water; to develop remotely sensed data products based on algorithms relating remote sensing imagery to field-based observations of indicator values; to develop methods to assess estuarine water quality, including trends, spatial and temporal variability, and seasonality; and to develop tools to assist in the development and implementation of estuarine and coastal nutrient criteria. Additional slides present process, criteria development, typical data sources and analyses for criteria process, the power of remote sensing data for the process, examples from Pensacola Bay, spatial and temporal variability, pixel matchups, remote sensing validation, remote sensing in coastal waters, requirements for remotely sensed data products, and needs assessment. An additional presentation examines group engagement and information collection. Topics include needs assessment purpose and objectives, understanding water quality decision making, determining information requirements, and next steps.

  13. Commercial use of remote sensing in agriculture: a case study

    NASA Astrophysics Data System (ADS)

    Gnauck, Gary E.

    1999-12-01

    Over 25 years of research have clearly shown that an analysis of remote sensing imagery can provide information on agricultural crops. Most of this research has been funded by and directed toward the needs of government agencies. Commercial use of agricultural remote sensing has been limited to very small-scale operations supplying remote sensing services to a few selected customers. Datron/Transco Inc. undertook an internally funded remote sensing program directed toward the California cash crop industry (strawberries, lettuce, tomatoes, other fresh vegetables and cotton). The objectives of this program were twofold: (1) to assess the need and readiness of agricultural land managers to adopt remote sensing as a management tool, and (2) determine what technical barriers exist to large-scale implementation of this technology on a commercial basis. The program was divided into three phases: Planning, Engineering Test and Evaluation, and Commercial Operations. Findings: Remote sensing technology can deliver high resolution multispectral imagery with rapid turnaround, that can provide information on crop stress insects, disease and various soil parameters. The limiting factors to the use of remote sensing in agriculture are a lack of familiarization by the land managers, difficulty in translating 'information' into increased revenue or reduced cost for the land manager, and the large economies of scale needed to make the venture commercially viable.

  14. 15 CFR 960.1 - Purpose.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... remote sensing satellite industry. (Available from NOAA, National Environmental Satellite Data and... LICENSING OF PRIVATE REMOTE SENSING SYSTEMS General § 960.1 Purpose. (a) The regulations in this part set... sensing space system under Title II of the Land Remote Sensing Policy Act of 1992 (15 U.S.C. 5601 et seq...

  15. 15 CFR 960.1 - Purpose.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... remote sensing satellite industry. (Available from NOAA, National Environmental Satellite Data and... LICENSING OF PRIVATE REMOTE SENSING SYSTEMS General § 960.1 Purpose. (a) The regulations in this part set... sensing space system under Title II of the Land Remote Sensing Policy Act of 1992 (15 U.S.C. 5601 et seq...

  16. 15 CFR 960.1 - Purpose.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... remote sensing satellite industry. (Available from NOAA, National Environmental Satellite Data and... LICENSING OF PRIVATE REMOTE SENSING SYSTEMS General § 960.1 Purpose. (a) The regulations in this part set... sensing space system under Title II of the Land Remote Sensing Policy Act of 1992 (15 U.S.C. 5601 et seq...

  17. 15 CFR 960.1 - Purpose.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... remote sensing satellite industry. (Available from NOAA, National Environmental Satellite Data and... LICENSING OF PRIVATE REMOTE SENSING SYSTEMS General § 960.1 Purpose. (a) The regulations in this part set... sensing space system under Title II of the Land Remote Sensing Policy Act of 1992 (15 U.S.C. 5601 et seq...

  18. 15 CFR 960.1 - Purpose.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... LICENSING OF PRIVATE REMOTE SENSING SYSTEMS General § 960.1 Purpose. (a) The regulations in this part set... sensing space system under Title II of the Land Remote Sensing Policy Act of 1992 (15 U.S.C. 5601 et seq... remote sensing satellite industry. (Available from NOAA, National Environmental Satellite Data and...

  19. Advanced Remote Sensing Research

    USGS Publications Warehouse

    Slonecker, Terrence; Jones, John W.; Price, Susan D.; Hogan, Dianna

    2008-01-01

    'Remote sensing' is a generic term for monitoring techniques that collect information without being in physical contact with the object of study. Overhead imagery from aircraft and satellite sensors provides the most common form of remotely sensed data and records the interaction of electromagnetic energy (usually visible light) with matter, such as the Earth's surface. Remotely sensed data are fundamental to geographic science. The Eastern Geographic Science Center (EGSC) of the U.S. Geological Survey (USGS) is currently conducting and promoting the research and development of three different aspects of remote sensing science: spectral analysis, automated orthorectification of historical imagery, and long wave infrared (LWIR) polarimetric imagery (PI).

  20. Remote sensing in the coastal and marine environment. Proceedings of the US North Atlantic Regional Workshop

    NASA Technical Reports Server (NTRS)

    Zaitzeff, J. B. (Editor); Cornillon, P. (Editor); Aubrey, D. A. (Editor)

    1980-01-01

    Presentations were grouped in the following categories: (1) a technical orientation of Earth resources remote sensing including data sources and processing; (2) a review of the present status of remote sensing technology applicable to the coastal and marine environment; (3) a description of data and information needs of selected coastal and marine activities; and (4) an outline of plans for marine monitoring systems for the east coast and a concept for an east coast remote sensing facility. Also discussed were user needs and remote sensing potentials in the areas of coastal processes and management, commercial and recreational fisheries, and marine physical processes.

  1. Remote sensing of Earth terrain

    NASA Technical Reports Server (NTRS)

    Kong, J. A.

    1992-01-01

    Research findings are summarized for projects dealing with the following: application of theoretical models to active and passive remote sensing of saline ice; radiative transfer theory for polarimetric remote sensing of pine forest; scattering of electromagnetic waves from a dense medium consisting of correlated Mie scatterers with size distribution and applications to dry snow; variance of phase fluctuations of waves propagating through a random medium; theoretical modeling for passive microwave remote sensing of earth terrain; polarimetric signatures of a canopy of dielectric cylinders based on first and second order vector radiative transfer theory; branching model for vegetation; polarimetric passive remote sensing of periodic surfaces; composite volume and surface scattering model; and radar image classification.

  2. Application of remote sensing to state and regional problems. [for Mississippi

    NASA Technical Reports Server (NTRS)

    Miller, W. F.; Bouchillon, C. W.; Harris, J. C.; Carter, B.; Whisler, F. D.; Robinette, R.

    1974-01-01

    The primary purpose of the remote sensing applications program is for various members of the university community to participate in activities that improve the effective communication between the scientific community engaged in remote sensing research and development and the potential users of modern remote sensing technology. Activities of this program are assisting the State of Mississippi in recognizing and solving its environmental, resource and socio-economic problems through inventory, analysis, and monitoring by appropriate remote sensing systems. Objectives, accomplishments, and current status of the following individual projects are reported: (1) bark beetle project; (2) state park location planning; and (3) waste source location and stream channel geometry monitoring.

  3. Application of remote sensing to water resources problems

    NASA Technical Reports Server (NTRS)

    Clapp, J. L.

    1972-01-01

    The following conclusions were reached concerning the applications of remote sensing to water resources problems: (1) Remote sensing methods provide the most practical method of obtaining data for many water resources problems; (2) the multi-disciplinary approach is essential to the effective application of remote sensing to water resource problems; (3) there is a correlation between the amount of suspended solids in an effluent discharged into a water body and reflected energy; (4) remote sensing provides for more effective and accurate monitoring, discovery and characterization of the mixing zone of effluent discharged into a receiving water body; and (5) it is possible to differentiate between blue and blue-green algae.

  4. SUPERFUND REMOTE SENSING SUPPORT

    EPA Science Inventory

    This task provides remote sensing technical support to the Superfund program. Support includes the collection, processing, and analysis of remote sensing data to characterize hazardous waste disposal sites and their history. Image analysis reports, aerial photographs, and assoc...

  5. Remote Sensing and the Earth

    NASA Technical Reports Server (NTRS)

    Brosius, C. A.; Gervin, J. C.; Ragusa, J. M.

    1977-01-01

    A text book on remote sensing, as part of the earth resources Skylab programs, is presented. The fundamentals of remote sensing and its application to agriculture, land use, geology, water and marine resources, and environmental monitoring are summarized.

  6. Operational Use of Remote Sensing within USDA

    NASA Technical Reports Server (NTRS)

    Bethel, Glenn R.

    2007-01-01

    A viewgraph presentation of remote sensing imagery within the USDA is shown. USDA Aerial Photography, Digital Sensors, Hurricane imagery, Remote Sensing Sources, Satellites used by Foreign Agricultural Service, Landsat Acquisitions, and Aerial Acquisitions are also shown.

  7. Investigation related to multispectral imaging systems

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F.; Erickson, J. D.

    1974-01-01

    A summary of technical progress made during a five year research program directed toward the development of operational information systems based on multispectral sensing and the use of these systems in earth-resource survey applications is presented. Efforts were undertaken during this program to: (1) improve the basic understanding of the many facets of multispectral remote sensing, (2) develop methods for improving the accuracy of information generated by remote sensing systems, (3) improve the efficiency of data processing and information extraction techniques to enhance the cost-effectiveness of remote sensing systems, (4) investigate additional problems having potential remote sensing solutions, and (5) apply the existing and developing technology for specific users and document and transfer that technology to the remote sensing community.

  8. The AOTF-Based NO2 Camera

    NASA Astrophysics Data System (ADS)

    Dekemper, E.; Fussen, D.; Vanhellemont, F.; Vanhamel, J.; Pieroux, D.; Berkenbosch, S.

    2017-12-01

    In an urban environment, nitrogen dioxide is emitted by a multitude of static and moving point sources (cars, industry, power plants, heating systems,…). Air quality models generally rely on a limited number of monitoring stations which do not capture the whole pattern, neither allow for full validation. So far, there has been a lack of instrument capable of measuring NO2 fields with the necessary spatio-temporal resolution above major point sources (power plants), or more extended ones (cities). We have developed a new type of passive remote sensing instrument aiming at the measurement of 2-D distributions of NO2 slant column densities (SCDs) with a high spatial (meters) and temporal (minutes) resolution. The measurement principle has some similarities with the popular filter-based SO2 camera (used in volcanic and industrial sulfur emissions monitoring) as it relies on spectral images taken at wavelengths where the molecule absorption cross section is different. But contrary to the SO2 camera, the spectral selection is performed by an acousto-optical tunable filter (AOTF) capable of resolving the target molecule's spectral features. A first prototype was successfully tested with the plume of a coal-firing power plant in Romania, revealing the dynamics of the formation of NO2 in the early plume. A lighter version of the NO2 camera is now being tested on other targets, such as oil refineries and urban air masses.

  9. Scientific Objectives of Small Carry-on Impactor (SCI) and Deployable Camera 3 Digital (DCAM3-D): Observation of an Ejecta Curtain and a Crater Formed on the Surface of Ryugu by an Artificial High-Velocity Impact

    NASA Astrophysics Data System (ADS)

    Arakawa, M.; Wada, K.; Saiki, T.; Kadono, T.; Takagi, Y.; Shirai, K.; Okamoto, C.; Yano, H.; Hayakawa, M.; Nakazawa, S.; Hirata, N.; Kobayashi, M.; Michel, P.; Jutzi, M.; Imamura, H.; Ogawa, K.; Sakatani, N.; Iijima, Y.; Honda, R.; Ishibashi, K.; Hayakawa, H.; Sawada, H.

    2017-07-01

    The Small Carry-on Impactor (SCI) equipped on Hayabusa2 was developed to produce an artificial impact crater on the primitive Near-Earth Asteroid (NEA) 162173 Ryugu (Ryugu) in order to explore the asteroid subsurface material unaffected by space weathering and thermal alteration by solar radiation. An exposed fresh surface by the impactor and/or the ejecta deposit excavated from the crater will be observed by remote sensing instruments, and a subsurface fresh sample of the asteroid will be collected there. The SCI impact experiment will be observed by a Deployable CAMera 3-D (DCAM3-D) at a distance of ˜1 km from the impact point, and the time evolution of the ejecta curtain will be observed by this camera to confirm the impact point on the asteroid surface. As a result of the observation of the ejecta curtain by DCAM3-D and the crater morphology by onboard cameras, the subsurface structure and the physical properties of the constituting materials will be derived from crater scaling laws. Moreover, the SCI experiment on Ryugu gives us a precious opportunity to clarify effects of microgravity on the cratering process and to validate numerical simulations and models of the cratering process.

  10. Multi-sensor fusion over the World Trade Center disaster site

    NASA Astrophysics Data System (ADS)

    Rodarmel, Craig; Scott, Lawrence; Simerlink, Deborah A.; Walker, Jeffrey

    2002-09-01

    The immense size and scope of the rescue and clean-up of the World Trade Center site created a need for data that would provide a total overview of the disaster area. To fulfill this need, the New York State Office for Technology (NYSOFT) contracted with EarthData International to collect airborne remote sensing data over Ground Zero with an airborne light detection and ranging (LIDAR) sensor, a high-resolution digital camera, and a thermal camera. The LIDAR data provided a three-dimensional elevation model of the ground surface that was used for volumetric calculations and also in the orthorectification of the digital images. The digital camera provided high-resolution imagery over the site to aide the rescuers in placement of equipment and other assets. In addition, the digital imagery was used to georeference the thermal imagery and also provided the visual background for the thermal data. The thermal camera aided in the location and tracking of underground fires. The combination of data from these three sensors provided the emergency crews with a timely, accurate overview containing a wealth of information of the rapidly changing disaster site. Because of the dynamic nature of the site, the data was acquired on a daily basis, processed, and turned over to NYSOFT within twelve hours of the collection. During processing, the three datasets were combined and georeferenced to allow them to be inserted into the client's geographic information systems.

  11. An object-based storage model for distributed remote sensing images

    NASA Astrophysics Data System (ADS)

    Yu, Zhanwu; Li, Zhongmin; Zheng, Sheng

    2006-10-01

    It is very difficult to design an integrated storage solution for distributed remote sensing images to offer high performance network storage services and secure data sharing across platforms using current network storage models such as direct attached storage, network attached storage and storage area network. Object-based storage, as new generation network storage technology emerged recently, separates the data path, the control path and the management path, which solves the bottleneck problem of metadata existed in traditional storage models, and has the characteristics of parallel data access, data sharing across platforms, intelligence of storage devices and security of data access. We use the object-based storage in the storage management of remote sensing images to construct an object-based storage model for distributed remote sensing images. In the storage model, remote sensing images are organized as remote sensing objects stored in the object-based storage devices. According to the storage model, we present the architecture of a distributed remote sensing images application system based on object-based storage, and give some test results about the write performance comparison of traditional network storage model and object-based storage model.

  12. Fuzzy Classification of High Resolution Remote Sensing Scenes Using Visual Attention Features.

    PubMed

    Li, Linyi; Xu, Tingbao; Chen, Yun

    2017-01-01

    In recent years the spatial resolutions of remote sensing images have been improved greatly. However, a higher spatial resolution image does not always lead to a better result of automatic scene classification. Visual attention is an important characteristic of the human visual system, which can effectively help to classify remote sensing scenes. In this study, a novel visual attention feature extraction algorithm was proposed, which extracted visual attention features through a multiscale process. And a fuzzy classification method using visual attention features (FC-VAF) was developed to perform high resolution remote sensing scene classification. FC-VAF was evaluated by using remote sensing scenes from widely used high resolution remote sensing images, including IKONOS, QuickBird, and ZY-3 images. FC-VAF achieved more accurate classification results than the others according to the quantitative accuracy evaluation indices. We also discussed the role and impacts of different decomposition levels and different wavelets on the classification accuracy. FC-VAF improves the accuracy of high resolution scene classification and therefore advances the research of digital image analysis and the applications of high resolution remote sensing images.

  13. Fuzzy Classification of High Resolution Remote Sensing Scenes Using Visual Attention Features

    PubMed Central

    Xu, Tingbao; Chen, Yun

    2017-01-01

    In recent years the spatial resolutions of remote sensing images have been improved greatly. However, a higher spatial resolution image does not always lead to a better result of automatic scene classification. Visual attention is an important characteristic of the human visual system, which can effectively help to classify remote sensing scenes. In this study, a novel visual attention feature extraction algorithm was proposed, which extracted visual attention features through a multiscale process. And a fuzzy classification method using visual attention features (FC-VAF) was developed to perform high resolution remote sensing scene classification. FC-VAF was evaluated by using remote sensing scenes from widely used high resolution remote sensing images, including IKONOS, QuickBird, and ZY-3 images. FC-VAF achieved more accurate classification results than the others according to the quantitative accuracy evaluation indices. We also discussed the role and impacts of different decomposition levels and different wavelets on the classification accuracy. FC-VAF improves the accuracy of high resolution scene classification and therefore advances the research of digital image analysis and the applications of high resolution remote sensing images. PMID:28761440

  14. Remote measurement of surface-water velocity using infrared videography and PIV: a proof-of-concept for Alaskan rivers

    USGS Publications Warehouse

    Kinzel, Paul J.; Legleiter, Carl; Nelson, Jonathan M.; Conaway, Jeffrey S.

    2017-01-01

    Thermal cameras with high sensitivity to medium and long wavelengths can resolve features at the surface of flowing water arising from turbulent mixing. Images acquired by these cameras can be processed with particle image velocimetry (PIV) to compute surface velocities based on the displacement of thermal features as they advect with the flow. We conducted a series of field measurements to test this methodology for remote sensing of surface velocities in rivers. We positioned an infrared video camera at multiple stations across bridges that spanned five rivers in Alaska. Simultaneous non-contact measurements of surface velocity were collected with a radar gun. In situ velocity profiles were collected with Acoustic Doppler Current Profilers (ADCP). Infrared image time series were collected at a frequency of 10Hz for a one-minute duration at a number of stations spaced across each bridge. Commercial PIV software used a cross-correlation algorithm to calculate pixel displacements between successive frames, which were then scaled to produce surface velocities. A blanking distance below the ADCP prevents a direct measurement of the surface velocity. However, we estimated surface velocity from the ADCP measurements using a program that normalizes each ADCP transect and combines those normalized transects to compute a mean measurement profile. The program can fit a power law to the profile and in so doing provides a velocity index, the ratio between the depth-averaged and surface velocity. For the rivers in this study, the velocity index ranged from 0.82 – 0.92. Average radar and extrapolated ADCP surface velocities were in good agreement with average infrared PIV calculations.

  15. A Plane Target Detection Algorithm in Remote Sensing Images based on Deep Learning Network Technology

    NASA Astrophysics Data System (ADS)

    Shuxin, Li; Zhilong, Zhang; Biao, Li

    2018-01-01

    Plane is an important target category in remote sensing targets and it is of great value to detect the plane targets automatically. As remote imaging technology developing continuously, the resolution of the remote sensing image has been very high and we can get more detailed information for detecting the remote sensing targets automatically. Deep learning network technology is the most advanced technology in image target detection and recognition, which provided great performance improvement in the field of target detection and recognition in the everyday scenes. We combined the technology with the application in the remote sensing target detection and proposed an algorithm with end to end deep network, which can learn from the remote sensing images to detect the targets in the new images automatically and robustly. Our experiments shows that the algorithm can capture the feature information of the plane target and has better performance in target detection with the old methods.

  16. DARLA: Data Assimilation and Remote Sensing for Littoral Applications

    NASA Astrophysics Data System (ADS)

    Jessup, A.; Holman, R. A.; Chickadel, C.; Elgar, S.; Farquharson, G.; Haller, M. C.; Kurapov, A. L.; Özkan-Haller, H. T.; Raubenheimer, B.; Thomson, J. M.

    2012-12-01

    DARLA is 5-year collaborative project that couples state-of-the-art remote sensing and in situ measurements with advanced data assimilation (DA) modeling to (a) evaluate and improve remote sensing retrieval algorithms for environmental parameters, (b) determine the extent to which remote sensing data can be used in place of in situ data in models, and (c) infer bathymetry for littoral environments by combining remotely-sensed parameters and data assimilation models. The project uses microwave, electro-optical, and infrared techniques to characterize the littoral ocean with a focus on wave and current parameters required for DA modeling. In conjunction with the RIVET (River and Inlets) Project, extensive in situ measurements provide ground truth for both the remote sensing retrieval algorithms and the DA modeling. Our goal is to use remote sensing to constrain data assimilation models of wave and circulation dynamics in a tidal inlet and surrounding beaches. We seek to improve environmental parameter estimation via remote sensing fusion, determine the success of using remote sensing data to drive DA models, and produce a dynamically consistent representation of the wave, circulation, and bathymetry fields in complex environments. The objectives are to test the following three hypotheses: 1. Environmental parameter estimation using remote sensing techniques can be significantly improved by fusion of multiple sensor products. 2. Data assimilation models can be adequately constrained (i.e., forced or guided) with environmental parameters derived from remote sensing measurements. 3. Bathymetry on open beaches, river mouths, and at tidal inlets can be inferred from a combination of remotely-sensed parameters and data assimilation models. Our approach is to conduct a series of field experiments combining remote sensing and in situ measurements to investigate signature physics and to gather data for developing and testing DA models. A preliminary experiment conducted at the Field Research Facility at Duck, NC in September 2010 focused on assimilation of tower-based electo-optical, infrared, and radar measurements in predictions of longshore currents. Here we provide an overview of our contribution to the RIVET I experiment at New River Inlet, NC in May 2012. During the course of the 3-week measurement period, continuous tower-based remote sensing measurements were made using electro-optical, infrared, and radar techniques covering the nearshore zone and the inlet mouth. A total of 50 hours of airborne measurements were made using high-resolution infrared imagers and a customized along track interferometric synthetic aperture radar (ATI SAR). The airborne IR imagery provides kilometer-scale mapping of frontal features that evolve as the inlet flow interacts with the oceanic wave and current fields. The ATI SAR provides maps of the two-dimensional surface currents. Near-surface measurements of turbulent velocities and surface waves using SWIFT drifters, designed to measures near-surface properties relevant to remote sensing, complimented the extensive in situ measurements by RIVET investigators.

  17. Reflections on current and future applications of multiangle imaging to aerosol and cloud remote sensing

    NASA Astrophysics Data System (ADS)

    Diner, David

    2010-05-01

    The Multi-angle Imaging SpectroRadiometer (MISR) instrument has been collecting global Earth data from NASA's Terra satellite since February 2000. With its 9 along-track view angles, 4 spectral bands, intrinsic spatial resolution of 275 m, and stable radiometric and geometric calibration, no instrument that combines MISR's attributes has previously flown in space, nor is there is a similar capability currently available on any other satellite platform. Multiangle imaging offers several tools for remote sensing of aerosol and cloud properties, including bidirectional reflectance and scattering measurements, stereoscopic pattern matching, time lapse sequencing, and potentially, optical tomography. Current data products from MISR employ several of these techniques. Observations of the intensity of scattered light as a function of view angle and wavelength provide accurate measures of aerosol optical depths (AOD) over land, including bright desert and urban source regions. Partitioning of AOD according to retrieved particle classification and incorporation of height information improves the relationship between AOD and surface PM2.5 (fine particulate matter, a regulated air pollutant), constituting an important step toward a satellite-based particulate pollution monitoring system. Stereoscopic cloud-top heights provide a unique metric for detecting interannual variability of clouds and exceptionally high quality and sensitivity for detection and height retrieval for low-level clouds. Using the several-minute time interval between camera views, MISR has enabled a pole-to-pole, height-resolved atmospheric wind measurement system. Stereo imagery also makes possible global measurement of the injection heights and advection speeds of smoke plumes, volcanic plumes, and dust clouds, for which a large database is now available. To build upon what has been learned during the first decade of MISR observations, we are evaluating algorithm updates that not only refine retrieval accuracies but also include enhancements (e.g., finer spatial resolution) that would have been computationally prohibitive just ten years ago. In addition, we are developing technological building blocks for future sensors that enable broader spectral coverage, wider swath, and incorporation of high-accuracy polarimetric imaging. Prototype cameras incorporating photoelastic modulators have been constructed. To fully capitalize on the rich information content of the current and next-generation of multiangle imagers, several algorithmic paradigms currently employed need to be re-examined, e.g., the use of aerosol look-up tables, neglect of 3-D effects, and binary partitioning of the atmosphere into "cloudy" or "clear" designations. Examples of progress in algorithm and technology developments geared toward advanced application of multiangle imaging to remote sensing of aerosols and clouds will be presented.

  18. A remote sensing and GIS-enabled asset management system (RS-GAMS).

    DOT National Transportation Integrated Search

    2013-04-01

    Under U.S. Department of Transportation (DOT) Commercial Remote Sensing and : Spatial Information (CRS&SI) Technology Initiative 2 of the Transportation : Infrastructure Construction and Condition Assessment, an intelligent Remote Sensing and : GIS-b...

  19. Remote Sensing.

    ERIC Educational Resources Information Center

    Williams, Richard S., Jr.; Southworth, C. Scott

    1983-01-01

    The Landsat Program became the major event of 1982 in geological remote sensing with the successful launch of Landsat 4. Other 1982 remote sensing accomplishments, research, publications, (including a set of Landsat worldwide reference system index maps), and conferences are highlighted. (JN)

  20. Remote sensing utility in a disaster struck urban environment

    NASA Technical Reports Server (NTRS)

    Rush, M.; Holguin, A.; Vernon, S.

    1974-01-01

    A project to determine the ways in which remote sensing can contribute to solutions of urban public health problems in time of natural disaster is discussed. The objectives of the project are to determine and describe remote sensing standard operating procedures for public health assistance during disaster relief operations which will aid the agencies and organizations involved in disaster intervention. Proposed tests to determine the validity of the remote sensing system are reported.

  1. Removal of Surface-Reflected Light for the Measurement of Remote-Sensing Reflectance from an Above-Surface Platform

    DTIC Science & Technology

    2010-12-06

    raw data). To remove surface-reflected light in field measurements of remote sensing reflectance, a spectral optimization approach was applied, with...results compared with those from remote - sensing models and from direct measurements. The agreement from different determinations suggests that...reasonable results for remote sensing reflectance of clear blue water to turbid brown water are obtainable from above-surface measurements, even under conditions of high waves.

  2. Bibliography of Remote Sensing Techniques Used in Wetland Research

    DTIC Science & Technology

    1993-01-01

    8217 is investigating the application of remote sensing technology for detecting changes in wetland environments. This report documents a bibliographic...search conducted as part of that work unit on applications of remote sensing techniques in wetland research. Results were used to guide research...efforts on the use of remote sensing technology for wetland change detection and assessment. The citations are presented in three appendixes, organized by wetland type, sensor type, and author.

  3. Use of Openly Available Satellite Images for Remote Sensing Education

    NASA Astrophysics Data System (ADS)

    Wang, C.-K.

    2011-09-01

    With the advent of Google Earth, Google Maps, and Microsoft Bing Maps, high resolution satellite imagery are becoming more easily accessible than ever. It have been the case that the college students may already have wealth experiences with the high resolution satellite imagery by using these software and web services prior to any formal remote sensing education. It is obvious that the remote sensing education should be adjusted to the fact that the audience are already the customers of remote sensing products (through the use of the above mentioned services). This paper reports the use of openly available satellite imagery in an introductory-level remote sensing course in the Department of Geomatics of National Cheng Kung University as a term project. From the experience learned from the fall of 2009 and 2010, it shows that this term project has effectively aroused the students' enthusiastic toward Remote Sensing.

  4. Strategies for using remotely sensed data in hydrologic models

    NASA Technical Reports Server (NTRS)

    Peck, E. L.; Keefer, T. N.; Johnson, E. R. (Principal Investigator)

    1981-01-01

    Present and planned remote sensing capabilities were evaluated. The usefulness of six remote sensing capabilities (soil moisture, land cover, impervious area, areal extent of snow cover, areal extent of frozen ground, and water equivalent of the snow cover) with seven hydrologic models (API, CREAMS, NWSRFS, STORM, STANFORD, SSARR, and NWSRFS Snowmelt) were reviewed. The results indicate remote sensing information has only limited value for use with the hydrologic models in their present form. With minor modifications to the models the usefulness would be enhanced. Specific recommendations are made for incorporating snow covered area measurements in the NWSRFS Snowmelt model. Recommendations are also made for incorporating soil moisture measurements in NWSRFS. Suggestions are made for incorporating snow covered area, soil moisture, and others in STORM and SSARR. General characteristics of a hydrologic model needed to make maximum use of remotely sensed data are discussed. Suggested goals for improvements in remote sensing for use in models are also established.

  5. A preliminary study of the statistical analyses and sampling strategies associated with the integration of remote sensing capabilities into the current agricultural crop forecasting system

    NASA Technical Reports Server (NTRS)

    Sand, F.; Christie, R.

    1975-01-01

    Extending the crop survey application of remote sensing from small experimental regions to state and national levels requires that a sample of agricultural fields be chosen for remote sensing of crop acreage, and that a statistical estimate be formulated with measurable characteristics. The critical requirements for the success of the application are reviewed in this report. The problem of sampling in the presence of cloud cover is discussed. Integration of remotely sensed information about crops into current agricultural crop forecasting systems is treated on the basis of the USDA multiple frame survey concepts, with an assumed addition of a new frame derived from remote sensing. Evolution of a crop forecasting system which utilizes LANDSAT and future remote sensing systems is projected for the 1975-1990 time frame.

  6. Archimedean Witness: The Application of Remote Sensing as an Aid to Human Rights Prosecutions

    NASA Astrophysics Data System (ADS)

    Walker, James Robin

    The 21st century has seen a significant increase in the use of remote sensing technology in the international human rights arena for the purposes of documenting crimes against humanity. The nexus between remote sensing, human rights activism, and international criminal prosecutions sits at a significant crossroads within geographic thought, calling attention to the epistemological and geopolitical implications that stem from the "view from nowhere" afforded by satellite imagery. Therefore, this thesis is divided into three sections. The first looks at the geographical questions raised by the expansion of remote sensing use in the context of international activism. The second explores the complications inherent in the presentation of remote sensing data as evidence of war crimes. Building upon the first two, the third section is a case study in alternate forms of analysis, aimed at expanding the utility of remote sensing data in international criminal prosecutions.

  7. [Small unmanned aerial vehicles for low-altitude remote sensing and its application progress in ecology.

    PubMed

    Sun, Zhong Yu; Chen, Yan Qiao; Yang, Long; Tang, Guang Liang; Yuan, Shao Xiong; Lin, Zhi Wen

    2017-02-01

    Low-altitude unmanned aerial vehicles (UAV) remote sensing system overcomes the deficiencies of space and aerial remote sensing system in resolution, revisit period, cloud cover and cost, which provides a novel method for ecological research on mesoscale. This study introduced the composition of UAV remote sensing system, reviewed its applications in species, population, community and ecosystem ecology research. Challenges and opportunities of UAV ecology were identified to direct future research. The promising research area of UAV ecology includes the establishment of species morphology and spectral characteristic data base, species automatic identification, the revelation of relationship between spectral index and plant physiological processes, three-dimension monitoring of ecosystem, and the integration of remote sensing data from multi resources and multi scales. With the development of UAV platform, data transformation and sensors, UAV remote sensing technology will have wide application in ecology research.

  8. Airport Remote Tower Sensor Systems

    NASA Technical Reports Server (NTRS)

    Papasin, Richard; Gawdiak, Yuri; Maluf, David A.; Leidich, Christopher; Tran, Peter B.

    2001-01-01

    Remote Tower Sensor Systems (RTSS) are proof-of-concept prototypes being developed by NASA/Ames Research Center (NASA/ARC) with collaboration with the FAA (Federal Aviation Administration) and NOAA (National Oceanic Atmospheric Administration). RTSS began with the deployment of an Airport Approach Zone Camera System that includes real-time weather observations at San Francisco International Airport. The goal of this research is to develop, deploy, and demonstrate remotely operated cameras and sensors at several major airport hubs and un-towered airports. RTSS can provide real-time weather observations of airport approach zone. RTSS will integrate and test airport sensor packages that will allow remote access to realtime airport conditions and aircraft status.

  9. Estimating evaporation with thermal UAV data and two-source energy balance models

    NASA Astrophysics Data System (ADS)

    Hoffmann, H.; Nieto, H.; Jensen, R.; Guzinski, R.; Zarco-Tejada, P.; Friborg, T.

    2016-02-01

    Estimating evaporation is important when managing water resources and cultivating crops. Evaporation can be estimated using land surface heat flux models and remotely sensed land surface temperatures (LST), which have recently become obtainable in very high resolution using lightweight thermal cameras and Unmanned Aerial Vehicles (UAVs). In this study a thermal camera was mounted on a UAV and applied into the field of heat fluxes and hydrology by concatenating thermal images into mosaics of LST and using these as input for the two-source energy balance (TSEB) modelling scheme. Thermal images are obtained with a fixed-wing UAV overflying a barley field in western Denmark during the growing season of 2014 and a spatial resolution of 0.20 m is obtained in final LST mosaics. Two models are used: the original TSEB model (TSEB-PT) and a dual-temperature-difference (DTD) model. In contrast to the TSEB-PT model, the DTD model accounts for the bias that is likely present in remotely sensed LST. TSEB-PT and DTD have already been well tested, however only during sunny weather conditions and with satellite images serving as thermal input. The aim of this study is to assess whether a lightweight thermal camera mounted on a UAV is able to provide data of sufficient quality to constitute as model input and thus attain accurate and high spatial and temporal resolution surface energy heat fluxes, with special focus on latent heat flux (evaporation). Furthermore, this study evaluates the performance of the TSEB scheme during cloudy and overcast weather conditions, which is feasible due to the low data retrieval altitude (due to low UAV flying altitude) compared to satellite thermal data that are only available during clear-sky conditions. TSEB-PT and DTD fluxes are compared and validated against eddy covariance measurements and the comparison shows that both TSEB-PT and DTD simulations are in good agreement with eddy covariance measurements, with DTD obtaining the best results. The DTD model provides results comparable to studies estimating evaporation with similar experimental setups, but with LST retrieved from satellites instead of a UAV. Further, systematic irrigation patterns on the barley field provide confidence in the veracity of the spatially distributed evaporation revealed by model output maps. Lastly, this study outlines and discusses the thermal UAV image processing that results in mosaics suited for model input. This study shows that the UAV platform and the lightweight thermal camera provide high spatial and temporal resolution data valid for model input and for other potential applications requiring high-resolution and consistent LST.

  10. NASA Tech Briefs, April 2012

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Topics include: Computational Ghost Imaging for Remote Sensing; Digital Architecture for a Trace Gas Sensor Platform; Dispersed Fringe Sensing Analysis - DFSA; Indium Tin Oxide Resistor-Based Nitric Oxide Microsensors; Gas Composition Sensing Using Carbon Nanotube Arrays; Sensor for Boundary Shear Stress in Fluid Flow; Model-Based Method for Sensor Validation; Qualification of Engineering Camera for Long-Duration Deep Space Missions; Remotely Powered Reconfigurable Receiver for Extreme Environment Sensing Platforms; Bump Bonding Using Metal-Coated Carbon Nanotubes; In Situ Mosaic Brightness Correction; Simplex GPS and InSAR Inversion Software; Virtual Machine Language 2.1; Multi-Scale Three-Dimensional Variational Data Assimilation System for Coastal Ocean Prediction; Pandora Operation and Analysis Software; Fabrication of a Cryogenic Bias Filter for Ultrasensitive Focal Plane; Processing of Nanosensors Using a Sacrificial Template Approach; High-Temperature Shape Memory Polymers; Modular Flooring System; Non-Toxic, Low-Freezing, Drop-In Replacement Heat Transfer Fluids; Materials That Enhance Efficiency and Radiation Resistance of Solar Cells; Low-Cost, Rugged High-Vacuum System; Static Gas-Charging Plug; Floating Oil-Spill Containment Device; Stemless Ball Valve; Improving Balance Function Using Low Levels of Electrical Stimulation of the Balance Organs; Oxygen-Methane Thruster; Lunar Navigation Determination System - LaNDS; Launch Method for Kites in Low-Wind or No-Wind Conditions; Supercritical CO2 Cleaning System for Planetary Protection and Contamination Control Applications; Design and Performance of a Wideband Radio Telescope; Finite Element Models for Electron Beam Freeform Fabrication Process Autonomous Information Unit for Fine-Grain Data Access Control and Information Protection in a Net-Centric System; Vehicle Detection for RCTA/ANS (Autonomous Navigation System); Image Mapping and Visual Attention on the Sensory Ego-Sphere; HyDE Framework for Stochastic and Hybrid Model-Based Diagnosis; and IMAGESEER - IMAGEs for Education and Research.

  11. International Models and Methods of Remote Sensing Education and Training.

    ERIC Educational Resources Information Center

    Anderson, Paul S.

    A classification of remote sensing courses throughout the world, the world-wide need for sensing instruction, and alternative instructional methods for meeting those needs are discussed. Remote sensing involves aerial photointerpretation or the use of satellite and other non-photographic imagery; its focus is to interpret what is in the photograph…

  12. A Review and Analysis of Remote Sensing Capability for Air Quality Measurements as a Potential Decision Support Tool Conducted by the NASA DEVELOP Program

    NASA Technical Reports Server (NTRS)

    Ross, A.; Richards, A.; Keith, K.; Frew, C.; Boseck, J.; Sutton, S.; Watts, C.; Rickman, D.

    2007-01-01

    This project focused on a comprehensive utilization of air quality model products as decision support tools (DST) needed for public health applications. A review of past and future air quality measurement methods and their uncertainty, along with the relationship of air quality to national and global public health, is vital. This project described current and future NASA satellite remote sensing and ground sensing capabilities and the potential for using these sensors to enhance the prediction, prevention, and control of public health effects that result from poor air quality. The qualitative uncertainty of current satellite remotely sensed air quality, the ground-based remotely sensed air quality, the air quality/public health model, and the decision making process is evaluated in this study. Current peer-reviewed literature suggests that remotely sensed air quality parameters correlate well with ground-based sensor data. A satellite remote-sensed and ground-sensed data complement is needed to enhance the models/tools used by policy makers for the protection of national and global public health communities

  13. Theme section for 36th International Symposium for Remote Sensing of the Environment in Berlin

    NASA Astrophysics Data System (ADS)

    Trinder, John; Waske, Björn

    2016-09-01

    The International Symposium for Remote Sensing of the Environment (ISRSE) is the longest series of international conferences held on the topic of Remote Sensing, commencing in Ann Arbor, Michigan USA in 1962. While the name of the conference has changed over the years, it is regularly held approximately every 2 years and continues to be one of the leading international conferences on remote sensing. The latest of these conferences, the 36th ISRSE, was held in Berlin, Germany from 11 to 15 May 2015. All complete papers from the conference are available in the ISPRS International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences at http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-7-W3/index.html.

  14. THE REMOTE SENSING DATA GATEWAY

    EPA Science Inventory

    The EPA Remote Sensing Data Gateway (RSDG) is a pilot project in the National Exposure Research Laboratory (NERL) to develop a comprehensive data search, acquisition, delivery and archive mechanism for internal, national and international sources of remote sensing data for the co...

  15. A remote sensing and GIS-enabled asset management system (RS-GAMS) : phase 2.

    DOT National Transportation Integrated Search

    2014-04-01

    Under the U.S. Department of Transportation (DOT) Commercial Remote Sensing and Spatial : Information (CRS&SI) Technology Initiative 2 of the Transportation Infrastructure Construction : and Condition Assessment, an intelligent Remote Sensing and GIS...

  16. Remote sensing applications program

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The activities of the Mississippi Remote Sensing Center are described in addition to technology transfer and information dissemination, remote sensing topics such as timber identification, water quality, flood prevention, land use, erosion control, animal habitats, and environmental impact studies are also discussed.

  17. Speech versus manual control of camera functions during a telerobotic task

    NASA Technical Reports Server (NTRS)

    Bierschwale, John M.; Sampaio, Carlos E.; Stuart, Mark A.; Smith, Randy L.

    1989-01-01

    Voice input for control of camera functions was investigated in this study. Objective were to (1) assess the feasibility of a voice-commanded camera control system, and (2) identify factors that differ between voice and manual control of camera functions. Subjects participated in a remote manipulation task that required extensive camera-aided viewing. Each subject was exposed to two conditions, voice and manual input, with a counterbalanced administration order. Voice input was found to be significantly slower than manual input for this task. However, in terms of remote manipulator performance errors and subject preference, there was no difference between modalities. Voice control of continuous camera functions is not recommended. It is believed that the use of voice input for discrete functions, such as multiplexing or camera switching, could aid performance. Hybrid mixes of voice and manual input may provide the best use of both modalities. This report contributes to a better understanding of the issues that affect the design of an efficient human/telerobot interface.

  18. Remote Sensing Terminology in a Global and Knowledge-Based World

    NASA Astrophysics Data System (ADS)

    Kancheva, Rumiana

    The paper is devoted to terminology issues related to all aspects of remote sensing research and applications. Terminology is the basis for a better understanding among people. It is crucial to keep up with the latest developments and novelties of the terminology in advanced technology fields such as aerospace science and industry. This is especially true in remote sensing and geoinformatics which develop rapidly and have ever extending applications in various domains of science and human activities. Remote sensing terminology issues are directly relevant to the contemporary worldwide policies on information accessibility, dissemination and utilization of research results in support of solutions to global environmental challenges and sustainable development goals. Remote sensing and spatial information technologies are an integral part of the international strategies for cooperation in scientific, research and application areas with a particular accent on environmental monitoring, ecological problems natural resources management, climate modeling, weather forecasts, disaster mitigation and many others to which remote sensing data can be put. Remote sensing researchers, professionals, students and decision makers of different counties and nationalities should fully understand, interpret and translate into their native language any term, definition or acronym found in papers, books, proceedings, specifications, documentation, and etc. The importance of the correct use, precise definition and unification of remote sensing terms refers not only to people working in this field but also to experts in a variety of disciplines who handle remote sensing data and information products. In this paper, we draw the attention on the specifics, peculiarities and recent needs of compiling specialized dictionaries in the area of remote sensing focusing on Earth observations and the integration of remote sensing with other geoinformation technologies such as photogrammetry, geodesy, GIS, etc. Our belief is that the elaboration of bilingual and multilingual dictionaries and glossaries in this spreading, most technically advanced and promising field of human expertise is of great practical importance. The work on an English-Bulgarian Dictionary of Remote Sensing Terms is described including considerations on its scope, structure, information content, sellection of terms, and etc. The vision builds upon previous national and international experience and makes use of ongoing activities on the subject. Any interest in cooperation and initiating suchlike collaborative projects is welcome and highly appreciated.

  19. Adaptive strategies of remote systems operators exposed to perturbed camera-viewing conditions

    NASA Technical Reports Server (NTRS)

    Stuart, Mark A.; Manahan, Meera K.; Bierschwale, John M.; Sampaio, Carlos E.; Legendre, A. J.

    1991-01-01

    This report describes a preliminary investigation of the use of perturbed visual feedback during the performance of simulated space-based remote manipulation tasks. The primary objective of this NASA evaluation was to determine to what extent operators exhibit adaptive strategies which allow them to perform these specific types of remote manipulation tasks more efficiently while exposed to perturbed visual feedback. A secondary objective of this evaluation was to establish a set of preliminary guidelines for enhancing remote manipulation performance and reducing the adverse effects. These objectives were accomplished by studying the remote manipulator performance of test subjects exposed to various perturbed camera-viewing conditions while performing a simulated space-based remote manipulation task. Statistical analysis of performance and subjective data revealed that remote manipulation performance was adversely affected by the use of perturbed visual feedback and performance tended to improve with successive trials in most perturbed viewing conditions.

  20. Indicators of international remote sensing activities

    NASA Technical Reports Server (NTRS)

    Spann, G. W.

    1977-01-01

    The extent of worldwide remote sensing activities, including the use of satellite and high/medium altitude aircraft data was studied. Data were obtained from numerous individuals and organizations with international remote sensing responsibilities. Indicators were selected to evaluate the nature and scope of remote sensing activities in each country. These indicators ranged from attendance at remote sensing workshops and training courses to the establishment of earth resources satellite ground stations and plans for the launch of earth resources satellites. Results indicate that this technology constitutes a rapidly increasing component of environmental, land use, and natural resources investigations in many countries, and most of these countries rely on the LANDSAT satellites for a major portion of their data.

Top