Sun-view angle effects on reflectance factors of corn canopies
NASA Technical Reports Server (NTRS)
Ranson, K. J.; Daughtry, C. S. T.; Biehl, L. L.; Bauer, M. E.
1985-01-01
The effects of sun and view angles on reflectance factors of corn (Zea mays L.) canopies ranging from the six leaf stage to harvest maturity were studied on the Purdue University Agronomy Farm by a multiband radiometer. The two methods of acquiring spectral data, the truck system and the tower systrem, are described. The analysis of the spectral data is presented in three parts: solar angle effects on reflectance factors viewed at nadir; solar angle effects on reflectance factors viewed at a fixed sun angle; and both sun and view angles effect on reflectance factors. The analysis revealed that for nadir-viewed reflectance factors there is a strong solar angle dependence in all spectral bands for canopies with low leaf area index. Reflectance factors observed from the sun angle at different view azimuth angles showed that the position of the sensor relative to the sun is important in determining angular reflectance characteristics. For both sun and view angles, reflectance factors are maximized when the sensor view direction is towards the sun.
Array Of Sensors Measures Broadband Radiation
NASA Technical Reports Server (NTRS)
Hoffman, James W.; Grush, Ronald G.
1994-01-01
Multiple broadband radiation sensors aimed at various portions of total field of view. All sensors mounted in supporting frame, serving as common heat sink and temperature reference. Each sensor includes heater winding and differential-temperature-sensing bridge circuit. Power in heater winding adjusted repeatedly in effort to balance bridge circuit. Intended to be used aboard satellite in orbit around Earth to measure total radiation emitted, at various viewing angles, by mosaic of "footprint" areas (each defined by its viewing angle) on surface of Earth. Modified versions of array useful for angle-resolved measurements of broadband radiation in laboratory and field settings on Earth.
BOREAS RSS-2 Level-1B ASAS Image Data: At-Sensor Radiance in BSQ Format
NASA Technical Reports Server (NTRS)
Russell, C.; Hall, Forrest G. (Editor); Nickeson, Jaime (Editor); Dabney, P. W.; Kovalick, W.; Graham, D.; Bur, Michael; Irons, James R.; Tierney, M.
2000-01-01
The BOREAS RSS-2 team used the ASAS instrument, mounted on the NASA C-130 aircraft, to create at-sensor radiance images of various sites as a function of spectral wavelength, view geometry (combinations of view zenith angle, view azimuth angle, solar zenith angle, and solar azimuth angle), and altitude. The level-1b ASAS images of the BOREAS study areas were collected from April to September 1994 and March to July 1996.
Wide-angle vision for road views
NASA Astrophysics Data System (ADS)
Huang, F.; Fehrs, K.-K.; Hartmann, G.; Klette, R.
2013-03-01
The field-of-view of a wide-angle image is greater than (say) 90 degrees, and so contains more information than available in a standard image. A wide field-of-view is more advantageous than standard input for understanding the geometry of 3D scenes, and for estimating the poses of panoramic sensors within such scenes. Thus, wide-angle imaging sensors and methodologies are commonly used in various road-safety, street surveillance, street virtual touring, or street 3D modelling applications. The paper reviews related wide-angle vision technologies by focusing on mathematical issues rather than on hardware.
Andújar, Dionisio; Fernández-Quintanilla, César; Dorado, José
2015-06-04
In energy crops for biomass production a proper plant structure is important to optimize wood yields. A precise crop characterization in early stages may contribute to the choice of proper cropping techniques. This study assesses the potential of the Microsoft Kinect for Windows v.1 sensor to determine the best viewing angle of the sensor to estimate the plant biomass based on poplar seedling geometry. Kinect Fusion algorithms were used to generate a 3D point cloud from the depth video stream. The sensor was mounted in different positions facing the tree in order to obtain depth (RGB-D) images from different angles. Individuals of two different ages, e.g., one month and one year old, were scanned. Four different viewing angles were compared: top view (0°), 45° downwards view, front view (90°) and ground upwards view (-45°). The ground-truth used to validate the sensor readings consisted of a destructive sampling in which the height, leaf area and biomass (dry weight basis) were measured in each individual plant. The depth image models agreed well with 45°, 90° and -45° measurements in one-year poplar trees. Good correlations (0.88 to 0.92) between dry biomass and the area measured with the Kinect were found. In addition, plant height was accurately estimated with a few centimeters error. The comparison between different viewing angles revealed that top views showed poorer results due to the fact the top leaves occluded the rest of the tree. However, the other views led to good results. Conversely, small poplars showed better correlations with actual parameters from the top view (0°). Therefore, although the Microsoft Kinect for Windows v.1 sensor provides good opportunities for biomass estimation, the viewing angle must be chosen taking into account the developmental stage of the crop and the desired parameters. The results of this study indicate that Kinect is a promising tool for a rapid canopy characterization, i.e., for estimating crop biomass production, with several important advantages: low cost, low power needs and a high frame rate (frames per second) when dynamic measurements are required.
NASA Technical Reports Server (NTRS)
Valdez, P. F.; Donohoe, G. W.
1997-01-01
Statistical classification of remotely sensed images attempts to discriminate between surface cover types on the basis of the spectral response recorded by a sensor. It is well known that surfaces reflect incident radiation as a function of wavelength producing a spectral signature specific to the material under investigation. Multispectral and hyperspectral sensors sample the spectral response over tens and even hundreds of wavelength bands to capture the variation of spectral response with wavelength. Classification algorithms then exploit these differences in spectral response to distinguish between materials of interest. Sensors of this type, however, collect detailed spectral information from one direction (usually nadir); consequently, do not consider the directional nature of reflectance potentially detectable at different sensor view angles. Improvements in sensor technology have resulted in remote sensing platforms capable of detecting reflected energy across wavelengths (spectral signatures) and from multiple view angles (angular signatures) in the fore and aft directions. Sensors of this type include: the moderate resolution imaging spectroradiometer (MODIS), the multiangle imaging spectroradiometer (MISR), and the airborne solid-state array spectroradiometer (ASAS). A goal of this paper, then, is to explore the utility of Bidirectional Reflectance Distribution Function (BRDF) models in the selection of optimal view angles for the classification of remotely sensed images by employing a strategy of searching for the maximum difference between surface BRDFs. After a brief discussion of directional reflect ante in Section 2, attention is directed to the Beard-Maxwell BRDF model and its use in predicting the bidirectional reflectance of a surface. The selection of optimal viewing angles is addressed in Section 3, followed by conclusions and future work in Section 4.
Jiao, Leizi; Dong, Daming; Zhao, Xiande; Han, Pengcheng
2016-12-01
In the study, we proposed an animal surface temperature measurement method based on Kinect sensor and infrared thermal imager to facilitate the screening of animals with febrile diseases. Due to random motion and small surface temperature variation of animals, the influence of the angle of view on temperature measurement is significant. The method proposed in the present study could compensate the temperature measurement error caused by the angle of view. Firstly, we analyzed the relationship between measured temperature and angle of view and established the mathematical model for compensating the influence of the angle of view with the correlation coefficient above 0.99. Secondly, the fusion method of depth and infrared thermal images was established for synchronous image capture with Kinect sensor and infrared thermal imager and the angle of view of each pixel was calculated. According to experimental results, without compensation treatment, the temperature image measured in the angle of view of 74° to 76° showed the difference of more than 2°C compared with that measured in the angle of view of 0°. However, after compensation treatment, the temperature difference range was only 0.03-1.2°C. This method is applicable for real-time compensation of errors caused by the angle of view during the temperature measurement process with the infrared thermal imager. Copyright © 2016 Elsevier Ltd. All rights reserved.
Effects of changing canopy directional reflectance on feature selection
NASA Technical Reports Server (NTRS)
Smith, J. A.; Oliver, R. E.; Kilpela, O. E.
1973-01-01
The use of a Monte Carlo model for generating sample directional reflectance data for two simplified target canopies at two different solar positions is reported. Successive iterations through the model permit the calculation of a mean vector and covariance matrix for canopy reflectance for varied sensor view angles. These data may then be used to calculate the divergence between the target distributions for various wavelength combinations and for these view angles. Results of a feature selection analysis indicate that different sets of wavelengths are optimum for target discrimination depending on sensor view angle and that the targets may be more easily discriminated for some scan angles than others. The time-varying behavior of these results is also pointed out.
View angle effect in LANDSAT imagery
NASA Technical Reports Server (NTRS)
Kaneko, T.; Engvall, J. L.
1977-01-01
The view angle effect in LANDSAT 2 imagery was investigated. The LANDSAT multispectral scanner scans over a range of view angles of -5.78 to 5.78 degrees. The view angle effect, which is caused by differing view angles, could be studied by comparing data collected at different view angles over a fixed location at a fixed time. Since such LANDSAT data is not available, consecutive day acquisition data were used as a substitute: they were collected over the same geographical location, acquired 24 hours apart, with a view angle change of 7 to 8 degrees at a latitude of 35 to 45 degrees. It is shown that there is approximately a 5% reduction in the average sensor response on the second-day acquisitions as compared with the first-day acquisitions, and that the view angle effect differs field to field and crop to crop. On false infrared color pictures the view angle effect causes changes primarily in brightness and to a lesser degree in color (hue and saturation). An implication is that caution must be taken when images with different view angles are combined for classification and a signature extension technique needs to take the view angle effect into account.
Optimization of Self-Directed Target Coverage in Wireless Multimedia Sensor Network
Yang, Yang; Wang, Yufei; Pi, Dechang; Wang, Ruchuan
2014-01-01
Video and image sensors in wireless multimedia sensor networks (WMSNs) have directed view and limited sensing angle. So the methods to solve target coverage problem for traditional sensor networks, which use circle sensing model, are not suitable for WMSNs. Based on the FoV (field of view) sensing model and FoV disk model proposed, how expected multimedia sensor covers the target is defined by the deflection angle between target and the sensor's current orientation and the distance between target and the sensor. Then target coverage optimization algorithms based on expected coverage value are presented for single-sensor single-target, multisensor single-target, and single-sensor multitargets problems distinguishingly. Selecting the orientation that sensor rotated to cover every target falling in the FoV disk of that sensor for candidate orientations and using genetic algorithm to multisensor multitargets problem, which has NP-complete complexity, then result in the approximated minimum subset of sensors which covers all the targets in networks. Simulation results show the algorithm's performance and the effect of number of targets on the resulting subset. PMID:25136667
Scheduling Randomly-Deployed Heterogeneous Video Sensor Nodes for Reduced Intrusion Detection Time
NASA Astrophysics Data System (ADS)
Pham, Congduc
This paper proposes to use video sensor nodes to provide an efficient intrusion detection system. We use a scheduling mechanism that takes into account the criticality of the surveillance application and present a performance study of various cover set construction strategies that take into account cameras with heterogeneous angle of view and those with very small angle of view. We show by simulation how a dynamic criticality management scheme can provide fast event detection for mission-critical surveillance applications by increasing the network lifetime and providing low stealth time of intrusions.
Krotkov, N A; Vasilkov, A P
2000-03-20
Use of a vertical polarizer has been suggested to reduce the effects of surface reflection in the above-water measurements of marine reflectance. We suggest using a similar technique for airborne or spaceborne sensors when atmospheric scattering adds its own polarization signature to the upwelling radiance. Our own theoretical sensitivity study supports the recommendation of Fougnie et al. [Appl. Opt. 38, 3844 (1999)] (40-50 degrees vertical angle and azimuth angle near 135 degrees, polarizer parallel to the viewing plane) for above-water measurements. However, the optimal viewing directions (and the optimal orientation of the polarizer) change with altitude above the sea surface, solar angle, and atmospheric vertical optical structure. A polarization efficiency function is introduced, which shows the maximal possible polarization discrimination of the background radiation for an arbitrary altitude above the sea surface, viewing direction, and solar angle. Our comment is meant to encourage broader application of airborne and spaceborne polarization sensors in remote sensing of water and sea surface properties.
View angle effects on relationships between leaf area index in wheat and vegetation indices
NASA Astrophysics Data System (ADS)
Chen, H.; Li, W.; Huang, W.; Niu, Z.
2016-12-01
The effects of plant types and view angles on the canopy-reflected spectrum can not be ignored in the estimation of leaf area index (LAI) using remote sensing vegetation indices. While vegetation indices derived from nadir-viewing remote sensors are insufficient in leaf area index (LAI) estimation because of its misinterpretation of structural characteristecs, vegetation indices derived from multi-angular remote sensors have potential to improve detection of LAI. However, view angle effects on relationships between these indices and LAI for low standing crops (i.e. wheat) has not been fully evaluated and thus limits them to applied for consistent and accurate monitoring of vegetation. View angles effects of two types of winter wheat (wheat 411, erectophile; and wheat 9507, planophile) on relationship between LAI and spectral reflectance are assessed and compared in this study. An evaluation is conducted with in-situ measurements of LAI and bidirectional reflectance in the principal plane from -60° (back-scattering direction ) ot 60° (forward scattering direction) in the growth cycle of winter wheat. A variety of vegetation indices (VIs) published are calculated by BRDF. Additionally, all combinations of the bands are used in order to calculate Normalized difference Spectral Indices (NDSI) and Simple Subtraction Indices (SSI). The performance of the above indices along with raw reflectance and reflectance derivatives on LAI estimation are examined based on a linearity comparison. The results will be helpful in further developing multi-angle remote sensing models for accurate LAI evaluation.
Holographic elements and curved slit used to enlarge field of view in rocket detection system
NASA Astrophysics Data System (ADS)
Breton, Mélanie; Fortin, Jean; Lessard, Roger A.; Châteauneuf, Marc
2006-09-01
Rocket detection over a wide field of view is an important issue in the protection of light armored vehicle. Traditionally, the detection occurs in UV band, but recent studies have shown the existence of significant emission peaks in the visible and near infrared at rocket launch time. The use of the visible region is interesting in order to reduce the weight and cost of systems. Current methods to detect those specific peaks involve use of interferometric filters. However, they fail to combine wide angle with wavelength selectivity. A linear array of volume holographic elements combined with a curved exit slit is proposed for the development of a wide field of view sensor for the detection of solid propellant motor launch flash. The sensor is envisaged to trigger an active protection system. On the basis of geometric theory, a system has been designed. It consists of a collector, a linear array of holographic elements, a curved slit and a detector. The collector is an off-axis parabolic mirror. Holographic elements are recorded subdividing a hologram film in regions, each individually exposed with a different incidence angle. All regions have a common diffraction angle. The incident angle determines the instantaneous field of view of the elements. The volume hologram performs the function of separating and focusing the diffracted beam on an image plane to achieve wavelength filtering. Conical diffraction property is used to enlarge the field of view in elevation. A curved slit was designed to correspond to oblique incidence of the holographic linear array. It is situated at the image plane and filters the diffracted spectrum toward the sensor. The field of view of the design was calculated to be 34 degrees. This was validated by a prototype tested during a field trial. Results are presented and analyzed. The system succeeded in detecting the rocket launch flash at desired fields of view.
Results from Solar Reflective Band End-to-End Testing for VIIRS F1 Sensor Using T-SIRCUS
NASA Technical Reports Server (NTRS)
McIntire, Jeff; Moyer, David; McCarthy, James K.; DeLuccia, Frank; Xiong, Xiaoxiong; Butler, James J.; Guenther, Bruce
2011-01-01
Verification of the Visible Infrared Imager Radiometer Suite (VIIRS) End-to-End (E2E) sensor calibration is highly recommended before launch, to identify any anomalies and to improve our understanding of the sensor on-orbit calibration performance. E2E testing of the Reflective Solar Bands (RSB) calibration cycle was performed pre-launch for the VIIRS Fight 1 (F1) sensor at the Ball Aerospace facility in Boulder CO in March 2010. VIIRS reflective band calibration cycle is very similar to heritage sensor MODIS in that solar illumination, via a diffuser, is used to correct for temporal variations in the instrument responsivity. Monochromatic light from the NIST T-SIRCUS was used to illuminate both the Earth View (EV), via an integrating sphere, and the Solar Diffuser (SD) view, through a collimator. The collimator illumination was cycled through a series of angles intended to simulate the range of possible angles for which solar radiation will be incident on the solar attenuation screen on-orbit. Ideally, the measured instrument responsivity (defined here as the ratio of the detector response to the at-sensor radiance) should be the same whether the EV or SD view is illuminated. The ratio of the measured responsivities was determined at each collimator angle and wavelength. In addition, the Solar Diffuser Stability Monitor (SDSM), a ratioing radiometer designed to track the temporal variation in the SD BRF by direct comparison to solar radiation, was illuminated by the collimator. The measured SDSM ratio was compared to the predicted ratio. An uncertainty analysis was also performed on both the SD and SDSM calibrations.
A wide-angle camera module for disposable endoscopy
NASA Astrophysics Data System (ADS)
Shim, Dongha; Yeon, Jesun; Yi, Jason; Park, Jongwon; Park, Soo Nam; Lee, Nanhee
2016-08-01
A wide-angle miniaturized camera module for disposable endoscope is demonstrated in this paper. A lens module with 150° angle of view (AOV) is designed and manufactured. All plastic injection-molded lenses and a commercial CMOS image sensor are employed to reduce the manufacturing cost. The image sensor and LED illumination unit are assembled with a lens module. The camera module does not include a camera processor to further reduce its size and cost. The size of the camera module is 5.5 × 5.5 × 22.3 mm3. The diagonal field of view (FOV) of the camera module is measured to be 110°. A prototype of a disposable endoscope is implemented to perform a pre-clinical animal testing. The esophagus of an adult beagle dog is observed. These results demonstrate the feasibility of a cost-effective and high-performance camera module for disposable endoscopy.
Topview stereo: combining vehicle-mounted wide-angle cameras to a distance sensor array
NASA Astrophysics Data System (ADS)
Houben, Sebastian
2015-03-01
The variety of vehicle-mounted sensors in order to fulfill a growing number of driver assistance tasks has become a substantial factor in automobile manufacturing cost. We present a stereo distance method exploiting the overlapping field of view of a multi-camera fisheye surround view system, as they are used for near-range vehicle surveillance tasks, e.g. in parking maneuvers. Hence, we aim at creating a new input signal from sensors that are already installed. Particular properties of wide-angle cameras (e.g. hanging resolution) demand an adaptation of the image processing pipeline to several problems that do not arise in classical stereo vision performed with cameras carefully designed for this purpose. We introduce the algorithms for rectification, correspondence analysis, and regularization of the disparity image, discuss reasons and avoidance of the shown caveats, and present first results on a prototype topview setup.
Inter-Comparison of MODIS and VIIRS Vegetation Indices Using One-Year Global Data
NASA Astrophysics Data System (ADS)
Miura, T.; Muratsuchi, J.; Obata, K.; Kato, A.; Vargas, M.; Huete, A. R.
2016-12-01
The Visible Infrared Imaging Radiometer Suite (VIIRS) sensor series of the Joint Polar Satellite System program is slated to continue the highly calibrated data stream initiated with the Earth Observing System Moderate Resolution Imaging Spectroradiometer (MODIS) sensors. A number of geophysical products are being/to be produced from VIIRS data, including the "Top-of-the-Atmosphere (TOA)" Normalized Difference Vegetation Index (NDVI), "Top-of-Canopy (TOC)" Enhanced Vegetation Index (EVI), and TOC NDVI. In this study, we cross-compared vegetation indices (VIs) from the first VIIRS sensor aboard the Suomi National Polar-orbiting Partnership satellite with the Aqua MODIS counterparts using one-year global data. This study was aimed at developing a thorough understanding of radiometric compatibility between the two VI datasets across globe, seasons, a range of viewing angle, and land cover types. VIIRS and MODIS VI data of January-December 2015 were obtained at monthly intervals when their orbital tracks coincided. These data were projected and spatially-aggregated into a .0036-degree grid while screening for cloud and aerosol contaminations using their respective quality flags. VIIRS-MODIS observation pairs with near-identical sun-target-view angles were extracted from each of these monthly image pairs for cross-comparison. The four VIs of TOA NDVI, TOC NDVI, TOC EVI, and TOC EVI2 (a two-band version of the EVI) were analyzed. Between MODIS and VIIRS, TOA NDVI, TOC NDVI, and TOC EVI2 had very small overall mean differences (MD) of .014, .013, and .013 VI units, respectively, whereas TOC EVI had a slightly larger overall MD of 0.023 EVI units attributed to the disparate blue bands of the two sensors. These systematic differences were consistent across the one-year period. With respect to sun-target-viewing geometry, MDs were also consistent across the view zenith angle range, but always lower for forward- than backward-viewing geometry. MDs showed large land cover dependencies for TOA NDVI and TOC NDVI, varying 10 folds from .002 for forests to .02 for sparsely-vegetated areas. They were consistent across land cover types for TOC EVI and TOC EVI2. Future studies should address the impact of sun-target-view geometry on corss-sensor VI comparisons.
Novel compact panomorph lens based vision system for monitoring around a vehicle
NASA Astrophysics Data System (ADS)
Thibault, Simon
2008-04-01
Automotive applications are one of the largest vision-sensor market segments and one of the fastest growing ones. The trend to use increasingly more sensors in cars is driven both by legislation and consumer demands for higher safety and better driving experiences. Awareness of what directly surrounds a vehicle affects safe driving and manoeuvring of a vehicle. Consequently, panoramic 360° Field of View imaging can contributes most to the perception of the world around the driver than any other sensors. However, to obtain a complete vision around the car, several sensor systems are necessary. To solve this issue, a customized imaging system based on a panomorph lens will provide the maximum information for the drivers with a reduced number of sensors. A panomorph lens is a hemispheric wide angle anamorphic lens with enhanced resolution in predefined zone of interest. Because panomorph lenses are optimized to a custom angle-to-pixel relationship, vision systems provide ideal image coverage that reduces and optimizes the processing. We present various scenarios which may benefit from the use of a custom panoramic sensor. We also discuss the technical requirements of such vision system. Finally we demonstrate how the panomorph based visual sensor is probably one of the most promising ways to fuse many sensors in one. For example, a single panoramic sensor on the front of a vehicle could provide all necessary information for assistance in crash avoidance, lane tracking, early warning, park aids, road sign detection, and various video monitoring views.
Modular multiaperatures for light sensors
NASA Technical Reports Server (NTRS)
Rizzo, A. A.
1977-01-01
Process involves electroplating multiaperature masks as unit, eliminating alinement and assembly difficulties previously encountered. Technique may be applied to masks in automated and surveillance light systems, when precise, wide angle field of view is needed.
An airborne sensor for the avoidance of clear air turbulence
NASA Technical Reports Server (NTRS)
Gary, B. L.
1981-01-01
This paper describes an airborne microwave radiometer that may be able to provide altitude guidance away from layers containing clear air turbulence, CAT. The sensor may also be able to predict upper limits for the severity of upcoming CAT. The 55 GHz radiometer is passive, not radar, and it measures the temperature of oxygen molecules in the viewing direction (averaged along a several-kilometer path). A small computer directs the viewing direction through elevation angle scans, and converts observed quantities to an 'altitude temperature profile'. The principle for CAT avoidance is that CAT is found statistically more often within inversion layers and at the tropopause, both of which are easily located from sensor-generated altitude temperature profiles.
2009-08-20
Nomenclature As = QCM sensor area E = ion energy E* = characteristic energy describing the differential sputter yield profile shape Eth...We report differential and total sputter yields for several grades of BN at ion energies down to 60 eV, obtained with a QCM deposition sensor 3-7,9...personal computer with LabView is used for data logging. Detailed discussion of the QCM sensor is provided in subsection IIF. B. Definition of Angles
Pixel super resolution using wavelength scanning
2016-04-08
the light source is adjusted to ~20 μW. The image sensor chip is a color CMOS sensor chip with a pixel size of 1.12 μm manufactured for cellphone...pitch (that is, ~ 1 μm in Figure 3a, using a CMOS sensor that has a 1.12-μm pixel pitch). For the same configuration depicted in Figure 3, utilizing...section). The a Lens-free raw holograms captured by 1.12 μm CMOS image sensor Field of view ≈ 20.5 mm2 Angle change directions for synthetic aperture
Lopez, Thomas; Massenot, Sébastien; Estribeau, Magali; Magnan, Pierre; Pardo, Fabrice; Pelouard, Jean-Luc
2016-04-18
This paper deals with the integration of metallic and dielectric nanostructured planar lenses into a pixel from a silicon based CMOS image sensor, for a monochromatic application at 1.064 μm. The first is a Plasmonic Lens, based on the phase delay through nanoslits, which has been found to be hardly compatible with current CMOS technology and exhibits a notable metallic absorption. The second is a dielectric Phase-Fresnel Lens integrated at the top of a pixel, it exhibits an Optical Efficiency (OE) improved by a few percent and an angle of view of 50°. The third one is a metallic diffractive lens integrated inside a pixel, which shows a better OE and an angle of view of 24°. The last two lenses exhibit a compatibility with a spectral band close to 1.064 μm.
Giga-pixel lensfree holographic microscopy and tomography using color image sensors.
Isikman, Serhan O; Greenbaum, Alon; Luo, Wei; Coskun, Ahmet F; Ozcan, Aydogan
2012-01-01
We report Giga-pixel lensfree holographic microscopy and tomography using color sensor-arrays such as CMOS imagers that exhibit Bayer color filter patterns. Without physically removing these color filters coated on the sensor chip, we synthesize pixel super-resolved lensfree holograms, which are then reconstructed to achieve ~350 nm lateral resolution, corresponding to a numerical aperture of ~0.8, across a field-of-view of ~20.5 mm(2). This constitutes a digital image with ~0.7 Billion effective pixels in both amplitude and phase channels (i.e., ~1.4 Giga-pixels total). Furthermore, by changing the illumination angle (e.g., ± 50°) and scanning a partially-coherent light source across two orthogonal axes, super-resolved images of the same specimen from different viewing angles are created, which are then digitally combined to synthesize tomographic images of the object. Using this dual-axis lensfree tomographic imager running on a color sensor-chip, we achieve a 3D spatial resolution of ~0.35 µm × 0.35 µm × ~2 µm, in x, y and z, respectively, creating an effective voxel size of ~0.03 µm(3) across a sample volume of ~5 mm(3), which is equivalent to >150 Billion voxels. We demonstrate the proof-of-concept of this lensfree optical tomographic microscopy platform on a color CMOS image sensor by creating tomograms of micro-particles as well as a wild-type C. elegans nematode.
Characterization Approaches to Place Invariant Sites on SI-Traceable Scales
NASA Technical Reports Server (NTRS)
Thome, Kurtis
2012-01-01
The effort to understand the Earth's climate system requires a complete integration of remote sensing imager data across time and multiple countries. Such an integration necessarily requires ensuring inter-consistency between multiple sensors to create the data sets needed to understand the climate system. Past efforts at inter-consistency have forced agreement between two sensors using sources that are viewed by both sensors at nearly the same time, and thus tend to be near polar regions over snow and ice. The current work describes a method that would provide an absolute radiometric calibration of a sensor rather than an inter-consistency of a sensor relative to another. The approach also relies on defensible error budgets that eventually provides a cross comparison of sensors without systematic errors. The basis of the technique is a model-based, SI-traceable prediction of at-sensor radiance over selected sites. The predicted radiance would be valid for arbitrary view and illumination angles and for any date of interest that is dominated by clear-sky conditions. The effort effectively works to characterize the sites as sources with known top-of-atmosphere radiance allowing accurate intercomparison of sensor data that without the need for coincident views. Data from the Advanced Spaceborne Thermal Emission and Reflection and Radiometer (ASTER), Enhanced Thematic Mapper Plus (ETM+), and Moderate Resolution Imaging Spectroradiometer (MODIS) are used to demonstrate the difficulties of cross calibration as applied to current sensors. Special attention is given to the differences caused in the cross-comparison of sensors in radiance space as opposed to reflectance space. The radiance comparisons lead to significant differences created by the specific solar model used for each sensor. The paper also proposes methods to mitigate the largest error sources in future systems. The results from these historical intercomparisons provide the basis for a set of recommendations to ensure future SI-traceable cross calibration using future missions such as CLARREO and TRUTHS. The paper describes a proposed approach that relies on model-based, SI-traceable predictions of at-sensor radiance over selected sites. The predicted radiance would be valid for arbitrary view and illumination angles and for any date of interest that is dominated by clear-sky conditions. The basis of the method is highly accurate measurements of at-sensor radiance of sufficient quality to understand the spectral and BRDF characteristics of the site and sufficient historical data to develop an understanding of temporal effects from changing surface and atmospheric conditions.
NASA Astrophysics Data System (ADS)
Chen, J. M.; He, L.; Chou, S.; Ju, W.; Zhang, Y.; Joiner, J.; Liu, J.; Mo, G.
2017-12-01
Sun-induced chlorophyll fluorescence (SIF) measured from plant canopies originates mostly from sunlit leaves. Observations of SIF by satellite sensors, such as GOME-2 and GOSAT, are often made over large view zenith angle ranges, causing large changes in the viewed sunlit leaf fraction across the scanning swath. Although observations made by OCO-2 are near nadir, the observed sunlit leaf fraction could still vary greatly due to changes in the solar zenith angle with latitude and time of overpass. To demonstrate the importance of considering the satellite-target-view geometry in using SIF for assessing vegetation productivity, we conducted multi-angle measurements of SIF using a hyperspectral sensor mounted on an automated rotating system over a rice field near Nanjing, China. A method is developed to separate SIF measurements at each angle into sunlit and shaded leaf components, and an angularly normalized canopy-level SIF is obtained as the weighted sum of sunlit and shaded SIF. This normalized SIF is shown to be a much better proxy of GPP of the rice field measured by an eddy covariance system than the unnormalized SIF observations. The same normalization scheme is also applied to the far-red GOME-2 SIF observations on sunny days, and we found that the normalized SIF is better correlated with model-simulated GPP than the original SIF observations. The coefficient of determination (R2) is improved by 0.07±0.04 on global average using the normalization scheme. The most significant improvement in R2 by 0.09±0.04 is found in deciduous broadleaf forests, where the observed sunlit leaf fraction is highly sensitive to solar zenith angle.
Shuttle imaging radar views the Earth from Challenger: The SIR-B experiment
NASA Technical Reports Server (NTRS)
Ford, J. P.; Cimino, J. B.; Holt, B.; Ruzek, M. R.
1986-01-01
In October 1984, SIR-B obtained digital image data of about 6.5 million km2 of the Earth's surface. The coverage is mostly of selected experimental test sites located between latitudes 60 deg north and 60 deg south. Programmed adjustments made to the look angle of the steerable radar antenna and to the flight attitude of the shuttle during the mission permitted collection of multiple-incidence-angle coverage or extended mapping coverage as required for the experiments. The SIR-B images included here are representative of the coverage obtained for scientific studies in geology, cartography, hydrology, vegetation cover, and oceanography. The relations between radar backscatter and incidence angle for discriminating various types of surfaces, and the use of multiple-incidence-angle SIR-B images for stereo measurement and viewing, are illustrated with examples. Interpretation of the images is facilitated by corresponding images or photographs obtained by different sensors or by sketch maps or diagrams.
NASA Technical Reports Server (NTRS)
Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Gregg, Watson W.
1992-01-01
Due to range safety considerations, the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) ocean color instrument may be required to be launched into a near-noon descending node, as opposed to the ascending node used by the predecessor sensor, the Coastal Zone Color Scanner (CZCS). The relative importance of ascending versus descending near-noon orbits was assessed here to determine if descending node will meet the scientific requirements of SeaWiFS. Analyses focused on ground coverage, local times of coverage, solar and viewing geometries (zenith and azimuth angles), and sun glint. Differences were found in the areas covered by individual orbits, but were not important when taken over a 16 day repeat time. Local time of coverage was also different: for ascending node orbits the Northern Hemisphere was observed in the morning and the Southern Hemisphere in the afternoon, while for descending node orbits the Northern Hemisphere was observed in the afternoon and the Southern in the morning. There were substantial differences in solar azimuth and spacecraft azimuth angles both at equinox and at the Northern Hemisphere summer solstice. Negligible differences in solar and spacecraft zenith angles, relative azimuth angles, and sun glint were obtained at the equinox. However, large differences were found in solar zenith angles, relative azimuths, and sun glint for the solstice. These differences appeared to compensate across the scan, however, an increase in sun glint in descending node over that in ascending node on the western part of the scan was compensated by a decrease on the eastern part of the scan. Thus, no advantage or disadvantage could be conferred upon either ascending node or descending node for noon orbits. Analyses were also performed for ascending and descending node orbits that deviated from a noon equator crossing time. For ascending node, afternoon orbits produced the lowest mean solar zenith angles in the Northern Hemisphere, and morning orbits produced the lowest angles for the Southern Hemisphere. For descending node, morning orbits produced the lowest mean solar zenith angles for the Northern Hemisphere; afternoon orbits produced the lowest angles for the Southern Hemisphere.
NASA Technical Reports Server (NTRS)
Fenner, R. G.; Reid, S. C.; Solie, C. H.
1980-01-01
An evaluation is given of how active and passive microwave sensors can best be used in oil spill detection and assessment. Radar backscatter curves taken over oil spills are presented and their effect on synthetic aperture radar (SAR) imagery are discussed. Plots of microwave radiometric brightness variations over oil spills are presented and discussed. Recommendations as to how to select the best combination of frequency, viewing angle, and sensor type for evaluation of various aspects of oil spills are also discussed.
Multi-view line-scan inspection system using planar mirrors
NASA Astrophysics Data System (ADS)
Holländer, Bransilav; Štolc, Svorad; Huber-Mörk, Reinhold
2013-04-01
We demonstrate the design, setup, and results for a line-scan stereo image acquisition system using a single area- scan sensor, single lens and two planar mirrors attached to the acquisition device. The acquired object is moving relatively to the acquisition device and is observed under three different angles at the same time. Depending on the specific configuration it is possible to observe the object under a straight view (i.e., looking along the optical axis) and two skewed views. The relative motion between an object and the acquisition device automatically fulfills the epipolar constraint in stereo vision. The choice of lines to be extracted from the CMOS sensor depends on various factors such as the number, position and size of the mirrors, the optical and sensor configuration, or other application-specific parameters like desired depth resolution. The acquisition setup presented in this paper is suitable for the inspection of a printed matter, small parts or security features such as optical variable devices and holograms. The image processing pipeline applied to the extracted sensor lines is explained in detail. The effective depth resolution achieved by the presented system, assembled from only off-the-shelf components, is approximately equal to the spatial resolution and can be smoothly controlled by changing positions and angles of the mirrors. Actual performance of the device is demonstrated on a 3D-printed ground-truth object as well as two real-world examples: (i) the EUR-100 banknote - a high-quality printed matter and (ii) the hologram at the EUR-50 banknote { an optical variable device.
Digital sun sensor multi-spot operation.
Rufino, Giancarlo; Grassi, Michele
2012-11-28
The operation and test of a multi-spot digital sun sensor for precise sun-line determination is described. The image forming system consists of an opaque mask with multiple pinhole apertures producing multiple, simultaneous, spot-like images of the sun on the focal plane. The sun-line precision can be improved by averaging multiple simultaneous measures. Nevertheless, the sensor operation on a wide field of view requires acquiring and processing images in which the number of sun spots and the related intensity level are largely variable. To this end, a reliable and robust image acquisition procedure based on a variable shutter time has been considered as well as a calibration function exploiting also the knowledge of the sun-spot array size. Main focus of the present paper is the experimental validation of the wide field of view operation of the sensor by using a sensor prototype and a laboratory test facility. Results demonstrate that it is possible to keep high measurement precision also for large off-boresight angles.
Low cost Earth attitude sensor
NASA Astrophysics Data System (ADS)
Liberati, Fabrizio; Perrotta, Giorgio; Verzegnassi, Fulvia
2017-11-01
A patent-pending, low-cost, moderate performance, Earth Attitude Sensor for LEO satellites is described in this paper. The paper deals with the system concepts, the technology adopted and the simulation results. The sensor comprises three or four narrow field of view mini telescopes pointed towards the Earth edge to detect and measure the variation of the off-nadir angle of the Earth-to-black sky transition using thermopile detectors suitably placed in the foci of the optical min telescopes. The system's innovation consists in the opto-mechanical configuration adopted that is sturdy and has no moving parts being , thus, inherently reliable. In addition, with a view to reducing production costs, the sensor does without hi-rel and is instead mainly based on COTS parts suitably chosen. Besides it is flexible and can be adapted to perform attitude measurement onboard spacecraft flying in orbits other than LEO with a minimum of modifications to the basic design. At present the sensor is under development by IMT and OptoService.
NASA Technical Reports Server (NTRS)
Odenthal, J. P.
1980-01-01
An opto-electronic receiver incorporating a multi-element linear photodiode array as a component of a laser-triangulation rangefinder was developed as an obstacle avoidance sensor for a Martian roving vehicle. The detector can resolve the angle of laser return in 1.5 deg increments within a field of view of 30 deg and a range of five meters. A second receiver with a 1024 elements over 60 deg and a 3 meter range is also documented. Design criteria, circuit operation, schematics, experimental results and calibration procedures are discussed.
Multi-Angle Snowflake Camera Instrument Handbook
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stuefer, Martin; Bailey, J.
2016-07-01
The Multi-Angle Snowflake Camera (MASC) takes 9- to 37-micron resolution stereographic photographs of free-falling hydrometers from three angles, while simultaneously measuring their fall speed. Information about hydrometeor size, shape orientation, and aspect ratio is derived from MASC photographs. The instrument consists of three commercial cameras separated by angles of 36º. Each camera field of view is aligned to have a common single focus point about 10 cm distant from the cameras. Two near-infrared emitter pairs are aligned with the camera’s field of view within a 10-angular ring and detect hydrometeor passage, with the lower emitters configured to trigger the MASCmore » cameras. The sensitive IR motion sensors are designed to filter out slow variations in ambient light. Fall speed is derived from successive triggers along the fall path. The camera exposure times are extremely short, in the range of 1/25,000th of a second, enabling the MASC to capture snowflake sizes ranging from 30 micrometers to 3 cm.« less
Ultrasonic imaging of material flaws exploiting multipath information
NASA Astrophysics Data System (ADS)
Shen, Xizhong; Zhang, Yimin D.; Demirli, Ramazan; Amin, Moeness G.
2011-05-01
In this paper, we consider ultrasonic imaging for the visualization of flaws in a material. Ultrasonic imaging is a powerful nondestructive testing (NDT) tool which assesses material conditions via the detection, localization, and classification of flaws inside a structure. Multipath exploitations provide extended virtual array apertures and, in turn, enhance imaging capability beyond the limitation of traditional multisensor approaches. We utilize reflections of ultrasonic signals which occur when encountering different media and interior discontinuities. The waveforms observed at the physical as well as virtual sensors yield additional measurements corresponding to different aspect angles. Exploitation of multipath information addresses unique issues observed in ultrasonic imaging. (1) Utilization of physical and virtual sensors significantly extends the array aperture for image enhancement. (2) Multipath signals extend the angle of view of the narrow beamwidth of the ultrasound transducers, allowing improved visibility and array design flexibility. (3) Ultrasonic signals experience difficulty in penetrating a flaw, thus the aspect angle of the observation is limited unless access to other sides is available. The significant extension of the aperture makes it possible to yield flaw observation from multiple aspect angles. We show that data fusion of physical and virtual sensor data significantly improves the detection and localization performance. The effectiveness of the proposed multipath exploitation approach is demonstrated through experimental studies.
Design and research of sun sensor based on technology of optical fiber
NASA Astrophysics Data System (ADS)
Li, Ye; Zhou, Wang; Li, Dan
2010-08-01
A kind of sun sensor is designed based on the optical fiber. This project consists of three parts: optical head, photoelectric sensor and signal processing unit. The innovation of this design lies in the improvement of traditional sun sensor, where multi-fibers, used as a leader, are symmetrically distributed on the surface of a spacecraft. To determine the attitude of a spacecraft, the sun sensor should measure the direction of the sun. Because the fiber length can be adjusted according to the fact, photoelectric sensor can be placed deeply inside a spacecraft to protect the photoelectric sensor against the damage by the high-energy particles from outer space. The processing unit calculates the difference value of sun energy imported by each pair of opposite optical fiber so as to obtain the angle and the orientation between the spacecraft and the sun. This sun sensor can suit multi-field of view, both small and large. It improves the accuracy of small field of view and increases the precision of locating a spacecraft. This paper briefly introduces the design of processing unit. This sun sensor is applicable to detect the attitude of a spacecraft. In addition, it can also be used in solar tracking system of PV technology.
NASA Astrophysics Data System (ADS)
Torkildsen, H. E.; Hovland, H.; Opsahl, T.; Haavardsholm, T. V.; Nicolas, S.; Skauli, T.
2014-06-01
In some applications of multi- or hyperspectral imaging, it is important to have a compact sensor. The most compact spectral imaging sensors are based on spectral filtering in the focal plane. For hyperspectral imaging, it has been proposed to use a "linearly variable" bandpass filter in the focal plane, combined with scanning of the field of view. As the image of a given object in the scene moves across the field of view, it is observed through parts of the filter with varying center wavelength, and a complete spectrum can be assembled. However if the radiance received from the object varies with viewing angle, or with time, then the reconstructed spectrum will be distorted. We describe a camera design where this hyperspectral functionality is traded for multispectral imaging with better spectral integrity. Spectral distortion is minimized by using a patterned filter with 6 bands arranged close together, so that a scene object is seen by each spectral band in rapid succession and with minimal change in viewing angle. The set of 6 bands is repeated 4 times so that the spectral data can be checked for internal consistency. Still the total extent of the filter in the scan direction is small. Therefore the remainder of the image sensor can be used for conventional imaging with potential for using motion tracking and 3D reconstruction to support the spectral imaging function. We show detailed characterization of the point spread function of the camera, demonstrating the importance of such characterization as a basis for image reconstruction. A simplified image reconstruction based on feature-based image coregistration is shown to yield reasonable results. Elimination of spectral artifacts due to scene motion is demonstrated.
Guidance Of A Mobile Robot Using An Omnidirectional Vision Navigation System
NASA Astrophysics Data System (ADS)
Oh, Sung J.; Hall, Ernest L.
1987-01-01
Navigation and visual guidance are key topics in the design of a mobile robot. Omnidirectional vision using a very wide angle or fisheye lens provides a hemispherical view at a single instant that permits target location without mechanical scanning. The inherent image distortion with this view and the numerical errors accumulated from vision components can be corrected to provide accurate position determination for navigation and path control. The purpose of this paper is to present the experimental results and analyses of the imaging characteristics of the omnivision system including the design of robot-oriented experiments and the calibration of raw results. Errors less than one picture element on each axis were observed by testing the accuracy and repeatability of the experimental setup and the alignment between the robot and the sensor. Similar results were obtained for four different locations using corrected results of the linearity test between zenith angle and image location. Angular error of less than one degree and radial error of less than one Y picture element were observed at moderate relative speed. The significance of this work is that the experimental information and the test of coordinated operation of the equipment provide a greater understanding of the dynamic omnivision system characteristics, as well as insight into the evaluation and improvement of the prototype sensor for a mobile robot. Also, the calibration of the sensor is important, since the results provide a cornerstone for future developments. This sensor system is currently being developed for a robot lawn mower.
NASA Astrophysics Data System (ADS)
Ball, C. P.; Marks, A. A.; Green, P.; Mac Arthur, A.; Fox, N.; King, M. D.
2013-12-01
Surface albedo is the hemispherical and wavelength integrated reflectance over the visible, near infrared and shortwave infrared regions of the solar spectrum. The albedo of Arctic snow can be in excess of 0.8 and it is a critical component in the global radiation budget because it determines the proportion of solar radiation absorbed, and reflected, over a large part of the Earth's surface. We present here our first results of the angularly resolved surface reflectance of Arctic snow at high solar zenith angles (~80°) suitable for the validation of satellite remote sensing products. The hemispherical directional reflectance factor (HDRF) of Arctic snow covered tundra was measured using the GonioRAdiometric Spectrometer System (GRASS) during a three-week field campaign in Ny-Ålesund, Svalbard, in March/April 2013. The measurements provide one of few existing HDRF datasets at high solar zenith angles for wind-blown Arctic snow covered tundra (conditions typical of the Arctic region), and the first ground-based measure of HDRF at Ny-Ålesund. The HDRF was recorded under clear sky conditions with 10° intervals in view zenith, and 30° intervals in view azimuth, for several typical sites over a wavelength range of 400-1500 nm at 1 nm resolution. Satellite sensors such as MODIS, AVHRR and VIIRS offer a method to monitor the surface albedo with high spatial and temporal resolution. However, snow reflectance is anisotropic and is dependent on view and illumination angle and the wavelength of the incident light. Spaceborne sensors subtend a discrete angle to the target surface and measure radiance over a limited number of narrow spectral bands. Therefore, the derivation of the surface albedo requires accurate knowledge of the surfaces bidirectional reflectance as a function of wavelength. The ultimate accuracy to which satellite sensors are able to measure snow surface properties such as albedo is dependant on the accuracy of the BRDF model, which can only be assessed if hyperspectral ground-based data are available to validate the current modelling approaches. The results presented here extend the work of previous studies by recording the HDRF of Arctic snow covered tundra at high solar zenith angles over several sites. Demonstrating the strong forward scattering nature of snow reflectance at high solar zenith angles, but also showing clear wavelength dependence in the shape of the HDRF, and an increasing anisotropy with wavelength.
Laser Range and Bearing Finder for Autonomous Missions
NASA Technical Reports Server (NTRS)
Granade, Stephen R.
2004-01-01
NASA has recently re-confirmed their interest in autonomous systems as an enabling technology for future missions. In order for autonomous missions to be possible, highly-capable relative sensor systems are needed to determine an object's distance, direction, and orientation. This is true whether the mission is autonomous in-space assembly, rendezvous and docking, or rover surface navigation. Advanced Optical Systems, Inc. has developed a wide-angle laser range and bearing finder (RBF) for autonomous space missions. The laser RBF has a number of features that make it well-suited for autonomous missions. It has an operating range of 10 m to 5 km, with a 5 deg field of view. Its wide field of view removes the need for scanning systems such as gimbals, eliminating moving parts and making the sensor simpler and space qualification easier. Its range accuracy is 1% or better. It is designed to operate either as a stand-alone sensor or in tandem with a sensor that returns range, bearing, and orientation at close ranges, such as NASA's Advanced Video Guidance Sensor. We have assembled the initial prototype and are currently testing it. We will discuss the laser RBF's design and specifications. Keywords: laser range and bearing finder, autonomous rendezvous and docking, space sensors, on-orbit sensors, advanced video guidance sensor
Liu, Bailing; Zhang, Fumin; Qu, Xinghua
2015-01-01
An improvement method for the pose accuracy of a robot manipulator by using a multiple-sensor combination measuring system (MCMS) is presented. It is composed of a visual sensor, an angle sensor and a series robot. The visual sensor is utilized to measure the position of the manipulator in real time, and the angle sensor is rigidly attached to the manipulator to obtain its orientation. Due to the higher accuracy of the multi-sensor, two efficient data fusion approaches, the Kalman filter (KF) and multi-sensor optimal information fusion algorithm (MOIFA), are used to fuse the position and orientation of the manipulator. The simulation and experimental results show that the pose accuracy of the robot manipulator is improved dramatically by 38%∼78% with the multi-sensor data fusion. Comparing with reported pose accuracy improvement methods, the primary advantage of this method is that it does not require the complex solution of the kinematics parameter equations, increase of the motion constraints and the complicated procedures of the traditional vision-based methods. It makes the robot processing more autonomous and accurate. To improve the reliability and accuracy of the pose measurements of MCMS, the visual sensor repeatability is experimentally studied. An optimal range of 1 × 0.8 × 1 ∼ 2 × 0.8 × 1 m in the field of view (FOV) is indicated by the experimental results. PMID:25850067
Laser interferometric high-precision angle monitor for JASMINE
NASA Astrophysics Data System (ADS)
Niwa, Yoshito; Arai, Koji; Sakagami, Masaaki; Gouda, Naoteru; Kobayashi, Yukiyasu; Yamada, Yoshiyuki; Yano, Taihei
2006-06-01
The JASMINE instrument uses a beam combiner to observe two different fields of view separated by 99.5 degrees simultaneously. This angle is so-called basic angle. The basic angle of JASMINE should be stabilized and fluctuations of the basic angle should be monitored with the accuracy of 10 microarcsec in root-mean-square over the satellite revolution period of 5 hours. For this purpose, a high-precision interferometric laser metrogy system is employed. One of the available techniques for measuring the fluctuations of the basic angle is a method known as the wave front sensing using a Fabry-Perot type laser interferometer. This technique is to detect fluctuations of the basic angle as displacement of optical axis in the Fabry-Perot cavity. One of the advantages of the technique is that the sensor is made to be sensitive only to the relative fluctuations of the basic angle which the JASMINE wants to know and to be insensitive to the common one; in order to make the optical axis displacement caused by relative motion enhanced the Fabry-Perot cavity is formed by two mirrors which have long radius of curvature. To verify the principle of this idea, the experiment was performed using a 0.1m-length Fabry-Perot cavity with the mirror curvature of 20m. The mirrors of the cavity were artificially actuated in either relative way or common way and the resultant outputs from the sensor were compared.
NASA Astrophysics Data System (ADS)
Psomiadis, Emmanouil; Dercas, Nicholas; Dalezios, Nicolas R.; Spyropoulos, Nikolaos V.
2017-10-01
Farmers throughout the world are constantly searching for ways to maximize their returns. Remote Sensing applications are designed to provide farmers with timely crop monitoring and production information. Such information can be used to identify crop vigor problems. Vegetation indices (VIs) derived from satellite data have been widely used to assess variations in the physiological state and biophysical properties of vegetation. However, due to the various sensor characteristics, there are differences among VIs derived from multiple sensors for the same target. Therefore, multi-sensor VI capability and effectiveness are critical but complicated issues in the application of multi-sensor vegetation observations. Various factors such as the atmospheric conditions during acquisition, sensor and geometric characteristics, such as viewing angle, field of view, and sun elevation influence direct comparability of vegetation indicators among different sensors. In the present study, two experimental areas were used which are located near the villages Nea Lefki and Melia of Larissa Prefecture in Thessaly Plain area, containing a wheat and a cotton crop, respectively. Two satellite systems with different spatial resolution, WorldView-2 (W2) and Sentinel-2 (S2) with 2 and 10 meters pixel size, were used. Normalized Difference Vegetation Index (NDVI) and Leaf Area Index (LAI) were calculated and a statistical comparison of the VIs was made to designate their correlation and dependency. Finally, several other innovative indices were calculated and compared to evaluate their effectiveness in the detection of problematic plant growth areas.
MODIS and SeaWIFS on-orbit lunar calibration
Sun, Jielun; Eplee, R.E.; Xiong, X.; Stone, T.; Meister, G.; McClain, C.R.
2008-01-01
The Moon plays an important role in the radiometric stability monitoring of the NASA Earth Observing System's (EOS) remote sensors. The MODIS and SeaWIFS are two of the key instruments for NASA's EOS missions. The MODIS Protoflight Model (PFM) on-board the Terra spacecraft and the MODIS Flight Model 1 (FM1) on-board the Aqua spacecraft were launched on December 18, 1999 and May 4, 2002, respectively. They view the Moon through the Space View (SV) port approximately once a month to monitor the long-term radiometric stability of their Reflective Solar Bands (RSB). SeaWIFS was launched on-board the OrbView-2 spacecraft on August 1, 1997. The SeaWiFS lunar calibrations are obtained once a month at a nominal phase angle of 7??. The lunar irradiance observed by these instruments depends on the viewing geometry. The USGS photometric model of the Moon (the ROLO model) has been developed to provide the geometric corrections for the lunar observations. For MODIS, the lunar view responses with corrections for the viewing geometry are used to track the gain change for its reflective solar bands (RSB). They trend the system response degradation at the Angle Of Incidence (AOI) of sensor's SV port. With both the lunar observation and the on-board Solar Diffuser (SD) calibration, it is shown that the MODIS system response degradation is wavelength, mirror side, and AOI dependent. Time-dependent Response Versus Scan angle (RVS) Look-Up Tables (LUT) are applied in MODIS RSB calibration and lunar observations play a key role in RVS derivation. The corrections provided by the RVS in the Terra and Aqua MODIS data from the 412 nm band are as large as 16% and 13%, respectively. For SeaWIFS lunar calibrations, the spacecraft is pitched across the Moon so that the instrument views the Moon near nadir through the same optical path as it views the Earth. The SeaWiFS system gain changes for its eight bands are calibrated using the geometrically-corrected lunar observations. The radiometric corrections to the SeaWiFS data, after more than ten years on orbit, are 19% at 865 nm, 8% at 765 nm, and 1-3% in the other bands. In this report, the lunar calibration algorithms are reviewed and the RSB gain changes observed by the lunar observations are shown for all three sensors. The lunar observations for the three instruments are compared using the USGS photometric model. The USGS lunar model facilitates the cross calibration of instruments with different spectra bandpasses whose measurements of the Moon differ in time and observing geometry.
WindCam and MSPI: two cloud and aerosol instrument concepts derived from Terra/MISR heritage
NASA Astrophysics Data System (ADS)
Diner, David J.; Mischna, Michael; Chipman, Russell A.; Davis, Ab; Cairns, Brian; Davies, Roger; Kahn, Ralph A.; Muller, Jan-Peter; Torres, Omar
2008-08-01
The Multi-angle Imaging SpectroRadiometer (MISR) has been acquiring global cloud and aerosol data from polar orbit since February 2000. MISR acquires moderately high-resolution imagery at nine view angles from nadir to 70.5°, in four visible/near-infrared spectral bands. Stereoscopic parallax, time lapse among the nine views, and the variation of radiance with angle and wavelength enable retrieval of geometric cloud and aerosol plume heights, height-resolved cloud-tracked winds, and aerosol optical depth and particle property information. Two instrument concepts based upon MISR heritage are in development. The Cloud Motion Vector Camera, or WindCam, is a simplified version comprised of a lightweight, compact, wide-angle camera to acquire multiangle stereo imagery at a single visible wavelength. A constellation of three WindCam instruments in polar Earth orbit would obtain height-resolved cloud-motion winds with daily global coverage, making it a low-cost complement to a spaceborne lidar wind measurement system. The Multiangle SpectroPolarimetric Imager (MSPI) is aimed at aerosol and cloud microphysical properties, and is a candidate for the National Research Council Decadal Survey's Aerosol-Cloud-Ecosystem (ACE) mission. MSPI combines the capabilities of MISR with those of other aerosol sensors, extending the spectral coverage to the ultraviolet and shortwave infrared and incorporating high-accuracy polarimetric imaging. Based on requirements for the nonimaging Aerosol Polarimeter Sensor on NASA's Glory mission, a degree of linear polarization uncertainty of 0.5% is specified within a subset of the MSPI bands. We are developing a polarization imaging approach using photoelastic modulators (PEMs) to accomplish this objective.
NASA Technical Reports Server (NTRS)
Kimes, D. S.
1979-01-01
The effects of vegetation canopy structure on thermal infrared sensor response must be understood before vegetation surface temperatures of canopies with low percent ground cover can be accurately inferred. The response of a sensor is a function of vegetation geometric structure, the vertical surface temperature distribution of the canopy components, and sensor view angle. Large deviations between the nadir sensor effective radiant temperature (ERT) and vegetation ERT for a soybean canopy were observed throughout the growing season. The nadir sensor ERT of a soybean canopy with 35 percent ground cover deviated from the vegetation ERT by as much as 11 C during the mid-day. These deviations were quantitatively explained as a function of canopy structure and soil temperature. Remote sensing techniques which determine the vegetation canopy temperature(s) from the sensor response need to be studied.
NASA Technical Reports Server (NTRS)
Natanson, G. A.
1997-01-01
New algorithms are described covering the simulation, processing, and calibration of penetration angles of the Barnes static Earth sensor assembly (SESA) as implemented in the Goddard Space Flight Center Flight Dynamics Division ground support system for the Tropical Rainfall Measuring Mission (TRMM) Observatory. The new treatment involves a detailed analysis of the measurements by individual quadrants. It is shown that, to a good approximation, individual quadrant misalignments can be treated simply as penetration angle biases. Simple formulas suitable for real-time applications are introduced for computing quadrant-dependent effects. The simulator generates penetration angles by solving a quadratic equation with coefficients uniquely determined by the spacecraft's position and the quadrant's orientation in GeoCentric Inertial (GCI) coordinates. Measurement processing for attitude determination is based on linearized equations obtained by expanding the coefficients of the aforementioned quadratic equation as a Taylor series in both the Earth oblateness coefficient (alpha approx. 1/150) and the angle between the pointing axis and the geodetic nadir vector. A simple formula relating a measured value of the penetration angle to the deviation of the Earth-pointed axis from the geodetic nadir vector is derived. It is shown that even near the very edge of the quadrant's Field Of View (FOV), attitude errors resulting from quadratic effects are a few hundredths of a degree, which is small compared to the attitude determination accuracy requirement (0.18 degree, 3 sigma) of TRMM. Calibration of SESA measurements is complicated by a first-order filtering used in the TRMM onboard algorithm to compute penetration angles from raw voltages. A simple calibration scheme is introduced where these complications are avoided by treating penetration angles as the primary raw measurements, which are adjusted using biases and scale factors. In addition to three misalignment parameters, the calibration state vector contains only two average penetration angle biases (one per each pair of opposite quadrants) since, because of the very narrow sensor FOV (+/- 2.6 degrees), differences between biases of the penetration angles measured by opposite quadrants cannot be distinguished from roll and pitch sensor misalignments. After calibration, the estimated misalignments and average penetration angle biases are converted to the four penetration angle biases and to the yaw misalignment angle. The resultant biases and the estimated scale factors are finally used to update the coefficients necessary for onboard computations of penetration angles from measured voltages.
Microwave Brightness Temperatures of Tilted Convective Systems
NASA Technical Reports Server (NTRS)
Hong, Ye; Haferman, Jeffrey L.; Olson, William S.; Kummerow, Christian D.
1998-01-01
Aircraft and ground-based radar data from the Tropical Ocean and Global Atmosphere Coupled-Ocean Atmosphere Response Experiment (TOGA COARE) show that convective systems are not always vertical. Instead, many are tilted from vertical. Satellite passive microwave radiometers observe the atmosphere at a viewing angle. For example, the Special Sensor Microwave/Imager (SSM/I) on Defense Meteorological Satellite Program (DMSP) satellites and the Tropical Rainfall Measurement Mission (TRMM) Microwave Imager (TMI) on the TRMM satellite have an incident angle of about 50deg. Thus, the brightness temperature measured from one direction of tilt may be different than that viewed from the opposite direction due to the different optical depth. This paper presents the investigation of passive microwave brightness temperatures of tilted convective systems. To account for the effect of tilt, a 3-D backward Monte Carlo radiative transfer model has been applied to a simple tilted cloud model and a dynamically evolving cloud model to derive the brightness temperature. The radiative transfer results indicate that brightness temperature varies when the viewing angle changes because of the different optical depth. The tilt increases the displacements between high 19 GHz brightness temperature (Tb(sub 19)) due to liquid emission from lower level of cloud and the low 85 GHz brightness temperature (Tb(sub 85)) due to ice scattering from upper level of cloud. As the resolution degrades, the difference of brightness temperature due to the change of viewing angle decreases dramatically. The dislocation between Tb(sub 19) and Tb(sub 85), however, remains prominent.
Impact of Surface Roughness on AMSR-E Sea Ice Products
NASA Technical Reports Server (NTRS)
Stroeve, Julienne C.; Markus, Thorsten; Maslanik, James A.; Cavalieri, Donald J.; Gasiewski, Albin J.; Heinrichs, John F.; Holmgren, Jon; Perovich, Donald K.; Sturm, Matthew
2006-01-01
This paper examines the sensitivity of Advanced Microwave Scanning Radiometer (AMSR-E) brightness temperatures (Tbs) to surface roughness by a using radiative transfer model to simulate AMSR-E Tbs as a function of incidence angle at which the surface is viewed. The simulated Tbs are then used to examine the influence that surface roughness has on two operational sea ice algorithms, namely: 1) the National Aeronautics and Space Administration Team (NT) algorithm and 2) the enhanced NT algorithm, as well as the impact of roughness on the AMSR-E snow depth algorithm. Surface snow and ice data collected during the AMSR-Ice03 field campaign held in March 2003 near Barrow, AK, were used to force the radiative transfer model, and resultant modeled Tbs are compared with airborne passive microwave observations from the Polarimetric Scanning Radiometer. Results indicate that passive microwave Tbs are very sensitive even to small variations in incidence angle, which can cause either an over or underestimation of the true amount of sea ice in the pixel area viewed. For example, this paper showed that if the sea ice areas modeled in this paper mere assumed to be completely smooth, sea ice concentrations were underestimated by nearly 14% using the NT sea ice algorithm and by 7% using the enhanced NT algorithm. A comparison of polarization ratios (PRs) at 10.7,18.7, and 37 GHz indicates that each channel responds to different degrees of surface roughness and suggests that the PR at 10.7 GHz can be useful for identifying locations of heavily ridged or rubbled ice. Using the PR at 10.7 GHz to derive an "effective" viewing angle, which is used as a proxy for surface roughness, resulted in more accurate retrievals of sea ice concentration for both algorithms. The AMSR-E snow depth algorithm was found to be extremely sensitive to instrument calibration and sensor viewing angle, and it is concluded that more work is needed to investigate the sensitivity of the gradient ratio at 37 and 18.7 GHz to these factors to improve snow depth retrievals from spaceborne passive microwave sensors.
Autonomous navigation accuracy using simulated horizon sensor and sun sensor observations
NASA Technical Reports Server (NTRS)
Pease, G. E.; Hendrickson, H. T.
1980-01-01
A relatively simple autonomous system which would use horizon crossing indicators, a sun sensor, a quartz oscillator, and a microprogrammed computer is discussed. The sensor combination is required only to effectively measure the angle between the centers of the Earth and the Sun. Simulations for a particular orbit indicate that 2 km r.m.s. orbit determination uncertainties may be expected from a system with 0.06 deg measurement uncertainty. A key finding is that knowledge of the satellite orbit plane orientation can be maintained to this level because of the annual motion of the Sun and the predictable effects of Earth oblateness. The basic system described can be updated periodically by transits of the Moon through the IR horizon crossing indicator fields of view.
A novel double fine guide sensor design on space telescope
NASA Astrophysics Data System (ADS)
Zhang, Xu-xu; Yin, Da-yi
2018-02-01
To get high precision attitude for space telescope, a double marginal FOV (field of view) FGS (Fine Guide Sensor) is proposed. It is composed of two large area APS CMOS sensors and both share the same lens in main light of sight. More star vectors can be get by two FGS and be used for high precision attitude determination. To improve star identification speed, the vector cross product in inter-star angles for small marginal FOV different from traditional way is elaborated and parallel processing method is applied to pyramid algorithm. The star vectors from two sensors are then used to attitude fusion with traditional QUEST algorithm. The simulation results show that the system can get high accuracy three axis attitudes and the scheme is feasibility.
Li, Feihu; Tang, Bingtao; Wu, Suli; Zhang, Shufen
2017-01-01
The synthesis and assembly of monodispersed colloidal spheres are currently the subject of extensive investigation to fabricate artificial structural color materials. However, artificial structural colors from general colloidal crystals still suffer from the low color visibility and strong viewing angle dependence which seriously hinder their practical application in paints, colorimetric sensors, and color displays. Herein, monodispersed polysulfide (PSF) spheres with intrinsic high refractive index (as high as 1.858) and light-absorbing characteristics are designed, synthesized through a facile polycondensation and crosslinking process between sodium disulfide and 1,2,3-trichloropropane. Owing to their high monodispersity, sufficient surface charge, and good dispersion stability, the PSF spheres can be assembled into large-scale and high-quality 3D photonic crystals. More importantly, high structural color visibility and broad viewing angle are easily achieved because the unique features of PSF can remarkably enhance the relative reflectivity and eliminate the disturbance of scattering and background light. The results of this study provide a simple and efficient strategy to create structural colors with high color visibility, which is very important for their practical application. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ten Years of MISR Observations from Terra: Looking Back, Ahead, and in Between
NASA Technical Reports Server (NTRS)
Diner, David J.; Ackerman, Thomas P.; Braverman, Amy J.; Bruegge, Carol J.; Chopping, Mark J.; Clothiaux, Eugene E.; Davies, Roger; Di Girolamo, Larry; Kahn, Ralph A.; Knyazikhin, Yuri;
2010-01-01
The Multi-angle Imaging SpectroRadiometer (MISR) instrument has been collecting global Earth data from NASA's Terra satellite since February 2000. With its nine along-track view angles, four visible/near-infrared spectral bands, intrinsic spatial resolution of 275 m, and stable radiometric and geometric calibration, no instrument that combines MISR's attributes has previously flown in space. The more than 10-year (and counting) MISR data record provides unprecedented opportunities for characterizing long-term trends in aerosol, cloud, and surface properties, and includes 3-D textural information conventionally thought to be accessible only to active sensors.
Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Horie, Yu; Han, Seunghoon; Faraon, Andrei
2016-01-01
Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° × 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision. PMID:27892454
NASA Astrophysics Data System (ADS)
Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Horie, Yu; Han, Seunghoon; Faraon, Andrei
2016-11-01
Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° × 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision.
Scanning mirror for infrared sensors
NASA Technical Reports Server (NTRS)
Anderson, R. H.; Bernstein, S. B.
1972-01-01
A high resolution, long life angle-encoded scanning mirror, built for application in an infrared attitude sensor, is described. The mirror uses a Moire' fringe type optical encoder and unique torsion bar suspension together with a magnetic drive to meet stringent operational and environmental requirements at a minimum weight and with minimum power consumption. Details of the specifications, design, and construction are presented with an analysis of the mirror suspension that allows accurate prediction of performance. The emphasis is on mechanical design considerations, and brief discussions are included on the encoder and magnetic drive to provide a complete view of the mirror system and its capabilities.
Ren, Huazhong; Yan, Guangjian; Liu, Rongyuan; Li, Zhao-Liang; Qin, Qiming; Nerry, Françoise; Liu, Qiang
2015-03-27
Multi-angular observation of land surface thermal radiation is considered to be a promising method of performing the angular normalization of land surface temperature (LST) retrieved from remote sensing data. This paper focuses on an investigation of the minimum requirements of viewing angles to perform such normalizations on LST. The normally kernel-driven bi-directional reflectance distribution function (BRDF) is first extended to the thermal infrared (TIR) domain as TIR-BRDF model, and its uncertainty is shown to be less than 0.3 K when used to fit the hemispheric directional thermal radiation. A local optimum three-angle combination is found and verified using the TIR-BRDF model based on two patterns: the single-point pattern and the linear-array pattern. The TIR-BRDF is applied to an airborne multi-angular dataset to retrieve LST at nadir (Te-nadir) from different viewing directions, and the results show that this model can obtain reliable Te-nadir from 3 to 4 directional observations with large angle intervals, thus corresponding to large temperature angular variations. The Te-nadir is generally larger than temperature of the slant direction, with a difference of approximately 0.5~2.0 K for vegetated pixels and up to several Kelvins for non-vegetated pixels. The findings of this paper will facilitate the future development of multi-angular thermal infrared sensors.
Ren, Huazhong; Yan, Guangjian; Liu, Rongyuan; Li, Zhao-Liang; Qin, Qiming; Nerry, Françoise; Liu, Qiang
2015-01-01
Multi-angular observation of land surface thermal radiation is considered to be a promising method of performing the angular normalization of land surface temperature (LST) retrieved from remote sensing data. This paper focuses on an investigation of the minimum requirements of viewing angles to perform such normalizations on LST. The normally kernel-driven bi-directional reflectance distribution function (BRDF) is first extended to the thermal infrared (TIR) domain as TIR-BRDF model, and its uncertainty is shown to be less than 0.3 K when used to fit the hemispheric directional thermal radiation. A local optimum three-angle combination is found and verified using the TIR-BRDF model based on two patterns: the single-point pattern and the linear-array pattern. The TIR-BRDF is applied to an airborne multi-angular dataset to retrieve LST at nadir (Te-nadir) from different viewing directions, and the results show that this model can obtain reliable Te-nadir from 3 to 4 directional observations with large angle intervals, thus corresponding to large temperature angular variations. The Te-nadir is generally larger than temperature of the slant direction, with a difference of approximately 0.5~2.0 K for vegetated pixels and up to several Kelvins for non-vegetated pixels. The findings of this paper will facilitate the future development of multi-angular thermal infrared sensors. PMID:25825975
Biomimetic virus-based colourimetric sensors.
Oh, Jin-Woo; Chung, Woo-Jae; Heo, Kwang; Jin, Hyo-Eon; Lee, Byung Yang; Wang, Eddie; Zueger, Chris; Wong, Winnie; Meyer, Joel; Kim, Chuntae; Lee, So-Young; Kim, Won-Geun; Zemla, Marcin; Auer, Manfred; Hexemer, Alexander; Lee, Seung-Wuk
2014-01-01
Many materials in nature change colours in response to stimuli, making them attractive for use as sensor platform. However, both natural materials and their synthetic analogues lack selectivity towards specific chemicals, and introducing such selectivity remains a challenge. Here we report the self-assembly of genetically engineered viruses (M13 phage) into target-specific, colourimetric biosensors. The sensors are composed of phage-bundle nanostructures and exhibit viewing-angle independent colour, similar to collagen structures in turkey skin. On exposure to various volatile organic chemicals, the structures rapidly swell and undergo distinct colour changes. Furthermore, sensors composed of phage displaying trinitrotoluene (TNT)-binding peptide motifs identified from a phage display selectively distinguish TNT down to 300 p.p.b. over similarly structured chemicals. Our tunable, colourimetric sensors can be useful for the detection of a variety of harmful toxicants and pathogens to protect human health and national security.
Biomimetic virus-based colourimetric sensors
NASA Astrophysics Data System (ADS)
Oh, Jin-Woo; Chung, Woo-Jae; Heo, Kwang; Jin, Hyo-Eon; Lee, Byung Yang; Wang, Eddie; Zueger, Chris; Wong, Winnie; Meyer, Joel; Kim, Chuntae; Lee, So-Young; Kim, Won-Geun; Zemla, Marcin; Auer, Manfred; Hexemer, Alexander; Lee, Seung-Wuk
2014-01-01
Many materials in nature change colours in response to stimuli, making them attractive for use as sensor platform. However, both natural materials and their synthetic analogues lack selectivity towards specific chemicals, and introducing such selectivity remains a challenge. Here we report the self-assembly of genetically engineered viruses (M13 phage) into target-specific, colourimetric biosensors. The sensors are composed of phage-bundle nanostructures and exhibit viewing-angle independent colour, similar to collagen structures in turkey skin. On exposure to various volatile organic chemicals, the structures rapidly swell and undergo distinct colour changes. Furthermore, sensors composed of phage displaying trinitrotoluene (TNT)-binding peptide motifs identified from a phage display selectively distinguish TNT down to 300 p.p.b. over similarly structured chemicals. Our tunable, colourimetric sensors can be useful for the detection of a variety of harmful toxicants and pathogens to protect human health and national security.
Photometric Calibration and Image Stitching for a Large Field of View Multi-Camera System
Lu, Yu; Wang, Keyi; Fan, Gongshu
2016-01-01
A new compact large field of view (FOV) multi-camera system is introduced. The camera is based on seven tiny complementary metal-oxide-semiconductor sensor modules covering over 160° × 160° FOV. Although image stitching has been studied extensively, sensor and lens differences have not been considered in previous multi-camera devices. In this study, we have calibrated the photometric characteristics of the multi-camera device. Lenses were not mounted on the sensor in the process of radiometric response calibration to eliminate the influence of the focusing effect of uniform light from an integrating sphere. Linearity range of the radiometric response, non-linearity response characteristics, sensitivity, and dark current of the camera response function are presented. The R, G, and B channels have different responses for the same illuminance. Vignetting artifact patterns have been tested. The actual luminance of the object is retrieved by sensor calibration results, and is used to blend images to make panoramas reflect the objective luminance more objectively. This compensates for the limitation of stitching images that are more realistic only through the smoothing method. The dynamic range limitation of can be resolved by using multiple cameras that cover a large field of view instead of a single image sensor with a wide-angle lens. The dynamic range is expanded by 48-fold in this system. We can obtain seven images in one shot with this multi-camera system, at 13 frames per second. PMID:27077857
Physical Interpretation of the Correlation Between Multi-Angle Spectral Data and Canopy Height
NASA Technical Reports Server (NTRS)
Schull, M. A.; Ganguly, S.; Samanta, A.; Huang, D.; Shabanov, N. V.; Jenkins, J. P.; Chiu, J. C.; Marshak, A.; Blair, J. B.; Myneni, R. B.;
2007-01-01
Recent empirical studies have shown that multi-angle spectral data can be useful for predicting canopy height, but the physical reason for this correlation was not understood. We follow the concept of canopy spectral invariants, specifically escape probability, to gain insight into the observed correlation. Airborne Multi-Angle Imaging Spectrometer (AirMISR) and airborne Laser Vegetation Imaging Sensor (LVIS) data acquired during a NASA Terrestrial Ecology Program aircraft campaign underlie our analysis. Two multivariate linear regression models were developed to estimate LVIS height measures from 28 AirMISR multi-angle spectral reflectances and from the spectrally invariant escape probability at 7 AirMISR view angles. Both models achieved nearly the same accuracy, suggesting that canopy spectral invariant theory can explain the observed correlation. We hypothesize that the escape probability is sensitive to the aspect ratio (crown diameter to crown height). The multi-angle spectral data alone therefore may not provide enough information to retrieve canopy height globally
NASA Astrophysics Data System (ADS)
Markiet, Vincent; Perheentupa, Viljami; Mõttus, Matti; Hernández-Clemente, Rocío
2016-04-01
Imaging spectroscopy is a remote sensing technology which records continuous spectral data at a very high (better than 10 nm) resolution. Such spectral images can be used to monitor, for example, the photosynthetic activity of vegetation. Photosynthetic activity is dependent on varying light conditions and varies within the canopy. To measure this variation we need very high spatial resolution data with resolution better than the dominating canopy element size (e.g., tree crown in a forest canopy). This is useful, e.g., for detecting photosynthetic downregulation and thus plant stress. Canopy illumination conditions are often quantified using the shadow fraction: the fraction of visible foliage which is not sunlit. Shadow fraction is known to depend on view angle (e.g., hot spot images have very low shadow fraction). Hence, multiple observation angles potentially increase the range of shadow fraction in the imagery in high spatial resolution imaging spectroscopy data. To investigate the potential of multi-angle imaging spectroscopy in investigating canopy processes which vary with shadow fraction, we obtained a unique multiangular airborne imaging spectroscopy data for the Hyytiälä forest research station located in Finland (61° 50'N, 24° 17'E) in July 2015. The main tree species are Norway spruce (Picea abies L. karst), Scots pine (Pinus sylvestris L.) and birch (Betula pubescens Ehrh., Betula pendula Roth). We used an airborne hyperspectral sensor AISA Eagle II (Specim - Spectral Imaging Ltd., Finland) mounted on a tilting platform. The tilting platform allowed us to measure at nadir and approximately 35 degrees off-nadir. The hyperspectral sensor has a 37.5 degrees field of view (FOV), 0.6m pixel size, 128 spectral bands with an average spectral bandwidth of 4.6nm and is sensitive in the 400-1000 nm spectral region. The airborne data was radiometrically, atmospherically and geometrically processed using the Parge and Atcor software (Re Se applications Schläpfer, Switzerland). However, even after meticulous geolocation, the canopy elements (needles) seen from the three view angles were different: at each overpass, different parts of the same crowns were observed. To overcome this, we used a 200m x 200m test site covered with pure pine stands. We assumed that for sunlit, shaded and understory spectral signatures are independent of viewing direction to the accuracy of a constant BRDF factor. Thus, we compared the spectral signatures for sunlit and shaded canopy and understory obtained for each view direction. We selected visually six hundred of the brightest and darkest canopy pixels. Next, we performed a minimum noise fraction (MNF) transformation, created a pixel purity index (PPI) and used Envi's n-D scatterplot to determine pure spectral signatures for the two classes. The pure endmembers for different view angles were compared to determine the BRDF factor and to analyze its spectral invariance. We demonstrate the compatibility of multi-angle data with high spatial resolution data. In principle, both carry similar information on structured (non-flat) targets thus as a vegetation canopy. Nevertheless, multiple view angles helped us to extend the range of shadow fraction in the images. Also, correct separation of shaded crown and shaded understory pixels remains a challenge.
Flight calibration tests of a nose-boom-mounted fixed hemispherical flow-direction sensor
NASA Technical Reports Server (NTRS)
Armistead, K. H.; Webb, L. D.
1973-01-01
Flight calibrations of a fixed hemispherical flow angle-of-attack and angle-of-sideslip sensor were made from Mach numbers of 0.5 to 1.8. Maneuvers were performed by an F-104 airplane at selected altitudes to compare the measurement of flow angle of attack from the fixed hemispherical sensor with that from a standard angle-of-attack vane. The hemispherical flow-direction sensor measured differential pressure at two angle-of-attack ports and two angle-of-sideslip ports in diametrically opposed positions. Stagnation pressure was measured at a center port. The results of these tests showed that the calibration curves for the hemispherical flow-direction sensor were linear for angles of attack up to 13 deg. The overall uncertainty in determining angle of attack from these curves was plus or minus 0.35 deg or less. A Mach number position error calibration curve was also obtained for the hemispherical flow-direction sensor. The hemispherical flow-direction sensor exhibited a much larger position error than a standard uncompensated pitot-static probe.
NASA Astrophysics Data System (ADS)
Rengarajan, Rajagopalan
Moderate resolution remote sensing data offers the potential to monitor the long and short term trends in the condition of the Earth's resources at finer spatial scales and over longer time periods. While improved calibration (radiometric and geometric), free access (Landsat, Sentinel, CBERS), and higher level products in reflectance units have made it easier for the science community to derive the biophysical parameters from these remotely sensed data, a number of issues still affect the analysis of multi-temporal datasets. These are primarily due to sources that are inherent in the process of imaging from single or multiple sensors. Some of these undesired or uncompensated sources of variation include variation in the view angles, illumination angles, atmospheric effects, and sensor effects such as Relative Spectral Response (RSR) variation between different sensors. The complex interaction of these sources of variation would make their study extremely difficult if not impossible with real data, and therefore, a simulated analysis approach is used in this study. A synthetic forest canopy is produced using the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model and its measured BRDFs are modeled using the RossLi canopy BRDF model. The simulated BRDF matches the real data to within 2% of the reflectance in the red and the NIR spectral bands studied. The BRDF modeling process is extended to model and characterize the defoliation of a forest, which is used in factor sensitivity studies to estimate the effect of each factor for varying environment and sensor conditions. Finally, a factorial experiment is designed to understand the significance of the sources of variation, and regression based analysis are performed to understand the relative importance of the factors. The design of experiment and the sensitivity analysis conclude that the atmospheric attenuation and variations due to the illumination angles are the dominant sources impacting the at-sensor radiance.
Multispectral determination of soil moisture. [Guymon, Oklahoma
NASA Technical Reports Server (NTRS)
Estes, J. E.; Simonett, D. S. (Principal Investigator); Hajic, E. J.; Blanchard, B. J.
1980-01-01
The edited Guymon soil moisture data collected on August 2, 5, 14, 17, 1978 were grouped into four field cover types for statistical analysis. These are the bare, milo with rows parallel to field of view, milo with rows perpendicular to field of view and alfalfa cover groups. There are 37, 22, 24 and 14 observations respectively in each group for each sensor channel and each soil moisture layer. A subset of these data called the 'five cover set' (VEG5) limited the scatterometer data to the 15 deg look angle and was used to determine discriminant functions and combined group regressions.
Thermal IR exitance model of a plant canopy
NASA Technical Reports Server (NTRS)
Kimes, D. S.; Smith, J. A.; Link, L. E.
1981-01-01
A thermal IR exitance model of a plant canopy based on a mathematical abstraction of three horizontal layers of vegetation was developed. Canopy geometry within each layer is quantitatively described by the foliage and branch orientation distributions and number density. Given this geometric information for each layer and the driving meteorological variables, a system of energy budget equations was determined and solved for average layer temperatures. These estimated layer temperatures, together with the angular distributions of radiating elements, were used to calculate the emitted thermal IR radiation as a function of view angle above the canopy. The model was applied to a lodgepole pine (Pinus contorta) canopy over a diurnal cycle. Simulated vs measured radiometric average temperatures of the midcanopy layer corresponded with 2 C. Simulation results suggested that canopy geometry can significantly influence the effective radiant temperature recorded at varying sensor view angles.
Study of a Solar Sensor for use in Space Vehicle Orientation Control Systems
NASA Technical Reports Server (NTRS)
Spencer, Paul R.
1961-01-01
The solar sensor described herein may be used for a variety of space operations requiring solar orientation. The use of silicon solar cells as the sensing elements provides the sensor with sufficient capability to withstand the hazards of a space environment. A method of arranging the cells in a sensor consists simply of mounting them at a large angle to the base. The use of an opaque shield placed between the cells and perpendicular to the base enhances the small-angle sensitivity while adding slightly to the bulk of the sensor. The difference in illumination of these cells as the result of an oblique incidence of the light rays from the reference source causes an electrical error signal which, when used in a battery-bridge circuit, requires a minimum of electrical processing for use in a space-vehicle orientation control system. An error which could occur after prolonged operation of the sensor is that resulting from asymmetrical aging of opposite cells. This could be periodically corrected with a balance potentiometer. A more routine error in the sensor is that produced by reflected earth radiation. This error may be eliminated over a large portion of the operation time by restricting the field of view and, consequently, the capture capability. A more sophisticated method of eliminating this error is to use separate sensors, for capture and fine pointing, along with a switching device. An experimental model has been constructed and tested to yield an output sensitivity of 1.2 millivolts per second of arc with a load resistance of 1,000 ohms and a reference light source of approximately 1,200 foot-candles delivered at the sensor.
NASA Astrophysics Data System (ADS)
Jiang, Shanchao; Wang, Jing; Sui, Qingmei
2018-03-01
In order to achieve rotation angle measurement, one novel type of miniaturization fiber Bragg grating (FBG) rotation angle sensor with high measurement precision and temperature self-compensation is proposed and studied in this paper. The FBG rotation angle sensor mainly contains two core sensitivity elements (FBG1 and FBG2), triangular cantilever beam, and rotation angle transfer element. In theory, the proposed sensor can achieve temperature self-compensation by complementation of the two core sensitivity elements (FBG1 and FBG2), and it has a boundless angel measurement range with 2π rad period duo to the function of the rotation angle transfer element. Based on introducing the joint working processes, the theory calculation model of the FBG rotation angel sensor is established, and the calibration experiment on one prototype is also carried out to obtain its measurement performance. After experimental data analyses, the measurement precision of the FBG rotation angle sensor prototype is 0.2 ° with excellent linearity, and the temperature sensitivities of FBG1 and FBG2 are 10 pm/° and 10.1 pm/°, correspondingly. All these experimental results confirm that the FBG rotation angle sensor can achieve large-range angle measurement with high precision and temperature self-compensation.
Variation of directional reflectance factors with structural changes of a developing alfalfa canopy
NASA Technical Reports Server (NTRS)
Kirchner, J. A.; Kimes, D. S.; Mcmurtrey, J. E., III
1982-01-01
Directional reflectance factors of an alfalfa canopy were determined and related to canopy structure, agronomic variables, and irradiance conditions at four periods during a cutting cycle. Nadir and off-nadir reflectance factors decreased with increasing biomass in Thematic Mapper band 3(0.63-0.69 micrometer) and increased with increasing biomass in band 4(0.76-0.90 micrometer). The sensor view angle had less impact on perceived reflectance as the alfalfa progressed from an erectophile canopy of stems after harvest to a near planophile canopy of leaves at maturity. Studies of directional reflectance are needed for testing and upgrading vegetation canopy models and to aid in the complex interpretation problems presented by aircraft scanners and pointable satellites where illumination and viewing geometries may vary widely. Distinct changes in the patterns of radiance observed by a sensor as structural and biomass changes occur are keys to monitoring the growth and condition of crops.
Miniature Wide-Angle Lens for Small-Pixel Electronic Camera
NASA Technical Reports Server (NTRS)
Mouroulils, Pantazis; Blazejewski, Edward
2009-01-01
A proposed wideangle lens is shown that would be especially well suited for an electronic camera in which the focal plane is occupied by an image sensor that has small pixels. The design of the lens is intended to satisfy requirements for compactness, high image quality, and reasonably low cost, while addressing issues peculiar to the operation of small-pixel image sensors. Hence, this design is expected to enable the development of a new generation of compact, high-performance electronic cameras. The lens example shown has a 60 degree field of view and a relative aperture (f-number) of 3.2. The main issues affecting the design are also shown.
High-angle-of-attack pneumatic lag and upwash corrections for a hemispherical flow direction sensor
NASA Technical Reports Server (NTRS)
Whitmore, Stephen A.; Heeg, Jennifer; Larson, Terry J.; Ehernberger, L. J.; Hagen, Floyd W.; Deleo, Richard V.
1987-01-01
As part of the NASA F-14 high angle of attack flight test program, a nose mounted hemispherical flow direction sensor was calibrated against a fuselage mounted movable vane flow angle sensor. Significant discrepancies were found to exist in the angle of attack measurements. A two fold approach taken to resolve these discrepancies during subsonic flight is described. First, the sensing integrity of the isolated hemispherical sensor is established by wind tunnel data extending to an angle of attack of 60 deg. Second, two probable causes for the discrepancies, pneumatic lag and upwash, are examined. Methods of identifying and compensating for lag and upwash are presented. The wind tunnel data verify that the isolated hemispherical sensor is sufficiently accurate for static conditions with angles of attack up to 60 deg and angles of sideslip up to 30 deg. Analysis of flight data for two high angle of attack maneuvers establishes that pneumatic lag and upwash are highly correlated with the discrepancies between the hemispherical and vane type sensor measurements.
Software for Testing Electroactive Structural Components
NASA Technical Reports Server (NTRS)
Moses, Robert W.; Fox, Robert L.; Dimery, Archie D.; Bryant, Robert G.; Shams, Qamar
2003-01-01
A computer program generates a graphical user interface that, in combination with its other features, facilitates the acquisition and preprocessing of experimental data on the strain response, hysteresis, and power consumption of a multilayer composite-material structural component containing one or more built-in sensor(s) and/or actuator(s) based on piezoelectric materials. This program runs in conjunction with Lab-VIEW software in a computer-controlled instrumentation system. For a test, a specimen is instrumented with appliedvoltage and current sensors and with strain gauges. Once the computational connection to the test setup has been made via the LabVIEW software, this program causes the test instrumentation to step through specified configurations. If the user is satisfied with the test results as displayed by the software, the user activates an icon on a front-panel display, causing the raw current, voltage, and strain data to be digitized and saved. The data are also put into a spreadsheet and can be plotted on a graph. Graphical displays are saved in an image file for future reference. The program also computes and displays the power and the phase angle between voltage and current.
On-Orbit Cross-Calibration of AM Satellite Remote Sensing Instruments using the Moon
NASA Technical Reports Server (NTRS)
Butler, James J.; Kieffer, Hugh H.; Barnes, Robert A.; Stone, Thomas C.
2003-01-01
On April 14,2003, three Earth remote sensing spacecraft were maneuvered enabling six satellite instruments operating in the visible through shortwave infrared wavelength region to view the Moon for purposes of on-orbit cross-calibration. These instruments included the Moderate Resolution Imaging Spectroradiometer (MODIS), the Multi-angle Imaging SpectroRadiometer (MISR), the Advanced Spaceborne Thermal Emission and Reflection (ASTER) radiometer on the Earth Observing System (EOS) Terra spacecraft, the Advanced Land Imager (ALI) and Hyperion instrument on Earth Observing-1 (EO-1) spacecraft, and the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) on the SeaStar spacecraft. Observations of the Moon were compared using a spectral photometric mode for lunar irradiance developed by the Robotic Lunar Observatory (ROLO) project located at the United States Geological Survey in Flagstaff, Arizona. The ROLO model effectively accounts for variations in lunar irradiance corresponding to lunar phase and libration angles, allowing intercomparison of observations made by instruments on different spacecraft under different time and location conditions. The spacecraft maneuvers necessary to view the Moon are briefly described and results of using the lunar irradiance model in comparing the radiometric calibration scales of the six satellite instruments are presented here.
Li, Jiang; Bifano, Thomas G.; Mertz, Jerome
2016-01-01
Abstract. We describe a wavefront sensor strategy for the implementation of adaptive optics (AO) in microscope applications involving thick, scattering media. The strategy is based on the exploitation of multiple scattering to provide oblique back illumination of the wavefront-sensor focal plane, enabling a simple and direct measurement of the flux-density tilt angles caused by aberrations at this plane. Advantages of the sensor are that it provides a large measurement field of view (FOV) while requiring no guide star, making it particularly adapted to a type of AO called conjugate AO, which provides a large correction FOV in cases when sample-induced aberrations arise from a single dominant plane (e.g., the sample surface). We apply conjugate AO here to widefield (i.e., nonscanning) fluorescence microscopy for the first time and demonstrate dynamic wavefront correction in a closed-loop implementation. PMID:27653793
Ka-Band Autonomous Formation Flying Sensor
NASA Technical Reports Server (NTRS)
Tien, Jeffrey; Purcell, George, Jr.; Srinivasan, Jeffrey; Ciminera, Michael; Srinivasan, Meera; Meehan, Thomas; Young, Lawrence; Aung, MiMi; Amaro, Luis; Chong, Yong;
2004-01-01
Ka-band integrated range and bearing-angle formation sensor called the Autonomous Formation Flying (AFF) Sensor has been developed to enable deep-space formation flying of multiple spacecraft. The AFF Sensor concept is similar to that of the Global Positioning System (GPS), but the AFF Sensor would not use the GPS. The AFF Sensor would reside in radio transceivers and signal-processing subsystems aboard the formation-flying spacecraft. A version of the AFF Sensor has been developed for initial application to the two-spacecraft StarLight optical-interferometry mission, and several design investigations have been performed. From the prototype development, it has been concluded that the AFF Sensor can be expected to measure distances and directions with standard deviations of 2 cm and 1 arc minute, respectively, for spacecraft separations ranging up to about 1 km. It has also been concluded that it is necessary to optimize performance of the overall mission through design trade-offs among the performance of the AFF Sensor, the field of view of the AFF Sensor, the designs of the spacecraft and the scientific instruments that they will carry, the spacecraft maneuvers required for formation flying, and the design of a formation-control system.
Multi-angle lensless digital holography for depth resolved imaging on a chip.
Su, Ting-Wei; Isikman, Serhan O; Bishara, Waheb; Tseng, Derek; Erlinger, Anthony; Ozcan, Aydogan
2010-04-26
A multi-angle lensfree holographic imaging platform that can accurately characterize both the axial and lateral positions of cells located within multi-layered micro-channels is introduced. In this platform, lensfree digital holograms of the micro-objects on the chip are recorded at different illumination angles using partially coherent illumination. These digital holograms start to shift laterally on the sensor plane as the illumination angle of the source is tilted. Since the exact amount of this lateral shift of each object hologram can be calculated with an accuracy that beats the diffraction limit of light, the height of each cell from the substrate can be determined over a large field of view without the use of any lenses. We demonstrate the proof of concept of this multi-angle lensless imaging platform by using light emitting diodes to characterize various sized microparticles located on a chip with sub-micron axial and lateral localization over approximately 60 mm(2) field of view. Furthermore, we successfully apply this lensless imaging approach to simultaneously characterize blood samples located at multi-layered micro-channels in terms of the counts, individual thicknesses and the volumes of the cells at each layer. Because this platform does not require any lenses, lasers or other bulky optical/mechanical components, it provides a compact and high-throughput alternative to conventional approaches for cytometry and diagnostics applications involving lab on a chip systems.
Correction for reflected sky radiance in low-altitude coastal hyperspectral images.
Kim, Minsu; Park, Joong Yong; Kopilevich, Yuri; Tuell, Grady; Philpot, William
2013-11-10
Low-altitude coastal hyperspectral imagery is sensitive to reflections of sky radiance at the water surface. Even in the absence of sun glint, and for a calm water surface, the wide range of viewing angles may result in pronounced, low-frequency variations of the reflected sky radiance across the scan line depending on the solar position. The variation in reflected sky radiance can be obscured by strong high-spatial-frequency sun glint and at high altitude by path radiance. However, at low altitudes, the low-spatial-frequency sky radiance effect is frequently significant and is not removed effectively by the typical corrections for sun glint. The reflected sky radiance from the water surface observed by a low-altitude sensor can be modeled in the first approximation as the sum of multiple-scattered Rayleigh path radiance and the single-scattered direct-solar-beam radiance by the aerosol in the lower atmosphere. The path radiance from zenith to the half field of view (FOV) of a typical airborne spectroradiometer has relatively minimal variation and its reflected radiance to detector array results in a flat base. Therefore the along-track variation is mostly contributed by the forward single-scattered solar-beam radiance. The scattered solar-beam radiances arrive at the water surface with different incident angles. Thus the reflected radiance received at the detector array corresponds to a certain scattering angle, and its variation is most effectively parameterized using the downward scattering angle (DSA) of the solar beam. Computation of the DSA must account for the roll, pitch, and heading of the platform and the viewing geometry of the sensor along with the solar ephemeris. Once the DSA image is calculated, the near-infrared (NIR) radiance from selected water scan lines are compared, and a relationship between DSA and NIR radiance is derived. We then apply the relationship to the entire DSA image to create an NIR reference image. Using the NIR reference image and an atmospheric spectral reflectance look-up table, the low spatial frequency variation of the water surface-reflected atmospheric contribution is removed.
A novel method of measuring spatial rotation angle using MEMS tilt sensors
NASA Astrophysics Data System (ADS)
Cao, Jian'an; Zhu, Xin; Wu, Hao; Zhang, Leping
2017-10-01
This paper presents a novel method of measuring spatial rotation angle with a dual-axis micro-electro-mechanical systems tilt sensor. When the sensor is randomly mounted on the surface of the rotating object, there are three unpredictable and unknown mounting position parameters: α, the sensor’s swing angle on the measuring plane; β, the angle between the rotation axis and the horizontal plane; and γ, the angle between the measuring plane and the rotation axis. Thus, the sensor’s spatial rotation model is established to describe the relationship between the measuring axis, rotation axis, and horizontal plane, and the corresponding analytical equations are derived. Furthermore, to eliminate the deviation caused by the uncertain direction of the rotation axis, an extra perpendicularly mounted, single-axis tilt sensor is combined with the dual-axis tilt sensor, forming a three-axis tilt sensor. Then, by measuring the sensors’ three tilts and solving the model’s equations, the object’s spatial rotation angle is obtained. Finally, experimental results show that the developed tilt sensor is capable of measuring spatial rotation angle in the range of ±180° with an accuracy of 0.2° if the angle between the rotation axis and the horizontal plane is less than 75°.
Results and lessons learned from MODIS polarization sensitivity characterization
NASA Astrophysics Data System (ADS)
Sun, J.; Xiong, X.; Wang, X.; Qiu, S.; Xiong, S.; Waluschka, E.
2006-08-01
In addition to radiometric, spatial, and spectral calibration requirements, MODIS design specifications include polarization sensitivity requirements of less than 2% for all Reflective Solar Bands (RSB) except for the band centered at 412nm. To the best of our knowledge, MODIS was the first imaging radiometer that went through comprehensive system level (end-to-end) polarization characterization. MODIS polarization sensitivity was measured pre-launch at a number of sensor view angles using a laboratory Polarization Source Assembly (PSA) that consists of a rotatable source, a polarizer (Ahrens prism design), and a collimator. This paper describes MODIS polarization characterization approaches used by MODIS Characterization Support Team (MCST) at NASA/GSFC and addresses issues and concerns in the measurements. Results (polarization factor and phase angle) using different analyzing methods are discussed. Also included in this paper is a polarization characterization comparison between Terra and Aqua MODIS. Our previous and recent analysis of MODIS RSB polarization sensitivity could provide useful information for future Earth-observing sensor design, development, and characterization.
Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; ...
2016-11-28
Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° ×more » 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision.« less
Evaluation of the electro-optic direction sensor
NASA Technical Reports Server (NTRS)
Johnson, A. R.; Salomon, P. M.
1973-01-01
Evaluation of a no-moving-parts single-axis star tracker called an electro-optic direction sensor (EODS) concept is described and the results are given in detail. The work involved experimental evaluation of a breadboard sensor yielding results which would permit design of a prototype sensor for a specific application. The laboratory work included evaluation of the noise equivalent input angle of the sensor, demonstration of a technique for producing an acquisition signal, constraints on the useful field-of-view, and a qualitative evaluation of the effects of stray light. In addition, the potential of the silicon avalanche-type photodiode for this application was investigated. No benefit in noise figure was found, but the easily adjustable gain of the avalanche device was useful. The use of mechanical tuning of the modulating element to reduce voltage requirements was also explored. The predicted performance of EODS in both photomultiplier and solid state detector configurations was compared to an existing state-of-the-art star tracker.
A High Fidelity Approach to Data Simulation for Space Situational Awareness Missions
NASA Astrophysics Data System (ADS)
Hagerty, S.; Ellis, H., Jr.
2016-09-01
Space Situational Awareness (SSA) is vital to maintaining our Space Superiority. A high fidelity, time-based simulation tool, PROXOR™ (Proximity Operations and Rendering), supports SSA by generating realistic mission scenarios including sensor frame data with corresponding truth. This is a unique and critical tool for supporting mission architecture studies, new capability (algorithm) development, current/future capability performance analysis, and mission performance prediction. PROXOR™ provides a flexible architecture for sensor and resident space object (RSO) orbital motion and attitude control that simulates SSA, rendezvous and proximity operations scenarios. The major elements of interest are based on the ability to accurately simulate all aspects of the RSO model, viewing geometry, imaging optics, sensor detector, and environmental conditions. These capabilities enhance the realism of mission scenario models and generated mission image data. As an input, PROXOR™ uses a library of 3-D satellite models containing 10+ satellites, including low-earth orbit (e.g., DMSP) and geostationary (e.g., Intelsat) spacecraft, where the spacecraft surface properties are those of actual materials and include Phong and Maxwell-Beard bidirectional reflectance distribution function (BRDF) coefficients for accurate radiometric modeling. We calculate the inertial attitude, the changing solar and Earth illumination angles of the satellite, and the viewing angles from the sensor as we propagate the RSO in its orbit. The synthetic satellite image is rendered at high resolution and aggregated to the focal plane resolution resulting in accurate radiometry even when the RSO is a point source. The sensor model includes optical effects from the imaging system [point spread function (PSF) includes aberrations, obscurations, support structures, defocus], detector effects (CCD blooming, left/right bias, fixed pattern noise, image persistence, shot noise, read noise, and quantization noise), and environmental effects (radiation hits with selectable angular distributions and 4-layer atmospheric turbulence model for ground based sensors). We have developed an accurate flash Light Detection and Ranging (LIDAR) model that supports reconstruction of 3-dimensional information on the RSO. PROXOR™ contains many important imaging effects such as intra-frame smear, realized by oversampling the image in time and capturing target motion and jitter during the integration time.
Intensity insensitive one-dimensional optical fiber tilt sensor
NASA Astrophysics Data System (ADS)
Vadakkapattu Canthadai, Badrinath; Sengupta, Dipankar; Pachava, Vengalrao; Kishore, P.
2014-06-01
The paper presents a proximity sensor based on plastic optical fiber as tilt sensor. Discrete and continuous response of the sensor against change in tilt angle of the setup is studied. The sensor can detect tilt angles up to 5.70 and the achieved sensor sensitivity is 97mV/0 .
Simulated and Real Sheet-of-Light 3D Object Scanning Using a-Si:H Thin Film PSD Arrays.
Contreras, Javier; Tornero, Josep; Ferreira, Isabel; Martins, Rodrigo; Gomes, Luis; Fortunato, Elvira
2015-11-30
A MATLAB/SIMULINK software simulation model (structure and component blocks) has been constructed in order to view and analyze the potential of the PSD (Position Sensitive Detector) array concept technology before it is further expanded or developed. This simulation allows changing most of its parameters, such as the number of elements in the PSD array, the direction of vision, the viewing/scanning angle, the object rotation, translation, sample/scan/simulation time, etc. In addition, results show for the first time the possibility of scanning an object in 3D when using an a-Si:H thin film 128 PSD array sensor and hardware/software system. Moreover, this sensor technology is able to perform these scans and render 3D objects at high speeds and high resolutions when using a sheet-of-light laser within a triangulation platform. As shown by the simulation, a substantial enhancement in 3D object profile image quality and realism can be achieved by increasing the number of elements of the PSD array sensor as well as by achieving an optimal position response from the sensor since clearly the definition of the 3D object profile depends on the correct and accurate position response of each detector as well as on the size of the PSD array.
NASA Astrophysics Data System (ADS)
Olivares, A.; Górriz, J. M.; Ramírez, J.; Olivares, G.
2011-02-01
Inertial sensors are widely used in human body motion monitoring systems since they permit us to determine the position of the subject's limbs. Limb angle measurement is carried out through the integration of the angular velocity measured by a rate sensor and the decomposition of the components of static gravity acceleration measured by an accelerometer. Different factors derived from the sensors' nature, such as the angle random walk and dynamic bias, lead to erroneous measurements. Dynamic bias effects can be reduced through the use of adaptive filtering based on sensor fusion concepts. Most existing published works use a Kalman filtering sensor fusion approach. Our aim is to perform a comparative study among different adaptive filters. Several least mean squares (LMS), recursive least squares (RLS) and Kalman filtering variations are tested for the purpose of finding the best method leading to a more accurate and robust limb angle measurement. A new angle wander compensation sensor fusion approach based on LMS and RLS filters has been developed.
Retrieval Algorithm for Broadband Albedo at the Top of the Atmosphere
NASA Astrophysics Data System (ADS)
Lee, Sang-Ho; Lee, Kyu-Tae; Kim, Bu-Yo; Zo, ll-Sung; Jung, Hyun-Seok; Rim, Se-Hun
2018-05-01
The objective of this study is to develop an algorithm that retrieves the broadband albedo at the top of the atmosphere (TOA albedo) for radiation budget and climate analysis of Earth's atmosphere using Geostationary Korea Multi-Purse Satellite/Advanced Meteorological Imager (GK-2A/AMI) data. Because the GK-2A satellite will launch in 2018, we used data from the Japanese weather satellite Himawari-8 and onboard sensor Advanced Himawari Imager (AHI), which has similar sensor properties and observation area to those of GK-2A. TOA albedo was retrieved based on reflectance and regression coefficients of shortwave channels 1 to 6 of AHI. The regression coefficient was calculated using the results of the radiative transfer model (SBDART) and ridge regression. The SBDART used simulations of the correlation between TOA albedo and reflectance of each channel according to each atmospheric conditions (solar zenith angle, viewing zenith angle, relative azimuth angle, surface type, and absence/presence of clouds). The TOA albedo from Himawari-8/AHI were compared to that from the National Aeronautics and Space Administration (NASA) satellite Terra with onboard sensor Clouds and the Earth's Radiant Energy System (CERES). The correlation coefficients between the two datasets from the week containing the first day of every month between 1st August 2015 and 1st July 2016 were high, ranging between 0.934 and 0.955, with the root mean square error in the 0.053-0.068 range.
Human-computer interface glove using flexible piezoelectric sensors
NASA Astrophysics Data System (ADS)
Cha, Youngsu; Seo, Jeonggyu; Kim, Jun-Sik; Park, Jung-Min
2017-05-01
In this note, we propose a human-computer interface glove based on flexible piezoelectric sensors. We select polyvinylidene fluoride as the piezoelectric material for the sensors because of advantages such as a steady piezoelectric characteristic and good flexibility. The sensors are installed in a fabric glove by means of pockets and Velcro bands. We detect changes in the angles of the finger joints from the outputs of the sensors, and use them for controlling a virtual hand that is utilized in virtual object manipulation. To assess the sensing ability of the piezoelectric sensors, we compare the processed angles from the sensor outputs with the real angles from a camera recoding. With good agreement between the processed and real angles, we successfully demonstrate the user interaction system with the virtual hand and interface glove based on the flexible piezoelectric sensors, for four hand motions: fist clenching, pinching, touching, and grasping.
Hemispherical Field-of-View Above-Water Surface Imager for Submarines
NASA Technical Reports Server (NTRS)
Hemmati, Hamid; Kovalik, Joseph M.; Farr, William H.; Dannecker, John D.
2012-01-01
A document discusses solutions to the problem of submarines having to rise above water to detect airplanes in the general vicinity. Two solutions are provided, in which a sensor is located just under the water surface, and at a few to tens of meter depth under the water surface. The first option is a Fish Eye Lens (FEL) digital-camera combination, situated just under the water surface that will have near-full- hemisphere (360 azimuth and 90 elevation) field of view for detecting objects on the water surface. This sensor can provide a three-dimensional picture of the airspace both in the marine and in the land environment. The FEL is coupled to a camera and can continuously look at the entire sky above it. The camera can have an Active Pixel Sensor (APS) focal plane array that allows logic circuitry to be built directly in the sensor. The logic circuitry allows data processing to occur on the sensor head without the need for any other external electronics. In the second option, a single-photon sensitive (photon counting) detector-array is used at depth, without the need for any optics in front of it, since at this location, optical signals are scattered and arrive at a wide (tens of degrees) range of angles. Beam scattering through clouds and seawater effectively negates optical imaging at depths below a few meters under cloudy or turbulent conditions. Under those conditions, maximum collection efficiency can be achieved by using a non-imaging photon-counting detector behind narrowband filters. In either case, signals from these sensors may be fused and correlated or decorrelated with other sensor data to get an accurate picture of the object(s) above the submarine. These devices can complement traditional submarine periscopes that have a limited field of view in the elevation direction. Also, these techniques circumvent the need for exposing the entire submarine or its periscopes to the outside environment.
Optical Polarization of Light from a Sorghum Canopy Measured Under Both a Clear and an Overcast Sky
NASA Technical Reports Server (NTRS)
Vanderbilt, Vern; Daughtry, Craig; Biehl, Larry; Dahlgren, Robert
2014-01-01
Introduction: We tested the hypothesis that the optical polarization of the light reflected by a sorghum canopy is due to a Fresnel-type redirection, by sorghum leaf surfaces, of light from an unpolarized light source, the sun or overcast sky, toward the measuring sensor. If it can be shown that the source of the polarization of the light scattered by the sorghum canopy is a first surface, Fresnel-type reflection, then removing this surface reflected light from measurements of canopy reflectance presumably would allow better insight into the biochemical processes such as photosynthesis and metabolism that occur in the interiors of sorghum canopy leaves. Methods: We constructed a tower 5.9m tall in the center of a homogenous sorghum field. We equipped two Barnes MMR radiometers with polarization analyzers on the number 1, 3 and 7 Landsat TM wavelength bands. Positioning the radiometers atop the tower, we collected radiance data in 44 view directions on two days, one day with an overcast sky and the other, clear and sunlit. From the radiance data we calculated the linear polarization of the reflected light for each radiometer wavelength channel and view direction. Results and Discussion: Our experimental results support our hypothesis, showing that the amplitude of the linearly polarized portion of the light reflected by the sorghum canopy varied dramatically with view azimuth direction under a point source, the sun, but the amplitude varied little with view azimuth direction under the hemispherical source, the overcast sky. Under the clear sky, the angle of polarization depended upon the angle of incidence of the sunlight on the leaf, while under the overcast sky the angle of polarization depended upon the zenith view angle. These results support a polarized radiation transport model of the canopy that is based upon a first surface, Fresnel reflection from leaves in the sorghum canopy.
Advanced Image Processing for NASA Applications
NASA Technical Reports Server (NTRS)
LeMoign, Jacqueline
2007-01-01
The future of space exploration will involve cooperating fleets of spacecraft or sensor webs geared towards coordinated and optimal observation of Earth Science phenomena. The main advantage of such systems is to utilize multiple viewing angles as well as multiple spatial and spectral resolutions of sensors carried on multiple spacecraft but acting collaboratively as a single system. Within this framework, our research focuses on all areas related to sensing in collaborative environments, which means systems utilizing intracommunicating spatially distributed sensor pods or crafts being deployed to monitor or explore different environments. This talk will describe the general concept of sensing in collaborative environments, will give a brief overview of several technologies developed at NASA Goddard Space Flight Center in this area, and then will concentrate on specific image processing research related to that domain, specifically image registration and image fusion.
Organic plasmon-emitting diodes for detecting refractive index variation.
Chiu, Nan-Fu; Cheng, Chih-Jen; Huang, Teng-Yi
2013-06-28
A photo-excited organic layer on a metal thin film with a corrugated substrate was used to generate surface plasmon grating coupled emissions (SPGCEs). Directional emissions corresponded to the resonant condition of surface plasmon modes on the Au/air interface. In experimental comparisons of the effects of different pitch sizes on the plasmonic band-gap, the obtained SPGCEs were highly directional, with intensity increases as large as 10.38-fold. The FWHM emission spectrum was less than 70 nm. This method is easily applicable to detecting refractive index changes by using SP-coupled fluorophores in which wavelength emissions vary by viewing angle. The measurements and calculations in this study confirmed that the color wavelength of the SPGCE changed from 545.3 nm to 615.4 nm at certain viewing angles, while the concentration of contacting glucose increased from 10 to 40 wt%, which corresponded to a refractive index increase from 1.3484 to 1.3968. The organic plasmon-emitting diode exhibits a wider linearity range and a resolution of the experimental is 1.056 × 10-3 RIU. The sensitivity of the detection limit for naked eye of the experimental is 0.6 wt%. At a certain viewing angle, a large spectral shift is clearly distinguishable by the naked eye unaided by optoelectronic devices. These experimental results confirm the potential applications of the organic plasmon-emitting diodes in a low-cost, integrated, and disposable refractive-index sensor.
NASA Astrophysics Data System (ADS)
Jouybari, A.; Ardalan, A. A.; Rezvani, M.-H.
2017-09-01
The accurate measurement of platform orientation plays a critical role in a range of applications including marine, aerospace, robotics, navigation, human motion analysis, and machine interaction. We used Mahoney filter, Complementary filter and Xsens Kalman filter for achieving Euler angle of a dynamic platform by integration of gyroscope, accelerometer, and magnetometer measurements. The field test has been performed in Kish Island using an IMU sensor (Xsens MTi-G-700) that installed onboard a buoy so as to provide raw data of gyroscopes, accelerometers, magnetometer measurements about 25 minutes. These raw data were used to calculate the Euler angles by Mahoney filter and Complementary filter, while the Euler angles collected by XSense IMU sensor become the reference of the Euler angle estimations. We then compared Euler angles which calculated by Mahoney Filter and Complementary Filter with reference to the Euler angles recorded by the XSense IMU sensor. The standard deviations of the differences between the Mahoney Filter, Complementary Filter Euler angles and XSense IMU sensor Euler angles were about 0.5644, 0.3872, 0.4990 degrees and 0.6349, 0.2621, 2.3778 degrees for roll, pitch, and heading, respectively, so the numerical result assert that Mahoney filter is precise for roll and heading angles determination and Complementary filter is precise only for pitch determination, it should be noted that heading angle determination by Complementary filter has more error than Mahoney filter.
2010-01-01
cephalopod and model both the shallow and deep-water world from the animals’ points of view. Report Documentation Page Form ApprovedOMB No...REPORT unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 2 3. To... solar zenith angles including noon and sunset conditions. Five of these deployments were made with the optical sensors oriented horizontally. Ten
Improving Kinematic Accuracy of Soft Wearable Data Gloves by Optimizing Sensor Locations
Kim, Dong Hyun; Lee, Sang Wook; Park, Hyung-Soon
2016-01-01
Bending sensors enable compact, wearable designs when used for measuring hand configurations in data gloves. While existing data gloves can accurately measure angular displacement of the finger and distal thumb joints, accurate measurement of thumb carpometacarpal (CMC) joint movements remains challenging due to crosstalk between the multi-sensor outputs required to measure the degrees of freedom (DOF). To properly measure CMC-joint configurations, sensor locations that minimize sensor crosstalk must be identified. This paper presents a novel approach to identifying optimal sensor locations. Three-dimensional hand surface data from ten subjects was collected in multiple thumb postures with varied CMC-joint flexion and abduction angles. For each posture, scanned CMC-joint contours were used to estimate CMC-joint flexion and abduction angles by varying the positions and orientations of two bending sensors. Optimal sensor locations were estimated by the least squares method, which minimized the difference between the true CMC-joint angles and the joint angle estimates. Finally, the resultant optimal sensor locations were experimentally validated. Placing sensors at the optimal locations, CMC-joint angle measurement accuracies improved (flexion, 2.8° ± 1.9°; abduction, 1.9° ± 1.2°). The proposed method for improving the accuracy of the sensing system can be extended to other types of soft wearable measurement devices. PMID:27240364
NASA Technical Reports Server (NTRS)
Moul, T. M.
1979-01-01
A preliminary wind tunnel investigation was undertaken to determine the flow correction for a vane angle of attack sensor over an angle of attack range from -10 deg to 110 deg. The sensor was mounted ahead of the wing on a 1/5 scale model of a general aviation airplane. It was shown that the flow correction was substantial, reaching about 15 deg at an angle of attack of 90 deg. The flow correction was found to increase as the sensor was moved closer to the wing or closer to the fuselage. The experimentally determined slope of the flow correction versus the measured angle of attack below the stall angle of attack agreed closely with the slope of flight data from a similar full scale airplane.
Reflectance-Based Sensor Validation Over Ice Surfaces
NASA Technical Reports Server (NTRS)
Jaross, Glen; Dodge, James C. (Technical Monitor)
2003-01-01
During this period work was performed in the following areas. These areas are defined in the Work Schedule presented in the original proposal: BRDF development, Data acquisition and processing, THR Table generation and Presentations and Publications. BRDF development involves creating and/or modifying a reflectance model of the Antarctic surface. This model must, for a temporal and spatial average, be representative of the East Antarctic plateau and be expressed in terms of the three standard surface angles: solar zenith angle (SolZA), view zenith angle (SatZA), and relative azimuth angle (RelAZ). We successfully acquired a limited amount of NOAA-9 AVHRR data for radiance validation. The data were obtained from the Laboratory for Terrestrial Physics at Goddard Space Flight Center. We developed our own reading and unpacking software, which we used to select Channel 1 data (visible). We then applied geographic subsetting criteria (same as used for TOMS), and wrote only the relevant data to packed binary files. We proceeded with analysis of these data, which is not yet complete.
Response Versus Scan-Angle Corrections for MODIS Reflective Solar Bands Using Deep Convective Clouds
NASA Technical Reports Server (NTRS)
Bhatt, Rajendra; Angal, Amit; Doelling, David R.; Xiong, Xiaoxiong; Wu, Aisheng; Haney, Conor O.; Scarino, Benjamin R.; Gopalan, Arun
2016-01-01
The absolute radiometric calibration of the reflective solar bands (RSBs) of Aqua- and Terra-MODIS is performed using on-board calibrators. A solar diffuser (SD) panel along with a solar diffuser stability monitor (SDSM) system, which tracks the performance of the SD over time, provides the absolute reference for calibrating the MODIS sensors. MODIS also views the moon and deep space through its space view (SV) port for lunar-based calibration and computing the zero input radiance, respectively. The MODIS instrument views the Earths surface through a two-sided scan mirror, whose reflectance is a function of angle of incidence (AOI) and is described by response versus scan-angle (RVS). The RVS for both MODIS instruments was characterized prior to launch. MODIS also views the SD and the moon at two different assigned RVS positions. There is sufficient evidence that the RVS is changing on orbit over time and as a function of wavelength. The SD and lunar observation scans can only track the RVS variation at two RVS positions. Consequently, the MODIS Characterization Support Team (MCST) developed enhanced approaches that supplement the onboard calibrator measurements with responses from pseudo-invariant desert sites. This approach has been implemented in Level 1B (L1B) Collection 6 (C6) for selected short-wavelength bands. This paper presents an alternative approach of characterizing the mirror RVS to derive the time-dependent RVS correction factors for MODIS RSBs using tropical deep convective cloud (DCC) targets. An initial assessment of the DCC response from Aqua-MODIS band 1 C6 data indicates evidence of RVS artifacts, which are not uniform across the scans and are more prevalent in the left side Earth-view scans.
Response Versus Scan-Angle Corrections for MODIS Reflective Solar Bands Using Deep Convective Clouds
NASA Technical Reports Server (NTRS)
Bhatt, Rajendra; Angal, Amit; Doelling, David R.; Xiong, Xiaoxiong; Wu, Aisheng; Haney, Conor O.; Scarino, Benjamin R.; Gopalan, Arun
2016-01-01
The absolute radiometric calibration of the reflective solar bands (RSBs) of Aqua- and Terra-MODIS is performed using on-board calibrators. A solar diffuser (SD) panel along with a solar diffuser stability monitor (SDSM) system, which tracks the performance of the SD over time, provides the absolute reference for calibrating the MODIS sensors. MODIS also views the moon and deep space through its space view (SV) port for lunar-based calibration and computing the zero input radiance, respectively. The MODIS instrument views the Earth's surface through a two-sided scan mirror, whose reflectance is a function of angle of incidence (AOI) and is described by response versus scan-angle (RVS). The RVS for both MODIS instruments was characterized prior to launch. MODIS also views the SD and the moon at two different assigned RVS positions. There is sufficient evidence that the RVS is changing on orbit over time and as a function of wavelength. The SD and lunar observation scans can only track the RVS variation at two RVS positions. Consequently, the MODIS Characterization Support Team (MCST) developed enhanced approaches that supplement the onboard calibrator measurements with responses from pseudo-invariant desert sites. This approach has been implemented in Level 1B (L1B) Collection 6 (C6) for selected short-wavelength bands. This paper presents an alternative approach of characterizing the mirror RVS to derive the time-dependent RVS correction factors for MODIS RSBs using tropical deep convective cloud (DCC) targets. An initial assessment of the DCC response from Aqua-MODIS band 1 C6 data indicates evidence of RVS artifacts, which are not uniform across the scans and are more prevalent in the left side Earth-view scans.
Response versus scan-angle corrections for MODIS reflective solar bands using deep convective clouds
NASA Astrophysics Data System (ADS)
Bhatt, Rajendra; Angal, Amit; Doelling, David R.; Xiong, Xiaoxiong; Wu, Aisheng; Haney, Conor O.; Scarino, Benjamin R.; Gopalan, Arun
2016-05-01
The absolute radiometric calibration of the reflective solar bands (RSBs) of Aqua- and Terra-MODIS is performed using on-board calibrators. A solar diffuser (SD) panel along with a solar diffuser stability monitor (SDSM) system, which tracks the degradation of the SD over time, provides the baseline for calibrating the MODIS sensors. MODIS also views the moon and deep space through its space view (SV) port for lunar-based calibration and computing the background, respectively. The MODIS instrument views the Earth's surface using a two-sided scan mirror, whose reflectance is a function of the angle of incidence (AOI) and is described by response versus scan-angle (RVS). The RVS for both MODIS instruments was characterized prior to launch. MODIS also views the SD and the moon at two different AOIs. There is sufficient evidence that the RVS is changing on orbit over time and as a function of wavelength. The SD and lunar observation scans can only track the RVS variation at two AOIs. Consequently, the MODIS Characterization Support Team (MCST) developed enhanced approaches that supplement the onboard calibrator measurements with responses from the pseudo-invariant desert sites. This approach has been implemented in Level 1B (L1B) Collection 6 (C6) for select short-wavelength bands. This paper presents an alternative approach of characterizing the mirror RVS to derive the time-dependent RVS correction factors for MODIS RSBs using tropical deep convective cloud (DCC) targets. An initial assessment of the DCC response from Aqua-MODIS band 1 C6 data indicates evidence of RVS artifacts, which are not uniform across the scans and are more prevalent at the beginning of the earth-view scan.
Experimentally determining the locations of two astigmatic images for an underwater light source
NASA Astrophysics Data System (ADS)
Yang, Pao-Keng; Liu, Jian-You; Ying, Shang-Ping
2015-05-01
Images formed by an underwater object from light rays refracted in the sagittal and tangential planes are located at different positions for an oblique viewing position. The overlapping of these two images from the observer's perspective will thus prevent the image-splitting astigmatism from being directly observable. In this work, we present a heuristic method to experimentally visualize the astigmatism. A point light source is used as an underwater object and the emerging wave front is recorded using a Shack-Hartmann wave-front sensor. The wave front is found to deform from a circular paraboloid to an elliptic paraboloid as the viewing position changes from normal to oblique. Using geometric optics, we derive an analytical expression for the image position as a function of the rotating angle of an arm used to carry the wave-front sensor in our experimental setup. The measured results are seen to be in good agreement with the theoretical predictions.
NASA Technical Reports Server (NTRS)
Wu, Aisheng; Xiong, Xiaoxiong; Chiang, Kwofu
2017-01-01
The visible infrared imaging radiometer suite (VIIRS) is a key sensor carried on the Suomi national polar-orbiting partnership (S-NPP) satellite, which was launched in October 2011. It has several on-board calibration components, including a solar diffuser and a solar diffuser stability monitor for the reflective solar bands, a V-groove blackbody for the thermal emissive bands (TEB), and a space view port for background subtraction. These on-board calibrators are located at fixed scan angles. The VIIRS response versus scan angle (RVS) was characterized prelaunch in lab ambient conditions and is currently used to characterize the on-orbit response for all scan angles relative to the calibrator scan angle. Since the RVS is vitally important to the quality of calibrated radiance products, several independent studies were performed to analyze the prelaunch RVS measurement data. A spacecraft level pitch maneuver was scheduled during the first 3 months of intensive Cal/Val. The S-NPP pitch maneuver provided a rare opportunity for VIIRS to make observations of deep space over the entire range of Earth view scan angles, which can be used to characterize the TEB RVS. This study provides our analysis of the pitch maneuver data and assessment of the derived TEB RVS by comparison with prelaunch results. In addition, the stability of the RVS after the first 5 years of operation is examined using observed brightness temperatures (BT) over a clear ocean at various angles of incidence (AOI). To reduce the impact of variations in the BT measurements, the daily overpasses collected over the ocean are screened for cloud contamination, normalized to the results obtained at the blackbody AOI, and averaged each year.
Watanabe, Takashi
2013-01-01
The wearable sensor system developed by our group, which measured lower limb angles using Kalman-filtering-based method, was suggested to be useful in evaluation of gait function for rehabilitation support. However, it was expected to reduce variations of measurement errors. In this paper, a variable-Kalman-gain method based on angle error that was calculated from acceleration signals was proposed to improve measurement accuracy. The proposed method was tested comparing to fixed-gain Kalman filter and a variable-Kalman-gain method that was based on acceleration magnitude used in previous studies. First, in angle measurement in treadmill walking, the proposed method measured lower limb angles with the highest measurement accuracy and improved significantly foot inclination angle measurement, while it improved slightly shank and thigh inclination angles. The variable-gain method based on acceleration magnitude was not effective for our Kalman filter system. Then, in angle measurement of a rigid body model, it was shown that the proposed method had measurement accuracy similar to or higher than results seen in other studies that used markers of camera-based motion measurement system fixing on a rigid plate together with a sensor or on the sensor directly. The proposed method was found to be effective in angle measurement with inertial sensors. PMID:24282442
Impact of Sensor Degradation on the MODIS NDVI Time Series
NASA Technical Reports Server (NTRS)
Wang, Dongdong; Morton, Douglas Christopher; Masek, Jeffrey; Wu, Aisheng; Nagol, Jyoteshwar; Xiong, Xiaoxiong; Levy, Robert; Vermote, Eric; Wolfe, Robert
2012-01-01
Time series of satellite data provide unparalleled information on the response of vegetation to climate variability. Detecting subtle changes in vegetation over time requires consistent satellite-based measurements. Here, the impact of sensor degradation on trend detection was evaluated using Collection 5 data from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensors on the Terra and Aqua platforms. For Terra MODIS, the impact of blue band (Band 3, 470 nm) degradation on simulated surface reflectance was most pronounced at near-nadir view angles, leading to a 0.001-0.004 yr-1 decline in Normalized Difference Vegetation Index (NDVI) under a range of simulated aerosol conditions and surface types. Observed trends in MODIS NDVI over North America were consistentwith simulated results,with nearly a threefold difference in negative NDVI trends derived from Terra (17.4%) and Aqua (6.7%) MODIS sensors during 2002-2010. Planned adjustments to Terra MODIS calibration for Collection 6 data reprocessing will largely eliminate this negative bias in detection of NDVI trends.
Impact of Sensor Degradation on the MODIS NDVI Time Series
NASA Technical Reports Server (NTRS)
Wang, Dongdong; Morton, Douglas; Masek, Jeffrey; Wu, Aisheng; Nagol, Jyoteshwar; Xiong, Xiaoxiong; Levy, Robert; Vermote, Eric; Wolfe, Robert
2011-01-01
Time series of satellite data provide unparalleled information on the response of vegetation to climate variability. Detecting subtle changes in vegetation over time requires consistent satellite-based measurements. Here, we evaluated the impact of sensor degradation on trend detection using Collection 5 data from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensors on the Terra and Aqua platforms. For Terra MODIS, the impact of blue band (Band 3, 470nm) degradation on simulated surface reflectance was most pronounced at near-nadir view angles, leading to a 0.001-0.004/yr decline in Normalized Difference Vegetation Index (NDVI) under a range of simulated aerosol conditions and surface types. Observed trends MODIS NDVI over North America were consistent with simulated results, with nearly a threefold difference in negative NDVI trends derived from Terra (17.4%) and Aqua (6.7%) MODIS sensors during 2002-2010. Planned adjustments to Terra MODIS calibration for Collection 6 data reprocessing will largely eliminate this negative bias in NDVI trends over vegetation.
Wide-angle lens for miniature capsule endoscope
NASA Astrophysics Data System (ADS)
Ou-Yang, Mang; Chen, Yung-Lin; Lee, Hsin-Hung; LU, Shih-chieh; Wu, Hsien-Ming
2006-02-01
In recent years, using the capsule endoscope to inspect the pathological change of digestive system and intestine had a great break-through on the medical engineering. However, there are some problems needs to overcome. One is that, the field of view was not wide enough, and the other is that the image quality was not enough well. The drawbacks made medical professionals to examine digestive diseases unclearly and ambiguously. In order to solve these problems, the paper designed a novel miniature lenses which has a wide angle of field of view and a good quality of imaging. The lenses employed in the capsule endoscope consisted of a piece of plastic aspherical lens and a piece of glass lens and compacted in the 9.8mm (W) *9.8mm (L) *10.7mm (H) size. Taking the white LED light source and the 10μm pixel size of 256*256 CMOS sensor under considerations, the field of view of the lenses could be achieved to 86 degrees, and the MTF to 37% at 50lp/mm of space frequency. The experimental data proves that the design is consistent with the finished prototype.
Solar Wind Monitoring with SWIM-SARA Onboard Chandrayaan-1
NASA Astrophysics Data System (ADS)
Bhardwaj, A.; Barabash, S.; Sridharan, R.; Wieser, M.; Dhanya, M. B.; Futaana, Y.; Asamura, K.; Kazama, Y.; McCann, D.; Varier, S.; Vijayakumar, E.; Mohankumar, S. V.; Raghavendra, K. V.; Kurian, T.; Thampi, R. S.; Andersson, H.; Svensson, J.; Karlsson, S.; Fischer, J.; Holmstrom, M.; Wurz, P.; Lundin, R.
The SARA experiment aboard the Indian lunar mission Chandrayaan-1 consists of two instruments: Chandrayaan-1 Energetic Neutral Analyzer (CENA) and the SolarWind Monitor (SWIM). CENA will provide measurements of low energy neutral atoms sputtered from lunar surface in the 0.01-3.3 keV energy range by the impact of solar wind ions. SWIM will monitor the solar wind flux precipitating onto the lunar surface and in the vicinity of moon. SWIM is basically an ion-mass analyzer providing energy-per-charge and number density of solar wind ions in the energy range 0.01-15 keV. It has sufficient mass resolution to resolve H+ , He++, He+, O++, O+, and >20 amu, with energy resolution 7% and angular resolution 4:5° × 22:5. The viewing angle of the instrument is 9° × 180°.Mechanically, SWIM consists of a sensor and an electronic board that includes high voltage supply and sensor electronics. The sensor part consists of an electrostatic deflector to analyze the arrival angle of the ions, cylindrical electrostatic analyzer for energy analysis, and the time-of-flight system for particle velocity determination. The total size of SWIM is slightly larger than a credit card and has a mass of 500 g.
Field research on the spectral properties of crops and soils, volume 1. [Purdue Agronomy Farm
NASA Technical Reports Server (NTRS)
Bauer, M. E. (Principal Investigator); Biehl, L. L.; Robinson, B. F.
1980-01-01
The experiment design, data acquisition and preprocessing, data base management, analysis results and development of instrumentation for the AgRISTARS Supporting Research Project, Field Research task are described. Results of several investigations on the spectral reflectance of corn and soybean canopies as influenced by cultural practices, development stage and nitrogen nutrition are reported as well as results of analyses of the spectral properties of crop canopies as a function of canopy geometry, row orientation, sensor view angle and solar illumination angle are presented. The objectives, experiment designs and data acquired in 1980 for field research experiments are described. The development and performance characteristics of a prototype multiband radiometer, data logger, and aerial tower for field research are discussed.
Haptic seat for fuel economy feedback
Bobbitt, III, John Thomas
2016-08-30
A process of providing driver fuel economy feedback is disclosed in which vehicle sensors provide for haptic feedback on fuel usage. Such sensors may include one or more of a speed sensors, global position satellite units, vehicle pitch/roll angle sensors, suspension displacement sensors, longitudinal accelerometer sensors, throttle position in sensors, steering angle sensors, break pressure sensors, and lateral accelerometer sensors. Sensors used singlely or collectively can provide enhanced feedback as to various environmental conditions and operating conditions such that a more accurate assessment of fuel economy information can be provided to the driver.
NASA Astrophysics Data System (ADS)
Wang, W.; Wang, Y.; Hashimoto, H.; Li, S.; Takenaka, H.; Higuchi, A.; Lyapustin, A.; Nemani, R. R.
2017-12-01
The latest generation of geostationary satellite sensors, including the GOES-16/ABI and the Himawari 8/AHI, provide exciting capability to monitor land surface at very high temporal resolutions (5-15 minute intervals) and with spatial and spectral characteristics that mimic the Earth Observing System flagship MODIS. However, geostationary data feature changing sun angles at constant view geometry, which is almost reciprocal to sun-synchronous observations. Such a challenge needs to be carefully addressed before one can exploit the full potential of the new sources of data. Here we take on this challenge with Multi-Angle Implementation of Atmospheric Correction (MAIAC) algorithm, recently developed for accurate and globally robust applications like the MODIS Collection 6 re-processing. MAIAC first grids the top-of-atmosphere measurements to a fixed grid so that the spectral and physical signatures of each grid cell are stacked ("remembered") over time and used to dramatically improve cloud/shadow/snow detection, which is by far the dominant error source in the remote sensing. It also exploits the changing sun-view geometry of the geostationary sensor to characterize surface BRDF with augmented angular resolution for accurate aerosol retrievals and atmospheric correction. The high temporal resolutions of the geostationary data indeed make the BRDF retrieval much simpler and more robust as compared with sun-synchronous sensors such as MODIS. As a prototype test for the geostationary-data processing pipeline on NASA Earth Exchange (GEONEX), we apply MAIAC to process 18 months of data from Himawari 8/AHI over Australia. We generate a suite of test results, including the input TOA reflectance and the output cloud mask, aerosol optical depth (AOD), and the atmospherically-corrected surface reflectance for a variety of geographic locations, terrain, and land cover types. Comparison with MODIS data indicates a general agreement between the retrieved surface reflectance products. Furthermore, the geostationary results satisfactorily capture the movement of clouds and variations in atmospheric dust/aerosol concentrations, suggesting that high quality land surface and vegetation datasets from the advanced geostationary sensors can help complement and improve the corresponding EOS products.
1975-03-01
Veazey , "An Integrated Error Description of Active and Passive Balloon Tracking Systems," ECOM-5500, June 1973. 18. Doll, Barry, "The Potential Use...Effect of Viewing Angle on the Ground Resolution of Satellite-Borne Sensors," ECOM-5502, July 1973. 20. Miller, Walter B., and Donald R. Veazey ...60. Miller, Walter B., and Donald R. Veazey , "On Increasing Vertical Efficiency of a Passive Balloon Tracking Device by Optimal Choice of
A software package for evaluating the performance of a star sensor operation
NASA Astrophysics Data System (ADS)
Sarpotdar, Mayuresh; Mathew, Joice; Sreejith, A. G.; Nirmal, K.; Ambily, S.; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant
2017-02-01
We have developed a low-cost off-the-shelf component star sensor ( StarSense) for use in minisatellites and CubeSats to determine the attitude of a satellite in orbit. StarSense is an imaging camera with a limiting magnitude of 6.5, which extracts information from star patterns it records in the images. The star sensor implements a centroiding algorithm to find centroids of the stars in the image, a Geometric Voting algorithm for star pattern identification, and a QUEST algorithm for attitude quaternion calculation. Here, we describe the software package to evaluate the performance of these algorithms as a star sensor single operating system. We simulate the ideal case where sky background and instrument errors are omitted, and a more realistic case where noise and camera parameters are added to the simulated images. We evaluate such performance parameters of the algorithms as attitude accuracy, calculation time, required memory, star catalog size, sky coverage, etc., and estimate the errors introduced by each algorithm. This software package is written for use in MATLAB. The testing is parametrized for different hardware parameters, such as the focal length of the imaging setup, the field of view (FOV) of the camera, angle measurement accuracy, distortion effects, etc., and therefore, can be applied to evaluate the performance of such algorithms in any star sensor. For its hardware implementation on our StarSense, we are currently porting the codes in form of functions written in C. This is done keeping in view its easy implementation on any star sensor electronics hardware.
NASA Technical Reports Server (NTRS)
Smith, J. A.
1980-01-01
A study was performed to evaluate the geometrical implication of a Multispectral Resource Sampler; a pointable sensor. Several vegetative targets representative of natural and agricultural canopies were considered in two wavelength bands. All combinations of Sun and view angles between 5 and 85 degrees zenith for a range of azimuths were simulated to examine geometrical dependance arising from seasonal as well as latitudinal variation. The effects of three different atmospheres corresponding to clear, medium and heavy haze conditions are included. An extensive model data base was generated to provide investigators with means for possible further study of atmospheric correction procedures and sensor design questions.
Evaluation of electrolytic tilt sensors for measuring model angle of attack in wind tunnel tests
NASA Technical Reports Server (NTRS)
Wong, Douglas T.
1992-01-01
The results of a laboratory evaluation of electrolytic tilt sensors as potential candidates for measuring model attitude or angle of attack in wind tunnel tests are presented. The performance of eight electrolytic tilt sensors was compared with that of typical servo accelerometers used for angle-of-attack measurements. The areas evaluated included linearity, hysteresis, repeatability, temperature characteristics, roll-on-pitch interaction, sensitivity to lead-wire resistance, step response time, and rectification. Among the sensors being evaluated, the Spectron model RG-37 electrolytic tilt sensors have the highest overall accuracy in terms of linearity, hysteresis, repeatability, temperature sensitivity, and roll sensitivity. A comparison of the sensors with the servo accelerometers revealed that the accuracy of the RG-37 sensors was on the average about one order of magnitude worse. Even though a comparison indicates that the cost of each tilt sensor is about one-third the cost of each servo accelerometer, the sensors are considered unsuitable for angle-of-attack measurements. However, the potential exists for other applications such as wind tunnel wall-attitude measurements where the errors resulting from roll interaction, vibration, and response time are less and sensor temperature can be controlled.
SeaWiFS long-term solar diffuser reflectance and sensor noise analyses.
Eplee, Robert E; Patt, Frederick S; Barnes, Robert A; McClain, Charles R
2007-02-10
The NASA Ocean Biology Processing Group's Calibration and Validation (Cal/Val) team has undertaken an analysis of the mission-long Sea-Viewing Wide Field-of-View Sensor (SeaWiFS) solar calibration time series to assess the long-term degradation of the solar diffuser reflectance over 9 years on orbit. The SeaWiFS diffuser is an aluminum plate coated with YB71 paint. The bidirectional reflectance distribution function of the diffuser was not fully characterized before launch, so the Cal/Val team has implemented a regression of the solar incidence angles and the drift in the node of the satellite's orbit against the diffuser time series to correct for solar incidence angle effects. An exponential function with a time constant of 200 days yields the best fit to the diffuser time series. The decrease in diffuser reflectance over the mission is wavelength dependent, ranging from 9% in the blue (412 nm) to 5% in the red and near infrared (670-865 nm). The Cal/Val team has developed a methodology for computing the signal-to-noise ratio (SNR) for SeaWiFS on orbit from the diffuser time series corrected for both the varying solar incidence angles and the diffuser reflectance degradation. A sensor noise model is used to compare on-orbit SNRs computed for radiances reflected from the diffuser with prelaunch SNRs measured at typical radiances specified for the instrument. To within the uncertainties in the measurements, the SNRs for SeaWiFS have not changed over the mission. The on-orbit performance of the SeaWiFS solar diffuser should offer insight into the long-term on-orbit performance of solar diffusers on other instruments, such as the Moderate-Resolution Imaging Spectrometer [currently flying on the Earth Observing System (EOS) Terra and Aqua satellites], the Visible and Infrared Radiometer Suite [scheduled to fly on the NASA National Polar-orbiting Operational Environmental Satellite System (NPOESS) and NPOESS Preparatory Project (NPP) satellites] and the Advanced Baseline Imager [scheduled to fly on the National Oceanic and Atmospheric Administration Geostationary Environmental Operational Satellite Series R (GOES-R) satellites].
SeaWiFS long-term solar diffuser reflectance and sensor noise analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eplee, Robert E. Jr.; Patt, Frederick S.; Barnes, Robert A.
The NASA Ocean Biology Processing Group's Calibration and Validation(Cal/Val) team has undertaken an analysis of the mission-long Sea-Viewing Wide Field-of-View Sensor (SeaWiFS)solar calibration time series to assess the long-term degradation of the solar diffuser reflectance over 9 years on orbit. The SeaWiFS diffuser is an aluminum plate coated with YB71 paint. The bidirectional reflectance distribution function of the diffuser was not fully characterized before launch,so the Cal/Val team has implemented a regression of the solar incidence angles and the drift in the node of the satellite's orbit against the diffuser time series to correct for solar incidence angle effects. Anmore » exponential function with a time constant of 200 days yields the best fit to the diffuser time series.The decrease in diffuser reflectance over the mission is wavelength dependent,ranging from 9% in the blue(412 nm) to 5% in the red and near infrared(670-865 nm). The Cal/Val team has developed a methodology for computing the signal-to-noise ratio (SNR) for SeaWiFS on orbit from the diffuser time series corrected for both the varying solar incidence angles and the diffuser reflectance degradation. A sensor noise model is used to compare on-orbit SNRs computed for radiances reflected from the diffuser with prelaunch SNRs measured at typical radiances specified for the instrument. To within the uncertainties in the measurements, the SNRs for SeaWiFS have not changed over the mission. The on-orbit performance of the SeaWiFS solar diffuser should offer insight into the long-term on-orbit performance of solar diffusers on other instruments, such as the Moderate-Resolution Imaging Spectrometer [currently flying on the Earth Observing System (EOS) Terra and Aqua satellites], the Visible and Infrared Radiometer Suite [scheduled to fly on the NASA National Polar-orbiting Operational Environmental Satellite System (NPOESS) and NPOESS Preparatory Project (NPP) satellites] and the Advanced Baseline Imager [scheduled to fly on the National Oceanic and Atmospheric Administration Geostationary Environmental Operational Satellite Series R (GOES-R) satellites].« less
SeaWiFS long-term solar diffuser reflectance and sensor noise analyses
NASA Astrophysics Data System (ADS)
Eplee, Robert E., Jr.; Patt, Frederick S.; Barnes, Robert A.; McClain, Charles R.
2007-02-01
The NASA Ocean Biology Processing Group's Calibration and Validation (Cal/Val) team has undertaken an analysis of the mission-long Sea-Viewing Wide Field-of-View Sensor (SeaWiFS) solar calibration time series to assess the long-term degradation of the solar diffuser reflectance over 9 years on orbit. The SeaWiFS diffuser is an aluminum plate coated with YB71 paint. The bidirectional reflectance distribution function of the diffuser was not fully characterized before launch, so the Cal/Val team has implemented a regression of the solar incidence angles and the drift in the node of the satellite's orbit against the diffuser time series to correct for solar incidence angle effects. An exponential function with a time constant of 200 days yields the best fit to the diffuser time series. The decrease in diffuser reflectance over the mission is wavelength dependent, ranging from 9% in the blue (412 nm) to 5% in the red and near infrared (670-865 nm). The Cal/Val team has developed a methodology for computing the signal-to-noise ratio (SNR) for SeaWiFS on orbit from the diffuser time series corrected for both the varying solar incidence angles and the diffuser reflectance degradation. A sensor noise model is used to compare on-orbit SNRs computed for radiances reflected from the diffuser with prelaunch SNRs measured at typical radiances specified for the instrument. To within the uncertainties in the measurements, the SNRs for SeaWiFS have not changed over the mission. The on-orbit performance of the SeaWiFS solar diffuser should offer insight into the long-term on-orbit performance of solar diffusers on other instruments, such as the Moderate-Resolution Imaging Spectrometer [currently flying on the Earth Observing System (EOS) Terra and Aqua satellites], the Visible and Infrared Radiometer Suite [scheduled to fly on the NASA National Polar-orbiting Operational Environmental Satellite System (NPOESS) and NPOESS Preparatory Project (NPP) satellites] and the Advanced Baseline Imager [scheduled to fly on the National Oceanic and Atmospheric Administration Geostationary Environmental Operational Satellite Series R (GOES-R) satellites].
NASA Astrophysics Data System (ADS)
Pratt, P.
2012-12-01
Ocean color bands on VIIRS span the visible spectrum and include two NIR bands. There are sixteen detectors per band and two HAM (Half-angle mirror) sides giving a total of thirty two independent systems. For each scan, thirty two hundred pixels are collected and each has a fixed specific optical path and a dynamic position relative to the earth geoid. For a given calibration target where scene variation is minimized, sensor characteristics can be observed. This gives insight into the performance and calibration of the instrument from a sensor-centric perspective. Calibration of the blue bands is especially challenging since there are few blue targets on land. An ocean region called the South Pacific Gyre (SPG) was chosen for its known stability and large area to serve as a calibration target for this investigation. Thousands of pixels from every granule that views the SPG are collected daily through an automated system and tabulated along with the detector, HAM and scan position. These are then collated and organized in a sensor-centric set of tables. The data are then analyzed by slicing by each variable and then plotted in a number of ways over time. Trends in the data show that the VIIRS sensor is largely behaving as expected according to heritage data and also reveals weaknesses where additional characterization of the sensor is possible. This work by Northrop Grumman NPP CalVal Team is supporting the VIIRS on-orbit calibration and validation teams for the sensor and ocean color as well as providing scientists interested in performing ground truth with results that show which detectors and scan angles are the most reliable over time. This novel approach offers a comprehensive sensor-centric on-orbit characterization of the VIIRS instrument on the NASA Suomi NPP mission.
NASA Astrophysics Data System (ADS)
Wang, Ling; Hu, Xiuqing; Chen, Lin
2017-09-01
Calibration is a critical step to ensure data quality and to meet the requirement of quantitative remote sensing in a broad range of scientific applications. One of the least expensive and increasingly popular methods of on-orbit calibration is the use of pseudo invariant calibration sites (PICS). A spatial homogenous and temporally stable area of 34 km2 in size around the center of Kunlun Mountain (KLM) over Tibetan Plateau (TP) was identified by our previous study. The spatial and temporal coefficient of variation (CV) this region was better than 4% for the reflective solar bands. In this study, the BRDF impacts of KLM glacier on MODIS observed TOA reflectance in band 1 (659 nm) are examined. The BRDF impact of KLM glacier with respect to the view zenith angle is studied through using the observations at a fixed solar zenith angle, and the effect with respect to the sun zenith angle is studied based on the observations collected at the same view angle. Then, the two widely used BRDF models are applied to our test data to simulate the variations of TOA reflectance due to the changes in viewing geometry. The first one is Ross-Li model, which has been used to produce the MODIS global BRDF albedo data product. The second one is snow surface BRDF model, which has been used to characterize the bidirectional reflectance of Antarctic snow. Finally, the accuracy and effectiveness of these two different BRDF models are tested through comparing the model of simulated TOA reflectance with the observed one. The results show that variations of the reflectances at a fixed solar zenith angle are close to the lambertian pattern, while those at a fixed sensor zenith angle are strongly anisotropic. A decrease in solar zenith angle from 50º to 20º causes an increase in reflectance by the level of approximated 50%. The snow surface BRDF model performs much better than the Ross-Li BRDF model to re-produce the Bi-Directional Reflectance of KLM glacier. The RMSE of snow surface BRDF model is 3.60%, which is only half of the RMSE when using Ross-Li model.
Surface plasmon resonance sensor using vari-focal liquid lens under angular interrogation
NASA Astrophysics Data System (ADS)
Lee, Muyoung; Bang, Yousung; Lee, Jooho; Jang, Wonjae; Won, Yong Hyub
2017-02-01
In this paper, a surface plasmon resonance sensor for the detection of refractive index variation is presented. A novel waveguide type surface plasmon resonance sensing configuration with focal length variable liquid lens is introduced. The method of surface plasmon resonance sensor is based on the waveguide type with incident angle variation. The incident angle is varied by using an electrowetting liquid lens which is possible to actively change focal length as applying voltage. The optical system, which is adapted to electrowetting lens can continuously change the incident angle of light from 73 to 78 degrees with compact size. The surface plasmon waves are excited between metal and dielectric interface. The sensing surfaces are prepared by a coating of gold metal above high refractive index glass substrate. The incident light which is 532nm monochromatic light source passes through a noble metal coated substrate to detect intensity with incident angle variation. An analysis to distinguish the contribution of light with various incident angle is focused on the angular characteristics of the surface plasmon sensor under wavelength interrogation. The resonance angle is determined corresponding to sensing material refractive index with high sensitivity. The result suggests that the performance of surface plasmon resonance sensor can be improved by real time varying incident angle. From this presented study, it provides a different approach for angular interrogation surface plasmon resonance sensor and can be miniaturized for a portable device.
Organic Plasmon-Emitting Diodes for Detecting Refractive Index Variation
Chiu, Nan-Fu; Cheng, Chih-Jen; Huang, Teng-Yi
2013-01-01
A photo-excited organic layer on a metal thin film with a corrugated substrate was used to generate surface plasmon grating coupled emissions (SPGCEs). Directional emissions corresponded to the resonant condition of surface plasmon modes on the Au/air interface. In experimental comparisons of the effects of different pitch sizes on the plasmonic band-gap, the obtained SPGCEs were highly directional, with intensity increases as large as 10.38-fold. The FWHM emission spectrum was less than 70 nm. This method is easily applicable to detecting refractive index changes by using SP-coupled fluorophores in which wavelength emissions vary by viewing angle. The measurements and calculations in this study confirmed that the color wavelength of the SPGCE changed from 545.3 nm to 615.4 nm at certain viewing angles, while the concentration of contacting glucose increased from 10 to 40 wt%, which corresponded to a refractive index increase from 1.3484 to 1.3968. The organic plasmon-emitting diode exhibits a wider linearity range and a resolution of the experimental is 1.056 × 10−3 RIU. The sensitivity of the detection limit for naked eye of the experimental is 0.6 wt%. At a certain viewing angle, a large spectral shift is clearly distinguishable by the naked eye unaided by optoelectronic devices. These experimental results confirm the potential applications of the organic plasmon-emitting diodes in a low-cost, integrated, and disposable refractive-index sensor. PMID:23812346
NASA Astrophysics Data System (ADS)
Jiang, Shanchao; Wang, Jing; Sui, Qingmei
2015-11-01
One novel distinguishable circumferential inclined direction tilt sensor is demonstrated by incorporating two strain sensitivity fiber Bragg gratings (FBGs) with two orthogonal triangular cantilever beam and using one fiber Bragg grating (FBG) as temperature compensation element. According to spatial vector and space geometry, theory calculation model of the proposed FBG tilt sensor which can be used to obtain the azimuth and tile angle of the inclined direction is established. To obtain its measuring characteristics, calibration experiment on one prototype of the proposed FBG tilt sensor is carried out. After temperature sensitivity experiment data analysis, the proposed FBG tilt sensor exhibits excellent temperature compensation characteristics. In 2-D tilt angle experiment, tilt measurement sensitivities of these two strain sensitivity FBGs are 140.85°/nm and 101.01°/nm over a wide range of 60º. Further, azimuth and tile angle of the inclined direction can be obtained by the proposed FBG tilt sensor which is verified in circumferential angle experiment. Experiment data show that relative errors of azimuth are 0.55% (positive direction) and 1.14% (negative direction), respectively, and relative errors of tilt angle are all less than 3%. Experiment results confirm that the proposed distinguishable circumferential inclined direction tilt sensor based on FBG can achieve azimuth and tile angle measurement with wide measuring range and high accuracy.
Exploring the mid-infrared region for urban remote sensing: seasonal and view angle effects
NASA Astrophysics Data System (ADS)
Krehbiel, C. P.; Kovalskyy, V.; Henebry, G. M.
2013-12-01
Spanning 3-5 microns, the mid-infrared (MIR) region is the mixing zone between reflected sunlight and emitted earthlight in roughly equal proportions. While the MIR has been utilized in atmospheric remote sensing, its potential in terrestrial remote sensing--particularly urban remote sensing, has yet to be realized. One major advantage of the MIR is the ability to penetrate most anthropogenic haze and smog. Green vegetation appears MIR-dark, urban building materials appear MIR-grey, and bare soil and dried vegetation appear MIR-bright. Thus, there is an intrinsic seasonality in MIR radiance dynamics due both to surface type differences and to seasonal change in insolation. These factors merit exploration into the potential applications of the MIR for monitoring urban change. We investigated MIR radiance dynamics in relation to (1) the spectral properties of land cover types, (2) time of year and (3) sensor view zenith angle (VZA). We used Aqua MODIS daily swaths for band 23 (~ 4.05 μm) at 1 km spatial resolution from 2009-2010 and the NLCD Percent Impervious Surface Area (%ISA) 30 m product from 2001 and 2006. We found the effects of time of year, sensor VZA, and %ISA to be three principal factors influencing MIR radiance dynamics. We focused on analyzing the relationship between MIR radiance and %ISA over eight major cities in the Great Plains of the USA. This region is characterized by four distinct seasons, relatively flat terrain, and isolated urban centers situated within a vegetated landscape. We used west-east transects beginning in the agricultural areas outside of each city, passing through the urban core and extending back out into the agricultural periphery to observe the spatial pattern of MIR radiance and how it changes seasonally. Sensor VZA influences radiance dynamics by affecting the proportion of surface elements detected--especially pertinent at the coarse spatial resolution (~1 km) of MODIS. For example, smaller VZAs (<30°) capture more spatial detail than larger VZAs (>30°). Larger VZAs detect a larger proportion of crop canopies and less soil surface, and thus generally exhibit lower radiance and less variation than smaller VZAs. Future work should focus on how best to account for (1) land surface phenology, (2) the proportion of impervious surface, and (3) sensor viewing geometry to generate high signal-to-noise ratio composites and advance change detection and urban growth monitoring.
Comparison of S-NPP VIIRS land surface temperature with SEVIRI
NASA Astrophysics Data System (ADS)
Ermida, Sofia L.; Trigo, Isabel F.; Liu, Yuling; Yu, Yunyue
2017-04-01
Land surface temperature (LST) is one of the key parameters in the physics of land surface processes. LST can be globally measured from space by infrared radiometers, with a wide range of spatial and temporal resolutions depending on the sensor design and orbit. The Visible Infrared Imaging Radiometer Suite (VIIRS) instrument is the primary sensor onboard the Suomi National Polar-orbiting Partnership (S-NPP) satellite, which was launched in recent years. VIIRS was designed to improve upon the capabilities of the operational AVHRR and provide observation continuity with MODIS. A Split Window approach has been applied to the VIIRS moderate resolution channels M15 and M16 centered at 10.76 µm and 12.01 µm, respectively. VIIRS has a swath of 3000 km and a spatial resolution of 745m (nadir) up to about 1600 m (limb view), leading to relatively high re-visiting frequency. LST is retrieved for a wide range of viewing angles along the VIIRS path, allowing the study of the variability of LST with viewing geometry for various land cover types. Here we present a comparison of VIRS LST data with data provided by the Spinning Enhanced Visible and Infrared Imager (SEVIRI) on-board EUMETSAT's Meteosat Second Generation (MSG). SEVIRI-based LST is available every 15-minute, but at coarser spatial resolution (3-km at nadir) when compared to VIIRS LST. The analysis is performed over 6 areas over the SEVIRI disk characterized by different surface conditions. VIIRS has generally slightly warmer night-time LST compared with SEVIRI, with differences smaller than 2K. Larger differences are found during daytime, with VIIRS presenting overall lower LST values up to 5K. These differences are also analysed taking into account the surface type, view zenith angle (VZA) and topography. As seen in previous comparison studies, high VZA and elevation values are associated to higher discrepancies of the LST products.
Linear wide angle sun sensor for spinning satellites
NASA Astrophysics Data System (ADS)
Philip, M. P.; Kalakrishnan, B.; Jain, Y. K.
1983-08-01
A concept is developed which overcomes the defects of the nonlinearity of response and limitation in range exhibited by the V-slit, N-slit, and crossed slit sun sensors normally used for sun elevation angle measurements on spinning spacecraft. Two versions of sensors based on this concept which give a linear output and have a range of nearly + or - 90 deg of elevation angle are examined. Results are presented for the application of the twin slit version of the sun sensor in the three Indian satellites, Rohini, Apple, and Bhaskara II, which was successfully used for spin rate control and spin axis orientation control corrections as well as for sun elevation angle and spin period measurements.
Partially-overlapped viewing zone based integral imaging system with super wide viewing angle.
Xiong, Zhao-Long; Wang, Qiong-Hua; Li, Shu-Li; Deng, Huan; Ji, Chao-Chao
2014-09-22
In this paper, we analyze the relationship between viewer and viewing zones of integral imaging (II) system and present a partially-overlapped viewing zone (POVZ) based integral imaging system with a super wide viewing angle. In the proposed system, the viewing angle can be wider than the viewing angle of the conventional tracking based II system. In addition, the POVZ can eliminate the flipping and time delay of the 3D scene as well. The proposed II system has a super wide viewing angle of 120° without flipping effect about twice as wide as the conventional one.
Simultaneous Soft Sensing of Tissue Contact Angle and Force for Millimeter-scale Medical Robots
Arabagi, Veaceslav; Gosline, Andrew; Wood, Robert J.; Dupont, Pierre E.
2013-01-01
A novel robotic sensor is proposed to measure both the contact angle and the force acting between the tip of a surgical robot and soft tissue. The sensor is manufactured using a planar lithography process that generates microchannels that are subsequently filled with a conductive liquid. The planar geometry is then molded onto a hemispherical plastic scaffolding in a geometric configuration enabling estimation of the contact angle (angle between robot tip tangent and tissue surface normal) by the rotation of the sensor around its roll axis. Contact force can also be estimated by monitoring the changes in resistance in each microchannel. Bench top experimental results indicate that, on average, the sensor can estimate the angle of contact to within ±2° and the contact force to within ±5.3 g. PMID:24241496
NASA Astrophysics Data System (ADS)
Claverie, M.; Vermote, E.; Franch, B.; Huc, M.; Hagolle, O.; Masek, J.
2013-12-01
Maintaining consistent dataset of Surface Reflectance (SR) data derived from the large panel of in-orbit sensors is an important challenge to ensure long term analysis of earth observation data. Continuous validation of such SR products through comparison with a reference dataset is thus an important challenge. Validating with in situ or airborne SR data is not easy since the sensors rarely match completely the same spectral, spatial and directional characteristics of the satellite measurement. Inter-comparison between satellites sensors data appears as a valuable tool to maintain a long term consistency of the data. However, satellite data are acquired at various times of the day (i.e., variation of the atmosphere content) and within a relative large range of geometry (view and sun angles). Also, even if band-to-band spectral characteristics of optical sensors are closed, they rarely have identical spectral responses. As the results, direct comparisons without consideration of these differences are poorly suitable. In this study, we suggest a new systematic method to assess land optical SR data from high to medium resolution sensors. We used MODIS SR products (MO/YD09CMG) which benefit from a long term calibration/validation process, to assess SR from 3 sensors data: Formosat-2 (280 scenes 24x24km - 5 sites), SPOT-4 (62 scenes 120x60km - 1 site) and Landsat-5/7 (104 180x180km scenes - 50 sites). The main issue concerns the difference in term of geometry acquisition between MODIS and compared sensors data. We used the VJB model (Vermote et al. 2009, TGRS) to correct MODIS SR from BRDF effects and to simulate SR at the corresponding geometry (view and sun angles) of each pixel of the compared sensor data. The comparison is done at the CMG spatial resolution (0.05°) which ensures a constant field-of-view and negligible geometrical errors. Figure 1 displays the summary of the NIR results through APU graphs where metrics A, P and U stands for Accuracy, Precision and Uncertainty (metrics explained in Claverie et al., 2013, RSE) and allows comparison with standard Specifications (S in magenta). The results shows relatively good uncertainty taking into account that atmospheric correction differs from MODIS and the sensors data (LEDAPS for Landsat-5/7 and MACC for Formosat-2 and SPOT-4). Biases (referring to the metric A) are in many cases related to the spectral differences which are analyzed using PROSAIL radiative transfer modeling. Finally some images of Landsat-8 OLI SR (computed using the preliminary adaptation of LEDAPS for Landsat-8) are assessed using this method. Figure 1: APU graph of SR comparison between MODIS NIR (from AQUA) and Landsat-5/7, Formosat-2 and SPOT-4. A, P and U metrics are given per bin (red, green and blue line, respectively) and for the whole range (upper left text values). Magenta line refers to the MODIS SR Specification.
NASA Technical Reports Server (NTRS)
Moul, T. M.
1983-01-01
The nature of corrections for flow direction measurements obtained with a wing-tip mounted sensor was investigated. Corrections for the angle of attack and sideslip, measured by sensors mounted in front of each wing tip of a general aviation airplane, were determined. These flow corrections were obtained from both wind-tunnel and flight tests over a large angle-of-attack range. Both the angle-of-attack and angle-of-sideslip flow corrections were found to be substantial. The corrections were a function of the angle of attack and angle of sideslip. The effects of wing configuration changes, small changes in Reynolds number, and spinning rotation on the angle-of-attack flow correction were found to be small. The angle-of-attack flow correction determined from the static wind-tunnel tests agreed reasonably well with the correction determined from flight tests.
Lü, Chun-guang; Wang, Wei-he; Yang, Wen-bo; Tian, Qing-iju; Lu, Shan; Chen, Yun
2015-11-01
New hyperspectral sensor to detect total ozone is considered to be carried on geostationary orbit platform in the future, because local troposphere ozone pollution and diurnal variation of ozone receive more and more attention. Sensors carried on geostationary satellites frequently obtain images on the condition of larger observation angles so that it has higher requirements of total ozone retrieval on these observation geometries. TOMS V8 algorithm is developing and widely used in low orbit ozone detecting sensors, but it still lack of accuracy on big observation geometry, therefore, how to improve the accuracy of total ozone retrieval is still an urgent problem that demands immediate solution. Using moderate resolution atmospheric transmission, MODT-RAN, synthetic UV backscatter radiance in the spectra region from 305 to 360 nm is simulated, which refers to clear sky, multi angles (12 solar zenith angles and view zenith angles) and 26 standard profiles, moreover, the correlation and trends between atmospheric total ozone and backward scattering of the earth UV radiation are analyzed based on the result data. According to these result data, a new modified initial total ozone estimation model in TOMS V8 algorithm is considered to be constructed in order to improve the initial total ozone estimating accuracy on big observation geometries. The analysis results about total ozone and simulated UV backscatter radiance shows: Radiance in 317.5 nm (R₃₁₇.₅) decreased as the total ozone rise. Under the small solar zenith Angle (SZA) and the same total ozone, R₃₁₇.₅ decreased with the increase of view zenith Angle (VZA) but increased on the large SZA. Comparison of two fit models shows: without the condition that both SZA and VZA are large (> 80°), exponential fitting model and logarithm fitting model all show high fitting precision (R² > 0.90), and precision of the two decreased as the SZA and VZA rise. In most cases, the precision of logarithm fitting mode is about 0.9% higher than exponential fitting model. With the increasing of VZA or SZA, the fitting precision gradually lower, and the fall is more in the larger VZA or SZA. In addition, the precision of fitting mode exist a plateau in the small SZA range. The modified initial total ozone estimating model (ln(I) vs. Ω) is established based on logarithm fitting mode, and compared with traditional estimating model (I vs. ln(Ω)), that shows: the RMSE of ln(I) vs. Ω and I vs. ln(Ω) all have the down trend with the rise of total ozone. In the low region of total ozone (175-275 DU), the RMSE is obvious higher than high region (425-525 DU), moreover, a RMSE peak and a trough exist in 225 and 475 DU respectively. With the increase of VZA and SZA, the RMSE of two initial estimating models are overall rise, and the upraising degree is ln(I) vs. Ω obvious with the growing of SZA and VZA. The estimating result by modified model is better than traditional model on the whole total ozone range (RMSE is 0.087%-0.537% lower than traditional model), especially on lower total ozone region and large observation geometries. Traditional estimating model relies on the precision of exponential fitting model, and modified estimating model relies on the precision of logarithm fitting model. The improvement of the estimation accuracy by modified initial total ozone estimating model expand the application range of TOMS V8 algorithm. For sensor carried on geostationary orbit platform, there is no doubt that the modified estimating model can help improve the inversion accuracy on wide spatial and time range This modified model could give support and reference to TOMS algorithm update in the future.
Tokuda, T; Yamada, H; Sasagawa, K; Ohta, J
2009-10-01
This paper proposes and demonstrates a polarization-analyzing CMOS sensor based on image sensor architecture. The sensor was designed targeting applications for chiral analysis in a microchemistry system. The sensor features a monolithically embedded polarizer. Embedded polarizers with different angles were implemented to realize a real-time absolute measurement of the incident polarization angle. Although the pixel-level performance was confirmed to be limited, estimation schemes based on the variation of the polarizer angle provided a promising performance for real-time polarization measurements. An estimation scheme using 180 pixels in a 1deg step provided an estimation accuracy of 0.04deg. Polarimetric measurements of chiral solutions were also successfully performed to demonstrate the applicability of the sensor to optical chiral analysis.
Preferred viewing distance and screen angle of electronic paper displays.
Shieh, Kong-King; Lee, Der-Song
2007-09-01
This study explored the viewing distance and screen angle for electronic paper (E-Paper) displays under various light sources, ambient illuminations, and character sizes. Data analysis showed that the mean viewing distance and screen angle were 495 mm and 123.7 degrees. The mean viewing distances for Kolin Chlorestic Liquid Crystal display was 500 mm, significantly longer than Sony electronic ink display, 491 mm. Screen angle for Kolin was 127.4 degrees, significantly greater than that of Sony, 120.0 degrees. Various light sources revealed no significant effect on viewing distances; nevertheless, they showed significant effect on screen angles. The screen angle for sunlight lamp (D65) was similar to that of fluorescent lamp (TL84), but greater than that of tungsten lamp (F). Ambient illumination and E-paper type had significant effects on viewing distance and screen angle. The higher the ambient illumination was, the longer the viewing distance and the lesser the screen angle. Character size had significant effect on viewing distances: the larger the character size, the longer the viewing distance. The results of this study indicated that the viewing distance for E-Paper was similar to that of visual display terminal (VDT) at around 500 mm, but greater than normal paper at about 360 mm. The mean screen angle was around 123.7 degrees, which in terms of viewing angle is 29.5 degrees below horizontal eye level. This result is similar to the general suggested viewing angle between 20 degrees and 50 degrees below the horizontal line of sight.
JPSS-1 VIIRS Pre-Launch Response Versus Scan Angle Testing and Performance
NASA Technical Reports Server (NTRS)
Moyer, David; McIntire, Jeff; Oudrari, Hassan; McCarthy, James; Xiong, Xiaoxiong; De Luccia, Frank
2016-01-01
The Visible Infrared Imaging Radiometer Suite (VIIRS) instruments on-board both the Suomi National Polar-orbiting Partnership (S-NPP) and the first Joint Polar Satellite System (JPSS-1) spacecraft, with launch dates of October 2011 and December 2016 respectively, are cross-track scanners with an angular swath of +/-56.06 deg. A four-mirror Rotating Telescope Assembly (RTA) is used for scanning combined with a Half Angle Mirror (HAM) that directs light exiting from the RTA into the aft-optics. It has 14 Reflective Solar Bands (RSBs), seven Thermal Emissive Bands (TEBs) and a panchromatic Day Night Band (DNB). There are three internal calibration targets, the Solar Diffuser, the BlackBody and the Space View, that have fixed scan angles within the internal cavity of VIIRS. VIIRS has calibration requirements of 2% on RSB reflectance and as tight as 0.4% on TEB radiance that requires the sensor's gain change across the scan or Response Versus Scan angle (RVS) to be well quantified. A flow down of the top level calibration requirements put constraints on the characterization of the RVS to 0.2%-0.3% but there are no specified limitations on the magnitude of response change across scan. The RVS change across scan angle can vary significantly between bands with the RSBs having smaller changes of approximately 2% and some TEBs having approximately 10% variation. Within aband, the RVS has both detector and HAM side dependencies that vary across scan. Errors in the RVS characterization will contribute to image banding and striping artifacts if their magnitudes are above the noise level of the detectors. The RVS was characterized pre-launch for both S-NPP and JPSS-1 VIIRS and a comparison of the RVS curves between these two sensors will be discussed.
Shaft-Angle Sensor Based on Tunnel-Diode Oscillator
NASA Technical Reports Server (NTRS)
Chui, Talso
2008-01-01
A proposed brushless shaft-angle sensor for use in extreme cold would offer significant advantages over prior such sensors: (1) It would be capable of operating in extreme cold; and (2) Its electronic circuitry would be simpler than that of a permanent-magnet/ multiple-Hall-probe shaft-angle sensor that would otherwise ordinarily be used to obtain comparable angular resolution. The principle of operation of the proposed shaft-angle sensor requires that the shaft (or at least the portion of the shaft at the sensor location) be electrically insulating. The affected portion of the shaft would be coated with metal around half of its circumference. Two half-circular-cylinder electrodes having a radius slightly larger than that of the shaft would be mounted on the stator, concentric with the shaft, so that there would be a small radial gap between them and the outer surface of the shaft. Hence, there would be a capacitance between each stationary electrode and the metal coat on the shaft.
NASA Astrophysics Data System (ADS)
Kusterer, M. B.; Mitchell, D. G.; Krimigis, S. M.; Vandegriff, J. D.
2014-12-01
Having been at Saturn for over a decade, the MIMI instrument on Cassini has created a rich dataset containing many details about Saturn's magnetosphere. In particular, the images of energetic neutral atoms (ENAs) taken by the Ion and Neutral Camera (INCA) offer a global perspective on Saturn's plasma environment. The MIMI team is now regularly making movies (in MP4 format) consisting of consecutive ENA images. The movies correct for spacecraft attitude changes by projecting the images (whose viewing angles can substantially vary from one image to the next) into a fixed inertial frame that makes it easy to view spatial features evolving in time. These movies are now being delivered to the PDS and are also available at the MIMI team web site. Several other higher order products are now also available, including 20-day energy-time spectrograms for the Charge-Energy-Mass Spectrometer (CHEMS) sensor, and daily energy-time spectrograms for the Low Energy Magnetospheric Measurements system (LEMMS) sensor. All spectrograms are available as plots or digital data in ASCII format. For all MIMI sensors, a Data User Guide is also available. This paper presents details and examples covering the specifics of MIMI higher order data products. URL: http://cassini-mimi.jhuapl.edu/
ODERACS 2 White Spheres Optical Calibration Report
NASA Technical Reports Server (NTRS)
Culp, Robert D.; Gravseth, Ian; Gloor, Jason; Wantuch, Todd
1995-01-01
This report documents the status of the Orbital Debris Radar Calibration Spheres (ODERACS) 2 white spheres optical calibration study. The purpose of this study is to determine the spectral reflectivity and scattering characteristics in the visible wavelength region for the white spheres that were added to the project in the fall, 1994. Laboratory measurements were performed upon these objects and an analysis of the resulting data was conducted. These measurements are performed by illuminating the objects with a collimated beam of light and measuring the reflected light versus the phase angle. The phase angle is defined as the angle between the light source and the sensor, as viewed from the object. By measuring the reflected signal at the various phase angles, one is able to estimate the reflectance properties of the object. The methodology used in taking the measurements and reducing the data are presented. The results of this study will be used to support the calibration of ground-based optical instruments used in support of space debris research. Visible measurements will be made by the GEODDS, NASA and ILADOT telescopes.
A Measuring System for Well Logging Attitude and a Method of Sensor Calibration
Ren, Yong; Wang, Yangdong; Wang, Mijian; Wu, Sheng; Wei, Biao
2014-01-01
This paper proposes an approach for measuring the azimuth angle and tilt angle of underground drilling tools with a MEMS three-axis accelerometer and a three-axis fluxgate sensor. A mathematical model of well logging attitude angle is deduced based on combining space coordinate transformations and algebraic equations. In addition, a system implementation plan of the inclinometer is given in this paper, which features low cost, small volume and integration. Aiming at the sensor and assembly errors, this paper analyses the sources of errors, and establishes two mathematical models of errors and calculates related parameters to achieve sensor calibration. The results show that this scheme can obtain a stable and high precision azimuth angle and tilt angle of drilling tools, with the deviation of the former less than ±1.4° and the deviation of the latter less than ±0.1°. PMID:24859028
A measuring system for well logging attitude and a method of sensor calibration.
Ren, Yong; Wang, Yangdong; Wang, Mijian; Wu, Sheng; Wei, Biao
2014-05-23
This paper proposes an approach for measuring the azimuth angle and tilt angle of underground drilling tools with a MEMS three-axis accelerometer and a three-axis fluxgate sensor. A mathematical model of well logging attitude angle is deduced based on combining space coordinate transformations and algebraic equations. In addition, a system implementation plan of the inclinometer is given in this paper, which features low cost, small volume and integration. Aiming at the sensor and assembly errors, this paper analyses the sources of errors, and establishes two mathematical models of errors and calculates related parameters to achieve sensor calibration. The results show that this scheme can obtain a stable and high precision azimuth angle and tilt angle of drilling tools, with the deviation of the former less than ±1.4° and the deviation of the latter less than ±0.1°.
Bidirectional measurements of surface reflectance for view angle corrections of oblique imagery
NASA Technical Reports Server (NTRS)
Jackson, R. D.; Teillet, P. M.; Slater, P. N.; Fedosejevs, G.; Jasinski, Michael F.
1990-01-01
An apparatus for acquiring bidirectional reflectance-factor data was constructed and used over four surface types. Data sets were obtained over a headed wheat canopy, bare soil having several different roughness conditions, playa (dry lake bed), and gypsum sand. Results are presented in terms of relative bidirectional reflectance factors (BRFs) as a function of view angle at a number of solar zenith angles, nadir BRFs as a function of solar zenith angles, and, for wheat, vegetation indices as related to view and solar zenith angles. The wheat canopy exhibited the largest BRF changes with view angle. BRFs for the red and the NIR bands measured over wheat did not have the same relationship with view angle. NIR/Red ratios calculated from nadir BRFs changed by nearly a factor of 2 when the solar zenith angle changed from 20 to 50 degs. BRF versus view angle relationships were similar for soils having smooth and intermediate rough surfaces but were considerably different for the roughest surface. Nadir BRF versus solar-zenith angle relationships were distinctly different for the three soil roughness levels. Of the various surfaces, BRFs for gypsum sand changed the least with view angle (10 percent at 30 degs).
77 FR 47552 - Event Data Recorders
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-09
... uncertainties in multiple event crashes; Revised certain sensor ranges and accuracies to reflect current state... resolution specification of 5 degrees. In its petition the Alliance stated that steering wheel angle sensors... angle sensors. Both Nissan and GAM submitted comments in support of the Alliance and Honda petitions to...
Investigation of an optical sensor for small tilt angle detection of a precision linear stage
NASA Astrophysics Data System (ADS)
Saito, Yusuke; Arai, Yoshikazu; Gao, Wei
2010-05-01
This paper presents evaluation results of the characteristics of the angle sensor based on the laser autocollimation method for small tilt angle detection of a precision linear stage. The sensor consists of a laser diode (LD) as the light source, and a quadrant photodiode (QPD) as the position-sensing detector. A small plane mirror is mounted on the moving table of the stage as a target mirror for the sensor. This optical system has advantages of high sensitivity, fast response speed and the ability for two-axis angle detection. On the other hand, the sensitivity of the sensor is determined by the size of the optical spot focused on the QPD, which is a function of the diameter of the laser beam projected onto the target mirror. Because the diameter is influenced by the divergence of the laser beam, this paper focuses on the relationship between the sensor sensitivity and the moving position of the target mirror (sensor working distance) over the moving stroke of the stage. The main error components that influence the sensor sensitivity are discussed and the optimal conditions of the optical system of the sensor are analyzed. The experimental result about evaluation of the effective working distance is also presented.
Setup and evaluation of a sensor tilting system for dimensional micro- and nanometrology
NASA Astrophysics Data System (ADS)
Schuler, Alexander; Weckenmann, Albert; Hausotte, Tino
2014-06-01
Sensors in micro- and nanometrology show their limits if the measurement objects and surfaces feature high aspect ratios, high curvature and steep surface angles. Their measurable surface angle is limited and an excess leads to measurement deviation and not detectable surface points. We demonstrate a principle to adapt the sensor's working angle during the measurement keeping the sensor in its optimal working angle. After the simulation of the principle, a hardware prototype was realized. It is based on a rotary kinematic chain with two rotary degrees of freedom, which extends the measurable surface angle to ±90° and is combined with a nanopositioning and nanomeasuring machine. By applying a calibration procedure with a quasi-tactile 3D sensor based on electrical near-field interaction the systematic position deviation of the kinematic chain is reduced. The paper shows for the first time the completed setup and integration of the prototype, the performance results of the calibration, the measurements with the prototype and the tilting principle, and finishes with the interpretation and feedback of the practical results.
AOD trends during 2001-2010 from observations and model simulations
NASA Astrophysics Data System (ADS)
Pozzer, A.; de Meij, A.; Yoon, J.; Tost, H.; Georgoulias, A. K.; Astitha, M.
2015-05-01
The aerosol optical depth (AOD) trend between 2001 and 2010 is estimated globally and regionally from observations and results from simulations with the EMAC (ECHAM5/MESSy Atmospheric Chemistry) model. Although interannual variability is applied only to anthropogenic and biomass-burning emissions, the model is able to quantitatively reproduce the AOD trends as observed by the MODIS (Moderate Resolution Imaging Spectroradiometer) satellite sensor, while some discrepancies are found when compared to MISR (Multi-angle Imaging SpectroRadiometer) and SeaWIFS (Sea-viewing Wide Field-of-view Sensor) observations. Thanks to an additional simulation without any change in emissions, it is shown that decreasing AOD trends over the US and Europe are due to the decrease in the emissions, while over the Sahara Desert and the Middle East region, the meteorological changes play a major role. Over Southeast Asia, both meteorology and emissions changes are equally important in defining AOD trends. Additionally, decomposing the regional AOD trends into individual aerosol components reveals that the soluble components are the most dominant contributors to the total AOD, as their influence on the total AOD is enhanced by the aerosol water content.
Mendoza, Beatriz R.; Rodríguez, Silvestre; Pérez-Jiménez, Rafael; Ayala, Alejandro; González, Oswaldo
2016-01-01
In general, the use of angle-diversity receivers makes it possible to reduce the impact of ambient light noise, path loss and multipath distortion, in part by exploiting the fact that they often receive the desired signal from different directions. Angle-diversity detection can be performed using a composite receiver with multiple detector elements looking in different directions. These are called non-imaging angle-diversity receivers. In this paper, a comparison of three non-imaging angle-diversity receivers as input sensors of nodes for an indoor infrared (IR) wireless sensor network is presented. The receivers considered are the conventional angle-diversity receiver (CDR), the sectored angle-diversity receiver (SDR), and the self-orienting receiver (SOR), which have been proposed or studied by research groups in Spain. To this end, the effective signal-collection area of the three receivers is modelled and a Monte-Carlo-based ray-tracing algorithm is implemented which allows us to investigate the effect on the signal to noise ratio and main IR channel parameters, such as path loss and rms delay spread, of using the three receivers in conjunction with different combination techniques in IR links operating at low bit rates. Based on the results of the simulations, we show that the use of a conventional angle-diversity receiver in conjunction with the equal-gain combining technique provides the solution with the best signal to noise ratio, the lowest computational capacity and the lowest transmitted power requirements, which comprise the main limitations for sensor nodes in an indoor infrared wireless sensor network. PMID:27428966
A Novel Low-Cost, Large Curvature Bend Sensor Based on a Bowden-Cable
Jeong, Useok; Cho, Kyu-Jin
2016-01-01
Bend sensors have been developed based on conductive ink, optical fiber, and electronic textiles. Each type has advantages and disadvantages in terms of performance, ease of use, and cost. This study proposes a new and low-cost bend sensor that can measure a wide range of accumulated bend angles with large curvatures. This bend sensor utilizes a Bowden-cable, which consists of a coil sheath and an inner wire. Displacement changes of the Bowden-cable’s inner wire, when the shape of the sheath changes, have been considered to be a position error in previous studies. However, this study takes advantage of this position error to detect the bend angle of the sheath. The bend angle of the sensor can be calculated from the displacement measurement of the sensing wire using a Hall-effect sensor or a potentiometer. Simulations and experiments have shown that the accumulated bend angle of the sensor is linearly related to the sensor signal, with an R-square value up to 0.9969 and a root mean square error of 2% of the full sensing range. The proposed sensor is not affected by a bend curvature of up to 80.0 m−1, unlike previous bend sensors. The proposed sensor is expected to be useful for various applications, including motion capture devices, wearable robots, surgical devices, or generally any device that requires an affordable and low-cost bend sensor. PMID:27347959
NASA Technical Reports Server (NTRS)
Smith, G. A.
1975-01-01
The attitude of a spacecraft is determined by specifying independent parameters which relate the spacecraft axes to an inertial coordinate system. Sensors which measure angles between spin axis and other vectors directed to objects or fields external to the spacecraft are discussed. For the spin-stabilized spacecraft considered, the spin axis is constant over at least an orbit, but separate solutions based on sensor angle measurements are different due to propagation of errors. Sensor-angle solution methods are described which minimize the propagated errors by making use of least squares techniques over many sensor angle measurements and by solving explicitly (in closed form) for the spin axis coordinates. These methods are compared with star observation solutions to determine if satisfactory accuracy is obtained by each method.
Mechanism For Adjustment Of Commutation Of Brushless Motor
NASA Technical Reports Server (NTRS)
Schaefer, Richard E.
1995-01-01
Mechanism enables adjustment of angular position of set of Hall-effect devices that sense instantaneous shaft angle of brushless dc motor. Outputs of sensors fed to commutation circuitry. Measurement of shaft angle essential for commutation; that is, application of voltage to stator windings must be synchronized with shaft angle. To obtain correct angle measurement for commutation, Hall-effect angle sensors positioned at proper reference angle. The present mechanism accelerates adjustment procedure and makes it possible to obtain more accurate indication of minimum-current position because it provides for adjustment while motor running.
Yamashita, Wakayo; Wang, Gang; Tanaka, Keiji
2010-01-01
One usually fails to recognize an unfamiliar object across changes in viewing angle when it has to be discriminated from similar distractor objects. Previous work has demonstrated that after long-term experience in discriminating among a set of objects seen from the same viewing angle, immediate recognition of the objects across 30-60 degrees changes in viewing angle becomes possible. The capability for view-invariant object recognition should develop during the within-viewing-angle discrimination, which includes two kinds of experience: seeing individual views and discriminating among the objects. The aim of the present study was to determine the relative contribution of each factor to the development of view-invariant object recognition capability. Monkeys were first extensively trained in a task that required view-invariant object recognition (Object task) with several sets of objects. The animals were then exposed to a new set of objects over 26 days in one of two preparatory tasks: one in which each object view was seen individually, and a second that required discrimination among the objects at each of four viewing angles. After the preparatory period, we measured the monkeys' ability to recognize the objects across changes in viewing angle, by introducing the object set to the Object task. Results indicated significant view-invariant recognition after the second but not first preparatory task. These results suggest that discrimination of objects from distractors at each of several viewing angles is required for the development of view-invariant recognition of the objects when the distractors are similar to the objects.
Koyama, Shinzo; Onozawa, Kazutoshi; Tanaka, Keisuke; Saito, Shigeru; Kourkouss, Sahim Mohamed; Kato, Yoshihisa
2016-08-08
We developed multiocular 1/3-inch 2.75-μm-pixel-size 2.1M- pixel image sensors by co-design of both on-chip beam-splitter and 100-nm-width 800-nm-depth patterned inner meta-micro-lens for single-main-lens stereo camera systems. A camera with the multiocular image sensor can capture horizontally one-dimensional light filed by both the on-chip beam-splitter horizontally dividing ray according to incident angle, and the inner meta-micro-lens collecting the divided ray into pixel with small optical loss. Cross-talks between adjacent light field images of a fabricated binocular image sensor and of a quad-ocular image sensor are as low as 6% and 7% respectively. With the selection of two images from one-dimensional light filed images, a selective baseline for stereo vision is realized to view close objects with single-main-lens. In addition, by adding multiple light field images with different ratios, baseline distance can be tuned within an aperture of a main lens. We suggest the electrically selective or tunable baseline stereo vision to reduce 3D fatigue of viewers.
A generalized technique for using cones and dihedral angles in attitude determination, revision 1
NASA Technical Reports Server (NTRS)
Werking, R. D.
1973-01-01
Analytic development is presented for a general least squares attitude determination subroutine applicable to spinning satellites. The method is founded on a geometric approach which is completely divorced from considerations relating to particular types and configurations of onboard attitude sensors. Any mix of sensor measurements which can be first transformed (outside the program) to cone or dihedral angle data can be processed. A cone angle is an angle between the spin axis and a known direction line in space; a dihedral angle is an angle between two planes formed by the spin axis and each of two known direction lines. Many different kinds of sensor data can be transformed to these angles, which in turn constitute the actual program inputs, so that the subroutine can be applied without change to a variety of satellite missions. Either a constant or dynamic spin axis model can be handled. The program is also capable of solving for fixed biases in the input angles, in addition to the spin axis attitude solution.
Object detection system using SPAD proximity detectors
NASA Astrophysics Data System (ADS)
Stark, Laurence; Raynor, Jeffrey M.; Henderson, Robert K.
2011-10-01
This paper presents an object detection system based upon the use of multiple single photon avalanche diode (SPAD) proximity sensors operating upon the time-of-flight (ToF) principle, whereby the co-ordinates of a target object in a coordinate system relative to the assembly are calculated. The system is similar to a touch screen system in form and operation except that the lack of requirement of a physical sensing surface provides a novel advantage over most existing touch screen technologies. The sensors are controlled by FPGA-based firmware and each proximity sensor in the system measures the range from the sensor to the target object. A software algorithm is implemented to calculate the x-y coordinates of the target object based on the distance measurements from at least two separate sensors and the known relative positions of these sensors. Existing proximity sensors were capable of determining the distance to an object with centimetric accuracy and were modified to obtain a wide field of view in the x-y axes with low beam angle in z in order to provide a detection area as large as possible. Design and implementation of the firmware, electronic hardware, mechanics and optics are covered in the paper. Possible future work would include characterisation with alternative designs of proximity sensors, as this is the component which determines the highest achievable accur1acy of the system.
Vection in patients with glaucoma.
Tarita-Nistor, Luminita; Hadavi, Shahriar; Steinbach, Martin J; Markowitz, Samuel N; González, Esther G
2014-05-01
Large moving scenes can induce a sensation of self-motion in stationary observers. This illusion is called "vection." Glaucoma progressively affects the functioning of peripheral vision, which plays an important role in inducing vection. It is still not known whether vection can be induced in these patients and, if it can, whether the interaction between visual and vestibular inputs is solved appropriately. The aim of this study was to investigate vection responses in patients with mild to moderate open-angle glaucoma. Fifteen patients with mild to moderate glaucoma and 15 age-matched controls were exposed to a random-dot pattern at a short viewing distance and in a dark room. The pattern was projected on a large screen and rotated clockwise with an angular speed of 45 degrees per second to induce a sensation of self-rotation. Vection latency, vection duration, and objective and subjective measures of tilt were obtained in three viewing conditions (binocular, and monocular with each eye). Each condition lasted 2 minutes. Patients with glaucoma had longer vection latencies (p = 0.005) than, but the same vection duration as, age-matched controls. Viewing condition did not affect vection responses for either group. The control group estimated the tilt angle as being significantly larger than the actual maximum tilt angle measured with the tilt sensor (p = 0.038). There was no relationship between vection measures and visual field sensitivity for the glaucoma group. These findings suggest that, despite an altered visual input that delays vection, the neural responses involved in canceling the illusion of self-motion remain intact in patients with mild peripheral visual field loss.
Yoo, Kwang Soo; Han, Soo Deok; Moon, Hi Gyu; Yoon, Seok-Jin; Kang, Chong-Yun
2015-01-01
As highly sensitive H2S gas sensors, Au- and Ag-catalyzed SnO2 thin films with morphology-controlled nanostructures were fabricated by using e-beam evaporation in combination with the glancing angle deposition (GAD) technique. After annealing at 500 °C for 40 h, the sensors showed a polycrystalline phase with a porous, tilted columnar nanostructure. The gas sensitivities (S = Rgas/Rair) of Au and Ag-catalyzed SnO2 sensors fabricated by the GAD process were 0.009 and 0.015, respectively, under 5 ppm H2S at 300 °C, and the 90% response time was approximately 5 s. These sensors showed excellent sensitivities compared with the SnO2 thin film sensors that were deposited normally (glancing angle = 0°, S = 0.48). PMID:26134105
77 FR 73282 - Airworthiness Directives; The Boeing Company Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-10
... system for the angle of attack sensor, the total air temperature, and the pitot probes. We are issuing this AD to prevent ice from forming on air data system sensors and consequent loss of or misleading... angle of attack sensor, the total air temperature, and the pitot probes. Actions Since Issuance of NPRM...
NASA Astrophysics Data System (ADS)
Leal-Junior, Arnaldo G.; Vargas-Valencia, Laura; dos Santos, Wilian M.; Schneider, Felipe B. A.; Siqueira, Adriano A. G.; Pontes, Maria José; Frizera, Anselmo
2018-07-01
This paper presents a low cost and highly reliable system for angle measurement based on a sensor fusion between inertial and fiber optic sensors. The system consists of the sensor fusion through Kalman filter of two inertial measurement units (IMUs) and an intensity variation-based polymer optical fiber (POF) curvature sensor. In addition, the IMU was applied as a reference for a compensation technique of POF curvature sensor hysteresis. The proposed system was applied on the knee angle measurement of a lower limb exoskeleton in flexion/extension cycles and in gait analysis. Results show the accuracy of the system, where the Root Mean Square Error (RMSE) between the POF-IMU sensor system and the encoder was below 4° in the worst case and about 1° in the best case. Then, the POF-IMU sensor system was evaluated as a wearable sensor for knee joint angle assessment without the exoskeleton, where its suitability for this purpose was demonstrated. The results obtained in this paper pave the way for future applications of sensor fusion between electronic and fiber optic sensors in movement analysis.
Optical polarization: background and camouflage
NASA Astrophysics Data System (ADS)
Škerlind, Christina; Hallberg, Tomas; Eriksson, Johan; Kariis, Hans; Bergström, David
2017-10-01
Polarimetric imaging sensors in the electro-optical region, already military and commercially available in both the visual and infrared, show enhanced capabilities for advanced target detection and recognition. The capabilities arise due to the ability to discriminate between man-made and natural background surfaces using the polarization information of light. In the development of materials for signature management in the visible and infrared wavelength regions, different criteria need to be met to fulfil the requirements for a good camouflage against modern sensors. In conventional camouflage design, the aimed design of the surface properties of an object is to spectrally match or adapt it to a background and thereby minimizing the contrast given by a specific threat sensor. Examples will be shown from measurements of some relevant materials and how they in different ways affect the polarimetric signature. Dimensioning properties relevant in an optical camouflage from a polarimetric perspective, such as degree of polarization, the viewing or incident angle, and amount of diffuse reflection, mainly in the infrared region, will be discussed.
New light-shielding technique for shortening the baffle length of a star sensor
NASA Astrophysics Data System (ADS)
Kawano, Hiroyuki; Sato, Yukio; Mitani, Kenji; Kanai, Hiroshi; Hama, Kazumori
2002-10-01
We have developed a star sensor with a short baffle of 140 mm. Our baffle provides a Sun rejection angle of 35 degrees with stray light attenuation less than the intensity level of a visual magnitude of Mv = +5 for a wide field of view lens of 13x13 degrees. The application of a new light shielding technique taking advantage of total internal reflection phenomena enables us to reduce the baffle length to about three fourths that of the conventional two-stage baffle. We have introduced two ideas to make the baffle length shorter. The one is the application of a nearly half sphere convex lens as the first focusing lens. The bottom surface reflects the scattering rays with high incident angles of over 50 degrees by using the total internal reflection phenomena. The other is the painting of the surface of the baffle with not frosted but gloss black paint. The gloss black paint enables most of the specular reflection rays to go back to outer space without scattering. We confirm the baffle performance mentioned above by scattering ray tracing simulation and a light attenuation experiment in a darkroom on the ground.
NASA Astrophysics Data System (ADS)
Lecerf, R.; Baret, F.; Hanocq, J.; Marloie, O.; Rautiainen, M.; Mottus, M.; Heiskanen, J.; Stenberg, P.
2010-12-01
The LAI (Leaf Area Index) is a key variable to analyze and model vegetation and its interactions with atmosphere and soils. The LAI maps derived from remote sensing images are often validated with non-destructive LAI measures obtained from digital hemispherical photography, LAI-2000 or ceptometer instruments. These methods are expensive and time consuming particularly when human intervention is needed. Consequently it is difficult to acquire overlapping field data and remotely sensed LAI. There is a need of a cheap, autonomous, easy to use ground system to measure foliage development and senescence at least with a daily frequency in order to increase the number of validation sites where vegetation phenology is continuously monitored. A system called PASTIS-57 (PAI Autonomous System from Transmittance Instantaneous Sensors oriented at 57°) devoted to PAI (Plant Area Index) ground measurements was developed to answer this need. PASTIS-57 consists in 6 sensors plugged on one logger that record data with a sampling rate of 1 to few minutes (tunable) with up to 3 months autonomy (energy and data storage). The sensors are plugged to the logger with 2x10m wires, 2x6m wires and 2x2m wires. The distance between each sensor was determined to obtain a representative spatial sampling over a 20m pixel corresponding to an Elementary Sampling Unit (ESU). The PASTIS-57 sensors are made of photodiodes that measure the incoming light in the blue wavelength to maximize the contrast between vegetation and sky and limit multiple scattering effects in the canopy. The diodes are oriented to the north to avoid direct sun light and point to a zenithal angle of 57° to minimize leaf angle distribution and plant clumping effects. The field of view of the diodes was set to ± 20° to take into consideration vegetation cover heterogeneity and to minimize environmental effects. The sensors were calibrated after recording data on a clear view site during a week. After calibration, the sensors were installed on several study sites including a boreal forest in Finland and an agricultural area in southern France. On each study site, several ESUs were equipped with 2 to 4 systems. The sensors were installed along an East-West line and were pointing to the north. A reference system was set up to monitor unobstructed incident radiation field. The results show that the transmitted light recorded by sensors depends on gap fraction and may be used to measure the PAI (Plant Area Index). The time series acquired with the PASTIS-57 show strong correlation with plant phenology. The PAI values were then derived from the measured gap fractions. Advantages and limitations of the system are finally discussed with emphasis on potential operational use within networks of sites.
Auto-converging stereo cameras for 3D robotic tele-operation
NASA Astrophysics Data System (ADS)
Edmondson, Richard; Aycock, Todd; Chenault, David
2012-06-01
Polaris Sensor Technologies has developed a Stereovision Upgrade Kit for TALON robot to provide enhanced depth perception to the operator. This kit previously required the TALON Operator Control Unit to be equipped with the optional touchscreen interface to allow for operator control of the camera convergence angle adjustment. This adjustment allowed for optimal camera convergence independent of the distance from the camera to the object being viewed. Polaris has recently improved the performance of the stereo camera by implementing an Automatic Convergence algorithm in a field programmable gate array in the camera assembly. This algorithm uses scene content to automatically adjust the camera convergence angle, freeing the operator to focus on the task rather than adjustment of the vision system. The autoconvergence capability has been demonstrated on both visible zoom cameras and longwave infrared microbolometer stereo pairs.
NASA Astrophysics Data System (ADS)
Baek, Jong-In; Kim, Ki-Han; Kim, Jae Chang; Yoon, Tae-Hoon
2010-01-01
This paper proposes a method of omni-directional viewing-angle switching by controlling the beam diverging angle (BDA) in a liquid crystal (LC) panel. The LCs aligned randomly by in-cell polymer structures diffuse the collimated backlight for the bright state of the wide viewing-angle mode. We align the LCs homogeneously by applying an in-plane field for the narrow viewing-angle mode. By doing this the scattering is significantly reduced so that the small BDA is maintained as it passes through the LC layer. The dark state can be obtained by aligning the LCs homeotropically with a vertical electric field. We demonstrated experimentally the omni-directional switching of the viewing-angle, without an additional panel or backlighting system.
Calibration Plans for the Multi-angle Imaging SpectroRadiometer (MISR)
NASA Astrophysics Data System (ADS)
Bruegge, C. J.; Duval, V. G.; Chrien, N. L.; Diner, D. J.
1993-01-01
The EOS Multi-angle Imaging SpectroRadiometer (MISR) will study the ecology and climate of the Earth through acquisition of global multi-angle imagery. The MISR employs nine discrete cameras, each a push-broom imager. Of these, four point forward, four point aft and one views the nadir. Absolute radiometric calibration will be obtained pre-flight using high quantum efficiency (HQE) detectors and an integrating sphere source. After launch, instrument calibration will be provided using HQE detectors in conjunction with deployable diffuse calibration panels. The panels will be deployed at time intervals of one month and used to direct sunlight into the cameras, filling their fields-of-view and providing through-the-optics calibration. Additional techniques will be utilized to reduce systematic errors, and provide continuity as the methodology changes with time. For example, radiation-resistant photodiodes will also be used to monitor panel radiant exitance. These data will be acquired throughout the five-year mission, to maintain calibration in the latter years when it is expected that the HQE diodes will have degraded. During the mission, it is planned that the MISR will conduct semi-annual ground calibration campaigns, utilizing field measurements and higher resolution sensors (aboard aircraft or in-orbit platforms) to provide a check of the on-board hardware. These ground calibration campaigns are limited in number, but are believed to be the key to the long-term maintenance of MISR radiometric calibration.
Study of optical techniques for the Ames unitary wind tunnels. Part 3: Angle of attack
NASA Technical Reports Server (NTRS)
Lee, George
1992-01-01
A review of optical sensors that are capable of accurate angle of attack measurements in wind tunnels was conducted. These include sensors being used or being developed at NASA Ames and Langley Research Centers, Boeing Airplane Company, McDonald Aircraft Company, Arnold Engineering Development Center, National Aerospace Laboratory of the Netherlands, National Research Council of Canada, and the Royal Aircraft Establishment of England. Some commercial sensors that may be applicable to accurate angle measurements were also reviewed. It was found that the optical sensor systems were based on interferometers, polarized light detector, linear or area photodiode cameras, position sensing photodetectors, and laser scanners. Several of the optical sensors can meet the requirements of the Ames Unitary Plan Wind Tunnel. Two of these, the Boeing interferometer and the Complere lateral effect photodiode sensors are being developed for the Ames Unitary Plan Wind Tunnel.
A star tracker insensitive to stray light generated by radiation sources close to the field of view
NASA Astrophysics Data System (ADS)
Romoli, A.; Gambicorti, L.; Simonetti, F.; Zuccaro Marchi, A.
2017-11-01
Aim of this work is to propose an innovative star tracker, practically insensitive to the radiation coming from the sun or from other strong planetary sources out of (but near) the Field of View. These sources need to be stopped in some way. The classical solution to reject the unwanted radiation is to place a shadow (or baffle) before the star tracker objective. The shadow size depends on the Field of View and on the minimum angle subtended by the source (i.e. the sun) with respect to the optical axis of the star tracker. The lower is this angle the larger is the shadow. Requests for star trackers able to work with the sun as close as possible to the Field of View are increasing, due to the need of maximum mission flexibility. The innovation of this proposed star tracker is conceived by using spatial filtering with a concept complementary to that of coronagraph for sun corona observation, allowing to drastically reduce the size of the shadow. It can also work close to antennas and other part of the platform, which, when illuminated by the sun, become secondary sources capable to blind the star tracker. This kind of accommodation offers three main advantages: no cumbersome shadows (baffle), maximum flexibility in terms of mission profile, less platform location constraints. This new star sensor concept, dated 2007, is now patent pending. Galileo Avionica (now Selex Galileo) is the owner of the patent.
A multi-camera system for real-time pose estimation
NASA Astrophysics Data System (ADS)
Savakis, Andreas; Erhard, Matthew; Schimmel, James; Hnatow, Justin
2007-04-01
This paper presents a multi-camera system that performs face detection and pose estimation in real-time and may be used for intelligent computing within a visual sensor network for surveillance or human-computer interaction. The system consists of a Scene View Camera (SVC), which operates at a fixed zoom level, and an Object View Camera (OVC), which continuously adjusts its zoom level to match objects of interest. The SVC is set to survey the whole filed of view. Once a region has been identified by the SVC as a potential object of interest, e.g. a face, the OVC zooms in to locate specific features. In this system, face candidate regions are selected based on skin color and face detection is accomplished using a Support Vector Machine classifier. The locations of the eyes and mouth are detected inside the face region using neural network feature detectors. Pose estimation is performed based on a geometrical model, where the head is modeled as a spherical object that rotates upon the vertical axis. The triangle formed by the mouth and eyes defines a vertical plane that intersects the head sphere. By projecting the eyes-mouth triangle onto a two dimensional viewing plane, equations were obtained that describe the change in its angles as the yaw pose angle increases. These equations are then combined and used for efficient pose estimation. The system achieves real-time performance for live video input. Testing results assessing system performance are presented for both still images and video.
Position and orientation determination system and method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harpring, Lawrence J.; Farfan, Eduardo B.; Gordon, John R.
A position determination system and method is provided that may be used for obtaining position and orientation information of a detector in a contaminated room. The system includes a detector, a sensor operably coupled to the detector, and a motor coupled to the sensor to move the sensor around the detector. A CPU controls the operation of the motor to move the sensor around the detector and determines distance and angle data from the sensor to an object. The method includes moving a sensor around the detector and measuring distance and angle data from the sensor to an object atmore » incremental positions around the detector.« less
A view finder control system for an earth observation satellite
NASA Astrophysics Data System (ADS)
Steyn, H.
2004-11-01
A real time TV view finder is used on-board a low earth orbiting (LEO) satellite to manually select targets for imaging from a ground station within the communication footprint of the satellite. The attitude control system on the satellite is used to steer the satellite using commands from the groundstation and a television camera onboard the satellite will then downlink a television signal in real time to a monitor screen in the ground station. The operator in the feedback loop will be able to manually steer the boresight of the satellite's main imager towards interested target areas e.g. to avoid clouds or correct for any attitude pointing errors. Due to a substantial delay (in the order of a second) in the view finding feedback loop and the narrow field of view of the main imager, the operator has to be assisted by the onboard attitude control system to stabilise and track the target area visible on the monitor screen. This paper will present the extended Kalman filter used to estimate the satellite's attitude angles using quaternions and the bias vector component of the 3-axis inertial rate sensors (gyros). Absolute attitude sensors (i.e. sun, horizon and magnetic) are used to supply the measurement vectors to correct the filter states during the view finder manoeuvres. The target tracking and rate steering reaction wheel controllers to accurately point and stabilise the satellite will be presented. The reference generator for the satellite to target attitude and rate vectors as used by the reaction wheel controllers will be derived.
Exponential Modelling for Mutual-Cohering of Subband Radar Data
NASA Astrophysics Data System (ADS)
Siart, U.; Tejero, S.; Detlefsen, J.
2005-05-01
Increasing resolution and accuracy is an important issue in almost any type of radar sensor application. However, both resolution and accuracy are strongly related to the available signal bandwidth and energy that can be used. Nowadays, often several sensors operating in different frequency bands become available on a sensor platform. It is an attractive goal to use the potential of advanced signal modelling and optimization procedures by making proper use of information stemming from different frequency bands at the RF signal level. An important prerequisite for optimal use of signal energy is coherence between all contributing sensors. Coherent multi-sensor platforms are greatly expensive and are thus not available in general. This paper presents an approach for accurately estimating object radar responses using subband measurements at different RF frequencies. An exponential model approach allows to compensate for the lack of mutual coherence between independently operating sensors. Mutual coherence is recovered from the a-priori information that both sensors have common scattering centers in view. Minimizing the total squared deviation between measured data and a full-range exponential signal model leads to more accurate pole angles and pole magnitudes compared to single-band optimization. The model parameters (range and magnitude of point scatterers) after this full-range optimization process are also more accurate than the parameters obtained from a commonly used super-resolution procedure (root-MUSIC) applied to the non-coherent subband data.
Absolute Calibration of Optical Satellite Sensors Using Libya 4 Pseudo Invariant Calibration Site
NASA Technical Reports Server (NTRS)
Mishra, Nischal; Helder, Dennis; Angal, Amit; Choi, Jason; Xiong, Xiaoxiong
2014-01-01
The objective of this paper is to report the improvements in an empirical absolute calibration model developed at South Dakota State University using Libya 4 (+28.55 deg, +23.39 deg) pseudo invariant calibration site (PICS). The approach was based on use of the Terra MODIS as the radiometer to develop an absolute calibration model for the spectral channels covered by this instrument from visible to shortwave infrared. Earth Observing One (EO-1) Hyperion, with a spectral resolution of 10 nm, was used to extend the model to cover visible and near-infrared regions. A simple Bidirectional Reflectance Distribution function (BRDF) model was generated using Terra Moderate Resolution Imaging Spectroradiometer (MODIS) observations over Libya 4 and the resulting model was validated with nadir data acquired from satellite sensors such as Aqua MODIS and Landsat 7 (L7) Enhanced Thematic Mapper (ETM+). The improvements in the absolute calibration model to account for the BRDF due to off-nadir measurements and annual variations in the atmosphere are summarized. BRDF models due to off-nadir viewing angles have been derived using the measurements from EO-1 Hyperion. In addition to L7 ETM+, measurements from other sensors such as Aqua MODIS, UK-2 Disaster Monitoring Constellation (DMC), ENVISAT Medium Resolution Imaging Spectrometer (MERIS) and Operational Land Imager (OLI) onboard Landsat 8 (L8), which was launched in February 2013, were employed to validate the model. These satellite sensors differ in terms of the width of their spectral bandpasses, overpass time, off-nadir-viewing capabilities, spatial resolution and temporal revisit time, etc. The results demonstrate that the proposed empirical calibration model has accuracy of the order of 3% with an uncertainty of about 2% for the sensors used in the study.
Gaze and viewing angle influence visual stabilization of upright posture
Ustinova, KI; Perkins, J
2011-01-01
Focusing gaze on a target helps stabilize upright posture. We investigated how this visual stabilization can be affected by observing a target presented under different gaze and viewing angles. In a series of 10-second trials, participants (N = 20, 29.3 ± 9 years of age) stood on a force plate and fixed their gaze on a figure presented on a screen at a distance of 1 m. The figure changed position (gaze angle: eye level (0°), 25° up or down), vertical body orientation (viewing angle: at eye level but rotated 25° as if leaning toward or away from the participant), or both (gaze and viewing angle: 25° up or down with the rotation equivalent of a natural visual perspective). Amplitude of participants’ sagittal displacement, surface area, and angular position of the center of gravity (COG) were compared. Results showed decreased COG velocity and amplitude for up and down gaze angles. Changes in viewing angles resulted in altered body alignment and increased amplitude of COG displacement. No significant changes in postural stability were observed when both gaze and viewing angles were altered. Results suggest that both the gaze angle and viewing perspective may be essential variables of the visuomotor system modulating postural responses. PMID:22398978
A Study of TRMM Static Earth Sensor Performance Using On-Orbit Sensor Data
NASA Technical Reports Server (NTRS)
Natanson, Gregory; Glickman, Jonathan
2000-01-01
This paper presents the results of a study of the Barnes static Earth sensor assembly (ESA) using on-orbit data collected from the Tropical Rainfall Measuring Mission (TRMM) spacecraft. It is shown that there exist strong correlations between the large penetration angle residuals and the voltages produced by the Offset Radiation Source (ORS). It is conjectured that at certain times in the TRMM orbit the ORS is operating out of its calibrated range, and consequently corrupts the penetration angle information observed and processed by the ESA. The observed yaw drift between Digital Sun Sensor (DSS) observations is shown to be consistent with predictions by a simple roll-yaw coupling computation. This would explain the large drifts seen on TRMM, where the propagation of the yaw angle between DSS updates does not take into account the possibility of a non-zero roll angle error. Finally, the accuracy of the onboard algorithm used when only three of the four quadrants supply valid penetration angles is assessed. In terms of procedures used to perform this study, the analysis of ESA penetration angle residuals is discovered to be a very useful and insightful tool for assessing, the health and functionality of the ESA.
Dual-mode switching of a liquid crystal panel for viewing angle control
NASA Astrophysics Data System (ADS)
Baek, Jong-In; Kwon, Yong-Hoan; Kim, Jae Chang; Yoon, Tae-Hoon
2007-03-01
The authors propose a method to control the viewing angle of a liquid crystal (LC) panel using dual-mode switching. To realize both wide viewing angle (WVA) characteristics and narrow viewing angle (NVA) characteristics with a single LC panel, the authors use two different dark states. The LC layer can be aligned homogeneously parallel to the transmission axis of the bottom polarizer for WVA dark state operation, while it can be aligned vertically for NVA dark state operation. The authors demonstrated that viewing angle control can be achieved with a single panel without any loss of contrast at the front.
Okamura, Jun-ya; Yamaguchi, Reona; Honda, Kazunari; Tanaka, Keiji
2014-01-01
One fails to recognize an unfamiliar object across changes in viewing angle when it must be discriminated from similar distractor objects. View-invariant recognition gradually develops as the viewer repeatedly sees the objects in rotation. It is assumed that different views of each object are associated with one another while their successive appearance is experienced in rotation. However, natural experience of objects also contains ample opportunities to discriminate among objects at each of the multiple viewing angles. Our previous behavioral experiments showed that after experiencing a new set of object stimuli during a task that required only discrimination at each of four viewing angles at 30° intervals, monkeys could recognize the objects across changes in viewing angle up to 60°. By recording activities of neurons from the inferotemporal cortex after various types of preparatory experience, we here found a possible neural substrate for the monkeys' performance. For object sets that the monkeys had experienced during the task that required only discrimination at each of four viewing angles, many inferotemporal neurons showed object selectivity covering multiple views. The degree of view generalization found for these object sets was similar to that found for stimulus sets with which the monkeys had been trained to conduct view-invariant recognition. These results suggest that the experience of discriminating new objects in each of several viewing angles develops the partially view-generalized object selectivity distributed over many neurons in the inferotemporal cortex, which in turn bases the monkeys' emergent capability to discriminate the objects across changes in viewing angle. PMID:25378169
Estimating Slopes In Images Of Terrain By Use Of BRDF
NASA Technical Reports Server (NTRS)
Scholl, Marija S.
1995-01-01
Proposed method of estimating slopes of terrain features based on use of bidirectional reflectivity distribution function (BRDF) in analyzing aerial photographs, satellite video images, or other images produced by remote sensors. Estimated slopes integrated along horizontal coordinates to obtain estimated heights; generating three-dimensional terrain maps. Method does not require coregistration of terrain features in pairs of images acquired from slightly different perspectives nor requires Sun or other source of illumination to be low in sky over terrain of interest. On contrary, best when Sun is high. Works at almost all combinations of illumination and viewing angles.
A flexible wearable sensor for knee flexion assessment during gait.
Papi, Enrica; Bo, Yen Nee; McGregor, Alison H
2018-05-01
Gait analysis plays an important role in the diagnosis and management of patients with movement disorders but it is usually performed within a laboratory. Recently interest has shifted towards the possibility of conducting gait assessments in everyday environments thus facilitating long-term monitoring. This is possible by using wearable technologies rather than laboratory based equipment. This study aims to validate a novel wearable sensor system's ability to measure peak knee sagittal angles during gait. The proposed system comprises a flexible conductive polymer unit interfaced with a wireless acquisition node attached over the knee on a pair of leggings. Sixteen healthy volunteers participated to two gait assessments on separate occasions. Data was simultaneously collected from the novel sensor and a gold standard 10 camera motion capture system. The relationship between sensor signal and reference knee flexion angles was defined for each subject to allow the transformation of sensor voltage outputs to angular measures (degrees). The knee peak flexion angle from the sensor and reference system were compared by means of root mean square error (RMSE), absolute error, Bland-Altman plots and intra-class correlation coefficients (ICCs) to assess test-retest reliability. Comparisons of knee peak flexion angles calculated from the sensor and gold standard yielded an absolute error of 0.35(±2.9°) and RMSE of 1.2(±0.4)°. Good agreement was found between the two systems with the majority of data lying within the limits of agreement. The sensor demonstrated high test-retest reliability (ICCs>0.8). These results show the ability of the sensor to monitor knee peak sagittal angles with small margins of error and in agreement with the gold standard system. The sensor has potential to be used in clinical settings as a discreet, unobtrusive wearable device allowing for long-term gait analysis. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
Analysis of error in TOMS total ozone as a function of orbit and attitude parameters
NASA Technical Reports Server (NTRS)
Gregg, W. W.; Ardanuy, P. E.; Braun, W. C.; Vallette, B. J.; Bhartia, P. K.; Ray, S. N.
1991-01-01
Computer simulations of orbital scenarios were performed to examine the effects of orbital altitude, equator crossing time, attitude uncertainty, and orbital eccentricity on ozone observations by future satellites. These effects were assessed by determining changes in solar and viewing geometry and earth daytime coverage loss. The importance of these changes on ozone retrieval was determined by simulating uncertainties in the TOMS ozone retrieval algorithm. The major findings are as follows: (1) Drift of equator crossing time from local noon would have the largest effect on the quality of ozone derived from TOMS. The most significant effect of this drift is the loss of earth daytime coverage in the winter hemisphere. The loss in coverage increases from 1 degree latitude for + or - 1 hour from noon, 6 degrees for + or - 3 hours from noon, to 53 degrees for + or - 6 hours from noon. An additional effect is the increase in ozone retrieval errors due to high solar zenith angles. (2) To maintain contiguous earth coverage, the maximum scan angle of the sensor must be increased with decreasing orbital altitude. The maximum scan angle required for full coverage at the equator varies from 60 degrees at 600 km altitude to 45 degrees at 1200 km. This produces an increase in spacecraft zenith angle, theta, which decreases the ozone retrieval accuracy. The range in theta was approximately 72 degrees for 600 km to approximately 57 degrees at 1200 km. (3) The effect of elliptical orbits is to create gaps in coverage along the subsatellite track. An elliptical orbit with a 200 km perigee and 1200 km apogee produced a maximum earth coverage gap of about 45 km at the perigee at nadir. (4) An attitude uncertainty of 0.1 degree in each axis (pitch, roll, yaw) produced a maximum scan angle to view the pole, and maximum solar zenith angle).
Autonomous satellite navigation using starlight refraction angle measurements
NASA Astrophysics Data System (ADS)
Ning, Xiaolin; Wang, Longhua; Bai, Xinbei; Fang, Jiancheng
2013-05-01
An on-board autonomous navigation capability is required to reduce the operation costs and enhance the navigation performance of future satellites. Autonomous navigation by stellar refraction is a type of autonomous celestial navigation method that uses high-accuracy star sensors instead of Earth sensors to provide information regarding Earth's horizon. In previous studies, the refraction apparent height has typically been used for such navigation. However, the apparent height cannot be measured directly by a star sensor and can only be calculated by the refraction angle and an atmospheric refraction model. Therefore, additional errors are introduced by the uncertainty and nonlinearity of atmospheric refraction models, which result in reduced navigation accuracy and reliability. A new navigation method based on the direct measurement of the refraction angle is proposed to solve this problem. Techniques for the determination of the refraction angle are introduced, and a measurement model for the refraction angle is established. The method is tested and validated by simulations. When the starlight refraction height ranges from 20 to 50 km, a positioning accuracy of better than 100 m can be achieved for a low-Earth-orbit (LEO) satellite using the refraction angle, while the positioning accuracy of the traditional method using the apparent height is worse than 500 m under the same conditions. Furthermore, an analysis of the factors that affect navigation accuracy, including the measurement accuracy of the refraction angle, the number of visible refracted stars per orbit and the installation azimuth of star sensor, is presented. This method is highly recommended for small satellites in particular, as no additional hardware besides two star sensors is required.
Xian, Zhiwen; Hu, Xiaoping; Lian, Junxiang; Zhang, Lilian; Cao, Juliang; Wang, Yujie; Ma, Tao
2014-09-15
Navigation plays a vital role in our daily life. As traditional and commonly used navigation technologies, Inertial Navigation System (INS) and Global Navigation Satellite System (GNSS) can provide accurate location information, but suffer from the accumulative error of inertial sensors and cannot be used in a satellite denied environment. The remarkable navigation ability of animals shows that the pattern of the polarization sky can be used for navigation. A bio-inspired POLarization Navigation Sensor (POLNS) is constructed to detect the polarization of skylight. Contrary to the previous approach, we utilize all the outputs of POLNS to compute input polarization angle, based on Least Squares, which provides optimal angle estimation. In addition, a new sensor calibration algorithm is presented, in which the installation angle errors and sensor biases are taken into consideration. Derivation and implementation of our calibration algorithm are discussed in detail. To evaluate the performance of our algorithms, simulation and real data test are done to compare our algorithms with several exiting algorithms. Comparison results indicate that our algorithms are superior to the others and are more feasible and effective in practice.
Remote sensing of aerosol plumes: a semianalytical model
NASA Astrophysics Data System (ADS)
Alakian, Alexandre; Marion, Rodolphe; Briottet, Xavier
2008-04-01
A semianalytical model, named APOM (aerosol plume optical model) and predicting the radiative effects of aerosol plumes in the spectral range [0.4,2.5 μm], is presented in the case of nadir viewing. It is devoted to the analysis of plumes arising from single strong emission events (high optical depths) such as fires or industrial discharges. The scene is represented by a standard atmosphere (molecules and natural aerosols) on which a plume layer is added at the bottom. The estimated at-sensor reflectance depends on the atmosphere without plume, the solar zenith angle, the plume optical properties (optical depth, single-scattering albedo, and asymmetry parameter), the ground reflectance, and the wavelength. Its mathematical expression as well as its numerical coefficients are derived from MODTRAN4 radiative transfer simulations. The DISORT option is used with 16 fluxes to provide a sufficiently accurate calculation of multiple scattering effects that are important for dense smokes. Model accuracy is assessed by using a set of simulations performed in the case of biomass burning and industrial plumes. APOM proves to be accurate and robust for solar zenith angles between 0° and 60° whatever the sensor altitude, the standard atmosphere, for plume phase functions defined from urban and rural models, and for plume locations that extend from the ground to a height below 3 km. The modeling errors in the at-sensor reflectance are on average below 0.002. They can reach values of 0.01 but correspond to low relative errors then (below 3% on average). This model can be used for forward modeling (quick simulations of multi/hyperspectral images and help in sensor design) as well as for the retrieval of the plume optical properties from remotely sensed images.
Remote sensing of aerosol plumes: a semianalytical model.
Alakian, Alexandre; Marion, Rodolphe; Briottet, Xavier
2008-04-10
A semianalytical model, named APOM (aerosol plume optical model) and predicting the radiative effects of aerosol plumes in the spectral range [0.4,2.5 microm], is presented in the case of nadir viewing. It is devoted to the analysis of plumes arising from single strong emission events (high optical depths) such as fires or industrial discharges. The scene is represented by a standard atmosphere (molecules and natural aerosols) on which a plume layer is added at the bottom. The estimated at-sensor reflectance depends on the atmosphere without plume, the solar zenith angle, the plume optical properties (optical depth, single-scattering albedo, and asymmetry parameter), the ground reflectance, and the wavelength. Its mathematical expression as well as its numerical coefficients are derived from MODTRAN4 radiative transfer simulations. The DISORT option is used with 16 fluxes to provide a sufficiently accurate calculation of multiple scattering effects that are important for dense smokes. Model accuracy is assessed by using a set of simulations performed in the case of biomass burning and industrial plumes. APOM proves to be accurate and robust for solar zenith angles between 0 degrees and 60 degrees whatever the sensor altitude, the standard atmosphere, for plume phase functions defined from urban and rural models, and for plume locations that extend from the ground to a height below 3 km. The modeling errors in the at-sensor reflectance are on average below 0.002. They can reach values of 0.01 but correspond to low relative errors then (below 3% on average). This model can be used for forward modeling (quick simulations of multi/hyperspectral images and help in sensor design) as well as for the retrieval of the plume optical properties from remotely sensed images.
Tachometer Derived From Brushless Shaft-Angle Resolver
NASA Technical Reports Server (NTRS)
Howard, David E.; Smith, Dennis A.
1995-01-01
Tachometer circuit operates in conjunction with brushless shaft-angle resolver. By performing sequence of straightforward mathematical operations on resolver signals and utilizing simple trigonometric identity, generates voltage proportional to rate of rotation of shaft. One advantage is use of brushless shaft-angle resolver as main source of rate signal: no brushes to wear out, no brush noise, and brushless resolvers have proven robustness. No switching of signals to generate noise. Another advantage, shaft-angle resolver used as shaft-angle sensor, tachometer input obtained without adding another sensor. Present circuit reduces overall size, weight, and cost of tachometer.
Polarimetric Imaging for the Detection of Disturbed Surfaces
2009-06-01
9 Figure 4. Rayleigh Roughness Criterion as a Function of Incident Angle ......................10 Figure 5. Definition of Geometrical...Terms (after Egan & Hallock, 1966).....................11 Figure 6. Haleakala Ash Depolarization for (a) °0 Viewing Angle and (b) °60 Viewing... Angle (from Egan et al., 1968)..........................................................13 Figure 7. Basalt Depolarization at (a) °0 Viewing Angle and
Testing Accuracy of Long-Range Ultrasonic Sensors for Olive Tree Canopy Measurements
Gamarra-Diezma, Juan Luis; Miranda-Fuentes, Antonio; Llorens, Jordi; Cuenca, Andrés; Blanco-Roldán, Gregorio L.; Rodríguez-Lizana, Antonio
2015-01-01
Ultrasonic sensors are often used to adjust spray volume by allowing the calculation of the crown volume of tree crops. The special conditions of the olive tree require the use of long-range sensors, which are less accurate and faster than the most commonly used sensors. The main objectives of the study were to determine the suitability of the sensor in terms of sound cone determination, angle errors, crosstalk errors and field measurements. Different laboratory tests were performed to check the suitability of a commercial long-range ultrasonic sensor, as were the experimental determination of the sound cone diameter at several distances for several target materials, the determination of the influence of the angle of incidence of the sound wave on the target and distance on the accuracy of measurements for several materials and the determination of the importance of the errors due to interference between sensors for different sensor spacings and distances for two different materials. Furthermore, sensor accuracy was tested under real field conditions. The results show that the studied sensor is appropriate for olive trees because the sound cone is narrower for an olive tree than for the other studied materials, the olive tree canopy does not have a large influence on the sensor accuracy with respect to distance and angle, the interference errors are insignificant for high sensor spacings and the sensor's field distance measurements were deemed sufficiently accurate. PMID:25635414
Observing System Simulations for Small Satellite Formations Estimating Bidirectional Reflectance
NASA Technical Reports Server (NTRS)
Nag, Sreeja; Gatebe, Charles K.; de Weck, Olivier
2015-01-01
The bidirectional reflectance distribution function (BRDF) gives the reflectance of a target as a function of illumination geometry and viewing geometry, hence carries information about the anisotropy of the surface. BRDF is needed in remote sensing for the correction of view and illumination angle effects (for example in image standardization and mosaicing), for deriving albedo, for land cover classification, for cloud detection, for atmospheric correction, and other applications. However, current spaceborne instruments provide sparse angular sampling of BRDF and airborne instruments are limited in the spatial and temporal coverage. To fill the gaps in angular coverage within spatial, spectral and temporal requirements, we propose a new measurement technique: Use of small satellites in formation flight, each satellite with a VNIR (visible and near infrared) imaging spectrometer, to make multi-spectral, near-simultaneous measurements of every ground spot in the swath at multiple angles. This paper describes an observing system simulation experiment (OSSE) to evaluate the proposed concept and select the optimal formation architecture that minimizes BRDF uncertainties. The variables of the OSSE are identified; number of satellites, measurement spread in the view zenith and relative azimuth with respect to solar plane, solar zenith angle, BRDF models and wavelength of reflection. Analyzing the sensitivity of BRDF estimation errors to the variables allow simplification of the OSSE, to enable its use to rapidly evaluate formation architectures. A 6-satellite formation is shown to produce lower BRDF estimation errors, purely in terms of angular sampling as evaluated by the OSSE, than a single spacecraft with 9 forward-aft sensors. We demonstrate the ability to use OSSEs to design small satellite formations as complements to flagship mission data. The formations can fill angular sampling gaps and enable better BRDF products than currently possible.
Observing system simulations for small satellite formations estimating bidirectional reflectance
NASA Astrophysics Data System (ADS)
Nag, Sreeja; Gatebe, Charles K.; Weck, Olivier de
2015-12-01
The bidirectional reflectance distribution function (BRDF) gives the reflectance of a target as a function of illumination geometry and viewing geometry, hence carries information about the anisotropy of the surface. BRDF is needed in remote sensing for the correction of view and illumination angle effects (for example in image standardization and mosaicing), for deriving albedo, for land cover classification, for cloud detection, for atmospheric correction, and other applications. However, current spaceborne instruments provide sparse angular sampling of BRDF and airborne instruments are limited in the spatial and temporal coverage. To fill the gaps in angular coverage within spatial, spectral and temporal requirements, we propose a new measurement technique: use of small satellites in formation flight, each satellite with a VNIR (visible and near infrared) imaging spectrometer, to make multi-spectral, near-simultaneous measurements of every ground spot in the swath at multiple angles. This paper describes an observing system simulation experiment (OSSE) to evaluate the proposed concept and select the optimal formation architecture that minimizes BRDF uncertainties. The variables of the OSSE are identified; number of satellites, measurement spread in the view zenith and relative azimuth with respect to solar plane, solar zenith angle, BRDF models and wavelength of reflection. Analyzing the sensitivity of BRDF estimation errors to the variables allow simplification of the OSSE, to enable its use to rapidly evaluate formation architectures. A 6-satellite formation is shown to produce lower BRDF estimation errors, purely in terms of angular sampling as evaluated by the OSSE, than a single spacecraft with 9 forward-aft sensors. We demonstrate the ability to use OSSEs to design small satellite formations as complements to flagship mission data. The formations can fill angular sampling gaps and enable better BRDF products than currently possible.
Design and fabrication of an autonomous rendezvous and docking sensor using off-the-shelf hardware
NASA Technical Reports Server (NTRS)
Grimm, Gary E.; Bryan, Thomas C.; Howard, Richard T.; Book, Michael L.
1991-01-01
NASA Marshall Space Flight Center (MSFC) has developed and tested an engineering model of an automated rendezvous and docking sensor system composed of a video camera ringed with laser diodes at two wavelengths and a standard remote manipulator system target that has been modified with retro-reflective tape and 830 and 780 mm optical filters. TRW has provided additional engineering analysis, design, and manufacturing support, resulting in a robust, low cost, automated rendezvous and docking sensor design. We have addressed the issue of space qualification using off-the-shelf hardware components. We have also addressed the performance problems of increased signal to noise ratio, increased range, increased frame rate, graceful degradation through component redundancy, and improved range calibration. Next year, we will build a breadboard of this sensor. The phenomenology of the background scene of a target vehicle as viewed against earth and space backgrounds under various lighting conditions will be simulated using the TRW Dynamic Scene Generator Facility (DSGF). Solar illumination angles of the target vehicle and candidate docking target ranging from eclipse to full sun will be explored. The sensor will be transportable for testing at the MSFC Flight Robotics Laboratory (EB24) using the Dynamic Overhead Telerobotic Simulator (DOTS).
Kim, Hwi; Hahn, Joonku; Choi, Hee-Jin
2011-04-10
We investigate the viewing angle enhancement of a lenticular three-dimensional (3D) display with a triplet lens array. The theoretical limitations of the viewing angle and view number of the lenticular 3D display with the triplet lens array are analyzed numerically. For this, the genetic-algorithm-based design method of the triplet lens is developed. We show that a lenticular 3D display with viewing angle of 120° and 144 views without interview cross talk can be realized with the use of an optimally designed triplet lens array. © 2011 Optical Society of America
NASA Astrophysics Data System (ADS)
Han, Youmei; Jiao, Minglian; Shijuan
2018-04-01
With the rapid development of the oblique photogrammetry, many cities have built some real 3D model with this technology. Although it has the advantages of short period, high efficiency and good air angle effect, the near ground view angle of these real 3D models are not very good. With increasing development of smart cities, the requirements of reality, practicality and accuracy on real 3D models are becoming higher. How to produce and improve the real 3D models quickly has become one of the hot research directions of geospatial information. To meet this requirement In this paper, Combined with the characteristics of current oblique photogrammetry modeling and the terrestrial photogrammetry, we proposed a new technological process, which consists of close range sensor design, data acquisition and processing. The proposed method is being tested by using oblique photography images acquired. The results confirm the effectiveness of the proposed approach.
An Overview of SIMBIOS Program Activities and Accomplishments. Chapter 1
NASA Technical Reports Server (NTRS)
Fargion, Giulietta S.; McClain, Charles R.
2003-01-01
The SIMBIOS Program was conceived in 1994 as a result of a NASA management review of the agency's strategy for monitoring the bio-optical properties of the global ocean through space-based ocean color remote sensing. At that time, the NASA ocean color flight manifest included two data buy missions, the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) and Earth Observing System (EOS) Color, and three sensors, two Moderate Resolution Imaging Spectroradiometers (MODIS) and the Multi-angle Imaging Spectro-Radiometer (MISR), scheduled for flight on the EOS-Terra and EOS-Aqua satellites. The review led to a decision that the international assemblage of ocean color satellite systems provided ample redundancy to assure continuous global coverage, with no need for the EOS Color mission. At the same time, it was noted that non-trivial technical difficulties attended the challenge (and opportunity) of combining ocean color data from this array of independent satellite systems to form consistent and accurate global bio-optical time series products. Thus, it was announced at the October 1994 EOS Interdisciplinary Working Group meeting that some of the resources budgeted for EOS Color should be redirected into an intercalibration and validation program (McClain et al., 2002).
Okamura, Jun-Ya; Yamaguchi, Reona; Honda, Kazunari; Wang, Gang; Tanaka, Keiji
2014-11-05
One fails to recognize an unfamiliar object across changes in viewing angle when it must be discriminated from similar distractor objects. View-invariant recognition gradually develops as the viewer repeatedly sees the objects in rotation. It is assumed that different views of each object are associated with one another while their successive appearance is experienced in rotation. However, natural experience of objects also contains ample opportunities to discriminate among objects at each of the multiple viewing angles. Our previous behavioral experiments showed that after experiencing a new set of object stimuli during a task that required only discrimination at each of four viewing angles at 30° intervals, monkeys could recognize the objects across changes in viewing angle up to 60°. By recording activities of neurons from the inferotemporal cortex after various types of preparatory experience, we here found a possible neural substrate for the monkeys' performance. For object sets that the monkeys had experienced during the task that required only discrimination at each of four viewing angles, many inferotemporal neurons showed object selectivity covering multiple views. The degree of view generalization found for these object sets was similar to that found for stimulus sets with which the monkeys had been trained to conduct view-invariant recognition. These results suggest that the experience of discriminating new objects in each of several viewing angles develops the partially view-generalized object selectivity distributed over many neurons in the inferotemporal cortex, which in turn bases the monkeys' emergent capability to discriminate the objects across changes in viewing angle. Copyright © 2014 the authors 0270-6474/14/3415047-13$15.00/0.
Directional infrared temperature and emissivity of vegetation: Measurements and models
NASA Technical Reports Server (NTRS)
Norman, J. M.; Castello, S.; Balick, L. K.
1994-01-01
Directional thermal radiance from vegetation depends on many factors, including the architecture of the plant canopy, thermal irradiance, emissivity of the foliage and soil, view angle, slope, and the kinetic temperature distribution within the vegetation-soil system. A one dimensional model, which includes the influence of topography, indicates that thermal emissivity of vegetation canopies may remain constant with view angle, or emissivity may increase or decrease as view angle from nadir increases. Typically, variations of emissivity with view angle are less than 0.01. As view angle increases away from nadir, directional infrared canopy temperature usually decreases but may remain nearly constant or even increase. Variations in directional temperature with view angle may be 5C or more. Model predictions of directional emissivity are compared with field measurements in corn canopies and over a bare soil using a method that requires two infrared thermometers, one sensitive to the 8 to 14 micrometer wavelength band and a second to the 14 to 22 micrometer band. After correction for CO2 absorption by the atmosphere, a directional canopy emissivity can be obtained as a function of view angle in the 8 to 14 micrometer band to an accuracy of about 0.005. Modeled and measured canopy emissivities for corn varied slightly with view angle (0.990 at nadir and 0.982 at 75 deg view zenith angle) and did not appear to vary significantly with view angle for the bare soil. Canopy emissivity is generally nearer to unity than leaf emissivity may vary by 0.02 with wavelength even though leaf emissivity. High spectral resolution, canopy thermal emissivity may vary by 0.02 with wavelength even though leaf emissivity may vary by 0.07. The one dimensional model provides reasonably accurate predictions of infrared temperature and can be used to study the dependence of infrared temperature on various plant, soil, and environmental factors.
Matsuya, Iwao; Katamura, Ryuta; Sato, Maya; Iba, Miroku; Kondo, Hideaki; Kanekawa, Kiyoshi; Takahashi, Motoichi; Hatada, Tomohiko; Nitta, Yoshihiro; Tanii, Takashi; Shoji, Shuichi; Nishitani, Akira; Ohdomari, Iwao
2010-01-01
We propose a novel sensor system for monitoring the structural health of a building. The system optically measures the relative-story displacement during earthquakes for detecting any deformations of building elements. The sensor unit is composed of three position sensitive detectors (PSDs) and lenses capable of measuring the relative-story displacement precisely, even if the PSD unit was inclined in response to the seismic vibration. For verification, laboratory tests were carried out using an Xθ-stage and a shaking table. The static experiment verified that the sensor could measure the local inclination angle as well as the lateral displacement. The dynamic experiment revealed that the accuracy of the sensor was 150 μm in the relative-displacement measurement and 100 μrad in the inclination angle measurement. These results indicate that the proposed sensor system has sufficient accuracy for the measurement of relative-story displacement in response to the seismic vibration.
Monocular Vision-Based Underwater Object Detection
Zhang, Zhen; Dai, Fengzhao; Bu, Yang; Wang, Huibin
2017-01-01
In this paper, we propose an underwater object detection method using monocular vision sensors. In addition to commonly used visual features such as color and intensity, we investigate the potential of underwater object detection using light transmission information. The global contrast of various features is used to initially identify the region of interest (ROI), which is then filtered by the image segmentation method, producing the final underwater object detection results. We test the performance of our method with diverse underwater datasets. Samples of the datasets are acquired by a monocular camera with different qualities (such as resolution and focal length) and setups (viewing distance, viewing angle, and optical environment). It is demonstrated that our ROI detection method is necessary and can largely remove the background noise and significantly increase the accuracy of our underwater object detection method. PMID:28771194
NASA Technical Reports Server (NTRS)
Tripp, John S.; Tcheng, Ping
1999-01-01
Statistical tools, previously developed for nonlinear least-squares estimation of multivariate sensor calibration parameters and the associated calibration uncertainty analysis, have been applied to single- and multiple-axis inertial model attitude sensors used in wind tunnel testing to measure angle of attack and roll angle. The analysis provides confidence and prediction intervals of calibrated sensor measurement uncertainty as functions of applied input pitch and roll angles. A comparative performance study of various experimental designs for inertial sensor calibration is presented along with corroborating experimental data. The importance of replicated calibrations over extended time periods has been emphasized; replication provides independent estimates of calibration precision and bias uncertainties, statistical tests for calibration or modeling bias uncertainty, and statistical tests for sensor parameter drift over time. A set of recommendations for a new standardized model attitude sensor calibration method and usage procedures is included. The statistical information provided by these procedures is necessary for the uncertainty analysis of aerospace test results now required by users of industrial wind tunnel test facilities.
Thinking Outside of the Blue Marble: Novel Ocean Applications Using the VIIRS Sensor
NASA Technical Reports Server (NTRS)
Vandermeulen, Ryan A.; Arnone, Robert
2016-01-01
While planning for future space-borne sensors will increase the quality, quantity, and duration of ocean observations in the years to come, efforts to extend the limits of sensors currently in orbit can help shed light on future scientific gains as well as associated uncertainties. Here, we present several applications that are unique to the polar orbiting Visual Infrared Imaging Radiometer Suite (VIIRS), each of which challenge the threshold capabilities of the sensor and provide lessons for future missions. For instance, while moderate resolution polar orbiters typically have a one day revisit time, we are able to obtain multiple looks of the same area by focusing on the extreme zenith angles where orbital views overlap, and pair these observations with those from other sensors to create pseudo-geostationary data sets. Or, by exploiting high spatial resolution (imaging) channels and analyzing patterns of synoptic covariance across the visible spectrum, we can obtain higher spatial resolution bio-optical products. Alternatively, non-traditional products can illuminate important biological interactions in the ocean, such as the use of the Day-Night-Band to provide some quantification of phototactic behavior of marine life along light polluted beaches, as well as track the location of marine fishing vessel fleets along ocean fronts. In this talk, we explore ways to take full advantage of the capabilities of existing sensors in order to maximize insights for future missions.
Comparison of Angle of Attack Measurements for Wind Tunnel Testing
NASA Technical Reports Server (NTRS)
Jones, Thomas, W.; Hoppe, John C.
2001-01-01
Two optical systems capable of measuring model attitude and deformation were compared to inertial devices employed to acquire wind tunnel model angle of attack measurements during the sting mounted full span 30% geometric scale flexible configuration of the Northrop Grumman Unmanned Combat Air Vehicle (UCAV) installed in the NASA Langley Transonic Dynamics Tunnel (TDT). The overall purpose of the test at TDT was to evaluate smart materials and structures adaptive wing technology. The optical techniques that were compared to inertial devices employed to measure angle of attack for this test were: (1) an Optotrak (registered) system, an optical system consisting of two sensors, each containing a pair of orthogonally oriented linear arrays to compute spatial positions of a set of active markers; and (2) Video Model Deformation (VMD) system, providing a single view of passive targets using a constrained photogrammetric solution whose primary function was to measure wing and control surface deformations. The Optotrak system was installed for this test for the first time at TDT in order to assess the usefulness of the system for future static and dynamic deformation measurements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa
Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° ×more » 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision.« less
NASA Astrophysics Data System (ADS)
Jolly, Arthur D.; Matoza, Robin S.; Fee, David; Kennedy, Ben M.; Iezzi, Alexandra M.; Fitzgerald, Rebecca H.; Austin, Allison C.; Johnson, Richard
2017-10-01
We obtained an unprecedented view of the acoustic radiation from persistent strombolian volcanic explosions at Yasur volcano, Vanuatu, from the deployment of infrasound sensors attached to a tethered aerostat. While traditional ground-based infrasound arrays may sample only a small portion of the eruption pressure wavefield, we were able to densely sample angular ranges of 200° in azimuth and 50° in takeoff angle by placing the aerostat at 38 tethered loiter positions around the active vent. The airborne data joined contemporaneously collected ground-based infrasound and video recordings over the period 29 July to 1 August 2016. We observe a persistent variation in the acoustic radiation pattern with average eastward directed root-mean-square pressures more than 2 times larger than in other directions. The observed radiation pattern may be related to both path effects from the crater walls, and source directionality.
Shen, Dazhong; Kang, Qi; Li, Xiaoyu; Cai, Hongmei; Wang, Yuandong
2007-06-19
This paper presents different experimental results of the influence of an immersion angle (theta, the angle between the surface of a quartz crystal resonator and the horizon) on the resonant frequency of a quartz crystal microbalance (QCM) sensor exposed one side of its sensing surfaces to liquid. The experimental results show that the immersion angle is an added factor that may influence the frequency of the QCM sensor. This type of influence is caused by variation of the reflection conditions of the longitudinal wave between the QCM sensor and the walls of the detection cell. The frequency shifts, measured by varying theta, are related to the QCM sensor used. When a QCM sensor with a weak longitudinal wave is used, its resonant frequency is nearly independent of theta. But, if a QCM sensor with a strong longitudinal wave is employed, the immersion angle is a potential error source for the measurements performed on the QCM sensor. When the reflection conditions of the longitudinal wave are reduced, the influence of theta on the resonant frequency of the QCM sensor is negligible. The slope of the plot of frequency shifts (deltaF) versus (rho eta)(1/2), the square root of the product of solution density (rho) and viscosity (eta), may be influenced by theta in a single experiment for the QCM sensor with a strong longitudinal wave in low viscous liquids, which can however, be effectively weakened by using the averaged values of reduplicated experiments. In solutions with a large (rho eta)(1/2) region (0-55 wt% sucrose solution as an example, with rho value from 1.00 to 1.26 g cm(-3) and eta value from 0.01 to 0.22 g cm(-1) s(-1), respectively), the slope of the plot of deltaF versus (rho eta)(1/2) is independent of theta even for the QCM sensor with a strong longitudinal wave in a single experiment. The influence of theta on the resonant frequency of the QCM sensor should be taken into consideration in its applications in liquid phase.
NASA Technical Reports Server (NTRS)
Gilyard, G. B.; Belte, D.
1974-01-01
Magnitudes of lags in the pneumatic angle-of-attack and angle-of-sideslip sensor systems of the YF-12A airplane were determined for a variety of flight conditions by analyzing stability and control data. The three analysis techniques used are described. An apparent trend with Mach number for measurements from both of the differential-pressure sensors showed that the lag ranged from approximately 0.15 second at subsonic speed to 0.4 second at Mach 3. Because Mach number was closely related to altitude for the available flight data, the individual effects of Mach number and altitude on the lag could not be separated clearly. However, the results indicated the influence of factors other than simple pneumatic lag.
Variation of MODIS reflectance and vegetation indices with viewing geometry and soybean development.
Breunig, Fábio M; Galvão, Lênio S; Formaggio, Antônio R; Epiphanio, José C N
2012-06-01
Directional effects introduce a variability in reflectance and vegetation index determination, especially when large field-of-view sensors are used (e.g., Moderate Resolution Imaging Spectroradiometer - MODIS). In this study, we evaluated directional effects on MODIS reflectance and four vegetation indices (Normalized Difference Vegetation Index - NDVI; Enhanced Vegetation Index - EVI; Normalized Difference Water Index - NDWI(1640) and NDWI(2120)) with the soybean development in two growing seasons (2004-2005 and 2005-2006). To keep the reproductive stage for a given cultivar as a constant factor while varying viewing geometry, pairs of images obtained in close dates and opposite view angles were analyzed. By using a non-parametric statistics with bootstrapping and by normalizing these indices for angular differences among viewing directions, their sensitivities to directional effects were studied. Results showed that the variation in MODIS reflectance between consecutive phenological stages was generally smaller than that resultant from viewing geometry for closed canopies. The contrary was observed for incomplete canopies. The reflectance of the first seven MODIS bands was higher in the backscattering. Except for the EVI, the other vegetation indices had larger values in the forward scattering direction. Directional effects decreased with canopy closure. The NDVI was lesser affected by directional effects than the other indices, presenting the smallest differences between viewing directions for fixed phenological stages.
Space-based infrared scanning sensor LOS determination and calibration using star observation
NASA Astrophysics Data System (ADS)
Chen, Jun; Xu, Zhan; An, Wei; Deng, Xin-Pu; Yang, Jun-Gang
2015-10-01
This paper provides a novel methodology for removing sensor bias from a space based infrared (IR) system (SBIRS) through the use of stars detected in the background field of the sensor. Space based IR system uses the LOS (line of sight) of target for target location. LOS determination and calibration is the key precondition of accurate location and tracking of targets in Space based IR system and the LOS calibration of scanning sensor is one of the difficulties. The subsequent changes of sensor bias are not been taking into account in the conventional LOS determination and calibration process. Based on the analysis of the imaging process of scanning sensor, a theoretical model based on the estimation of bias angles using star observation is proposed. By establishing the process model of the bias angles and the observation model of stars, using an extended Kalman filter (EKF) to estimate the bias angles, and then calibrating the sensor LOS. Time domain simulations results indicate that the proposed method has a high precision and smooth performance for sensor LOS determination and calibration. The timeliness and precision of target tracking process in the space based infrared (IR) tracking system could be met with the proposed algorithm.
ESA DUE GlobTemperature project: Infrared-based LST Product
NASA Astrophysics Data System (ADS)
Ermida, Sofia; Pires, Ana; Ghent, Darren; Trigo, Isabel; DaCamara, Carlos; Remedios, John
2016-04-01
One of the purposes of the GlobTemperature project is to provide a product of global Land Surface Temperature (LST) based on Geostationary Earth Orbit (GEO) and Low Earth polar Orbit (LEO) satellite data. The objective is to use existing LST products, which are obtained from different sensors/platforms, combining them into a harmonized product for a reference view angle. In a first approach, only infra-red based retrievals are considered, and LEO LSTs will be used as a common denominator among geostationary sensors. LST data is provided by a wide range of sensors to optimize spatial coverage, namely: (i) 2 LEO sensors - the Advanced Along Track Scanning Radiometer (AATSR) series of instruments on-board ESA's Envisat, and the Moderate Resolution Imaging Spectroradiometer (MODIS) on-board NASA's TERRA and AQUA; and (ii) 3 GEO sensors - the Spinning Enhanced Visible and Infrared Imager (SEVIRI) on-board EUMETSAT's Meteosat Second Generation (MSG), the Japanese Meteorological Imager (JAMI) on-board the Japanese Meteorological Association (JMA) Multifunction Transport SATellite (MTSAT-2), and NASA's Geostationary Operational Environmental Satellites (GOES). The merged LST product is generated in two steps: 1) calibration between each LEO and each GEO that consists in the removal of systematic differences (associated to sensor type and LST algorithms, including calibration, atmospheric and surface emissivity corrections, amongst others) represented by linear regressions; 2) angular correction that consists in bringing all LST data to reference (nadir) view. Angular effects on LST are estimated by means of a kernel model of the surface thermal emission, which describes the angular dependence of LST as function of viewing and illumination geometry. The model is adjusted to MODIS and SEVIRI/MSG LST estimates and validated against LST retrievals from those sensors obtained for other years (not used in the calibration). It is shown that the model leads to a reduction of LST differences between the two sensors, indicating that it may be used to effectively estimate/correct angular dependence in LST. A global set of kernel model parameters is finally obtained by adjusting the model to either a GEO and a LEO or the two LEOs (poles). A first version of the merged product will be released in 2016, available for download through the GlobTemperature portal. This includes only the calibration process (step 1), incorporating LST data from SEVIRI, GOES, MTSAT and MODIS; information on directional effects added as an extra layer of information. A second version of the dataset with a better incorporation of the angular correction is currently in preparation.
Volume-holographic memory for laser threat discrimination
NASA Astrophysics Data System (ADS)
Delong, Mark L.; Duncan, Bradley D.; Parker, Jack H., Jr.
1996-10-01
Using conventional volume-holographic angle multiplexing in an Fe:LiNbO3 crystal, we have developed a compact laser threat discriminator, intended for aircraft integration, that optically detects laser spatial coherence and angle of arrival while simultaneously rejecting incoherent background sources, such as the Sun. The device is intended for a specific type of psychophysical laser attack against U.S. Air Force pilots, namely, third-world-country exploitation of inexpensive and powerful cw Ar-ion or doubled Nd:YAG lasers in the visible spectrum to blind or disorient U.S. pilots. The component does not solve the general tactical laser weapon situation, which includes identifying precision-guided munitions, range finders, and lidar systems that use pulsed infrared lasers. These are fundamentally different threats requiring different detector solutions. The device incorporates a sequence of highly redundant, simple black-and-white warning patterns that are keyed to be reconstructed as the incident laser threat, playing the role of an uncooperative probe beam, changes angle with respect to the crystal. The device tracks both azimuth and elevation, using a nonconventional hologram viewing system. Recording and playback conditions are simplified because nonzero cross talk is a desirable feature of this discriminator, inasmuch as our application requires a nonzero probability of detection for arbitrary directions of arrival within the sensor's field of view. The device can exploit phase-matched grating trade-off with probe-beam wavelength, accommodating wavelength-tunable threats, while still maintaining high direction-of-arrival tracking accuracy. .
Redundant unbalance compensation of an active magnetic bearing system
NASA Astrophysics Data System (ADS)
Hutterer, Markus; Kalteis, Gerald; Schrödl, Manfred
2017-09-01
To achieve a good running behavior of a magnetic levitated rotor, a well-developed position controller and different compensation methods are required. Two very important structures in this context are the reduction of the gyroscopic effect and the unbalance vibration. Both structures have in common that they need the angular velocity information for calculation. For industrial applications this information is normally provided by an angle sensor which is fixed on the rotor. The angle information is also necessary for the field oriented control of the electrical drive. The main drawback of external position sensors are the case of a breakdown or an error of the motor controller. Therefore, the magnetic bearing can get unstable, because no angular velocity information is provided. To overcome this problem the presented paper describes the development of a selfsensing unbalance rejection in combination with a selfsensing speed control of the motor controller. Selfsensing means in this context that no angle sensor is required for the unbalance or torque control. With such structures two redundant speed and angle information sources are available and can be used for the magnetic bearing and the motor controller without the usage of an angle sensor.
Xian, Zhiwen; Hu, Xiaoping; Lian, Junxiang; Zhang, Lilian; Cao, Juliang; Wang, Yujie; Ma, Tao
2014-01-01
Navigation plays a vital role in our daily life. As traditional and commonly used navigation technologies, Inertial Navigation System (INS) and Global Navigation Satellite System (GNSS) can provide accurate location information, but suffer from the accumulative error of inertial sensors and cannot be used in a satellite denied environment. The remarkable navigation ability of animals shows that the pattern of the polarization sky can be used for navigation. A bio-inspired POLarization Navigation Sensor (POLNS) is constructed to detect the polarization of skylight. Contrary to the previous approach, we utilize all the outputs of POLNS to compute input polarization angle, based on Least Squares, which provides optimal angle estimation. In addition, a new sensor calibration algorithm is presented, in which the installation angle errors and sensor biases are taken into consideration. Derivation and implementation of our calibration algorithm are discussed in detail. To evaluate the performance of our algorithms, simulation and real data test are done to compare our algorithms with several exiting algorithms. Comparison results indicate that our algorithms are superior to the others and are more feasible and effective in practice. PMID:25225872
Dynamic Range and Sensitivity Requirements of Satellite Ocean Color Sensors: Learning from the Past
NASA Technical Reports Server (NTRS)
Hu, Chuanmin; Feng, Lian; Lee, Zhongping; Davis, Curtiss O.; Mannino, Antonio; McClain, Charles R.; Franz, Bryan A.
2012-01-01
Sensor design and mission planning for satellite ocean color measurements requires careful consideration of the signal dynamic range and sensitivity (specifically here signal-to-noise ratio or SNR) so that small changes of ocean properties (e.g., surface chlorophyll-a concentrations or Chl) can be quantified while most measurements are not saturated. Past and current sensors used different signal levels, formats, and conventions to specify these critical parameters, making it difficult to make cross-sensor comparisons or to establish standards for future sensor design. The goal of this study is to quantify these parameters under uniform conditions for widely used past and current sensors in order to provide a reference for the design of future ocean color radiometers. Using measurements from the Moderate Resolution Imaging Spectroradiometer onboard the Aqua satellite (MODISA) under various solar zenith angles (SZAs), typical (L(sub typical)) and maximum (L(sub max)) at-sensor radiances from the visible to the shortwave IR were determined. The Ltypical values at an SZA of 45 deg were used as constraints to calculate SNRs of 10 multiband sensors at the same L(sub typical) radiance input and 2 hyperspectral sensors at a similar radiance input. The calculations were based on clear-water scenes with an objective method of selecting pixels with minimal cross-pixel variations to assure target homogeneity. Among the widely used ocean color sensors that have routine global coverage, MODISA ocean bands (1 km) showed 2-4 times higher SNRs than the Sea-viewing Wide Field-of-view Sensor (Sea-WiFS) (1 km) and comparable SNRs to the Medium Resolution Imaging Spectrometer (MERIS)-RR (reduced resolution, 1.2 km), leading to different levels of precision in the retrieved Chl data product. MERIS-FR (full resolution, 300 m) showed SNRs lower than MODISA and MERIS-RR with the gain in spatial resolution. SNRs of all MODISA ocean bands and SeaWiFS bands (except the SeaWiFS near-IR bands) exceeded those from prelaunch sensor specifications after adjusting the input radiance to L(sub typical). The tabulated L(sub typical), L(sub max), and SNRs of the various multiband and hyperspectral sensors under the same or similar radiance input provide references to compare sensor performance in product precision and to help design future missions such as the Geostationary Coastal and Air Pollution Events (GEO-CAPE) mission and the Pre-Aerosol-Clouds-Ecosystems (PACE) mission currently being planned by the U.S. National Aeronautics and Space Administration (NASA).
A buoyancy-based fiber Bragg grating tilt sensor
NASA Astrophysics Data System (ADS)
Maheshwari, Muneesh; Yang, Yaowen; Chaturvedi, Tanmay
2017-04-01
In this paper, a novel design of fiber Bragg grating tilt sensor is proposed. This tilt sensor exhibits high angle sensitivity and resolution. The presented tilt sensor works on the principle of the force of buoyancy in a liquid. It has certain advantages over the other designs of tilt sensors. The temperature effect can be easily compensated by using an un-bonded or free FBG. An analytical model is established which correlates the Bragg wavelength (λB) with the angle of inclination. This model is then validated by the experiment, where the experimental and analytical results are found in good agreement with each other.
2007-03-01
front of a large area blackbody as background. The viewing angle , defined as the angle between surface normal and camera line of sight, was varied by...and polarization angle were derived from the Stokes parameters. The dependence of these polarization characteristics on viewing angle was investigated
NASA Technical Reports Server (NTRS)
Albus, James S.
1961-01-01
The solar aspect sensor described herein performs the analog-to-digital conversion of data optically. To accomplish this, it uses a binary "Gray code" light mask to produce a digital indication, in vehicle-fixed coordinates, of the elevation and azimuth angles of incident light from the sun. This digital solar aspect sensor system, in Explorer X, provided measurements of both elevation and azimuth angles to +/- 2 degrees at a distance of over 140,000 statute miles.
Vargas-Meléndez, Leandro; Boada, Beatriz L; Boada, María Jesús L; Gauchía, Antonio; Díaz, Vicente
2016-08-31
This article presents a novel estimator based on sensor fusion, which combines the Neural Network (NN) with a Kalman filter in order to estimate the vehicle roll angle. The NN estimates a "pseudo-roll angle" through variables that are easily measured from Inertial Measurement Unit (IMU) sensors. An IMU is a device that is commonly used for vehicle motion detection, and its cost has decreased during recent years. The pseudo-roll angle is introduced in the Kalman filter in order to filter noise and minimize the variance of the norm and maximum errors' estimation. The NN has been trained for J-turn maneuvers, double lane change maneuvers and lane change maneuvers at different speeds and road friction coefficients. The proposed method takes into account the vehicle non-linearities, thus yielding good roll angle estimation. Finally, the proposed estimator has been compared with one that uses the suspension deflections to obtain the pseudo-roll angle. Experimental results show the effectiveness of the proposed NN and Kalman filter-based estimator.
Internal reflection sensors with high angular resolution
NASA Astrophysics Data System (ADS)
Shavirin, I.; Strelkov, O.; Vetskous, A.; Norton-Wayne, L.; Harwood, R.
1996-07-01
We discuss the use of total internal reflection for the production of sensors with high angular resolution. These sensors are intended for measurement of the angle between a sensor's axis and the direction to a source of radiation or reflecting object. Sensors of this type are used in controlling the position of machine parts in robotics and industry, orienting space vehicles and astronomic devices in relation to the Sun, and as autocollimators for checking angles of deviation. This kind of sensor was used in the Apollo space vehicle some 20 years ago. Using photodetectors with linear and area CCD arrays has opened up new application possibilities for appropriately designed sensors. A generalized methodology is presented applicable to a wide range of tasks. Some modifications that can improve the performance of the basic design are described.
Research on visible and near infrared spectral-polarimetric properties of soil polluted by crude oil
NASA Astrophysics Data System (ADS)
Shen, Hui-yan; Zhou, Pu-cheng; Pan, Bang-long
2017-10-01
Hydrocarbon contaminated soil can impose detrimental effects on forest health and quality of agricultural products. To manage such consequences, oil leak indicators should be detected quickly by monitoring systems. Remote sensing is one of the most suitable techniques for monitoring systems, especially for areas which are uninhabitable and difficulty to access. The most available physical quantities in optical remote sensing domain are the intensity and spectral information obtained by visible or infrared sensors. However, besides the intensity and wavelength, polarization is another primary physical quantity associated with an optical field. During the course of reflecting light-wave, the surface of soil polluted by crude oil will cause polarimetric properties which are related to the nature of itself. Thus, detection of the spectralpolarimetric properties for soil polluted by crude oil has become a new remote sensing monitoring method. In this paper, the multi-angle spectral-polarimetric instrument was used to obtain multi-angle visible and near infrared spectralpolarimetric characteristic data of soil polluted by crude oil. And then, the change rule between polarimetric properties with different affecting factors, such as viewing zenith angle, incidence zenith angle of the light source, relative azimuth angle, waveband of the detector as well as different grain size of soil were discussed, so as to provide a scientific basis for the research on polarization remote sensing for soil polluted by crude oil.
Performance Assessment and Geometric Calibration of RESOURCESAT-2
NASA Astrophysics Data System (ADS)
Radhadevi, P. V.; Solanki, S. S.; Akilan, A.; Jyothi, M. V.; Nagasubramanian, V.
2016-06-01
Resourcesat-2 (RS-2) has successfully completed five years of operations in its orbit. This satellite has multi-resolution and multi-spectral capabilities in a single platform. A continuous and autonomous co-registration, geo-location and radiometric calibration of image data from different sensors with widely varying view angles and resolution was one of the challenges of RS-2 data processing. On-orbit geometric performance of RS-2 sensors has been widely assessed and calibrated during the initial phase operations. Since then, as an ongoing activity, various geometric performance data are being generated periodically. This is performed with sites of dense ground control points (GCPs). These parameters are correlated to the direct geo-location accuracy of the RS-2 sensors and are monitored and validated to maintain the performance. This paper brings out the geometric accuracy assessment, calibration and validation done for about 500 datasets of RS-2. The objectives of this study are to ensure the best absolute and relative location accuracy of different cameras, location performance with payload steering and co-registration of multiple bands. This is done using a viewing geometry model, given ephemeris and attitude data, precise camera geometry and datum transformation. In the model, the forward and reverse transformations between the coordinate systems associated with the focal plane, payload, body, orbit and ground are rigorously and explicitly defined. System level tests using comparisons to ground check points have validated the operational geo-location accuracy performance and the stability of the calibration parameters.
Using the Moon to Track MODIS Reflective Solar Bands Calibration Stability
NASA Technical Reports Server (NTRS)
Xiong, Xiaoxiong; Geng, Xu; Angal, Amit; Sun, Junqiang; Barnes, William
2011-01-01
MODIS has 20 reflective solar bands (RSB) in the visible (VIS), near infrared (NIR), and short-wave infrared (SWIR) spectral regions. In addition to instrument on-board calibrators (OBC), lunar observations have been used by both Terra and Aqua MODIS to track their reflective solar bands (RSB) on-orbit calibration stability. On a near monthly basis, lunar observations are scheduled and implemented for each instrument at nearly the same lunar phase angles. A time series of normalized detector responses to the Moon is used to monitor its on-orbit calibration stability. The normalization is applied to correct the differences of lunar viewing geometries and the Sun-Moon-Sensor distances among different lunar observations. Initially, the lunar calibration stability monitoring was only applied to MODIS bands (1-4 and 8-12) that do not saturate while viewing the Moon. As the mission continued, we extended the lunar calibration stability monitoring to other RSB bands (bands 13-16) that contain saturated pixels. For these bands, the calibration stability is monitored by referencing their non-saturated pixels to the matched pixels in a non-saturation band. In this paper, we describe this relative approach and apply it to MODIS regularly scheduled lunar observations. We present lunar trending results for both Terra and Aqua MODIS over their entire missions. Also discussed in the paper are the advantages and limitations of this approach and its potential applications to other earth-observing sensors. Keywords: Terra, Aqua, MODIS, sensor, Moon, calibration, stability
Jamaludin, Juliza; Rahim, Ruzairi Abdul; Fazul Rahiman, Mohd Hafiz; Mohd Rohani, Jemmy
2018-04-01
Optical tomography (OPT) is a method to capture a cross-sectional image based on the data obtained by sensors, distributed around the periphery of the analyzed system. This system is based on the measurement of the final light attenuation or absorption of radiation after crossing the measured objects. The number of sensor views will affect the results of image reconstruction, where the high number of sensor views per projection will give a high image quality. This research presents an application of charge-coupled device linear sensor and laser diode in an OPT system. Experiments in detecting solid and transparent objects in crystal clear water were conducted. Two numbers of sensors views, 160 and 320 views are evaluated in this research in reconstructing the images. The image reconstruction algorithms used were filtered images of linear back projection algorithms. Analysis on comparing the simulation and experiments image results shows that, with 320 image views giving less area error than 160 views. This suggests that high image view resulted in the high resolution of image reconstruction.
Optimizing sensor cover energy for directional sensors
NASA Astrophysics Data System (ADS)
Astorino, Annabella; Gaudioso, Manlio; Miglionico, Giovanna
2016-10-01
The Directional Sensors Continuous Coverage Problem (DSCCP) aims at covering a given set of targets in a plane by means of a set of directional sensors. The location of these sensors is known in advance and they are characterized by a discrete set of possible radii and aperture angles. Decisions to be made are about orientation (which in our approach can vary continuously), radius and aperture angle of each sensor. The objective is to get a minimum cost coverage of all targets, if any. We introduce a MINLP formulation of the problem and define a Lagrangian heuristics based on a dual ascent procedure operating on one multiplier at a time. Finally we report the results of the implementation of the method on a set of test problems.
Seismic migration for SAR focusing: Interferometrical applications
NASA Astrophysics Data System (ADS)
Prati, C.; Montiguarnieri, A.; Damonti, E.; Rocca, F.
SAR (Synthetic Aperture Radar) data focusing is analyzed from a theoretical point of view. Two applications of a SAR data processing algorithm are presented, where the phases of the returns are used for the recovery of interesting parameters of the observed scenes. Migration techniques, similar to those used in seismic signal processing for oil prospecting, were implemented for the determination of the terrain altitude map from a satellite and the evaluation of the sensor attitude for an airplane. A satisfying precision was achieved, since it was shown how an interferometric system is able to detect variations of the airplane roll angle of a small fraction of a degree.
NASA Astrophysics Data System (ADS)
Kasahara, Satoshi; Yokota, Shoichiro; Mitani, Takefumi; Asamura, Kazushi; Hirahara, Masafumi; Shibano, Yasuko; Takashima, Takeshi
2018-05-01
The medium-energy particle experiments—electron analyzer onboard the exploration of energization and radiation in geospace spacecraft measures the energy and direction of each incoming electron in the energy range of 7-87 keV. The sensor covers a 2 π-radian disklike field of view with 16 detectors, and the full solid angle coverage is achieved through the spacecraft's spin motion. The electron energy is independently measured by both an electrostatic analyzer and avalanche photodiodes, enabling significant background reduction. We describe the technical approach, data output, and examples of initial observations.[Figure not available: see fulltext.
Observations and impressions from lunar orbit
NASA Technical Reports Server (NTRS)
Mattingly, T. K.; El-Baz, F.; Laidley, R. A.
1972-01-01
On Apollo 16, the command module pilot made observations of particular surface features and processes to complement photographic and other remotely sensed data. Emphasis was placed on geological problems that required the extreme dynamic range and color sensitivities of the human eye; repetitive observations of varying sun angles and viewing directions; and, in some cases, on-the-scene interpretations. Visual observations and impressions recorded during the mission verified the effectiveness of the hardware and techniques used. The orbiting observer functioned both as a sensor, in otherwise inaccessible areas such as earthshine and shadows, and as a designator of potentially significant data that were acquired on the photographic record.
Space based optical staring sensor LOS determination and calibration using GCPs observation
NASA Astrophysics Data System (ADS)
Chen, Jun; An, Wei; Deng, Xinpu; Yang, Jungang; Sha, Zhichao
2016-10-01
Line of sight (LOS) attitude determination and calibration is the key prerequisite of tracking and location of targets in space based infrared (IR) surveillance systems (SBIRS) and the LOS determination and calibration of staring sensor is one of the difficulties. This paper provides a novel methodology for removing staring sensor bias through the use of Ground Control Points (GCPs) detected in the background field of the sensor. Based on researching the imaging model and characteristics of the staring sensor of SBIRS geostationary earth orbit part (GEO), the real time LOS attitude determination and calibration algorithm using landmark control point is proposed. The influential factors (including the thermal distortions error, assemble error, and so on) of staring sensor LOS attitude error are equivalent to bias angle of LOS attitude. By establishing the observation equation of GCPs and the state transition equation of bias angle, and using an extend Kalman filter (EKF), the real time estimation of bias angle and the high precision sensor LOS attitude determination and calibration are achieved. The simulation results show that the precision and timeliness of the proposed algorithm meet the request of target tracking and location process in space based infrared surveillance system.
Zhao, Hao; Feng, Hao
2013-01-01
An angular acceleration sensor can be used for the dynamic analysis of human and joint motions. In this paper, an angular acceleration sensor with novel structure based on the principle of electromagnetic induction is designed. The method involves the construction of a constant magnetic field by the excitation windings of sensor, and the cup-shaped rotor that cut the magnetic field. The output windings of the sensor generate an electromotive force, which is directly proportional to the angular acceleration through the electromagnetic coupling when the rotor has rotational angular acceleration. The mechanical structure and the magnetic working circuit of the sensor are described. The output properties and the mathematical model including the transfer function and state-space model of the sensor are established. The asymptotical stability of the sensor when it is working is verified by the Lyapunov Theorem. An angular acceleration calibration device based on the torsional pendulum principle is designed. The method involves the coaxial connection of the angular acceleration sensor, torsion pendulum and a high-precision angle sensor, and then an initial external force is applied to the torsion pendulum to produce a periodic damping angle oscillation. The angular acceleration sensor and the angle sensor will generate two corresponding electrical signals. The sensitivity coefficient of the angular acceleration sensor can be obtained after processing these two-channel signals. The experiment results show that the sensitivity coefficient of the sensor is about 17.29 mv/Krad·s2. Finally, the errors existing in the practical applications of the sensor are discussed and the corresponding improvement measures are proposed to provide effective technical support for the practical promotion of the novel sensor. PMID:23941911
NASA Astrophysics Data System (ADS)
Berger, Michael; Mokhtar, Marwan; Zahler, Christian; Willert, Daniel; Neuhäuser, Anton; Schleicher, Eckhard
2017-06-01
At Industrial Solar's test facility in Freiburg (Germany), two phase flow patterns have been measured by using a wire mesh sensor from Helmholtz Zentrum Dresden-Rossendorf (HZDR). Main purpose of the measurements was to compare observed two-phase flow patterns with expected flow patterns from models. The two-phase flow pattern is important for the design of direct steam generating solar collectors. Vibrations should be avoided in the peripheral piping, and local dry-outs or large circumferential temperature gradients should be prevented in the absorber tubes. Therefore, the choice of design for operation conditions like mass flow and steam quality are an important step in the engineering process of such a project. Results of a measurement with the wire mesh sensor are the flow pattern and the plug or slug frequency at the given operating conditions. Under the assumption of the collector power, which can be assumed from previous measurements at the same collector and adaption with sun position and incidence angle modifier, also the slip can be evaluated for a wire mesh sensor measurement. Measurements have been performed at different mass flows and pressure levels. Transient behavior has been tested for flashing, change of mass flow, and sudden changes of irradiation (cloud simulation). This paper describes the measurements and the method of evaluation. Results are shown as extruded profiles in top view and in side view. Measurement and model are compared. The tests have been performed at low steam quality, because of the limits of the test facility. Conclusions and implications for possible future measurements at larger collectors are also presented in this paper.
Tannous, Halim; Istrate, Dan; Benlarbi-Delai, Aziz; Sarrazin, Julien; Gamet, Didier; Ho Ba Tho, Marie Christine; Dao, Tien Tuan
2016-11-15
Exergames have been proposed as a potential tool to improve the current practice of musculoskeletal rehabilitation. Inertial or optical motion capture sensors are commonly used to track the subject's movements. However, the use of these motion capture tools suffers from the lack of accuracy in estimating joint angles, which could lead to wrong data interpretation. In this study, we proposed a real time quaternion-based fusion scheme, based on the extended Kalman filter, between inertial and visual motion capture sensors, to improve the estimation accuracy of joint angles. The fusion outcome was compared to angles measured using a goniometer. The fusion output shows a better estimation, when compared to inertial measurement units and Kinect outputs. We noted a smaller error (3.96°) compared to the one obtained using inertial sensors (5.04°). The proposed multi-sensor fusion system is therefore accurate enough to be applied, in future works, to our serious game for musculoskeletal rehabilitation.
New generation of wearable goniometers for motion capture systems
2014-01-01
Background Monitoring joint angles through wearable systems enables human posture and gesture to be reconstructed as a support for physical rehabilitation both in clinics and at the patient’s home. A new generation of wearable goniometers based on knitted piezoresistive fabric (KPF) technology is presented. Methods KPF single-and double-layer devices were designed and characterized under stretching and bending to work as strain sensors and goniometers. The theoretical working principle and the derived electromechanical model, previously proved for carbon elastomer sensors, were generalized to KPF. The devices were used to correlate angles and piezoresistive fabric behaviour, to highlight the differences in terms of performance between the single layer and the double layer sensors. A fast calibration procedure is also proposed. Results The proposed device was tested both in static and dynamic conditions in comparison with standard electrogoniometers and inertial measurement units respectively. KPF goniometer capabilities in angle detection were experimentally proved and a discussion of the device measurement errors of is provided. The paper concludes with an analysis of sensor accuracy and hysteresis reduction in particular configurations. Conclusions Double layer KPF goniometers showed a promising performance in terms of angle measurements both in quasi-static and dynamic working mode for velocities typical of human movement. A further approach consisting of a combination of multiple sensors to increase accuracy via sensor fusion technique has been presented. PMID:24725669
Validity of the Microsoft Kinect for measurement of neck angle: comparison with electrogoniometry.
Allahyari, Teimour; Sahraneshin Samani, Ali; Khalkhali, Hamid-Reza
2017-12-01
Considering the importance of evaluating working postures, many techniques and tools have been developed to identify and eliminate awkward postures and prevent musculoskeletal disorders (MSDs). The introduction of the Microsoft Kinect sensor, which is a low-cost, easy to set up and markerless motion capture system, offers promising possibilities for postural studies. Considering the Kinect's special ability in head-pose and facial-expression tracking and complexity of cervical spine movements, this study aimed to assess concurrent validity of the Microsoft Kinect against an electrogoniometer for neck angle measurements. A special software program was developed to calculate the neck angle based on Kinect skeleton tracking data. Neck angles were measured simultaneously by electrogoniometer and the developed software program in 10 volunteers. The results were recorded in degrees and the time required for each method was also measured. The Kinect's ability to identify body joints was reliable and precise. There was moderate to excellent agreement between the Kinect-based method and the electrogoniometer (paired-sample t test, p ≥ 0.25; intraclass correlation for test-retest reliability, ≥0.75). Kinect-based measurement was much faster and required less equipment, but accurate measurement with Microsoft Kinect was only possible if the participant was in its field of view.
An algorithm for deriving core magnetic field models from the Swarm data set
NASA Astrophysics Data System (ADS)
Rother, Martin; Lesur, Vincent; Schachtschneider, Reyko
2013-11-01
In view of an optimal exploitation of the Swarm data set, we have prepared and tested software dedicated to the determination of accurate core magnetic field models and of the Euler angles between the magnetic sensors and the satellite reference frame. The dedicated core field model estimation is derived directly from the GFZ Reference Internal Magnetic Model (GRIMM) inversion and modeling family. The data selection techniques and the model parameterizations are similar to what were used for the derivation of the second (Lesur et al., 2010) and third versions of GRIMM, although the usage of observatory data is not planned in the framework of the application to Swarm. The regularization technique applied during the inversion process smoothes the magnetic field model in time. The algorithm to estimate the Euler angles is also derived from the CHAMP studies. The inversion scheme includes Euler angle determination with a quaternion representation for describing the rotations. It has been built to handle possible weak time variations of these angles. The modeling approach and software have been initially validated on a simple, noise-free, synthetic data set and on CHAMP vector magnetic field measurements. We present results of test runs applied to the synthetic Swarm test data set.
NASA Technical Reports Server (NTRS)
Donovan, Sheila
1985-01-01
A full evaluation of the bidirectional reflectance properties of different vegetated surfaces was limited in past studies by instrumental inadequacies. With the development of the PARABOLA, it is now possible to sample reflectances from a large number of view angles in a short period of time, maintaining an almost constant solar zenith angle. PARABOLA data collected over five different canopies in Texas are analyzed. The objective of this investigation was to evaluate the intercanopy and intracanopy differences in bidirectional reflectance patterns. Particular attention was given to the separability of canopy types using different view angles for the red and the near infrared (NIR) spectral bands. Comparisons were repeated for different solar zenith angles. Statistical and other quantitative techniques were used to assess these differences. For the canopies investigated, the greatest reflectances were found in the backscatter direction for both bands. Canopy discrimination was found to vary with both view angle and the spectral reflectance band considered, the forward scatter view angles being most suited to observations in the NIR and backscatter view angles giving better results in the red band. Because of different leaf angle distribution characteristics, discrimination was found to be better at small solar zenith angles in both spectral bands.
NASA Astrophysics Data System (ADS)
García-Santos, Vicente; Niclòs, Raquel; Coll, César; Valor, Enric; Caselles, Vicente
2015-04-01
The MOD21 Land Surface Temperature and Emissivity (LST&E) product will be included in forthcoming MODIS Collection 6. Surface temperature and emissivities for thermal infrared (TIR) bands 29 (8.55 μm), 31 (11 μm) and 32 (12 μm) will be retrieved using the ASTER TES method adapted to MODIS at-sensor spectral radiances, previously corrected with the Water Vapor Scaling method (MODTES algorithm). LSE of most natural surfaces changes with soil moisture content, type of surface cover, surface roughness or sensor viewing geometry. The present study addresses the observation of anisotropy effects on LSE of bare soils using MODIS data and a processor simulator of the MOD21 product, since it is not available yet. Two highly homogeneous and quasi-invariant desert sites were selected to carry out the present study. The first one is the White Sands National Monument, located in Tularosa Valley (South-central New Mexico, USA), which is a dune system desert at 1216 m above sea level, with an area of 704 km2 and a maximum dune height of 10 m. The grain size is considered fine sand and the major mineralogy component is gypsum. The second site selected was the Great Sands National Park, located in the San Luis Valley (Colorado, USA). Great Sands is also a sand dune system desert, created from quartz and volcanic fragments derived from Santa Fe and Alamosa formations. The major mineral is quartz, with minor traces of potassium and feldspar. The grain size of the sand is medium to coarse according to the X-Ray Diffraction measurements. Great Sands covers an area of 104 km2 at 2560 m above sea level and the maximum dune height is 230 m. The obtained LSEs and their dependence on azimuth and zenith viewing angles were analyzed, based on series of MODIS scenes from 2010 to 2013. MODTES nadir and off-nadir LSEs showed a good agreement with laboratory emissivity measurements. Results show that band 29 LSE decreases with the zenithal angle up to 0.041 from its nadir value, while LSEs for bands 31 and 32 do not show significant changes with zenith angle.
Cooperative angle-only orbit initialization via fusion of admissible areas
NASA Astrophysics Data System (ADS)
Jia, Bin; Pham, Khanh; Blasch, Erik; Chen, Genshe; Shen, Dan; Wang, Zhonghai
2017-05-01
For the short-arc angle only orbit initialization problem, the admissible area is often used. However, the accuracy using a single sensor is often limited. For high value space objects, it is desired to achieve more accurate results. Fortunately, multiple sensors, which are dedicated to space situational awareness, are available. The work in this paper uses multiple sensors' information to cooperatively initialize the orbit based on the fusion of multiple admissible areas. Both the centralized fusion and decentralized fusion are discussed. Simulation results verify the expectation that the orbit initialization accuracy is improved by using information from multiple sensors.
NASA Technical Reports Server (NTRS)
Kerr, Yann H.; Njoku, Eni G.
1990-01-01
A radiative-transfer model for simulating microwave brightness temperatures over land surfaces is described. The model takes into account sensor viewing conditions (spacecraft altitude, viewing angle, frequency, and polarization) and atmospheric parameters over a soil surface characterized by its moisture, roughness, and temperature and covered with a layer of vegetation characterized by its temperature, water content, single scattering albedo, structure, and percent coverage. In order to reduce the influence of atmospheric and surface temperature effects, the brightness temperatures are expressed as polarization ratios that depend primarily on the soil moisture and roughness, canopy water content, and percentage of cover. The sensitivity of the polarization ratio to these parameters is investigated. Simulation of the temporal evolution of the microwave signal over semiarid areas in the African Sahel is presented and compared to actual satellite data from the SMMR instrument on Nimbus-7.
View of ASTRO-2 payload in cargo bay of STS-67 Endeavour
1995-03-17
STS067-713-072 (2-18 March 1995) --- This 70mm cargo bay scene, backdropped against a desert area of Namibia, typifies the view that daily greeted the Astro-2 crew members during their almost 17-days aboard the Space Shuttle Endeavour. Positioned on the Spacelab pallet amidst other hardware, the Astro-2 payload is in its operational mode. Visible here are the Instrument Pointing System (IPS), Hopkins Ultraviolet Telescope (HUT), Star Tracker (ST), Ultraviolet Imaging Telescope (UIT), Wisconsin Ultraviolet Photo-Polarimeter Experiment (WUPPE), and Integrated Radiator System (IRS). At this angle, the Optical Sensor Package (OPS) is not seen. The Igloo, which supports the package of experiments, is in center foreground. Two Get-Away Special (GAS) canisters are in lower left foreground. The Extended Duration Orbiter (EDO) pallet, located aft of the cargo bay, is obscured by the Astro-2 payload. The Endeavour was 190 nautical miles above Earth.
Comparison of SeaWiFS measurements of the Moon with the U.S. Geological Survey lunar model.
Barnes, Robert A; Eplee, Robert E; Patt, Frederick S; Kieffer, Hugh H; Stone, Thomas C; Meister, Gerhard; Butler, James J; McClain, Charles R
2004-11-01
The Sea-Viewing Wide-Field-of-View Sensor (SeaWiFS) has made monthly observations of the Moon since 1997. Using 66 monthly measurements, the SeaWiFS calibration team has developed a correction for the instrument's on-orbit response changes. Concurrently, a lunar irradiance model has been developed by the U.S. Geological Survey (USGS) from extensive Earth-based observations of the Moon. The lunar irradiances measured by SeaWiFS are compared with the USGS model. The comparison shows essentially identical response histories for SeaWiFS, with differences from the model of less than 0.05% per thousand days in the long-term trends. From the SeaWiFS experience we have learned that it is important to view the entire lunar image at a constant phase angle from measurement to measurement and to understand, as best as possible, the size of each lunar image. However, a constant phase angle is not required for using the USGS model. With a long-term satellite lunar data set it is possible to determine instrument changes at a quality level approximating that from the USGS lunar model. However, early in a mission, when the dependence on factors such as phase and libration cannot be adequately determined from satellite measurements alone, the USGS model is critical to an understanding of trends in instruments that use the Moon for calibration. This is the case for SeaWiFS.
Que, Ruiyi; Zhu, Rong
2012-01-01
Air speed, angle of sideslip and angle of attack are fundamental aerodynamic parameters for controlling most aircraft. For small aircraft for which conventional detecting devices are too bulky and heavy to be utilized, a novel and practical methodology by which the aerodynamic parameters are inferred using a micro hot-film flow sensor array mounted on the surface of the wing is proposed. A back-propagation neural network is used to model the coupling relationship between readings of the sensor array and aerodynamic parameters. Two different sensor arrangements are tested in wind tunnel experiments and dependence of the system performance on the sensor arrangement is analyzed. PMID:23112638
Que, Ruiyi; Zhu, Rong
2012-01-01
Air speed, angle of sideslip and angle of attack are fundamental aerodynamic parameters for controlling most aircraft. For small aircraft for which conventional detecting devices are too bulky and heavy to be utilized, a novel and practical methodology by which the aerodynamic parameters are inferred using a micro hot-film flow sensor array mounted on the surface of the wing is proposed. A back-propagation neural network is used to model the coupling relationship between readings of the sensor array and aerodynamic parameters. Two different sensor arrangements are tested in wind tunnel experiments and dependence of the system performance on the sensor arrangement is analyzed.
Do BRDF effects dominate seasonal changes in tower-based remote sensing imagery?
NASA Astrophysics Data System (ADS)
Nagol, J. R.; Morton, D. C.; Rubio, J.; Cook, B. D.; Rishmawi, K.
2014-12-01
In situ remote sensing complements data from airborne and space-based sensors, in particular for intensive study sites where optical imagery can be paired with detailed ground and tower measurements. The characteristics of tower-mounted imaging systems are quite different from the nadir viewing geometry of other remote sensing platforms. In particular, tower-mounted systems are quite sensitive to artifacts of seasonal and diurnal sun angle variations. Most systems are oriented in a fixed north or south direction (depending on latitude), placing them in the principal plane at solar noon. The strength of the BRDF (Bidirectional Reflectance Distribution Function) effect is strongest for images acquired at that time. Phenological metrics derived from tower based oblique angle imaging systems are particularly prone to BRDF effects, as shadowing within and between tree crowns varies seasonally. For sites in the northern hemisphere, the fraction of sunlit and shaded vegetation declines following the June solstice to leaf senescence in September. Correcting tower-based remote sensing imagery for artifacts of BRDF is critical to isolate real changes in canopy phenology and reflectance. Here, we used airborne lidar data from NASA Goddard's Lidar, Hyperspectral, and Thermal Airborne Imager (G-LiHT) to develop a 3D forest scene for Harvard Forest in the Discrete Anisotrophic Radiative Transfer (DART) model. Our objective was to model the contribution of changes in shadowing and illumination to observations of changes in greenness from the Phenocam image time series at the Harvard Forest site. Diurnal variability in canopy greenness from the Phenocam time series provides an independent evaluation of BRDF effects from changes in illumination and sun-sensor geometries. The overall goal of this work is to develop a look-up table solution to correct major components of BRDF for tower-mounted imaging systems such as Phenocam, based on characteristics of the forest structure (forest height, canopy rugosity, fractional cover, and composition) and viewing geometry of the sensor. Given the sensitivity of tower-based systems to BRDF effects, efforts to correct artifacts of BRDF in phenology time series is critical to isolate seasonal changes in vegetation reflectance.
Calibration requirements and methodology for remote sensors viewing the ocean in the visible
NASA Technical Reports Server (NTRS)
Gordon, Howard R.
1987-01-01
The calibration requirements for ocean-viewing sensors are outlined, and the present methods of effecting such calibration are described in detail. For future instruments it is suggested that provision be made for the sensor to view solar irradiance in diffuse reflection and that the moon be used as a source of diffuse light for monitoring the sensor stability.
Performance optimization for space-based sensors: simulation and modelling at Fraunhofer IOSB
NASA Astrophysics Data System (ADS)
Schweitzer, Caroline; Stein, Karin
2014-10-01
The prediction of the effectiveness of a space-based sensor for its designated application in space (e.g. special earth surface observations or missile detection) can help to reduce the expenses, especially during the phases of mission planning and instrumentation. In order to optimize the performance of such systems we simulate and analyse the entire operational scenario, including: - optional waveband - various orbit heights and viewing angles - system design characteristics, e. g. pixel size and filter transmission - atmospheric effects, e. g. different cloud types, climate zones and seasons In the following, an evaluation of the appropriate infrared (IR) waveband for the designated sensor application is given. The simulation environment is also capable of simulating moving objects like aircraft or missiles. Therefore, the spectral signature of the object/missile as well as its track along a flight path is implemented. The resulting video sequence is then analysed by a tracking algorithm and an estimation of the effectiveness of the sensor system can be simulated. This paper summarizes the work carried out at Fraunhofer IOSB in the field of simulation and modelling for the performance optimization of space based sensors. The paper is structured as follows: First, an overview of the applied simulation and modelling software is given. Then, the capability of those tools is illustrated by means of a hypothetical threat scenario for space-based early warning (launch of a long-range ballistic missile (BM)).
Integrated polarization-dependent sensor for autonomous navigation
NASA Astrophysics Data System (ADS)
Liu, Ze; Zhang, Ran; Wang, Zhiwen; Guan, Le; Li, Bin; Chu, Jinkui
2015-01-01
Based on the navigation strategy of insects utilizing the polarized skylight, an integrated polarization-dependent sensor for autonomous navigation is presented. The navigation sensor has the features of compact structure, high precision, strong robustness, and a simple manufacture technique. The sensor is composed by integrating a complementary-metal-oxide-semiconductor sensor with a multiorientation nanowire grid polarizer. By nanoimprint lithography, the multiorientation nanowire polarizer is fabricated in one step and the alignment error is eliminated. The statistical theory is added to the interval-division algorithm to calculate the polarization angle of the incident light. The laboratory and outdoor tests for the navigation sensor are implemented and the errors of the measured angle are ±0.02 deg and ±1.3 deg, respectively. The results show that the proposed sensor has potential for application in autonomous navigation.
Tan, Xinran; Zhu, Fan; Wang, Chao; Yu, Yang; Shi, Jian; Qi, Xue; Yuan, Feng; Tan, Jiubin
2017-11-19
This study presents a two-dimensional micro-/nanoradian angle generator (2D-MNAG) that achieves high angular displacement resolution and repeatability using a piezo-driven flexure hinge for two-dimensional deflections and three capacitive sensors for output angle monitoring and feedback control. The principal error of the capacitive sensor for precision microangle measurement is analyzed and compensated for; so as to achieve a high angle output resolution of 10 nrad (0.002 arcsec) and positioning repeatability of 120 nrad (0.024 arcsec) over a large angular range of ±4363 μrad (±900 arcsec) for the 2D-MNAG. The impact of each error component, together with the synthetic error of the 2D-MNAG after principal error compensation are determined using Monte Carlo simulation for further improvement of the 2D-MNAG.
Tan, Xinran; Zhu, Fan; Wang, Chao; Yu, Yang; Shi, Jian; Qi, Xue; Yuan, Feng; Tan, Jiubin
2017-01-01
This study presents a two-dimensional micro-/nanoradian angle generator (2D-MNAG) that achieves high angular displacement resolution and repeatability using a piezo-driven flexure hinge for two-dimensional deflections and three capacitive sensors for output angle monitoring and feedback control. The principal error of the capacitive sensor for precision microangle measurement is analyzed and compensated for; so as to achieve a high angle output resolution of 10 nrad (0.002 arcsec) and positioning repeatability of 120 nrad (0.024 arcsec) over a large angular range of ±4363 μrad (±900 arcsec) for the 2D-MNAG. The impact of each error component, together with the synthetic error of the 2D-MNAG after principal error compensation are determined using Monte Carlo simulation for further improvement of the 2D-MNAG. PMID:29156595
Feedback Robust Cubature Kalman Filter for Target Tracking Using an Angle Sensor.
Wu, Hao; Chen, Shuxin; Yang, Binfeng; Chen, Kun
2016-05-09
The direction of arrival (DOA) tracking problem based on an angle sensor is an important topic in many fields. In this paper, a nonlinear filter named the feedback M-estimation based robust cubature Kalman filter (FMR-CKF) is proposed to deal with measurement outliers from the angle sensor. The filter designs a new equivalent weight function with the Mahalanobis distance to combine the cubature Kalman filter (CKF) with the M-estimation method. Moreover, by embedding a feedback strategy which consists of a splitting and merging procedure, the proper sub-filter (the standard CKF or the robust CKF) can be chosen in each time index. Hence, the probability of the outliers' misjudgment can be reduced. Numerical experiments show that the FMR-CKF performs better than the CKF and conventional robust filters in terms of accuracy and robustness with good computational efficiency. Additionally, the filter can be extended to the nonlinear applications using other types of sensors.
Luo, Wei; Chen, Sheng; Chen, Lei; Li, Hualong; Miao, Pengcheng; Gao, Huiyi; Hu, Zelin; Li, Miao
2017-05-29
We describe a theoretical model to analyze temperature effects on the Kretschmann surface plasmon resonance (SPR) sensor, and describe a new double-incident angle technique to simultaneously measure changes in refractive index (RI) and temperature. The method uses the observation that output signals obtained from two different incident angles each have a linear dependence on RI and temperature, and are independent. A proof-of-concept experiment using different NaCl concentration solutions as analytes demonstrates the ability of the technique. The optical design is as simple and robust as conventional SPR detection, but provides a way to discriminate between RI-induced and temperature-induced SPR changes. This technique facilitates a way for traditional SPR sensors to detect RI in different temperature environments, and may lead to better design and fabrication of SPR sensors against temperature variation.
Node Scheduling Strategies for Achieving Full-View Area Coverage in Camera Sensor Networks.
Wu, Peng-Fei; Xiao, Fu; Sha, Chao; Huang, Hai-Ping; Wang, Ru-Chuan; Xiong, Nai-Xue
2017-06-06
Unlike conventional scalar sensors, camera sensors at different positions can capture a variety of views of an object. Based on this intrinsic property, a novel model called full-view coverage was proposed. We study the problem that how to select the minimum number of sensors to guarantee the full-view coverage for the given region of interest (ROI). To tackle this issue, we derive the constraint condition of the sensor positions for full-view neighborhood coverage with the minimum number of nodes around the point. Next, we prove that the full-view area coverage can be approximately guaranteed, as long as the regular hexagons decided by the virtual grid are seamlessly stitched. Then we present two solutions for camera sensor networks in two different deployment strategies. By computing the theoretically optimal length of the virtual grids, we put forward the deployment pattern algorithm (DPA) in the deterministic implementation. To reduce the redundancy in random deployment, we come up with a local neighboring-optimal selection algorithm (LNSA) for achieving the full-view coverage. Finally, extensive simulation results show the feasibility of our proposed solutions.
Node Scheduling Strategies for Achieving Full-View Area Coverage in Camera Sensor Networks
Wu, Peng-Fei; Xiao, Fu; Sha, Chao; Huang, Hai-Ping; Wang, Ru-Chuan; Xiong, Nai-Xue
2017-01-01
Unlike conventional scalar sensors, camera sensors at different positions can capture a variety of views of an object. Based on this intrinsic property, a novel model called full-view coverage was proposed. We study the problem that how to select the minimum number of sensors to guarantee the full-view coverage for the given region of interest (ROI). To tackle this issue, we derive the constraint condition of the sensor positions for full-view neighborhood coverage with the minimum number of nodes around the point. Next, we prove that the full-view area coverage can be approximately guaranteed, as long as the regular hexagons decided by the virtual grid are seamlessly stitched. Then we present two solutions for camera sensor networks in two different deployment strategies. By computing the theoretically optimal length of the virtual grids, we put forward the deployment pattern algorithm (DPA) in the deterministic implementation. To reduce the redundancy in random deployment, we come up with a local neighboring-optimal selection algorithm (LNSA) for achieving the full-view coverage. Finally, extensive simulation results show the feasibility of our proposed solutions. PMID:28587304
Tests Of Array Of Flush Pressure Sensors
NASA Technical Reports Server (NTRS)
Larson, Larry J.; Moes, Timothy R.; Siemers, Paul M., III
1992-01-01
Report describes tests of array of pressure sensors connected to small orifices flush with surface of 1/7-scale model of F-14 airplane in wind tunnel. Part of effort to determine whether pressure parameters consisting of various sums, differences, and ratios of measured pressures used to compute accurately free-stream values of stagnation pressure, static pressure, angle of attack, angle of sideslip, and mach number. Such arrays of sensors and associated processing circuitry integrated into advanced aircraft as parts of flight-monitoring and -controlling systems.
Flexible mobile robot system for smart optical pipe inspection
NASA Astrophysics Data System (ADS)
Kampfer, Wolfram; Bartzke, Ralf; Ziehl, Wolfgang
1998-03-01
Damages of pipes can be inspected and graded by TV technology available on the market. Remotely controlled vehicles carry a TV-camera through pipes. Thus, depending on the experience and the capability of the operator, diagnosis failures can not be avoided. The classification of damages requires the knowledge of the exact geometrical dimensions of the damages such as width and depth of cracks, fractures and defect connections. Within the framework of a joint R&D project a sensor based pipe inspection system named RODIAS has been developed with two partners from industry and research institute. It consists of a remotely controlled mobile robot which carries intelligent sensors for on-line sewerage inspection purpose. The sensor is based on a 3D-optical sensor and a laser distance sensor. The laser distance sensor is integrated in the optical system of the camera and can measure the distance between camera and object. The angle of view can be determined from the position of the pan and tilt unit. With coordinate transformations it is possible to calculate the spatial coordinates for every point of the video image. So the geometry of an object can be described exactly. The company Optimess has developed TriScan32, a special software for pipe condition classification. The user can start complex measurements of profiles, pipe displacements or crack widths simply by pressing a push-button. The measuring results are stored together with other data like verbal damage descriptions and digitized images in a data base.
Optical fiber extrinsic Fabry-Perot interferometer sensors for ultrasound detection
NASA Astrophysics Data System (ADS)
Sun, Qingguo; Chen, Na; Ding, Yuetong; Chen, Zhenyi; Wang, Tingyun
2009-11-01
In this paper, a new method is proposed to fabricate an optical fiber extrinsic Fabry-Perot interferometer (EFPI) as an ultrasonic sensor. An acoustic emission detecting system is constructed based on multiple EFPI sensors and demodulation circuit. Ultrasound detection experiments were done from both traditional piezoelectric transducer (PZT) and high voltage discharge. In the experiments, strong ultrasound signals were detected in both cases. The signal attenuation related to the distance and the angle between the acoustic emission source and the FP sensor are obtained. The results indicate that the receiving angle of the FP sensor is nearly 90° and the maximum detection distance in the air is more than 200cm. Furthermore, four sensors are used to locate the position of the ultrasound source produced by high voltage discharge.
NASA Astrophysics Data System (ADS)
Xiong, X.; Stone, T. C.
2017-12-01
To meet objectives for assembling continuous Earth environmental data records from multiple satellite instruments, a key consideration is to assure consistent and stable sensor calibration across platforms and spanning mission lifetimes. Maintaining and verifying calibration stability in orbit is particularly challenging for reflected solar band (RSB) radiometer instruments, as options for stable references are limited. The Moon is used regularly as a calibration target, which has capabilities for long-term sensor performance monitoring and for use as a common reference for RSB sensor inter-calibration. Suomi NPP VIIRS has viewed the Moon nearly every month since launch, utilizing spacecraft roll maneuvers to acquire lunar observations within a small range of phase angles. The VIIRS Characterization Support Team (VCST) at NASA GSFC has processed the Moon images acquired by SNPP VIIRS into irradiance measurements for calibration purposes; however, the variations in the Moon's brightness still require normalizing the VIIRS lunar measurements using radiometric reference values generated by the USGS lunar calibration system, i.e. the ROLO model. Comparison of the lunar irradiance time series to the calibration f-factors derived from the VIIRS on-board solar diffuser system shows similar overall trends in sensor response, but also reveals residual geometric anomalies in the lunar model results. The excellent lunar radiometry achieved by SNPP VIIRS is actively being used to advance lunar model development at USGS. Both MODIS instruments also have viewed the Moon regularly since launch, providing a practical application of sensor inter-calibration using the Moon as a common reference. This paper discusses ongoing efforts aimed toward demonstrating and utilizing the full potential of lunar observations to support long-term calibration stability and consistency for SNPP VIIRS and MODIS, thus contributing to level-1B data quality assurance for continuity and monitoring global environmental changes.
Development and evaluation of a SUAS perching system
NASA Astrophysics Data System (ADS)
Reynolds, Ryan
Perching has been proposed as a possible landing technique for Small Unmanned Aircraft Systems (SUAS). The current research study develops an onboard open loop perching system for a fixed-wing SUAS and examines the impact of initial flight speed and sensor placement on the perching dynamics. A catapult launcher and modified COTS aircraft were used for the experiments, while an ultrasonic sensor on the aircraft was used to detect the perching target. Thirty tests were conducted varying the initial launch speed and ultrasonic sensor placement to see if they affected the time the aircraft reaches its maximum pitch angle, since the maximum pitch angle is the optimum perching point for the aircraft. High-speed video was analyzed to obtain flight data, along with data from an onboard inertial measuring unit. The data were analyzed using a model 1, two-way ANOVA to determine if launch speed and sensor placement affect the optimum perching point where the aircraft reaches its maximum pitch angle during the maneuver. The results show the launch speed does affect the time at which the maximum pitch angle occurs, but sensor placement does not. This means a closed loop system will need to adjust its perching distance based on its initial velocity. The sensor placement not having any noticeable effect means the ultrasonic sensor can be placed on the nose or the wing of the aircraft as needed for the design. There was also no noticeable interaction between the two variables. Aerodynamic parameters such as lift, drag, and moment coefficients were derived from the dynamic equations of motion for use in numerical simulations and dynamic perching models.
A Novel Permanent Magnetic Angular Acceleration Sensor
Zhao, Hao; Feng, Hao
2015-01-01
Angular acceleration is an important parameter for status monitoring and fault diagnosis of rotary machinery. Therefore, we developed a novel permanent magnetic angular acceleration sensor, which is without rotation angle limitations and could directly measure the instantaneous angular acceleration of the rotating system. The sensor rotor only needs to be coaxially connected with the rotating system, which enables convenient sensor installation. For the cup structure of the sensor rotor, it has a relatively small rotational inertia. Due to the unique mechanical structure of the sensor, the output signal of the sensor can be directed without a slip ring, which avoids signal weakening effect. In this paper, the operating principle of the sensor is described, and simulated using finite element method. The sensitivity of the sensor is calibrated by torsional pendulum and angle sensor, yielding an experimental result of about 0.88 mV/(rad·s−2). Finally, the angular acceleration of the actual rotating system has been tested, using both a single-phase asynchronous motor and a step motor. Experimental result confirms the operating principle of the sensor and indicates that the sensor has good practicability. PMID:26151217
Design and simulation of betavoltaic angle sensor Based on ⁶³Ni-Si.
Ghasemi Nejad, Gholam Reza; Rahmani, Faezeh
2016-01-01
A theoretical design and simulation of betavoltaic angle sensor (beta-AS) based on (63)Ni-Si using MCNP code is presented in this article. It can measure the full angle of 0-360° in the temperature range of 233-353 K. Beta-AS is composed of semicircular (63)Ni as the beta source, which rotates along the circular (four-quadrant) surface of Si as a semiconductor (in p-n structure), so that the change in the source angle in relation to Si surface can be measured based on the changes in V(oc) observed in each quadrant of Si. For better performance, characteristics of Si and (63)Ni have been optimized: N(D) and N(A) values of 8e19 and 4e18 cm(-3) (donor and acceptor doping concentration in Si, respectively), source thickness and activity of 1.5 µm and 18 mCi, respectively. The relation between angle and V(oc) is also investigated. The maximum difference between measured and real values of angle (the worst case, i.e., 0.18° for the angle of 45°) occurs at 233 K. It has been shown that sensitivity of the sensor decreases with an increase of angle. The results also show that the change in activity does not affect the sensitivity. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Azzam, R. M. A.; Howlader, M. M. K.; Georgiou, T. Y.
1995-08-01
A transparent or absorbing substrate can be coated with a transparent thin film to produce a linear reflectance-versus-angle-of-incidence response over a certain range of angles. Linearization at and near normal incidence is a special case that leads to a maximally flat response for p -polarized, s -polarized, or unpolarized light. For midrange and high-range linearization with moderate and high slopes, respectively, the best results are obtained when the incident light is s polarized. Application to a Si substrate that is coated with a SiO2 film leads to novel passive and active reflection rotation sensors. Experimental results and an error analysis of this rotation sensor are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hlond, M.; Bzowski, M.; Moebius, E.
Post-launch boresight of the IBEX-Lo instrument on board the Interstellar Boundary Explorer (IBEX) is determined based on IBEX-Lo Star Sensor observations. Accurate information on the boresight of the neutral gas camera is essential for precise determination of interstellar gas flow parameters. Utilizing spin-phase information from the spacecraft attitude control system (ACS), positions of stars observed by the Star Sensor during two years of IBEX measurements were analyzed and compared with positions obtained from a star catalog. No statistically significant differences were observed beyond those expected from the pre-launch uncertainty in the Star Sensor mounting. Based on the star observations andmore » their positions in the spacecraft reference system, pointing of the IBEX satellite spin axis was determined and compared with the pointing obtained from the ACS. Again, no statistically significant deviations were observed. We conclude that no systematic correction for boresight geometry is needed in the analysis of IBEX-Lo observations to determine neutral interstellar gas flow properties. A stack-up of uncertainties in attitude knowledge shows that the instantaneous IBEX-Lo pointing is determined to within {approx}0.{sup 0}1 in both spin angle and elevation using either the Star Sensor or the ACS. Further, the Star Sensor can be used to independently determine the spacecraft spin axis. Thus, Star Sensor data can be used reliably to correct the spin phase when the Star Tracker (used by the ACS) is disabled by bright objects in its field of view. The Star Sensor can also determine the spin axis during most orbits and thus provides redundancy for the Star Tracker.« less
NASA Astrophysics Data System (ADS)
Van doninck, Jasper; Tuomisto, Hanna
2017-06-01
Biodiversity mapping in extensive tropical forest areas poses a major challenge for the interpretation of Landsat images, because floristically clearly distinct forest types may show little difference in reflectance. In such cases, the effects of the bidirectional reflection distribution function (BRDF) can be sufficiently strong to cause erroneous image interpretation and classification. Since the opening of the Landsat archive in 2008, several BRDF normalization methods for Landsat have been developed. The simplest of these consist of an empirical view angle normalization, whereas more complex approaches apply the semi-empirical Ross-Li BRDF model and the MODIS MCD43-series of products to normalize directional Landsat reflectance to standard view and solar angles. Here we quantify the effect of surface anisotropy on Landsat TM/ETM+ images over old-growth Amazonian forests, and evaluate five angular normalization approaches. Even for the narrow swath of the Landsat sensors, we observed directional effects in all spectral bands. Those normalization methods that are based on removing the surface reflectance gradient as observed in each image were adequate to normalize TM/ETM+ imagery to nadir viewing, but were less suitable for multitemporal analysis when the solar vector varied strongly among images. Approaches based on the MODIS BRDF model parameters successfully reduced directional effects in the visible bands, but removed only half of the systematic errors in the infrared bands. The best results were obtained when the semi-empirical BRDF model was calibrated using pairs of Landsat observation. This method produces a single set of BRDF parameters, which can then be used to operationally normalize Landsat TM/ETM+ imagery over Amazonian forests to nadir viewing and a standard solar configuration.
Moreno-Salinas, David; Pascoal, Antonio; Aranda, Joaquin
2013-08-12
In this paper, we address the problem of determining the optimal geometric configuration of an acoustic sensor network that will maximize the angle-related information available for underwater target positioning. In the set-up adopted, a set of autonomous vehicles carries a network of acoustic units that measure the elevation and azimuth angles between a target and each of the receivers on board the vehicles. It is assumed that the angle measurements are corrupted by white Gaussian noise, the variance of which is distance-dependent. Using tools from estimation theory, the problem is converted into that of minimizing, by proper choice of the sensor positions, the trace of the inverse of the Fisher Information Matrix (also called the Cramer-Rao Bound matrix) to determine the sensor configuration that yields the minimum possible covariance of any unbiased target estimator. It is shown that the optimal configuration of the sensors depends explicitly on the intensity of the measurement noise, the constraints imposed on the sensor configuration, the target depth and the probabilistic distribution that defines the prior uncertainty in the target position. Simulation examples illustrate the key results derived.
Field Measured Spectral Albedo-Four Years of Data from the Western U.S. Prairie
NASA Astrophysics Data System (ADS)
Michalsky, Joseph J.; Hodges, Gary B.
2013-01-01
This paper presents an initial look at four years of spectral measurements used to calculate albedo for the Colorado prairie just east of the Rocky Mountain range foothills. Some issues associated with calculating broadband albedo from thermopile sensors are discussed demonstrating that uncorrected instrument issues have led to incorrect conclusions. Normalized Difference Vegetative Index (NDVI) is defined for the spectral instruments in this study and used to demonstrate the dramatic changes that can be monitored with this very sensitive product. Examples of albedo wavelength and solar-zenith angle dependence for different stages of vegetative growth and senescence are presented. The spectral albedo of fresh snow and its spectral and solar-zenith angle dependence are discussed and contrasted with other studies of these dependencies. We conclude that fresh snow is consistent with a Lambertian reflector over the solar incidence angles measured; this is contrary to most snow albedo results. Even a slope of a degree or two in the viewed surface can explain the asymmetry in the morning and afternoon albedos for snow and vegetation. Plans for extending these spectral measurements for albedo to longer wavelengths and to additional sites are described.
Sensor On-orbit Calibration and Characterization Using Spacecraft Maneuvers
NASA Technical Reports Server (NTRS)
Xiong, X.; Butler, Jim; Barnes, W. L.; Guenther, B.
2007-01-01
Spacecraft flight operations often require activities that involve different kinds of maneuvers for orbital adjustments (pitch, yaw, and roll). Different maneuvers, when properly planned and scheduled, can also be applied to support and/or to perform on-board sensor calibration and characterization. This paper uses MODIS (Moderate Resolution Imaging Spectroradiometer) as an example to illustrate applications of spacecraft maneuvers for Earth-observing sensors on-orbit calibration and characterization. MODIS is one of the key instruments for NASA's Earth Observing System (EOS) currently operated on-board the EOS Terra and Aqua spacecraft launched in December 1999 and May 2002, respectively. Since their launch, both Terra and Aqua spacecraft have made a number of maneuvers, specially the yaw and roll maneuvers, to support the MODIS on-orbit calibration and characterization. For both Terra and Aqua MODIS, near-monthly spacecraft roll maneuvers are executed for lunar observations. These maneuvers are carefully scheduled so that the lunar phase angles are nearly identical for each sensor's lunar observations. The lunar observations are used to track MODIS reflective solar bands (RSB) calibration stability and to inter-compare Terra and Aqua MODIS RSB calibration consistency. To date, two sets of yaw maneuvers (each consists of two series of 8 consecutive yaws) by the Terra spacecraft and one set by the Aqua spacecraft have been performed to validate MODIS solar diffuser (SD) bi-directional reflectance factor (BRF) and to derive SD screen transmission. Terra spacecraft pitch maneuvers, first made on March 26, 2003 and the second on April 14, 2003 (with the Moon in the spacecraft nadir view), have been applied to characterize MODIS thermal emissive bands (TEB) response versus scan angle (RVS). This is particularly important since the pre-launch TEB RSV measurements made by the sensor vendor were not successful. Terra MODIS TEB RVS obtained from pitch maneuvers have been used in the current LIB calibration algorithm. Lunq observations from pitch maneuvers also provided information to cross-calibrate MODIS with other sensors (MISR and ASTER) on the same platform. We will provide a summary of MODIS maneuver activities and their applications for MODIS calibration and characterization. The results and lessons learned discussed in this paper from MODIS maneuver activities will provide useful insights into future spacecraft and sensor operation.
Earth Observations taken by the Expedition 27 Crew
2011-05-02
ISS027-E-020395 (2 May 2011) --- Avachinsky Volcano, Kamchatka Peninsula, Russia is featured in this image photographed by an Expedition 27 crew member on the International Space Station. The Kamchatka Peninsula, located along the Pacific ?ring of fire?, includes more than 100 identified volcanoes. While most of these volcanoes are not actively erupting, many are considered to be dangerous due to their past eruptive history and proximity to population centers and air travel corridors. This detailed photograph highlights the summit crater and snow-covered upper slopes of the Avachinsky stratovolcano exposed above a surrounding cloud deck. The 2,741-meter-high Avachinsky volcano has an extensive historical and geological record of eruptions with the latest activity observed in 2008. The large city of Petropavlovsk, Kamchatka is located approximately 25 kilometers to the southwest and, according to scientists, is built over approximately 30,000 ? 40,000 year old debris avalanche deposits that originated from Avachinsky ? suggesting that the city may be at risk from a similar hazard in the future. To the southeast (right), the large breached crater of Kozelsky Volcano is also visible above the clouds. Kozelsky is a parasitic cone, formed by the eruption of material from vents along the flank of Avachinsky volcano. The topography of the volcanoes is accentuated by shadows produced by the relatively low sun angle, and by the oblique viewing angle. Oblique images are taken looking outwards at an angle from the International Space Station, rather than the ?straight down? (or nadir) view typical of most orbital Earth-observing sensor systems.
Vargas-Meléndez, Leandro; Boada, Beatriz L.; Boada, María Jesús L.; Gauchía, Antonio; Díaz, Vicente
2016-01-01
This article presents a novel estimator based on sensor fusion, which combines the Neural Network (NN) with a Kalman filter in order to estimate the vehicle roll angle. The NN estimates a “pseudo-roll angle” through variables that are easily measured from Inertial Measurement Unit (IMU) sensors. An IMU is a device that is commonly used for vehicle motion detection, and its cost has decreased during recent years. The pseudo-roll angle is introduced in the Kalman filter in order to filter noise and minimize the variance of the norm and maximum errors’ estimation. The NN has been trained for J-turn maneuvers, double lane change maneuvers and lane change maneuvers at different speeds and road friction coefficients. The proposed method takes into account the vehicle non-linearities, thus yielding good roll angle estimation. Finally, the proposed estimator has been compared with one that uses the suspension deflections to obtain the pseudo-roll angle. Experimental results show the effectiveness of the proposed NN and Kalman filter-based estimator. PMID:27589763
Vehicle Fault Diagnose Based on Smart Sensor
NASA Astrophysics Data System (ADS)
Zhining, Li; Peng, Wang; Jianmin, Mei; Jianwei, Li; Fei, Teng
In the vehicle's traditional fault diagnose system, we usually use a computer system with a A/D card and with many sensors connected to it. The disadvantage of this system is that these sensor can hardly be shared with control system and other systems, there are too many connect lines and the electro magnetic compatibility(EMC) will be affected. In this paper, smart speed sensor, smart acoustic press sensor, smart oil press sensor, smart acceleration sensor and smart order tracking sensor were designed to solve this problem. With the CAN BUS these smart sensors, fault diagnose computer and other computer could be connected together to establish a network system which can monitor and control the vehicle's diesel and other system without any duplicate sensor. The hard and soft ware of the smart sensor system was introduced, the oil press, vibration and acoustic signal are resampled by constant angle increment to eliminate the influence of the rotate speed. After the resample, the signal in every working cycle could be averaged in angle domain and do other analysis like order spectrum.
Estimation Filter for Alignment of the Spitzer Space Telescope
NASA Technical Reports Server (NTRS)
Bayard, David
2007-01-01
A document presents a summary of an onboard estimation algorithm now being used to calibrate the alignment of the Spitzer Space Telescope (formerly known as the Space Infrared Telescope Facility). The algorithm, denoted the S2P calibration filter, recursively generates estimates of the alignment angles between a telescope reference frame and a star-tracker reference frame. At several discrete times during the day, the filter accepts, as input, attitude estimates from the star tracker and observations taken by the Pointing Control Reference Sensor (a sensor in the field of view of the telescope). The output of the filter is a calibrated quaternion that represents the best current mean-square estimate of the alignment angles between the telescope and the star tracker. The S2P calibration filter incorporates a Kalman filter that tracks six states - two for each of three orthogonal coordinate axes. Although, in principle, one state per axis is sufficient, the use of two states per axis makes it possible to model both short- and long-term behaviors. Specifically, the filter properly models transient learning, characteristic times and bounds of thermomechanical drift, and long-term steady-state statistics, whether calibration measurements are taken frequently or infrequently. These properties ensure that the S2P filter performance is optimal over a broad range of flight conditions, and can be confidently run autonomously over several years of in-flight operation without human intervention.
NASA Technical Reports Server (NTRS)
Petty, Grant W.; Katsaros, Kristina B.
1994-01-01
Based on a geometric optics model and the assumption of an isotropic Gaussian surface slope distribution, the component of ocean surface microwave emissivity variation due to large-scale surface roughness is parameterized for the frequencies and approximate viewing angle of the Special Sensor Microwave/Imager. Independent geophysical variables in the parameterization are the effective (microwave frequency dependent) slope variance and the sea surface temperature. Using the same physical model, the change in the effective zenith angle of reflected sky radiation arising from large-scale roughness is also parameterized. Independent geophysical variables in this parameterization are the effective slope variance and the atmospheric optical depth at the frequency in question. Both of the above model-based parameterizations are intended for use in conjunction with empirical parameterizations relating effective slope variance and foam coverage to near-surface wind speed. These empirical parameterizations are the subject of a separate paper.
Overcoming turbulence-induced space-variant blur by using phase-diverse speckle.
Thelen, Brian J; Paxman, Richard G; Carrara, David A; Seldin, John H
2009-01-01
Space-variant blur occurs when imaging through volume turbulence over sufficiently large fields of view. Space-variant effects are particularly severe in horizontal-path imaging, slant-path (air-to-ground or ground-to-air) geometries, and ground-based imaging of low-elevation satellites or astronomical objects. In these geometries, the isoplanatic angle can be comparable to or even smaller than the diffraction-limited resolution angle. We report on a postdetection correction method that seeks to correct for the effects of space-variant aberrations, with the goal of reconstructing near-diffraction-limited imagery. Our approach has been to generalize the method of phase-diverse speckle (PDS) by using a physically motivated distributed-phase-screen model. Simulation results are presented that demonstrate the reconstruction of near-diffraction-limited imagery under both matched and mismatched model assumptions. In addition, we present evidence that PDS could be used as a beaconless wavefront sensor in a multiconjugate adaptive optics system when imaging extended scenes.
Experimental teaching and training system based on volume holographic storage
NASA Astrophysics Data System (ADS)
Jiang, Zhuqing; Wang, Zhe; Sun, Chan; Cui, Yutong; Wan, Yuhong; Zou, Rufei
2017-08-01
The experiment of volume holographic storage for teaching and training the practical ability of senior students in Applied Physics is introduced. The students can learn to use advanced optoelectronic devices and the automatic control means via this experiment, and further understand the theoretical knowledge of optical information processing and photonics disciplines that have been studied in some courses. In the experiment, multiplexing holographic recording and readout is based on Bragg selectivity of volume holographic grating, in which Bragg diffraction angle is dependent on grating-recording angel. By using different interference angle between reference and object beams, the holograms can be recorded into photorefractive crystal, and then the object images can be read out from these holograms via angular addressing by using the original reference beam. In this system, the experimental data acquisition and the control of the optoelectronic devices, such as the shutter on-off, image loaded in SLM and image acquisition of a CCD sensor, are automatically realized by using LabVIEW programming.
Simulations of Convection Zone Flows and Measurements from Multiple Viewing Angles
NASA Technical Reports Server (NTRS)
Duvall, Thomas L.; Hanasoge, Shravan
2011-01-01
A deep-focusing time-distance measurement technique has been applied to linear acoustic simulations of a solar interior perturbed by convective flows. The simulations are for the full sphere for r/R greater than 0.2. From these it is straightforward to simulate the observations from different viewing angles and to test how multiple viewing angles enhance detectibility. Some initial results will be presented.
High-precision micro-displacement optical-fiber sensor based on surface plasmon resonance.
Zhu, Zongda; Liu, Lu; Liu, Zhihai; Zhang, Yu; Zhang, Yaxun
2017-05-15
We propose and demonstrate a novel optical-fiber micro-displacement sensor based on surface plasmon resonance (SPR) by fabricating a Kretschmann configuration on graded-index multimode fiber (GIMMF). We employ a single-mode fiber to change the radial position of the incident beam as the displacement. In the GIMMF, the angle between the light beam and fiber axis, which is closely related to the resonance angle, is changed by the displacement; thus, the resonance wavelength of the fiber SPR shifts. This micro-displacement fiber sensor has a wide detection range of 0-25 μm, a high sensitivity with maximum up to 10.32 nm/μm, and a nanometer resolution with minimum to 2 nm, which transcends almost all of other optical-fiber micro-displacement sensors. In addition, we also research that increasing the fiber polishing angle or medium refractive index can improve the sensitivity. This micro-displacement sensor will have a great significance in many industrial applications and provide a neoteric, rapid, and accurate optical measurement method in micro-displacement.
NASA Astrophysics Data System (ADS)
Gomes Leal-Junior, Arnaldo; Frizera-Neto, Anselmo; José Pontes, Maria; Rodrigues Botelho, Thomaz
2017-12-01
Polymer optical fiber (POF) curvature sensors present some advantages over conventional techniques for angle measurements, such as their light weight, compactness and immunity to electromagnetic fields. However, high hysteresis can occur in POF curvature sensors due to the polymer viscoelastic response. In order to overcome this limitation, this paper shows how the hysteresis sensor can be compensated by a calibration equation relating the measured output signal to the sensor’s angular velocity. The proposed method is validated using an exoskeleton with an active joint on the knee for flexion and extension rehabilitation exercises. The results show a decrease in sensor hysteresis and a decrease by more than two times in the error between the POF sensor and the potentiometer, which is employed for the angle measurement of the exoskeleton knee joint.
Esthetic smile preferences and the orientation of the maxillary occlusal plane.
Kattadiyil, Mathew T; Goodacre, Charles J; Naylor, W Patrick; Maveli, Thomas C
2012-12-01
The anteroposterior orientation of the maxillary occlusal plane has an important role in the creation, assessment, and perception of an esthetic smile. However, the effect of the angle at which this plane is visualized (the viewing angle) in a broad smile has not been quantified. The purpose of this study was to assess the esthetic preferences of dental professionals and nondentists by using 3 viewing angles of the anteroposterior orientation of the maxillary occlusal plane. After Institutional Review Board approval, standardized digital photographic images of the smiles of 100 participants were recorded by simultaneously triggering 3 cameras set at different viewing angles. The top camera was positioned 10 degrees above the occlusal plane (camera #1, Top view); the center camera was positioned at the level of the occlusal plane (camera #2, Center view); and the bottom camera was located 10 degrees below the occlusal plane (camera #3, Bottom view). Forty-two dental professionals and 31 nondentists (persons from the general population) independently evaluated digital images of each participant's smile captured from the Top view, Center view, and Bottom view. The 73 evaluators were asked individually through a questionnaire to rank the 3 photographic images of each patient as 'most pleasing,' 'somewhat pleasing,' or 'least pleasing,' with most pleasing being the most esthetic view and the preferred orientation of the occlusal plane. The resulting esthetic preferences were statistically analyzed by using the Friedman test. In addition, the participants were asked to rank their own images from the 3 viewing angles as 'most pleasing,' 'somewhat pleasing,' and 'least pleasing.' The 73 evaluators found statistically significant differences in the esthetic preferences between the Top and Bottom views and between the Center and Bottom views (P<.001). No significant differences were found between the Top and Center views. The Top position was marginally preferred over the Center, and both were significantly preferred over the Bottom position. When the participants evaluated their own smiles, a significantly greater number (P< .001) preferred the Top view over the Center or the Bottom views. No significant differences were found in preferences based on the demographics of the evaluators when comparing age, education, gender, profession, and race. The esthetic preference for the maxillary occlusal plane was influenced by the viewing angle with the higher (Top) and center views preferred by both dental and nondental evaluators. The participants themselves preferred the higher view of their smile significantly more often than the center or lower angle views (P<.001). Copyright © 2012 The Editorial Council of the Journal of Prosthetic Dentistry. Published by Mosby, Inc. All rights reserved.
Large-viewing-angle electroholography by space projection
NASA Astrophysics Data System (ADS)
Sato, Koki; Obana, Kazuki; Okumura, Toshimichi; Kanaoka, Takumi; Nishikawa, Satoko; Takano, Kunihiko
2004-06-01
The specification of hologram image is the full parallax 3D image. In this case we can get more natural 3D image because focusing and convergence are coincident each other. We try to get practical electro-holography system because for conventional electro-holography the image viewing angle is very small. This is due to the limited display pixel size. Now we are developing new method for large viewing angle by space projection method. White color laser is irradiated to single DMD panel ( time shared CGH of RGB three colors ). 3D space screen constructed by very small water particle is used to reconstruct the 3D image with large viewing angle by scattering of water particle.
Takaki, Yasuhiro; Hayashi, Yuki
2008-07-01
The narrow viewing zone angle is one of the problems associated with electronic holography. We propose a technique that enables the ratio of horizontal and vertical resolutions of a spatial light modulator (SLM) to be altered. This technique increases the horizontal resolution of a SLM several times, so that the horizontal viewing zone angle is also increased several times. A SLM illuminated by a slanted point light source array is imaged by a 4f imaging system in which a horizontal slit is located on the Fourier plane. We show that the horizontal resolution was increased four times and that the horizontal viewing zone angle was increased approximately four times.
C-band backscattering from corn canopies
NASA Technical Reports Server (NTRS)
Daughtry, C. S. T.; Ranson, K. J.; Biehl, L. L.
1991-01-01
A frequency-modulatad continuous-wave C-band (4.8 GHz) scatterometer was mounted on an aerial lift truck, and backscatter coefficients of corn (Zea mays L.) were acquired as functions of polarizations, view angles, and row directions. As phytomass and green-leaf area index increased, the backscatter also increased. Near anthesis, when the canopies were fully developed, the major scattering elements were located in the upper 1 m of the 2.8 m tall canopy and little backscatter was measured below that level for view angles of 30 deg or greater. C-band backscatter data could provide information to monitor tillage operations at small view zenith angles and vegetation at large view zenith angles.
The effect of viewing angle on the spectral behavior of a Gd plasma source near 6.7 nm
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Gorman, Colm; Li Bowen; Cummins, Thomas
2012-04-02
We have demonstrated the effect of viewing angle on the extreme ultraviolet (EUV) emission spectra of gadolinium (Gd) near 6.7 nm. The spectra are shown to have a strong dependence on viewing angle when produced with a laser pulse duration of 10 ns, which may be attributed to absorption by low ion stages of Gd and an angular variation in the ion distribution. Absorption effects are less pronounced at a 150-ps pulse duration due to reduced opacity resulting from plasma expansion. Thus for evaluating source intensity, it is necessary to allow for variation with both viewing angle and target orientation.
Silicon Based Schottky Barrier Infrared Sensors For Power System And Industrial Applications
NASA Astrophysics Data System (ADS)
Elabd, Hammam; Kosonocky, Walter F.
1984-03-01
Schottky barrier infrared charge coupled device sensors (IR-CCDs) have been developed. PtSi Schottky barrier detectors require cooling to liquid Nitrogen temperature and cover the wavelength range between 1 and 6 μm. The PtSi IR-CCDs can be used in industrial thermography with NEAT below 0.1°C. Pd Si-Schottkybarrier detectors require cooling to 145K and cover the spectral range between 1 and 3.5 μm. 11d2Si-IR-CCDs can be used in imaging high temperature scenes with NE▵T around 100°C. Several high density staring area and line imagers are available. Both interlaced and noninterlaced area imagers can be operated with variable and TV compatible frame rates as well as various field of view angles. The advantages of silicon fabrication technology in terms of cost and high density structures opens the doors for the design of special purpose thermal camera systems for a number of power aystem and industrial applications.
NASA Astrophysics Data System (ADS)
Kassem, A.; Sawan, M.; Boukadoum, M.; Haidar, A.
2005-12-01
We are concerned with the design, implementation, and validation of a perception SoC based on an ultrasonic array of sensors. The proposed SoC is dedicated to ultrasonic echography applications. A rapid prototyping platform is used to implement and validate the new architecture of the digital signal processing (DSP) core. The proposed DSP core efficiently integrates all of the necessary ultrasonic B-mode processing modules. It includes digital beamforming, quadrature demodulation of RF signals, digital filtering, and envelope detection of the received signals. This system handles 128 scan lines and 6400 samples per scan line with a[InlineEquation not available: see fulltext.] angle of view span. The design uses a minimum size lookup memory to store the initial scan information. Rapid prototyping using an ARM/FPGA combination is used to validate the operation of the described system. This system offers significant advantages of portability and a rapid time to market.
Occupant UV Exposure Measurements for Upper-Room Ultraviolet Germicidal Irradiation
Milonova, Sonya; Rudnick, Stephen; McDevitt, James; Nardell, Edward
2016-01-01
The threshold limit value (TLV) guideline for ultraviolet (UV) radiation specifies that irradiance measurements to ensure occupant safety be taken over an angle of 80° at the sensor. The purpose of this study was to evaluate the effect of an 80° field of view (FOV) tube on lower room UV-C irradiation measurements. Measurements were made in an experimental chamber at a height of 1.73 m with and without an FOV tube. The FOV tube reduced the lower room irradiance readings by 18-34%, a statistically significant reduction compared to the bare sensor. An 80° FOV tube should be used for lower room irradiance measurements to comply with the TLV guideline. The resulting lower readings would allow more UV-C radiation in the upper room without compromising occupant safety. More UV-C radiation in the upper room could increase efficacy of UVGI systems for reducing transmission of airborne infectious diseases. In addition, recommendations are made to standardize lower room irradiance measurement techniques. PMID:27038734
The Transition-Edge-Sensor Array for the Micro-X Sounding Rocket
NASA Technical Reports Server (NTRS)
Eckart, M. E.; Adams, J. S.; Bailey, C. N.; Bandler, S. R.; Busch, Sarah Elizabeth; Chervenak J. A.; Finkbeiner, F. M.; Kelley, R. L.; Kilbourne, C. A.; Porst, J. P.;
2012-01-01
The Micro-X sounding rocket program will fly a 128-element array of transition-edge-sensor microcalorimeters to enable high-resolution X-ray imaging spectroscopy of the Puppis-A supernova remnant. To match the angular resolution of the optics while maximizing the field-of-view and retaining a high energy resolution (< 4 eV at 1 keV), we have designed the pixels using 600 x 600 sq. micron Au/Bi absorbers, which overhang 140 x 140 sq. micron Mo/Au sensors. The data-rate capabilities of the rocket telemetry system require the pulse decay to be approximately 2 ms to allow a significant portion of the data to be telemetered during flight. Here we report experimental results from the flight array, including measurements of energy resolution, uniformity, and absorber thermalization. In addition, we present studies of test devices that have a variety of absorber contact geometries, as well as a variety of membrane-perforation schemes designed to slow the pulse decay time to match the telemetry requirements. Finally, we describe the reduction in pixel-to-pixel crosstalk afforded by an angle-evaporated Cu backside heatsinking layer, which provides Cu coverage on the four sidewalls of the silicon wells beneath each pixel.
Automated comprehensive Adolescent Idiopathic Scoliosis assessment using MVC-Net.
Wu, Hongbo; Bailey, Chris; Rasoulinejad, Parham; Li, Shuo
2018-05-18
Automated quantitative estimation of spinal curvature is an important task for the ongoing evaluation and treatment planning of Adolescent Idiopathic Scoliosis (AIS). It solves the widely accepted disadvantage of manual Cobb angle measurement (time-consuming and unreliable) which is currently the gold standard for AIS assessment. Attempts have been made to improve the reliability of automated Cobb angle estimation. However, it is very challenging to achieve accurate and robust estimation of Cobb angles due to the need for correctly identifying all the required vertebrae in both Anterior-posterior (AP) and Lateral (LAT) view x-rays. The challenge is especially evident in LAT x-ray where occlusion of vertebrae by the ribcage occurs. We therefore propose a novel Multi-View Correlation Network (MVC-Net) architecture that can provide a fully automated end-to-end framework for spinal curvature estimation in multi-view (both AP and LAT) x-rays. The proposed MVC-Net uses our newly designed multi-view convolution layers to incorporate joint features of multi-view x-rays, which allows the network to mitigate the occlusion problem by utilizing the structural dependencies of the two views. The MVC-Net consists of three closely-linked components: (1) a series of X-modules for joint representation of spinal structure (2) a Spinal Landmark Estimator network for robust spinal landmark estimation, and (3) a Cobb Angle Estimator network for accurate Cobb Angles estimation. By utilizing an iterative multi-task training algorithm to train the Spinal Landmark Estimator and Cobb Angle Estimator in tandem, the MVC-Net leverages the multi-task relationship between landmark and angle estimation to reliably detect all the required vertebrae for accurate Cobb angles estimation. Experimental results on 526 x-ray images from 154 patients show an impressive 4.04° Circular Mean Absolute Error (CMAE) in AP Cobb angle and 4.07° CMAE in LAT Cobb angle estimation, which demonstrates the MVC-Net's capability of robust and accurate estimation of Cobb angles in multi-view x-rays. Our method therefore provides clinicians with a framework for efficient, accurate, and reliable estimation of spinal curvature for comprehensive AIS assessment. Copyright © 2018. Published by Elsevier B.V.
Reenalda, Jasper; Maartens, Erik; Homan, Lotte; Buurke, J H Jaap
2016-10-03
Recent developments in wearable and wireless sensor technology allow for a continuous three dimensional analysis of running mechanics in the sport specific setting. The present study is the first to demonstrate the possibility of analyzing three dimensional (3D) running mechanics continuously, by means of inertial magnetic measurement units, to objectify changes in mechanics over the course of a marathon. Three well trained male distance runners ran a marathon while equipped with inertial magnetic measurement units on trunk, pelvis, upper legs, lower legs and feet to obtain a 3D view of running mechanics and to asses changes in running mechanics over the course of a marathon. Data were continuously recorded during the entire 42.2km (26.2Miles) of the Marathon. Data from the individual sensors were transmitted wirelessly to a receiver, mounted on the handlebar of an accompanying cyclist. Anatomical calibration was performed using both static and dynamic procedures and sensor orientations were thus converted to body segment orientations by means of transformation matrices obtained from the segment calibration. Joint angle (hip, knee and ankle) trajectories as well as center of mass (COM) trajectory and acceleration were derived from the sensor data after segment calibration. Data were collected and repeated measures one way ANOVA׳s, with Tukey post-hoc test, were used to statistically analyze differences between the defined kinematic parameters (max hip angle, peak knee flexion at mid-stance and at mid-swing, ankle angle at initial contact and COM vertical displacement and acceleration), averaged over 100 strides, between the first and the last stages (8 and 40km) of the marathon. Significant changes in running mechanics were witnessed between the first and the last stage of the marathon. This study showed the possibility of performing a 3D kinematic analysis of the running technique, in the sport specific setting, by using inertial magnetic measurement units. For the three runners analyzed, significant changes were observed in running mechanics over the course of a marathon. The present measurement technique therefore allows for more in-depth study of running mechanics outside the laboratory setting. Copyright © 2016 Elsevier Ltd. All rights reserved.
Analysis of Opportunities for Intercalibration Between Two Spacecraft. Chapter 1
NASA Technical Reports Server (NTRS)
Roithmayr, Carlos M.; Speth, Paul W.
2012-01-01
There is currently a strong interest in obtaining highly accurate measurements of solar radiation reflected by Earth. For example, the Traceable Radiometry Underpinning Terrestrial- and Helio- Studies (TRUTHS) satellite mission has been under consideration in Europe for several years, and planning is now under way for the Climate Absolute Radiance and Refractivity Observatory (CLARREO) spacecraft in the United States. Such spacecraft will provide measurements whose high accuracy is traceable to SI standards; these measurements will be useful as a reference for calibrating similar instruments on board other spacecraft. Hence, analysis of opportunities for intercalibration between two spacecraft plays an important role in the planning of future missions. In order for intercalibration to take place, the measurements obtained from two spacecraft must have similar viewing geometry and be taken within a few minutes of one another. Viewing geometry is characterized in terms of viewing zenith angle, solar zenith angle, and relative azimuth angle. Opportunities for intercalibration are greater in number and longer in duration if the sensor with high accuracy can be aimed at points on the surface of the Earth other than the nadir or sub-satellite point. Analysis of intercalibration over long periods is rendered tractable by making several simplifying assumptions regarding orbital motions of the two spacecraft about Earth, as well as Earth s orbit about the Sun. The shape of the Earth is also considered. A geometric construction called a tent is introduced to facilitate analysis. It is helpful to think of an intercalibration opportunity as the passage of one spacecraft through a tent that has a fixed shape and moves with the spacecraft whose measurements are to be calibrated. Selection of points on Earth s surface as targets for measurement is discussed, as is aiming the boresight of a steerable instrument. Analysis results for a pair of spacecraft in typical low Earth orbits are provided.
Qin, Zong; Wang, Kai; Chen, Fei; Luo, Xiaobing; Liu, Sheng
2010-08-02
In this research, the condition for uniform lighting generated by array of LEDs with large view angle was studied. The luminous intensity distribution of LED is not monotone decreasing with view angle. A LED with freeform lens was designed as an example for analysis. In a system based on LEDs designed in house with a thickness of 20mm and rectangular arrangement, the condition for uniform lighting was derived and the analytical results demonstrated that the uniformity was not decreasing monotonously with the increasing of LED-to-LED spacing. The illuminance uniformities were calculated with Monte Carlo ray tracing simulations and the uniformity was found to increase with the increasing of certain LED-to-LED spacings anomalously. Another type of large view angle LED and different arrangements were discussed in addition. Both analysis and simulation results showed that the method is available for LED array lighting system design on the basis of large view angle LED..
NASA Astrophysics Data System (ADS)
Sadat, Mojtaba T.; Viti, Francesco
2015-02-01
Machine vision is rapidly gaining popularity in the field of Intelligent Transportation Systems. In particular, advantages are foreseen by the exploitation of Aerial Vehicles (AV) in delivering a superior view on traffic phenomena. However, vibration on AVs makes it difficult to extract moving objects on the ground. To partly overcome this issue, image stabilization/registration procedures are adopted to correct and stitch multiple frames taken of the same scene but from different positions, angles, or sensors. In this study, we examine the impact of multiple feature-based techniques for stabilization, and we show that SURF detector outperforms the others in terms of time efficiency and output similarity.
Optical inverse-square displacement sensor
Howe, Robert D.; Kychakoff, George
1989-01-01
This invention comprises an optical displacement sensor that uses the inverse-square attenuation of light reflected from a diffused surface to calculate the distance from the sensor to the reflecting surface. Light emerging from an optical fiber or the like is directed onto the surface whose distance is to be measured. The intensity I of reflected light is angle dependent, but within a sufficiently small solid angle it falls off as the inverse square of the distance from the surface. At least a pair of optical detectors are mounted to detect the reflected light within the small solid angle, their ends being at different distances R and R+.DELTA.R from the surface. The distance R can then be found in terms of the ratio of the intensity measurements and the separation length as ##EQU1##
NASA Astrophysics Data System (ADS)
Barbosa, H. M.; Martins, J. V.; McBride, B.; Espinosa, R.; Fernandez Borda, R. A.; Remer, L.; Dubovik, O.
2017-12-01
The largest impediments to estimating climate change revolve around a lack of quantitative information on aerosol forcing and our poor understanding of aerosol-cloud processes and cloud feedbacks in the climate system. This is so because global aerosol and cloud data come from satellite sensors that, today, measure limited subsets of the full Stokes parameters. Most measure only spectral intensity at one geometry, or at a severely limited set of geometries, or measure polarization non-simultaneously using a filter wheel, with a low spatial resolution. To overcome this scientific gap, the Laboratory for Aerosols, Clouds and Optics (LACO) of UMBC developed the Hyper Angular Rainbow Polarimeter (HARP): a very simple but highly effective sensor that can simultaneously measure 3 angles of polarization, at 4 different wavelengths, to observe the same target with up to 60 viewing angles, with no moving parts. The HARP-Cubesat mission will fly next January, with the main objective of proving the on-flight capabilities of a highly accurate wide FOV hyperangle imaging polarimeter for characterizing aerosol and cloud properties. AirHARP is an exact copy of the HARP sensor but prepared to fly on aircrafts. Here we report on preliminary aerosol data analysis from its first measurements during the Lake Michigan Ozone Study (LMOS) field campaign last June. We will discuss how the polarization measurements are inverted using the GRASP (Generalized Retrieval of Aerosol and Surface Properties) inversion algorithm to obtain the aerosol size distribution, complex index of refraction and sphericity. For the flights on June 8th and 12th, we will compare the retrievals with those from the Aeronet station LMOS-ZION, specially setup for the campaign.
Development of a high resolution optical-fiber tilt sensor by F-P filter
NASA Astrophysics Data System (ADS)
Pan, Jianjun; Nan, Qiuming; Li, Shujie; Hao, Zhonghua
2017-04-01
A high-resolution tilt sensor is developed, which is composed of a pair of optical fiber collimators and a simple pendulum with an F-P filter. The tilt angle is measured by demodulating the shift of center wavelength of F-P filter, which is caused by incidence angle changing. The relationship between tilted angle and the center wavelength is deduced. Calibration experiment results also confirm the deduction, and show that it is easy to obtain a high resolution. Setting the initial angle to 6degree, the measurement range is ±3degree, its average sensitivity is 1104pm/degree, and its average resolution is as high as 0.0009degree.
View angle dependence of cloud optical thicknesses retrieved by MODIS
NASA Technical Reports Server (NTRS)
Marshak, Alexander; Varnai, Tamas
2005-01-01
This study examines whether cloud inhomogeneity influences the view angle dependence of MODIS cloud optical thickness (tau) retrieval results. The degree of cloud inhomogeneity is characterized through the local gradient in 11 microns brightness temperature. The analysis of liquid phase clouds in a one year long global dataset of Collection 4 MODIS data reveals that while optical thickness retrievals give remarkably consistent results for all view directions if clouds are homogeneous, they give much higher tau-values for oblique views than for overhead views if clouds are inhomogeneous and the sun is fairly oblique. For solar zenith angles larger than 55deg, the mean optical thickness retrieved for the most inhomogeneous third of cloudy pixels is more than 30% higher for oblique views than for overhead views. After considering a variety of possible scenarios, the paper concludes that the most likely reason for the increase lies in three-dimensional radiative interactions that are not considered in current, one-dimensional retrieval algorithms. Namely, the radiative effect of cloud sides viewed at oblique angles seems to contribute most to the enhanced tau-values. The results presented here will help understand cloud retrieval uncertainties related to cloud inhomogeneity. They complement the uncertainty estimates that will start accompanying MODIS cloud products in Collection 5 and may eventually help correct for the observed view angle dependent biases.
NASA Astrophysics Data System (ADS)
Castro, José J.; Pozo, Antonio M.; Rubiño, Manuel
2013-11-01
In this work we studied the color dependence with a horizontal-viewing angle and colorimetric characterization of two liquid-crystal displays (LCD) using two different backlighting: Cold Cathode Fluorescent Lamps (CCFLs) and light-emitting diodes (LEDs). The LCDs studied had identical resolution, size, and technology (TFT - thin film transistor). The colorimetric measurements were made with the spectroradiometer SpectraScan PR-650 following the procedure recommended in the European guideline EN 61747-6. For each display, we measured at the centre of the screen the chromaticity coordinates at horizontal viewing angles of 0, 20, 40, 60 and 80 degrees for the achromatic (A), red (R), green (G) and blue (B) channels. Results showed a greater color-gamut area for the display with LED backlight, compared with the CCFL backlight, showing a greater range of colors perceptible by human vision. This color-gamut area diminished with viewing angle for both displays. Higher differences between trends for viewing angles were observed in the LED-backlight, especially for the R- and G-channels, demonstrating a higher variability of the chromaticity coordinates with viewing angle. The best additivity was reached by the LED-backlight display (a lower error percentage). LED-backlight display provided better color performance of visualization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davies, R.
The spatial autocorrelation functions of broad-band longwave and shortwave radiances measured by the Earth Radiation Budget Experiment (ERBE) are analyzed as a function of view angle in an investigation of the general effects of scene inhomogeneity on radiation. For nadir views, the correlation distance of the autocorrelation function is about 900 km for longwave radiance and about 500 km for shortwave radiance, consistent with higher degrees of freedom in shortwave reflection. Both functions rise monotonically with view angle, but there is a substantial difference in the relative angular dependence of the shortwave and longwave functions, especially for view angles lessmore » than 50 deg. In this range, the increase with angle of the longwave functions is found to depend only on the expansion of pixel area with angle, whereas the shortwave functions show an additional dependence on angle that is attributed to the occlusion of inhomogeneities by cloud height variations. Beyond a view angle of about 50 deg, both longwave and shortwave functions appear to be affected by cloud sides. The shortwave autocorrelation functions do not satisfy the principle of directional reciprocity, thereby proving that the average scene is horizontally inhomogeneous over the scale of an ERBE pixel (1500 sq km). Coarse stratification of the measurements by cloud amount, however, indicates that the average cloud-free scene does satisfy directional reciprocity on this scale.« less
NASA Astrophysics Data System (ADS)
Tanaka, Takuro; Takahashi, Hisashi
In some motor applications, it is very difficult to attach a position sensor to the motor in housing. One of the examples of such applications is the dental handpiece-motor. In those designs, it is necessary to drive highly efficiency at low speed and variable load condition without a position sensor. We developed a method to control a motor high-efficient and smoothly at low speed without a position sensor. In this paper, the method in which permanent magnet synchronous motor is controlled smoothly and high-efficient by using torque angle control in synchronized operation is shown. The usefulness is confirmed by experimental results. In conclusion, the proposed sensor-less control method has been achieved to be very efficiently and smoothly.
Gao, Xiang; Yan, Shenggang; Li, Bin
2017-01-01
Magnetic detection techniques have been widely used in many fields, such as virtual reality, surgical robotics systems, and so on. A large number of methods have been developed to obtain the position of a ferromagnetic target. However, the angular rotation of the target relative to the sensor is rarely studied. In this paper, a new method for localization of moving object to determine both the position and rotation angle with three magnetic sensors is proposed. Trajectory localization estimation of three magnetic sensors, which are collinear and noncollinear, were obtained by the simulations, and experimental results demonstrated that the position and rotation angle of ferromagnetic target having roll, pitch or yaw in its movement could be calculated accurately and effectively with three noncollinear vector sensors. PMID:28892006
Evaluation of electrolytic tilt sensors for wind tunnel model angle-of-attack (AOA) measurements
NASA Technical Reports Server (NTRS)
Wong, Douglas T.
1991-01-01
The results of a laboratory evaluation of three types of electrolytic tilt sensors as potential candidates for model attitude or angle of attack (AOA) measurements in wind tunnel tests are presented. Their performance was also compared with that from typical servo accelerometers used for AOA measurements. Model RG-37 electrolytic tilt sensors were found to have the highest overall accuracy among the three types. Compared with the servo accelerometer, their accuracies are about one order of magnitude worse and each of them cost about two-thirds less. Therefore, the sensors are unsuitable for AOA measurements although they are less expensive. However, the potential for other applications exists where the errors resulting from roll interaction, vibration, and response time are less, and sensor temperature can be controlled.
NASA Astrophysics Data System (ADS)
Van Volkinburg, Kyle R.; Nguyen, Thao; Pegan, Jonathan D.; Khine, Michelle; Washington, Gregory N.
2016-04-01
The shape memory polymer polystyrene (PS) has been used to create complex hierarchical wrinkling in the fabrication of stretchable thin film bimetallic sensors ideal for wearable based gesture monitoring applications. The film has been bonded to the elastomer polydimethylsiloxane (PDMS) and operates as a strain gauge under the general notion of geometric piezoresistivity. The film was subject to tensile, cyclic, and step loading conditions in order to characterize its dynamic behavior. To measure the joint angle of the metacarpophalangeal (MCP) joint on the right index finger, the sensor was adhered to a fitted golf glove above said joint and a motion study was conducted. At maximum joint angle the sensor experienced roughly 23.5% strain. From the study it was found that two simple curves, one while the finger was in flexion and the other while the finger was in extension, were able to predict the joint angle from measured voltage with an average error of 2.99 degrees.
NASA Technical Reports Server (NTRS)
Bekdash, Omar; Norcross, Jason; McFarland, Shane
2015-01-01
Mobility tracking of human subjects while conducting suited operations still remains focused on the external movement of the suit and little is known about the human movement within it. For this study, accelerometers and bend sensitive resistors were integrated into a custom carrier glove to quantify range of motion and dexterity from within the pressurized glove environment as a first stage feasibility study of sensor hardware, integration, and reporting capabilities. Sensors were also placed on the exterior of the pressurized glove to determine if it was possible to compare a glove joint angle to the anatomical joint angle of the subject during tasks. Quantifying human movement within the suit was feasible, with accelerometers clearly detecting movements in the wrist and reporting expected joint angles at maximum flexion or extension postures with repeatability of plus or minus 5 degrees between trials. Bend sensors placed on the proximal interphalangeal and distal interphalangeal joints performed less well. It was not possible to accurately determine the actual joint angle using these bend sensors, but these sensors could be used to determine when the joint was flexed to its maximum and provide a general range of mobility needed to complete a task. Further work includes additional testing with accelerometers and the possible inclusion of hardware such as magnetometers or gyroscopes to more precisely locate the joint in 3D space. We hope to eventually expand beyond the hand and glove and develop a more comprehensive suit sensor suite to characterize motion across more joints (knee, elbow, shoulder, etc.) and fully monitor the human body operating within the suit environment.
A new technique for the measurement of surface shear stress vectors using liquid crystal coatings
NASA Technical Reports Server (NTRS)
Reda, Daniel C.; Muratore, J. J., Jr.
1994-01-01
Research has recently shown that liquid crystal coating (LCC) color-change response to shear depends on both shear stress magnitude and direction. Additional research was thus conducted to extend the LCC method from a flow-visualization tool to a surface shear stress vector measurement technique. A shear-sensitive LCC was applied to a planar test surface and illuminated by white light from the normal direction. A fiber optic probe was used to capture light scattered by the LCC from a point on the centerline of a turbulent, tangential-jet flow. Both the relative shear stress magnitude and the relative in-plane view angle between the sensor and the centerline shear vector were systematically varied. A spectrophotometer was used to obtain scattered-light spectra which were used to quantify the LCC color (dominant wavelength) as a function of shear stress magnitude and direction. At any fixed shear stress magnitude, the minimum dominant wavelength was measured when the shear vector was aligned with and directed away from the observer; changes in the relative in-plane view angle to either side of this vector/observer aligned position resulted in symmetric Gaussian increases in measured dominant wavelength. Based on these results, a vector measurement methodology, involving multiple oblique-view observations of the test surface, was formulated. Under present test conditions, the measurement resolution of this technique was found to be +/- 1 deg for vector orientations and +/- 5% for vector magnitudes. An approach t o extend the present methodology to full-surface applications is proposed.
Attitude Estimation for Large Field-of-View Sensors
NASA Technical Reports Server (NTRS)
Cheng, Yang; Crassidis, John L.; Markley, F. Landis
2005-01-01
The QUEST measurement noise model for unit vector observations has been widely used in spacecraft attitude estimation for more than twenty years. It was derived under the approximation that the noise lies in the tangent plane of the respective unit vector and is axially symmetrically distributed about the vector. For large field-of-view sensors, however, this approximation may be poor, especially when the measurement falls near the edge of the field of view. In this paper a new measurement noise model is derived based on a realistic noise distribution in the focal-plane of a large field-of-view sensor, which shows significant differences from the QUEST model for unit vector observations far away from the sensor boresight. An extended Kalman filter for attitude estimation is then designed with the new measurement noise model. Simulation results show that with the new measurement model the extended Kalman filter achieves better estimation performance using large field-of-view sensor observations.
Low Frequency Radar Sensor Observations of Tropical Forests in the Panama Canal Area
NASA Technical Reports Server (NTRS)
Imhoff, M. L.; Lawrence, W.; Condit, R.; Wright, J.; Johnson, P.; Hyer, J.; May, L.; Carson, S.; Smith, David E. (Technical Monitor)
2000-01-01
A synthetic aperture radar sensor operating in 5 bands between 80 and 120 MHz was flown over forested areas in the canal zone of the Republic of Panama in an experiment to measure biomass in heavy tropical forests. The sensor is a pulse coherent SAR flown on a small aircraft and oriented straight down. The doppler history is processed to collect data on the ground in rectangular cells of varying size over a range of incidence angles fore and aft of nadir (+45 to - 45 degrees). Sensor data consists of 5 frequency bands with 20 incidence angles per band. Sensor data for over 12+ sites were collected with forest stands having biomass densities ranging from 50 to 300 tons/ha dry above ground biomass. Results are shown exploring the biomass saturation thresholds using these frequencies, the system design is explained, and preliminary attempts at data visualization using this unique sensor design are described.
Low-cost rapid miniature optical pressure sensors for blast wave measurements.
Wu, Nan; Wang, Wenhui; Tian, Ye; Zou, Xiaotian; Maffeo, Michael; Niezrecki, Christopher; Chen, Julie; Wang, Xingwei
2011-05-23
This paper presents an optical pressure sensor based on a Fabry-Perot (FP) interferometer formed by a 45° angle polished single mode fiber and an external silicon nitride diaphragm. The sensor is comprised of two V-shape grooves with different widths on a silicon chip, a silicon nitride diaphragm released on the surface of the wider V-groove, and a 45° angle polished single mode fiber. The sensor is especially suitable for blast wave measurements: its compact structure ensures a high spatial resolution; its thin diaphragm based design and the optical demodulation scheme allow a fast response to the rapid changing signals experienced during blast events. The sensor shows linearity with the correlation coefficient of 0.9999 as well as a hysteresis of less than 0.3%. The shock tube test demonstrated that the sensor has a rise time of less than 2 µs from 0 kPa to 140 kPa.
What convention is used for the illumination and view angles?
Atmospheric Science Data Center
2014-12-08
... Azimuth angles are measured clockwise from the direction of travel to local north. For both the Sun and cameras, azimuth describes the ... to the equator, because of its morning equator crossing time. Additionally, the difference in view and solar azimuth angle will be near ...
Digital 3D holographic display using scattering layers for enhanced viewing angle and image size
NASA Astrophysics Data System (ADS)
Yu, Hyeonseung; Lee, KyeoReh; Park, Jongchan; Park, YongKeun
2017-05-01
In digital 3D holographic displays, the generation of realistic 3D images has been hindered by limited viewing angle and image size. Here we demonstrate a digital 3D holographic display using volume speckle fields produced by scattering layers in which both the viewing angle and the image size are greatly enhanced. Although volume speckle fields exhibit random distributions, the transmitted speckle fields have a linear and deterministic relationship with the input field. By modulating the incident wavefront with a digital micro-mirror device, volume speckle patterns are controlled to generate 3D images of micrometer-size optical foci with 35° viewing angle in a volume of 2 cm × 2 cm × 2 cm.
Ostaszewski, Michal; Pauk, Jolanta
2018-05-16
Gait analysis is a useful tool medical staff use to support clinical decision making. There is still an urgent need to develop low-cost and unobtrusive mobile health monitoring systems. The goal of this study was twofold. Firstly, a wearable sensor system composed of plantar pressure insoles and wearable sensors for joint angle measurement was developed. Secondly, the accuracy of the system in the measurement of ground reaction forces and joint moments was examined. The measurements included joint angles and plantar pressure distribution. To validate the wearable sensor system and examine the effectiveness of the proposed method for gait analysis, an experimental study on ten volunteer subjects was conducted. The accuracy of measurement of ground reaction forces and joint moments was validated against the results obtained from a reference motion capture system. Ground reaction forces and joint moments measured by the wearable sensor system showed a root mean square error of 1% for min. GRF and 27.3% for knee extension moment. The correlation coefficient was over 0.9, in comparison with the stationary motion capture system. The study suggests that the wearable sensor system could be recommended both for research and clinical applications outside a typical gait laboratory.
Development of a Low-Cost Attitude Sensor for Agricultural Vehicles
USDA-ARS?s Scientific Manuscript database
The objective of this research was to develop a low-cost attitude sensor for agricultural vehicles. The attitude sensor was composed of three vibratory gyroscopes and two inclinometers. A sensor fusion algorithm was developed to estimate tilt angles (roll and pitch) by least-squares method. In the a...
Method for measuring tri-axial lumbar motion angles using wearable sheet stretch sensors
Nakamoto, Hiroyuki; Yamaji, Tokiya; Ootaka, Hideo; Bessho, Yusuke; Nakamura, Ryo; Ono, Rei
2017-01-01
Background Body movements, such as trunk flexion and rotation, are risk factors for low back pain in occupational settings, especially in healthcare workers. Wearable motion capture systems are potentially useful to monitor lower back movement in healthcare workers to help avoid the risk factors. In this study, we propose a novel system using sheet stretch sensors and investigate the system validity for estimating lower back movement. Methods Six volunteers (female:male = 1:1, mean age: 24.8 ± 4.0 years, height 166.7 ± 5.6 cm, weight 56.3 ± 7.6 kg) participated in test protocols that involved executing seven types of movements. The movements were three uniaxial trunk movements (i.e., trunk flexion-extension, trunk side-bending, and trunk rotation) and four multiaxial trunk movements (i.e., flexion + rotation, flexion + side-bending, side-bending + rotation, and moving around the cranial–caudal axis). Each trial lasted for approximately 30 s. Four stretch sensors were attached to each participant’s lower back. The lumbar motion angles were estimated using simple linear regression analysis based on the stretch sensor outputs and compared with those obtained by the optical motion capture system. Results The estimated lumbar motion angles showed a good correlation with the actual angles, with correlation values of r = 0.68 (SD = 0.35), r = 0.60 (SD = 0.19), and r = 0.72 (SD = 0.18) for the flexion-extension, side bending, and rotation movements, respectively (all P < 0.05). The estimation errors in all three directions were less than 3°. Conclusion The stretch sensors mounted on the back provided reasonable estimates of the lumbar motion angles. The novel motion capture system provided three directional angles without capture space limits. The wearable system possessed great potential to monitor the lower back movement in healthcare workers and helping prevent low back pain. PMID:29020053
NASA Astrophysics Data System (ADS)
Guo, Junpeng; Guo, Hong; Li, Zhitong
2016-09-01
In this work, a 2D metallic nano-trench array was fabricated on gold metal surface by using an e-beam lithography patterning and etching process. Optical reflectance from the device was measured at oblique angles of incidence for TE and TM polarization. Near perfect light trapping was observed at different wavelengths for TE and TM polarization at oblique angle of incidence. As angle of incidence increases, light trapping wavelength has a red-shift for TM polarization and blue shift for TE polarization. The fabricated nano-trench device was also investigated for chemical sensor application. It was found that by varying the angle of incidence, the sensitivity changes with opposite trends for TE and TM polarization. Sensor sensitivity increases for TM polarization and decreases for TE polarization with increase of the oblique incident angle.
10. 22'X34' original blueprint, VariableAngle Launcher, 'SIDE VIEW CAMERA CARSTEEL ...
10. 22'X34' original blueprint, Variable-Angle Launcher, 'SIDE VIEW CAMERA CAR-STEEL FRAME AND AXLES' drawn at 1/2'=1'-0'. (BOURD Sketch # 209124). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Earth Observations taken by the Expedition 23 Crew
2010-05-04
ISS023-E-032397 (4 May 2010) --- The Gulf of Mexico oil spill is featured in this image photographed by an Expedition 23 crew member on the International Space Station. On April 20, 2010 the oil rig Deepwater Horizon suffered an explosion and sank two days later. Shortly thereafter oil began leaking into the Gulf of Mexico from ruptured pipes as safety cutoff mechanisms failed to operate. Automated nadir-viewing orbital NASA sensors have been tracking the growth of the oil spill as it has spread towards the northern Gulf Coast. This detailed photograph provides a different viewing perspective on the ongoing event. The image is oblique, meaning that it was taken with a sideways viewing angle from the space station, rather than the ?straight down? or nadir view typical of automated satellite sensors. The view is towards the west; the ISS was located over the eastern edge of the Gulf of Mexico when the image was taken. The Mississippi River Delta and nearby Louisiana coast (top) appear dark in the sunglint that illuminates most of the image. This phenomenon is caused by sunlight reflecting off the water surface ? much like a mirror ? directly back towards the astronaut observer onboard the orbital complex. The sunglint improves the identification of the oil spill (colored dark to light gray) which is creating a different water texture, and therefore a contrast, between the smooth and rougher water of the reflective ocean surface (colored silver to white). Wind and water current patterns have modified the oil spill?s original shape into streamers and elongated masses. Efforts are ongoing to contain the spill and protect fragile coastal ecosystems and habitats such as the Chandeleur Islands (right center). Other features visible in the image include a solid field of low cloud cover at the lower left corner of the image. A part of one of the ISS solar arrays is visible at lower right. Wave patterns at lower right are most likely caused by tidal effects.
Interferometric rotation sensor
NASA Technical Reports Server (NTRS)
Walsh, T. M. (Inventor)
1973-01-01
An interferometric rotation sensor and control system is provided which includes a compound prism interferometer and an associated direction control system. Light entering the interferometer is split into two paths with the light in the respective paths being reflected an unequal number of times, and then being recombined at an exit aperture in phase differing relationships. Incoming light is deviated from the optical axis of the device by an angle, alpha. The angle causes a similar displacement of the two component images at the exit aperture which results in a fringe pattern. Fringe numbers are directly related to angle alpha. Various control systems of the interferometer are given.
Closed-Form 3-D Localization for Single Source in Uniform Circular Array with a Center Sensor
NASA Astrophysics Data System (ADS)
Bae, Eun-Hyon; Lee, Kyun-Kyung
A novel closed-form algorithm is presented for estimating the 3-D location (azimuth angle, elevation angle, and range) of a single source in a uniform circular array (UCA) with a center sensor. Based on the centrosymmetry of the UCA and noncircularity of the source, the proposed algorithm decouples and estimates the 2-D direction of arrival (DOA), i.e. azimuth and elevation angles, and then estimates the range of the source. Notwithstanding a low computational complexity, the proposed algorithm provides an estimation performance close to that of the benchmark estimator 3-D MUSIC.
Two Perspectives on Forest Fire
NASA Technical Reports Server (NTRS)
2002-01-01
Multi-angle Imaging Spectroradiometer (MISR) images of smoke plumes from wildfires in western Montana acquired on August 14, 2000. A portion of Flathead Lake is visible at the top, and the Bitterroot Range traverses the images. The left view is from MISR's vertical-viewing (nadir) camera. The right view is from the camera that looks forward at a steep angle (60 degrees). The smoke location and extent are far more visible when seen at this highly oblique angle. However, vegetation is much darker in the forward view. A brown burn scar is located nearly in the exact center of the nadir image, while in the high-angle view it is shrouded in smoke. Also visible in the center and upper right of the images, and more obvious in the clearer nadir view, are checkerboard patterns on the surface associated with land ownership boundaries and logging. Compare these images with the high resolution infrared imagery captured nearby by Landsat 7 half an hour earlier. Images by NASA/GSFC/JPL, MISR Science Team.
Optoelectronic Sensor System for Guidance in Docking
NASA Technical Reports Server (NTRS)
Howard, Richard T.; Bryan, Thomas C.; Book, Michael L.; Jackson, John L.
2004-01-01
The Video Guidance Sensor (VGS) system is an optoelectronic sensor that provides automated guidance between two vehicles. In the original intended application, the two vehicles would be spacecraft docking together, but the basic principles of design and operation of the sensor are applicable to aircraft, robots, vehicles, or other objects that may be required to be aligned for docking, assembly, resupply, or precise separation. The system includes a sensor head containing a monochrome charge-coupled- device video camera and pulsed laser diodes mounted on the tracking vehicle, and passive reflective targets on the tracked vehicle. The lasers illuminate the targets, and the resulting video images of the targets are digitized. Then, from the positions of the digitized target images and known geometric relationships among the targets, the relative position and orientation of the vehicles are computed. As described thus far, the VGS system is based on the same principles as those of the system described in "Improved Video Sensor System for Guidance in Docking" (MFS-31150), NASA Tech Briefs, Vol. 21, No. 4 (April 1997), page 9a. However, the two systems differ in the details of design and operation. The VGS system is designed to operate with the target completely visible within a relative-azimuth range of +/-10.5deg and a relative-elevation range of +/-8deg. The VGS acquires and tracks the target within that field of view at any distance from 1.0 to 110 m and at any relative roll, pitch, and/or yaw angle within +/-10deg. The VGS produces sets of distance and relative-orientation data at a repetition rate of 5 Hz. The software of this system also accommodates the simultaneous operation of two sensors for redundancy
Optical inverse-square displacement sensor
Howe, R.D.; Kychakoff, G.
1989-09-12
This invention comprises an optical displacement sensor that uses the inverse-square attenuation of light reflected from a diffused surface to calculate the distance from the sensor to the reflecting surface. Light emerging from an optical fiber or the like is directed onto the surface whose distance is to be measured. The intensity I of reflected light is angle dependent, but within a sufficiently small solid angle it falls off as the inverse square of the distance from the surface. At least a pair of optical detectors are mounted to detect the reflected light within the small solid angle, their ends being at different distances R and R + [Delta]R from the surface. The distance R can then be found in terms of the ratio of the intensity measurements and the separation length as given in an equation. 10 figs.
NASA Technical Reports Server (NTRS)
Elmer, Nicholas J.; Berndt, Emily; Jedlovec, Gary J.
2016-01-01
Red-Green-Blue (RGB) composites (EUMETSAT User Services 2009) combine information from several channels into a single composite image. RGB composites contain the same information as the original channels, but presents the information in a more efficient manner. However, RGB composites derived from infrared imagery of both polar-orbiting and geostationary sensors are adversely affected by the limb effect, which interferes with the qualitative interpretation of RGB composites at large viewing zenith angles. The limb effect, or limb-cooling, is a result of an increase in optical path length of the absorbing atmosphere as viewing zenith angle increases (Goldberg et al. 2001; Joyce et al. 2001; Liu and Weng 2007). As a result, greater atmospheric absorption occurs at the limb, causing the sensor to observe anomalously cooler brightness temperatures. Figure 1 illustrates this effect. In general, limb-cooling results in a 4-11 K decrease in measured brightness temperature (Liu and Weng 2007) depending on the infrared band. For example, water vapor and ozone absorption channels display much larger limb-cooling than infrared window channels. Consequently, RGB composites created from infrared imagery not corrected for limb effects can only be reliably interpreted close to nadir, which reduces the spatial coverage of the available imagery. Elmer (2015) developed a reliable, operational limb correction technique for clear regions. However, many RGB composites are intended to be used and interpreted in cloudy regions, so a limb correction methodology valid for both clear and cloudy regions is needed. This paper presents a limb correction technique valid for both clear and cloudy regions, which is described in Section 2. Section 3 presents several RGB case studies demonstrating the improved functionality of limb-corrected RGBs in both clear and cloudy regions, and Section 4 summarizes and presents the key conclusions of this work.
Angle-independent pH-sensitive composites with natural gyroid structure
Xue, Ruiyang; Zhang, Wang; Sun, Peng; Zada, Imran; Guo, Cuiping; Liu, Qinglei; Gu, Jiajun; Su, Huilan; Zhang, Di
2017-01-01
pH sensor is an important and practical device with a wide application in environmental protection field and biomedical industries. An efficient way to enhance the practicability of intelligent polymer composed pH sensor is to subtilize the three-dimensional microstructure of the materials, adding measurable features to visualize the output signal. In this work, C. rubi wing scales were combined with pH-responsive smart polymer polymethylacrylic acid (PMAA) through polymerization to achieve a colour-tunable pH sensor with nature gyroid structure. Morphology and reflection characteristics of the novel composites, named G-PMAA, are carefully investigated and compared with the original biotemplate, C. rubi wing scales. The most remarkable property of G-PMAA is a single-value corresponding relationship between pH value and the reflection peak wavelength (λmax), with a colour distinction degree of 18 nm/pH, ensuring the accuracy and authenticity of the output. The pH sensor reported here is totally reversible, which is able to show the same results after several detection circles. Besides, G-PMAA is proved to be not influenced by the detection angle, which makes it a promising pH sensor with superb sensitivity, stability, and angle-independence. PMID:28165044
2D tilting MEMS micro mirror integrating a piezoresistive sensor position feedback
NASA Astrophysics Data System (ADS)
Lani, S.; Bayat, D.; Despont, M.
2015-02-01
An integrated position sensor for a dual-axis electromagnetic tilting mirror is presented. This tilting mirror is composed of a silicon based mirror directly assembled on a silicon membrane supported by flexible beams. The position sensors are constituted by 4 Wheatstone bridges of piezoresistors which are fabricated by doping locally the flexible beams. A permanent magnet is attached to the membrane and the scanner is mounted above planar coils deposited on a ceramic substrate to achieve electromagnetic actuation. The performances of the piezoresistive sensors are evaluated by measuring the output signal of the piezoresistors as a function of the tilt of the mirror and the temperature. White light interferometry was performed for all measurement to measure the exact tilt angle. The minimum detectable angle with such sensors was 30µrad (around 13bits) in the range of the minimum resolution of the interferometer. The tilt reproducibility was 0.0186%, obtained by measuring the tilt after repeated actuations with a coil current of 50mA during 30 min and the stability over time was 0.05% in 1h without actuation. The maximum measured tilt angle was 6° (mechanical) limited by nonlinearity of the MEMS system.
Optical flows method for lightweight agile remote sensor design and instrumentation
NASA Astrophysics Data System (ADS)
Wang, Chong; Xing, Fei; Wang, Hongjian; You, Zheng
2013-08-01
Lightweight agile remote sensors have become one type of the most important payloads and were widely utilized in space reconnaissance and resource survey. These imaging sensors are designed to obtain the high spatial, temporary and spectral resolution imageries. Key techniques in instrumentation include flexible maneuvering, advanced imaging control algorithms and integrative measuring techniques, which are closely correlative or even acting as the bottle-necks for each other. Therefore, mutual restrictive problems must be solved and optimized. Optical flow is the critical model which to be fully represented in the information transferring as well as radiation energy flowing in dynamic imaging. For agile sensors, especially with wide-field-of view, imaging optical flows may distort and deviate seriously when they perform large angle attitude maneuvering imaging. The phenomena are mainly attributed to the geometrical characteristics of the three-dimensional earth surface as well as the coupled effects due to the complicated relative motion between the sensor and scene. Under this circumstance, velocity fields distribute nonlinearly, the imageries may badly be smeared or probably the geometrical structures are changed since the image velocity matching errors are not having been eliminated perfectly. In this paper, precise imaging optical flow model is established for agile remote sensors, for which optical flows evolving is factorized by two forms, which respectively due to translational movement and image shape changing. Moreover, base on that, agile remote sensors instrumentation was investigated. The main techniques which concern optical flow modeling include integrative design with lightweight star sensors along with micro inertial measurement units and corresponding data fusion, the assemblies of focal plane layout and control, imageries post processing for agile remote sensors etc. Some experiments show that the optical analyzing method is effective to eliminate the limitations for the performance indexes, and succeeded to be applied for integrative system design. Finally, a principle prototype of agile remote sensor designed by the method is discussed.
On Orbit Measurement of Response vs. Scan Angle for the Infrared Bands on TRMM/VIRS
NASA Technical Reports Server (NTRS)
Barnes, William L.; Lyu, Cheng-Hsuan; Barnes, Robert A.
1999-01-01
The Visible and Infrared Scanner on the Tropical Rainfall Measuring Mission (TRMM/VIRS) is a whiskbroom imaging radiometer with two reflected solar bands and three emissive infrared bands. All five detectors are on a single cooled focal plane. This configuration necessitated the use of a paddlewheel scan mirror to avoid the effects of focal plane rotation that arise when using a scan mirror that is inclined to its axis of rotation. System radiometric requirements led to the need for protected silver as the mirror surface. Unfortunately, the SiO(x) coatings currently used to protect silver from oxidation introduce a change in reflectance with angle of incidence (AOI). This AOI dependence results in a modulation of system level response with scan angle. Measurement of system response vs. scan angle (RVS) was not difficult for the VIRS reflected solar bands, but attaining the required accuracy for the IR bands in the laboratory was not possible without a large vacuum chamber and a considerable amount of custom designed testing apparatus. Therefore, the decision was made to conduct the measurement on-orbit. On three separate occasions, the TRMM spacecraft was rotated about its pitch axis and, after the nadir view passed over the Earth's limb, the VIRS performed several thousand scans while viewing deep space. The resulting data has been analyzed and the RVS curves generated for the three IR bands are being used in the VIRS radiometric calibration algorithm. This, to our knowledge, the first time this measurement has been made on-orbit. Similar measurements are planned for the EOS-AM and EOS-PM MODIS sensors and are being considered for several systems under development. The VIRS on-orbit results will be compared to VIRS and MODIS system level laboratory measurements, MODIS scan mirror witness sample measurements and modeled data.
Smart CMOS sensor for wideband laser threat detection
NASA Astrophysics Data System (ADS)
Schwarze, Craig R.; Sonkusale, Sameer
2015-09-01
The proliferation of lasers has led to their widespread use in applications ranging from short range standoff chemical detection to long range Lidar sensing and target designation operating across the UV to LWIR spectrum. Recent advances in high energy lasers have renewed the development of laser weapons systems. The ability to measure and assess laser source information is important to both identify a potential threat as well as determine safety and nominal hazard zone (NHZ). Laser detection sensors are required that provide high dynamic range, wide spectral coverage, pulsed and continuous wave detection, and large field of view. OPTRA, Inc. and Tufts have developed a custom ROIC smart pixel imaging sensor architecture and wavelength encoding optics for measurement of source wavelength, pulse length, pulse repetition frequency (PRF), irradiance, and angle of arrival. The smart architecture provides dual linear and logarithmic operating modes to provide 8+ orders of signal dynamic range and nanosecond pulse measurement capability that can be hybridized with the appropriate detector array to provide UV through LWIR laser sensing. Recent advances in sputtering techniques provide the capability for post-processing CMOS dies from the foundry and patterning PbS and PbSe photoconductors directly on the chip to create a single monolithic sensor array architecture for measuring sources operating from 0.26 - 5.0 microns, 1 mW/cm2 - 2 kW/cm2.
Miniature Rotorcraft Flight Control Stabilization System
2008-05-30
The first algorithm is based on the well known QUEST algorithm used for spacecraft and satellites. Due to large vibration in sensors a pseudo...for spacecraft and satellites. Due to large vibration in sensors a pseudo-measurement is developed from gyroscope measurements and rotational...using any valid set of orientation map. Note, in Eq. (6) Euler angles were used to describe . A common alternative to Euler angles is a quaternion
Wireless Orbiter Hang-Angle Inclinometer System
NASA Technical Reports Server (NTRS)
Lucena, Angel; Perotti, Jose; Green, Eric; Byon, Jonathan; Burns, Bradley; Mata, Carlos; Randazzo, John; Blalock, Norman
2011-01-01
A document describes a system to reliably gather the hang-angle inclination of the orbiter. The system comprises a wireless handheld master station (which contains the main station software) and a wireless remote station (which contains the inclinometer sensors, the RF transceivers, and the remote station software). The remote station is designed to provide redundancy to the system. It includes two RF transceivers, two power-management boards, and four inclinometer sensors.
NASA Technical Reports Server (NTRS)
Moes, Timothy R.; Whitmore, Stephen A.; Jordan, Frank L., Jr.
1993-01-01
A nonintrusive airdata-sensing system was calibrated in flight and wind-tunnel experiments to an angle of attack of 70 deg and to angles of sideslip of +/- 15 deg. Flight-calibration data have also been obtained to Mach 1.2. The sensor, known as the flush airdata sensor, was installed on the nosecap of an F-18 aircraft for flight tests and on a full-scale F-18 forebody for wind-tunnel tests. Flight tests occurred at the NASA Dryden Flight Research Facility, Edwards, California, using the F-18 High Alpha Research Vehicle. Wind-tunnel tests were conducted in the 30- by 60-ft wind tunnel at the NASA LaRC, Hampton, Virginia. The sensor consisted of 23 flush-mounted pressure ports arranged in concentric circles and located within 1.75 in. of the tip of the nosecap. An overdetermined mathematical model was used to relate the pressure measurements to the local airdata quantities. The mathematical model was based on potential flow over a sphere and was empirically adjusted based on flight and wind-tunnel data. For quasi-steady maneuvering, the mathematical model worked well throughout the subsonic, transonic, and low supersonic flight regimes. The model also worked well throughout the angle-of-attack and sideslip regions studied.
NASA Technical Reports Server (NTRS)
Moes, Timothy R.; Whitmore, Stephen A.; Jordan, Frank L., Jr.
1993-01-01
A nonintrusive airdata-sensing system was calibrated in flight and wind-tunnel experiments to an angle of attack of 70 deg and to angles of sideslip of +/- 15 deg. Flight-calibration data have also been obtained to Mach 1.2. The sensor, known as the flush airdata sensor, was installed on the nosecap of an F-18 aircraft for flight tests and on a full-scale F-18 forebody for wind-tunnel tests. Flight tests occurred at the NASA Dryden Flight Research Facility, Edwards, California, using the F-18 High Alpha Research Vehicle. Wind-tunnel tests were conducted in the 30- by 60-ft wind tunnel at the NASA LaRC, Hampton, Virginia. The sensor consisted of 23 flush-mounted pressure ports arranged in concentric circles and located within 1.75 in. of the tip of the nosecap. An overdetermined mathematical model was used to relate the pressure measurements to the local airdata quantities. The mathematical model was based on potential flow over a sphere and was empirically adjusted based on flight and wind-tunnel data. For quasi-steady maneuvering, the mathematical model worked well throughout the subsonic, transonic, and low supersonic flight regimes. The model also worked well throughout the angles-of-attack and -sideslip regions studied.
NASA Technical Reports Server (NTRS)
Davies, Roger
1994-01-01
The spatial autocorrelation functions of broad-band longwave and shortwave radiances measured by the Earth Radiation Budget Experiment (ERBE) are analyzed as a function of view angle in an investigation of the general effects of scene inhomogeneity on radiation. For nadir views, the correlation distance of the autocorrelation function is about 900 km for longwave radiance and about 500 km for shortwave radiance, consistent with higher degrees of freedom in shortwave reflection. Both functions rise monotonically with view angle, but there is a substantial difference in the relative angular dependence of the shortwave and longwave functions, especially for view angles less than 50 deg. In this range, the increase with angle of the longwave functions is found to depend only on the expansion of pixel area with angle, whereas the shortwave functions show an additional dependence on angle that is attributed to the occlusion of inhomogeneities by cloud height variations. Beyond a view angle of about 50 deg, both longwave and shortwave functions appear to be affected by cloud sides. The shortwave autocorrelation functions do not satisfy the principle of directional reciprocity, thereby proving that the average scene is horizontally inhomogeneous over the scale of an ERBE pixel (1500 sq km). Coarse stratification of the measurements by cloud amount, however, indicates that the average cloud-free scene does satisfy directional reciprocity on this scale.
Application of AI techniques to infer vegetation characteristics from directional reflectance(s)
NASA Technical Reports Server (NTRS)
Kimes, D. S.; Smith, J. A.; Harrison, P. A.; Harrison, P. R.
1994-01-01
Traditionally, the remote sensing community has relied totally on spectral knowledge to extract vegetation characteristics. However, there are other knowledge bases (KB's) that can be used to significantly improve the accuracy and robustness of inference techniques. Using AI (artificial intelligence) techniques a KB system (VEG) was developed that integrates input spectral measurements with diverse KB's. These KB's consist of data sets of directional reflectance measurements, knowledge from literature, and knowledge from experts which are combined into an intelligent and efficient system for making vegetation inferences. VEG accepts spectral data of an unknown target as input, determines the best techniques for inferring the desired vegetation characteristic(s), applies the techniques to the target data, and provides a rigorous estimate of the accuracy of the inference. VEG was developed to: infer spectral hemispherical reflectance from any combination of nadir and/or off-nadir view angles; infer percent ground cover from any combination of nadir and/or off-nadir view angles; infer unknown view angle(s) from known view angle(s) (known as view angle extension); and discriminate between user defined vegetation classes using spectral and directional reflectance relationships developed from an automated learning algorithm. The errors for these techniques were generally very good ranging between 2 to 15% (proportional root mean square). The system is designed to aid scientists in developing, testing, and applying new inference techniques using directional reflectance data.
NASA Astrophysics Data System (ADS)
Venolia, Dan S.; Williams, Lance
1990-08-01
A range of stereoscopic display technologies exist which are no more intrusive, to the user, than a pair of spectacles. Combining such a display system with sensors for the position and orientation of the user's point-of-view results in a greatly enhanced depiction of three-dimensional data. As the point of view changes, the stereo display channels are updated in real time. The face of a monitor or display screen becomes a window on a three-dimensional scene. Motion parallax naturally conveys the placement and relative depth of objects in the field of view. Most of the advantages of "head-mounted display" technology are achieved with a less cumbersome system. To derive the full benefits of stereo combined with motion parallax, both stereo channels must be updated in real time. This may limit the size and complexity of data bases which can be viewed on processors of modest resources, and restrict the use of additional three-dimensional cues, such as texture mapping, depth cueing, and hidden surface elimination. Effective use of "full 3D" may still be undertaken in a non-interactive mode. Integral composite holograms have often been advanced as a powerful 3D visualization tool. Such a hologram is typically produced from a film recording of an object on a turntable, or a computer animation of an object rotating about one axis. The individual frames of film are multiplexed, in a composite hologram, in such a way as to be indexed by viewing angle. The composite may be produced as a cylinder transparency, which provides a stereo view of the object as if enclosed within the cylinder, which can be viewed from any angle. No vertical parallax is usually provided (this would require increasing the dimensionality of the multiplexing scheme), but the three dimensional image is highly resolved and easy to view and interpret. Even a modest processor can duplicate the effect of such a precomputed display, provided sufficient memory and bus bandwidth. This paper describes the components of a stereo display system with user point-of-view tracking for interactive 3D, and a digital realization of integral composite display which we term virtual integral holography. The primary drawbacks of holographic display - film processing turnaround time, and the difficulties of displaying scenes in full color -are obviated, and motion parallax cues provide easy 3D interpretation even for users who cannot see in stereo.
NASA Astrophysics Data System (ADS)
Mori, Hiroshi; Asahara, Yousuke
1996-03-01
We analyze the linearity and modulation depth of ac magnetic-field sensors or current sensors, using a ferrimagnetic or ferromagnetic film as the Faraday rotator and employing the detection of only the zeroth-order optical diffraction component from the rotator. It is theoretically shown that for this class of sensor the condition of a constant modulation depth and that of a constant ratio error give an identical series of curves for the relationship between Faraday rotation angle greater than or equals V and polarizer/analyzer relative angle Phi . We give some numerical examples to demonstrate the usefulness of the result with reference to a rare-earth iron garnet film as the rotator.
Wide angle sun sensor. [consisting of cylinder, insulation and pair of detectors
NASA Technical Reports Server (NTRS)
Schumacher, L. L. (Inventor)
1975-01-01
A single-axis sun sensor consists of a cylinder of an insulating material on which at least one pair of detectors is deposited on a circumference of the cylinder, was disclosed. At any time only one-half of the cylinder is illuminated so that the total resistance of the two detectors is a constant. Due to the round surface on which the detectors are deposited, the sensor exhibits a linear wide angle of + or - 50 deg to within an accuracy of about 2%. By depositing several pairs of detectors on adjacent circumferences, sufficient redundancy is realized to provide high reliability. A two-axis sensor is provided by depositing detectors on the surface of a sphere along at least two orthogonal great circles.
1999-08-24
One wide-angle and eight narrow-angle camera images of Miranda, taken by NASA Voyager 2, were combined in this view. The controlled mosaic was transformed to an orthographic view centered on the south pole.
NASA Astrophysics Data System (ADS)
Voss, K. J.; Morel, A.; Antoine, D.
2007-06-01
The radiance viewed from the ocean depends on the illumination and viewing geometry along with the water properties and this variation is called the bidirectional effect, or BRDF of the water. This BRDF depends on the inherent optical properties of the water, including the volume scattering function, and is important when comparing data from different satellite sensors. The current model by Morel et al. (2002) depends on modeled water parameters, thus must be carefully validated. In this paper we combined upwelling radiance distribution data from several cruises, in varied water types and with a wide range of solar zenith angles. We found that the average error of the model, when compared to the data was less than 1%, while the RMS difference between the model and data was on the order of 0.02-0.03. This is well within the statistical noise of the data, which was on the order of 0.04-0.05, due to environmental noise sources such as wave focusing.
3D medical thermography device
NASA Astrophysics Data System (ADS)
Moghadam, Peyman
2015-05-01
In this paper, a novel handheld 3D medical thermography system is introduced. The proposed system consists of a thermal-infrared camera, a color camera and a depth camera rigidly attached in close proximity and mounted on an ergonomic handle. As a practitioner holding the device smoothly moves it around the human body parts, the proposed system generates and builds up a precise 3D thermogram model by incorporating information from each new measurement in real-time. The data is acquired in motion, thus it provides multiple points of view. When processed, these multiple points of view are adaptively combined by taking into account the reliability of each individual measurement which can vary due to a variety of factors such as angle of incidence, distance between the device and the subject and environmental sensor data or other factors influencing a confidence of the thermal-infrared data when captured. Finally, several case studies are presented to support the usability and performance of the proposed system.
Gross, Lydwine; Frouin, Robert; Dupouy, Cécile; André, Jean Michel; Thiria, Sylvie
2004-07-10
A neural network is developed to retrieve chlorophyll a concentration from marine reflectance by use of the five visible spectral bands of the Sea-viewing Wide Field-of-view Sensor (SeaWiFS). The network, dedicated to the western equatorial Pacific Ocean, is calibrated with synthetic data that vary in terms of atmospheric content, solar zenith angle, and secondary pigments. Pigment variability is based on in situ data collected in the study region and is introduced through nonlinear modeling of phytoplankton absorption as a function of chlorophyll a, b, and c and photosynthetic and photoprotectant carotenoids. Tests performed on simulated yet realistic data show that chlorophyll a retrievals are substantially improved by use of the neural network instead of classical algorithms, which are sensitive to spectrally uncorrelated effects. The methodology is general, i.e., is applicable to regions other than the western equatorial Pacific Ocean.
Enhanced Pedestrian Navigation Based on Course Angle Error Estimation Using Cascaded Kalman Filters
Park, Chan Gook
2018-01-01
An enhanced pedestrian dead reckoning (PDR) based navigation algorithm, which uses two cascaded Kalman filters (TCKF) for the estimation of course angle and navigation errors, is proposed. The proposed algorithm uses a foot-mounted inertial measurement unit (IMU), waist-mounted magnetic sensors, and a zero velocity update (ZUPT) based inertial navigation technique with TCKF. The first stage filter estimates the course angle error of a human, which is closely related to the heading error of the IMU. In order to obtain the course measurements, the filter uses magnetic sensors and a position-trace based course angle. For preventing magnetic disturbance from contaminating the estimation, the magnetic sensors are attached to the waistband. Because the course angle error is mainly due to the heading error of the IMU, and the characteristic error of the heading angle is highly dependent on that of the course angle, the estimated course angle error is used as a measurement for estimating the heading error in the second stage filter. At the second stage, an inertial navigation system-extended Kalman filter-ZUPT (INS-EKF-ZUPT) method is adopted. As the heading error is estimated directly by using course-angle error measurements, the estimation accuracy for the heading and yaw gyro bias can be enhanced, compared with the ZUPT-only case, which eventually enhances the position accuracy more efficiently. The performance enhancements are verified via experiments, and the way-point position error for the proposed method is compared with those for the ZUPT-only case and with other cases that use ZUPT and various types of magnetic heading measurements. The results show that the position errors are reduced by a maximum of 90% compared with the conventional ZUPT based PDR algorithms. PMID:29690539
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lv, Yang; Wang, Ruixing; Ma, Haotong
Purpose: The measurement based on Shack-Hartmann wave-front sensor(WFS), obtaining both the high and low order wave-front aberrations simultaneously and accurately, has been applied in the detection of human eyes aberration in recent years. However, Its application is limited by the small field of view (FOV), slight eye movement leads the optical bacon image exceeds the lenslet array which result in uncertain detection error. To overcome difficulties of precise eye location, the capacity of detecting eye wave-front aberration over FOV much larger than simply a single conjugate Hartmann WFS accurately and simultaneously is demanded. Methods: Plenoptic camera’s lenslet array subdivides themore » aperture light-field in spatial frequency domain, capture the 4-D light-field information. Data recorded by plenoptic cameras can be used to extract the wave-front phases associated to the eyes aberration. The corresponding theoretical model and simulation system is built up in this article to discuss wave-front measurement performance when utilizing plenoptic camera as wave-front sensor. Results: The simulation results indicate that the plenoptic wave-front method can obtain both the high and low order eyes wave-front aberration with the same accuracy as conventional system in single visual angle detectionand over FOV much larger than simply a single conjugate Hartmann systems. Meanwhile, simulation results show that detection of eye aberrations wave-front in different visual angle can be achieved effectively and simultaneously by plenoptic method, by both point and extended optical beacon from the eye. Conclusion: Plenoptic wave-front method possesses the feasibility in eye aberrations wave-front detection. With larger FOV, the method can effectively reduce the detection error brought by imprecise eye location and simplify the eye aberrations wave-front detection system comparing with which based on Shack-Hartmann WFS. Unique advantage of the plenoptic method lies in obtaining wave-front in different visual angle simultaneously, which provides an approach in building up 3-D model of eye refractor tomographically. Funded by the key Laboratory of High Power Laser and Physics, CAS Research Project of National University of Defense Technology No. JC13-07-01; National Natural Science Foundation of China No. 61205144.« less
Measuring the Viewing Angle of GW170817 with Electromagnetic and Gravitational Waves
NASA Astrophysics Data System (ADS)
Finstad, Daniel; De, Soumi; Brown, Duncan A.; Berger, Edo; Biwer, Christopher M.
2018-06-01
The joint detection of gravitational waves (GWs) and electromagnetic (EM) radiation from the binary neutron star merger GW170817 ushered in a new era of multi-messenger astronomy. Joint GW–EM observations can be used to measure the parameters of the binary with better precision than either observation alone. Here, we use joint GW–EM observations to measure the viewing angle of GW170817, the angle between the binary’s angular momentum and the line of sight. We combine a direct measurement of the distance to the host galaxy of GW170817 (NGC 4993) of 40.7 ± 2.36 Mpc with the Laser Interferometer Gravitational-wave Observatory (LIGO)/Virgo GW data and find that the viewing angle is {32}-13+10 +/- 1.7 degrees (90% confidence, statistical, and systematic errors). We place a conservative lower limit on the viewing angle of ≥13°, which is robust to the choice of prior. This measurement provides a constraint on models of the prompt γ-ray and radio/X-ray afterglow emission associated with the merger; for example, it is consistent with the off-axis viewing angle inferred for a structured jet model. We provide for the first time the full posterior samples from Bayesian parameter estimation of LIGO/Virgo data to enable further analysis by the community.
Method and apparatus for calibrating multi-axis load cells in a dexterous robot
NASA Technical Reports Server (NTRS)
Wampler, II, Charles W. (Inventor); Platt, Jr., Robert J. (Inventor)
2012-01-01
A robotic system includes a dexterous robot having robotic joints, angle sensors adapted for measuring joint angles at a corresponding one of the joints, load cells for measuring a set of strain values imparted to a corresponding one of the load cells during a predetermined pose of the robot, and a host machine. The host machine is electrically connected to the load cells and angle sensors, and receives the joint angle values and strain values during the predetermined pose. The robot presses together mating pairs of load cells to form the poses. The host machine executes an algorithm to process the joint angles and strain values, and from the set of all calibration matrices that minimize error in force balance equations, selects the set of calibration matrices that is closest in a value to a pre-specified value. A method for calibrating the load cells via the algorithm is also provided.
Design considerations for a backlight with switchable viewing angles
NASA Astrophysics Data System (ADS)
Fujieda, Ichiro; Takagi, Yoshihiko; Rahadian, Fanny
2006-08-01
Small-sized liquid crystal displays are widely used for mobile applications such as cell phones. Electronic control of a viewing angle range is desired in order to maintain privacy for viewing in public as well as to provide wide viewing angles for solitary viewing. Conventionally, a polymer-dispersed liquid crystal (PDLC) panel is inserted between a backlight and a liquid crystal panel. The PDLC layer either transmits or scatters the light from the backlight, thus providing an electronic control of viewing angles. However, such a display system is obviously thick and expensive. Here, we propose to place an electronically-controlled, light-deflecting device between an LED and a light-guide of a backlight. For example, a liquid crystal lens is investigated for other applications and its focal length is controlled electronically. A liquid crystal phase grating either transmits or diffracts an incoming light depending on whether or not a periodic phase distribution is formed inside its liquid crystal layer. A bias applied to such a device will control the angular distribution of the light propagating inside a light-guide. Output couplers built in the light-guide extract the propagating light to outside. They can be V-shaped grooves, pyramids, or any other structures that can refract, reflect or diffract light. When any of such interactions occur, the output couplers translate the changes in the propagation angles into the angular distribution of the output light. Hence the viewing-angle characteristic can be switched. The designs of the output couplers and the LC devices are important for such a backlight system.
In-plane omnidirectional magnetic field sensor based on Giant Magneto Impedance (GMI)
NASA Astrophysics Data System (ADS)
Díaz-Rubio, Ana; García-Miquel, Héctor; García-Chocano, Víctor Manuel
2017-12-01
In this work the design and characterization of an omnidirectional in-plane magnetic field sensor are presented. The sensor is based on the Giant Magneto Impedance (GMI) effect in glass-coated amorphous microwires of composition (Fe6Co94)72.5Si12.5B15. For the first time, a circular loop made with a microwire is used for giving omnidirectional response. In order to estimate the GMI response of the circular loop we have used a theoretical model of GMI, determining the GMI response as the sum of longitudinal sections with different angles of incidence. As a consequence of the circular loop, the GMI ratio of the sensor is reduced to 15% instead of 100% for the axial GMI response of a microwire. The sensor response has been experimentally verified and the GMI response of the circular loop has been studied as function of the magnetic field, driven current, and frequency. First, we have measured the GMI response of a longitudinal microwire for different angles of incidence, covering the full range between the tangential and perpendicular directions to the microwire axis. Then, using these results, we have experimentally verified the decomposition of a microwire with circular shape as longitudinal segments with different angles of incidence. Finally, we have designed a signal conditioning circuit for the omnidirectional magnetic field sensor. The response of the sensor has been studied as a function of the amplitude of the incident magnetic field.
NASA Technical Reports Server (NTRS)
Gastellu-Etchegorry, Jean-Philippe; Yin, Tiangang; Lauret, Nicolas; Grau, Eloi; Rubio, Jeremy; Cook, Bruce D.; Morton, Douglas C.; Sun, Guoqing
2016-01-01
Light Detection And Ranging (LiDAR) provides unique data on the 3-D structure of atmosphere constituents and the Earth's surface. Simulating LiDAR returns for different laser technologies and Earth scenes is fundamental for evaluating and interpreting signal and noise in LiDAR data. Different types of models are capable of simulating LiDAR waveforms of Earth surfaces. Semi-empirical and geometric models can be imprecise because they rely on simplified simulations of Earth surfaces and light interaction mechanisms. On the other hand, Monte Carlo ray tracing (MCRT) models are potentially accurate but require long computational time. Here, we present a new LiDAR waveform simulation tool that is based on the introduction of a quasi-Monte Carlo ray tracing approach in the Discrete Anisotropic Radiative Transfer (DART) model. Two new approaches, the so-called "box method" and "Ray Carlo method", are implemented to provide robust and accurate simulations of LiDAR waveforms for any landscape, atmosphere and LiDAR sensor configuration (view direction, footprint size, pulse characteristics, etc.). The box method accelerates the selection of the scattering direction of a photon in the presence of scatterers with non-invertible phase function. The Ray Carlo method brings traditional ray-tracking into MCRT simulation, which makes computational time independent of LiDAR field of view (FOV) and reception solid angle. Both methods are fast enough for simulating multi-pulse acquisition. Sensitivity studies with various landscapes and atmosphere constituents are presented, and the simulated LiDAR signals compare favorably with their associated reflectance images and Laser Vegetation Imaging Sensor (LVIS) waveforms. The LiDAR module is fully integrated into DART, enabling more detailed simulations of LiDAR sensitivity to specific scene elements (e.g., atmospheric aerosols, leaf area, branches, or topography) and sensor configuration for airborne or satellite LiDAR sensors.
Modeling contact angle hysteresis of a liquid droplet sitting on a cosine wave-like pattern surface.
Promraksa, Arwut; Chen, Li-Jen
2012-10-15
A liquid droplet sitting on a hydrophobic surface with a cosine wave-like square-array pattern in the Wenzel state is simulated by using the Surface Evolver to determine the contact angle. For a fixed drop volume, multiple metastable states are obtained at two different surface roughnesses. Unusual and non-circular shape of the three-phase contact line of a liquid droplet sitting on the model surface is observed due to corrugation and distortion of the contact line by structure of the roughness. The contact angle varies along the contact line for each metastable state. The maximum and minimum contact angles among the multiple metastable states at a fixed viewing angle correspond to the advancing and the receding contact angles, respectively. It is interesting to observe that the advancing/receding contact angles (and contact angle hysteresis) are a function of viewing angle. In addition, the receding (or advancing) contact angles at different viewing angles are determined at different metastable states. The contact angle of minimum energy among the multiple metastable states is defined as the most stable (equilibrium) contact angle. The Wenzel model is not able to describe the contact angle along the three-phase contact line. The contact angle hysteresis at different drop volumes is determined. The number of the metastable states increases with increasing drop volume. Drop volume effect on the contact angles is also discussed. Crown Copyright © 2012. Published by Elsevier Inc. All rights reserved.
Sun glitter imaging analysis of submarine sand waves in HJ-1A/B satellite CCD images
NASA Astrophysics Data System (ADS)
Zhang, Huaguo; He, Xiekai; Yang, Kang; Fu, Bin; Guan, Weibing
2014-11-01
Submarine sand waves are a widespread bed-form in tidal environment. Submarine sand waves induce current convergence and divergence that affect sea surface roughness thus become visible in sun glitter images. These sun glitter images have been employed for mapping sand wave topography. However, there are lots of effect factors in sun glitter imaging of the submarine sand waves, such as the imaging geometry and dynamic environment condition. In this paper, several sun glitter images from HJ-1A/B in the Taiwan Banks are selected. These satellite sun glitter images are used to discuss sun glitter imaging characteristics in different sensor parameters and dynamic environment condition. To interpret the imaging characteristics, calculating the sun glitter radiance and analyzing its spatial characteristics of the sand wave in different images is the best way. In this study, a simulated model based on sun glitter radiation transmission is adopted to certify the imaging analysis in further. Some results are drawn based on the study. Firstly, the sun glitter radiation is mainly determined by sensor view angle. Second, the current is another key factor for the sun glitter. The opposite current direction will cause exchanging of bright stripes and dark stripes. Third, brightness reversal would happen at the critical angle. Therefore, when using sun glitter image to obtain depth inversion, one is advised to take advantage of image properties of sand waves and to pay attention to key dynamic environment condition and brightness reversal.
Simulation of laser beam reflection at the sea surface
NASA Astrophysics Data System (ADS)
Schwenger, Frédéric; Repasi, Endre
2011-05-01
A 3D simulation of the reflection of a Gaussian shaped laser beam on the dynamic sea surface is presented. The simulation is suitable for both the calculation of images of SWIR (short wave infrared) imaging sensor and for determination of total detected power of reflected laser light for a bistatic configuration of laser source and receiver at different atmospheric conditions. Our computer simulation comprises the 3D simulation of a maritime scene (open sea/clear sky) and the simulation of laser light reflected at the sea surface. The basic sea surface geometry is modeled by a composition of smooth wind driven gravity waves. The propagation model for water waves is applied for sea surface animation. To predict the view of a camera in the spectral band SWIR the sea surface radiance must be calculated. This is done by considering the emitted sea surface radiance and the reflected sky radiance, calculated by MODTRAN. Additionally, the radiances of laser light specularly reflected at the wind-roughened sea surface are modeled in the SWIR band considering an analytical statistical sea surface BRDF (bidirectional reflectance distribution function). This BRDF model considers the statistical slope statistics of waves and accounts for slope-shadowing of waves that especially occurs at flat incident angles of the laser beam and near horizontal detection angles of reflected irradiance at rough seas. Simulation results are presented showing the variation of the detected laser power dependent on the geometric configuration of laser, sensor and wind characteristics.
NASA Technical Reports Server (NTRS)
Sun, Junqiang; Xiong, Xiaoxiong; Waluschka, Eugene; Wang, Menghua
2016-01-01
The Visible Infrared Imaging Radiometer Suite (VIIRS) is one of five instruments onboard the Suomi National Polar-Orbiting Partnership (SNPP) satellite that launched from Vandenberg Air Force Base, California, on October 28, 2011. It is a whiskbroom radiometer that provides +/-56.28deg scans of the Earth view. It has 22 bands, among which 14 are reflective solar bands (RSBs). The RSBs cover a wavelength range from 410 to 2250 nm. The RSBs of a remote sensor are usually sensitive to the polarization of incident light. For VIIRS, it is specified that the polarization factor should be smaller than 3% for 410 and 862 nm bands and 2.5% for other RSBs for the scan angle within +/-45deg. Several polarization sensitivity tests were performed prelaunch for SNPP VIIRS. The first few tests either had large uncertainty or were less reliable, while the last one was believed to provide the more accurate information about the polarization property of the instrument. In this paper, the measured data in the last polarization sensitivity test are analyzed, and the polarization factors and phase angles are derived from the measurements for all the RSBs. The derived polarization factors and phase angles are band, detector, and scan angle dependent. For near-infrared bands, they also depend on the half-angle mirror side. Nevertheless, the derived polarization factors are all within the specification, although the strong detector dependence of the polarization parameters was not expected. Compared to the Moderate Resolution Imaging Spectroradiometer on both Aqua and Terra satellites, the polarization effect on VIIRS RSB is much smaller.
Liu, Lei; Bai, Yu-Guang; Zhang, Da-Li; Wu, Zhi-Gang
2013-01-01
The measurement and control strategy of a piezo-based platform by using strain gauge sensors (SGS) and a robust composite controller is investigated in this paper. First, the experimental setup is constructed by using a piezo-based platform, SGS sensors, an AD5435 platform and two voltage amplifiers. Then, the measurement strategy to measure the tip/tilt angles accurately in the order of sub-μrad is presented. A comprehensive composite control strategy design to enhance the tracking accuracy with a novel driving principle is also proposed. Finally, an experiment is presented to validate the measurement and control strategy. The experimental results demonstrate that the proposed measurement and control strategy provides accurate angle motion with a root mean square (RMS) error of 0.21 μrad, which is approximately equal to the noise level. PMID:23860316
Full-parallax 3D display from stereo-hybrid 3D camera system
NASA Astrophysics Data System (ADS)
Hong, Seokmin; Ansari, Amir; Saavedra, Genaro; Martinez-Corral, Manuel
2018-04-01
In this paper, we propose an innovative approach for the production of the microimages ready to display onto an integral-imaging monitor. Our main contribution is using a stereo-hybrid 3D camera system, which is used for picking up a 3D data pair and composing a denser point cloud. However, there is an intrinsic difficulty in the fact that hybrid sensors have dissimilarities and therefore should be equalized. Handled data facilitate to generating an integral image after projecting computationally the information through a virtual pinhole array. We illustrate this procedure with some imaging experiments that provide microimages with enhanced quality. After projection of such microimages onto the integral-imaging monitor, 3D images are produced with great parallax and viewing angle.
A see-through holographic head-mounted display with the large viewing angle
NASA Astrophysics Data System (ADS)
Chen, Zhidong; sang, Xinzhu; Lin, Qiaojun; Li, Jin; Yu, Xunbo; Gao, Xin; Yan, Binbin; Wang, Kuiru; Yu, Chongxiu; Xie, Songlin
2017-02-01
A novel solution for the large view angle holographic head-mounted display (HHMD) is presented. Divergent light is used for the hologram illumination to construct a large size three-dimensional object outside the display in a short distance. A designed project-type lens with large numerical aperture projects the object constructed by the hologram to its real location. The presented solution can realize a compact HHMD system with a large field of view. The basic principle and the structure of the system are described. An augmented reality (AR) prototype with the size of 50 mm×40 mm and the view angle above 60° is demonstrated.
DC-8 Scanning Lidar Characterization of Aircraft Contrails and Cirrus Clouds
NASA Technical Reports Server (NTRS)
Uthe, Edward E.; Nielsen, Norman B.; Oseberg, Terje E.
1998-01-01
An angular-scanning large-aperture (36 cm) backscatter lidar was developed and deployed on the NASA DC-8 research aircraft as part of the SUCCESS (Subsonic Aircraft: Contrail and Cloud Effects Special Study) program. The lidar viewing direction could be scanned continuously during aircraft flight from vertically upward to forward to vertically downward, or the viewing could be at fixed angles. Real-time pictorial displays generated from the lidar signatures were broadcast on the DC-8 video network and used to locate clouds and contrails above, ahead of, and below the DC-8 to depict their spatial structure and to help select DC-8 altitudes for achieving optimum sampling by onboard in situ sensors. Several lidar receiver systems and real-time data displays were evaluated to help extend in situ data into vertical dimensions and to help establish possible lidar configurations and applications on future missions. Digital lidar signatures were recorded on 8 mm Exabyte tape and generated real-time displays were recorded on 8mm video tape. The digital records were transcribed in a common format to compact disks to facilitate data analysis and delivery to SUCCESS participants. Data selected from the real-time display video recordings were processed for publication-quality displays incorporating several standard lidar data corrections. Data examples are presented that illustrate: (1) correlation with particulate, gas, and radiometric measurements made by onboard sensors, (2) discrimination and identification between contrails observed by onboard sensors, (3) high-altitude (13 km) scattering layer that exhibits greatly enhanced vertical backscatter relative to off-vertical backscatter, and (4) mapping of vertical distributions of individual precipitating ice crystals and their capture by cloud layers. An angular scan plotting program was developed that accounts for DC-8 pitch and velocity.
NASA Astrophysics Data System (ADS)
Tagesson, T.; Fensholt, R.; Huber, S.; Horion, S.; Guiro, I.; Ehammer, A.; Ardo, J.
2015-08-01
This paper investigates how hyperspectral reflectance (between 350 and 1800 nm) can be used to infer ecosystem properties for a semi-arid savanna grassland in West Africa using a unique in situ-based multi-angular data set of hemispherical conical reflectance factor (HCRF) measurements. Relationships between seasonal dynamics in hyperspectral HCRF and ecosystem properties (biomass, gross primary productivity (GPP), light use efficiency (LUE), and fraction of photosynthetically active radiation absorbed by vegetation (FAPAR)) were analysed. HCRF data (ρ) were used to study the relationship between normalised difference spectral indices (NDSIs) and the measured ecosystem properties. Finally, the effects of variable sun sensor viewing geometry on different NDSI wavelength combinations were analysed. The wavelengths with the strongest correlation to seasonal dynamics in ecosystem properties were shortwave infrared (biomass), the peak absorption band for chlorophyll a and b (at 682 nm) (GPP), the oxygen A band at 761 nm used for estimating chlorophyll fluorescence (GPP and LUE), and blue wavelengths (ρ412) (FAPAR). The NDSI with the strongest correlation to (i) biomass combined red-edge HCRF (ρ705) with green HCRF (ρ587), (ii) GPP combined wavelengths at the peak of green reflection (ρ518, ρ556), (iii) LUE combined red (ρ688) with blue HCRF (ρ436), and (iv) FAPAR combined blue (ρ399) and near-infrared (ρ1295) wavelengths. NDSIs combining near infrared and shortwave infrared were strongly affected by solar zenith angles and sensor viewing geometry, as were many combinations of visible wavelengths. This study provides analyses based upon novel multi-angular hyperspectral data for validation of Earth-observation-based properties of semi-arid ecosystems, as well as insights for designing spectral characteristics of future sensors for ecosystem monitoring.
Utilization of optical sensors for phasor measurement units
Yao, Wenxuan; Wells, David; King, Daniel; ...
2017-11-10
With the help of GPS signals for synchronization, increasingly ubiquitous phasor measurement units (PMUs) provide power grid operators unprecedented system monitoring and control opportunities. However, the performance of PMUs is limited by the inherent deficiencies in traditional transformers. To address these issues, an optical sensor is used in PMU for signal acquisition to replace the traditional transformers. This is the first time the utilization of an optical sensor in PMUs has ever been reported. The accuracy of frequency, angle, and amplitude are evaluated via experiments. Lastly, the optical sensor based PMU can achieve the accuracy of 9.03 × 10 –4more » Hz for frequency, 6.38 × 10 –3 rad for angle and 6.73 × 10 –2 V for amplitude with real power grid signal, demonstrating the practicability of optical sensors in future PMUs.« less
Utilization of optical sensors for phasor measurement units
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yao, Wenxuan; Wells, David; King, Daniel
With the help of GPS signals for synchronization, increasingly ubiquitous phasor measurement units (PMUs) provide power grid operators unprecedented system monitoring and control opportunities. However, the performance of PMUs is limited by the inherent deficiencies in traditional transformers. To address these issues, an optical sensor is used in PMU for signal acquisition to replace the traditional transformers. This is the first time the utilization of an optical sensor in PMUs has ever been reported. The accuracy of frequency, angle, and amplitude are evaluated via experiments. Lastly, the optical sensor based PMU can achieve the accuracy of 9.03 × 10 –4more » Hz for frequency, 6.38 × 10 –3 rad for angle and 6.73 × 10 –2 V for amplitude with real power grid signal, demonstrating the practicability of optical sensors in future PMUs.« less
D-shaped tilted fiber Bragg grating using magnetic fluid for magnetic field sensor
NASA Astrophysics Data System (ADS)
Ying, Yu; Zhang, Rui; Si, Guang-Yuan; Wang, Xin; Qi, Yuan-Wei
2017-12-01
In our work, a numerical investigation of a magnetic field sensor based on a D-shaped tilted fiber Bragg grating and magnetic fluid is performed. The sensing probe is constructed by placing the magnetic fluid film on the flat surface of the D-shaped tilted fiber Bragg grating. We investigate the resonance wavelengths of the proposed structure with different tilted angles of grating ranging from 0° to 20°, and analyze the magnetic field sensing characteristics. The simulation results show that the optical fiber sensor exhibits optimal transmission characteristics with a tilted angle of 8°. The wavelength sensitivity of the magnetic field sensor is as high as -0.18nm/Oe in the range of 30Oe-270Oe, and it demonstrates a linearity up to R2= -0.9998. Such sensor has potential applications in determining magnetic sensing field.
Study on the Ag Nanowire/PDMS Pressure Sensors with Three-Layer and Back-to-Back Structures
NASA Astrophysics Data System (ADS)
Wu, Jianhao; Lan, Qiuming; Yang, Weijia; He, Xin; Yue, Yunting; Jiang, Jiayi; Jiang, Tinghui
2018-01-01
Ag nanowire (NW)/polydimethylsiloxane (PDMS) pressure sensors with the three-layer and back-to-back structures were fabricated by a coating-peeling method. The bending and pressing responses of the sensors were comparably investigated. The results reveal that two kinds of pressure sensors show similar response linearity in the bending test with a bending angle of 0-180°. However, the response sensitivity of the three-layer structured pressure sensor is superior to that of the back-to-back structural one, which exhibits that the relationship between the capacitance value (Y) and the bending angle (X) is: Y = 0.01244X + 2.9763. On the contrary, in the pressing test, the response sensitivity of the back-to-back structural sensor is better than that of the three-layer structural one. The relationship between capacitance value (Y) and the number of paper clips (pressure, X2) is Y = 0.09241X2 + 88.03597.
Visual Costs of the Inhomogeneity of Luminance and Contrast by Viewing LCD-TFT Screens Off-Axis.
Ziefle, Martina; Groeger, Thomas; Sommer, Dietmar
2003-01-01
In this study the anisotropic characteristics of TFT-LCD (Thin-Film-Transistor-Liquid Crystal Display) screens were examined. Anisotropy occurs as the distribution of luminance and contrast changes over the screen surface due to different viewing angles. On the basis of detailed photometric measurements the detection performance in a visual reaction task was measured in different viewing conditions. Viewing angle (0 degrees, frontal view; 30 degrees, off-axis; 50 degrees, off-axis) as well as ambient lighting (a dark or illuminated room) were varied. Reaction times and accuracy of detection performance were recorded. Results showed TFT's anisotropy to be a crucial factor deteriorating performance. With an increasing viewing angle performance decreased. It is concluded that TFT's anisotropy is a limiting factor for overall suitability and usefulness of this new display technology.
Preferred viewing distance of liquid crystal high-definition television.
Lee, Der-Song
2012-01-01
This study explored the effect of TV size, illumination, and viewing angle on preferred viewing distance in high-definition liquid crystal display televisions (HDTV). Results showed that the mean preferred viewing distance was 2856 mm. TV size and illumination significantly affected preferred viewing distance. The larger the screen size, the greater the preferred viewing distance, at around 3-4 times the width of the screen (W). The greater the illumination, the greater the preferred viewing distance. Viewing angle also correlated significantly with preferred viewing distance. The more deflected from direct frontal view, the shorter the preferred viewing distance seemed to be. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.
MONET: multidimensional radiative cloud scene model
NASA Astrophysics Data System (ADS)
Chervet, Patrick
1999-12-01
All cloud fields exhibit variable structures (bulge) and heterogeneities in water distributions. With the development of multidimensional radiative models by the atmospheric community, it is now possible to describe horizontal heterogeneities of the cloud medium, to study these influences on radiative quantities. We have developed a complete radiative cloud scene generator, called MONET (French acronym for: MOdelisation des Nuages En Tridim.) to compute radiative cloud scene from visible to infrared wavelengths for various viewing and solar conditions, different spatial scales, and various locations on the Earth. MONET is composed of two parts: a cloud medium generator (CSSM -- Cloud Scene Simulation Model) developed by the Air Force Research Laboratory, and a multidimensional radiative code (SHDOM -- Spherical Harmonic Discrete Ordinate Method) developed at the University of Colorado by Evans. MONET computes images for several scenario defined by user inputs: date, location, viewing angles, wavelength, spatial resolution, meteorological conditions (atmospheric profiles, cloud types)... For the same cloud scene, we can output different viewing conditions, or/and various wavelengths. Shadowing effects on clouds or grounds are taken into account. This code is useful to study heterogeneity effects on satellite data for various cloud types and spatial resolutions, and to determine specifications of new imaging sensor.
Flight test of MMW radar for brown-out helicopter landing
NASA Astrophysics Data System (ADS)
Martin, Christopher A.; Kolinko, Vladimir; Otto, Gregory P.; Lovberg, John A.
2012-06-01
Trex Enterprises and US Army RDECOM CERDEC Night Vision Electronic Sensors Directorate developed and tested helicopter radar to aid in brown-out landing situations. A brown-out occurs when sand and dust kicked up by the helicopter rotors impair the pilot's vision. Millimeter-wave (MMW) radiation penetrates sand and dust with little loss or scattering, and radar at this frequency can provide a pilot with an image of the intended landing zone. The Brown-out Situational Awareness System (BSAS) is a frequency-modulated, continuous-wave radar that measures range to the ground across a conical field-of-view and uses that range information to create an image for the pilot. The BSAS collected imagery from a helicopter in a blowing sand environment with obstacles including ditches, hills, posts, poles, wires, buildings and vehicles. The BSAS proved the capability to form images of the ground through heavy blowing sand and resolve images of some obstacles. The BSAS also attempted to differentiate flat ground from bumpy ground with limited success at some viewing angles. The BSAS test imagery includes some artifacts formed by high radar cross-section targets in the field-of-view or sidelobes. The paper discusses future improvements that could limit these artifacts.
High-precision angle sensor based on a Köster’s prism with absolute zero-point
NASA Astrophysics Data System (ADS)
Ullmann, V.; Oertel, E.; Manske, E.
2018-06-01
In this publication, a novel approach will be presented to use a compact white-light interferometer based on a Köster’s prism for angle measurements. Experiments show that the resolution of this angle interferometer is in the range of a commercial digital autocollimator, with a focal length of f = 300 mm, but with clearly reduced signal noise and without overshoot artifacts in the signal caused by digital filters. The angle detection of the reference mirror in the Köster’s interferometer is based on analysing the rotation angle of the fringe pattern, which is projected on a CMOS-matrix. The fringe pattern is generated by two displaced spherical wave fronts coming from one fiber-coupled white-light source and getting divided into a reference and a measurement beam by the Köster’s prism. The displacement correlates with the reference angle mirror in one linear direction and with the angle aberrations of the prism in the other orthogonal direction on the CMOS sensor. We will present the experimental and optical setup, the method and algorithms for the image-to-angle processing as well as the experimental results obtained in calibration and long-term measurements.
Hurwitz, David S; Pradhan, Anuj; Fisher, Donald L; Knodler, Michael A; Muttart, Jeffrey W; Menon, Rajiv; Meissner, Uwe
2010-04-01
Backing crash injures can be severe; approximately 200 of the 2,500 reported injuries of this type per year to children under the age of 15 years result in death. Technology for assisting drivers when backing has limited success in preventing backing crashes. Two questions are addressed: Why is the reduction in backing crashes moderate when rear-view cameras are deployed? Could rear-view cameras augment sensor systems? 46 drivers (36 experimental, 10 control) completed 16 parking trials over 2 days (eight trials per day). Experimental participants were provided with a sensor camera system, controls were not. Three crash scenarios were introduced. Parking facility at UMass Amherst, USA. 46 drivers (33 men, 13 women) average age 29 years, who were Massachusetts residents licensed within the USA for an average of 9.3 years. Interventions Vehicles equipped with a rear-view camera and sensor system-based parking aid. Subject's eye fixations while driving and researcher's observation of collision with objects during backing. Only 20% of drivers looked at the rear-view camera before backing, and 88% of those did not crash. Of those who did not look at the rear-view camera before backing, 46% looked after the sensor warned the driver. This study indicates that drivers not only attend to an audible warning, but will look at a rear-view camera if available. Evidence suggests that when used appropriately, rear-view cameras can mitigate the occurrence of backing crashes, particularly when paired with an appropriate sensor system.
Hurwitz, David S; Pradhan, Anuj; Fisher, Donald L; Knodler, Michael A; Muttart, Jeffrey W; Menon, Rajiv; Meissner, Uwe
2012-01-01
Context Backing crash injures can be severe; approximately 200 of the 2,500 reported injuries of this type per year to children under the age of 15 years result in death. Technology for assisting drivers when backing has limited success in preventing backing crashes. Objectives Two questions are addressed: Why is the reduction in backing crashes moderate when rear-view cameras are deployed? Could rear-view cameras augment sensor systems? Design 46 drivers (36 experimental, 10 control) completed 16 parking trials over 2 days (eight trials per day). Experimental participants were provided with a sensor camera system, controls were not. Three crash scenarios were introduced. Setting Parking facility at UMass Amherst, USA. Subjects 46 drivers (33 men, 13 women) average age 29 years, who were Massachusetts residents licensed within the USA for an average of 9.3 years. Interventions Vehicles equipped with a rear-view camera and sensor system-based parking aid. Main Outcome Measures Subject’s eye fixations while driving and researcher’s observation of collision with objects during backing. Results Only 20% of drivers looked at the rear-view camera before backing, and 88% of those did not crash. Of those who did not look at the rear-view camera before backing, 46% looked after the sensor warned the driver. Conclusions This study indicates that drivers not only attend to an audible warning, but will look at a rear-view camera if available. Evidence suggests that when used appropriately, rear-view cameras can mitigate the occurrence of backing crashes, particularly when paired with an appropriate sensor system. PMID:20363812
NASA Astrophysics Data System (ADS)
Li, J.; Wu, Z.; Wei, X.; Zhang, Y.; Feng, F.; Guo, F.
2018-04-01
Cross-calibration has the advantages of high precision, low resource requirements and simple implementation. It has been widely used in recent years. The four wide-field-of-view (WFV) cameras on-board Gaofen-1 satellite provide high spatial resolution and wide combined coverage (4 × 200 km) without onboard calibration. In this paper, the four-band radiometric cross-calibration coefficients of WFV1 camera were obtained based on radiation and geometry matching taking Landsat 8 OLI (Operational Land Imager) sensor as reference. Scale Invariant Feature Transform (SIFT) feature detection method and distance and included angle weighting method were introduced to correct misregistration of WFV-OLI image pair. The radiative transfer model was used to eliminate difference between OLI sensor and WFV1 camera through the spectral match factor (SMF). The near-infrared band of WFV1 camera encompasses water vapor absorption bands, thus a Look Up Table (LUT) for SMF varies from water vapor amount is established to estimate the water vapor effects. The surface synchronization experiment was designed to verify the reliability of the cross-calibration coefficients, which seem to perform better than the official coefficients claimed by the China Centre for Resources Satellite Data and Application (CCRSDA).
NASA Technical Reports Server (NTRS)
Whitmore, Stephen A.; Moes, Timothy R.
1991-01-01
The accuracy of a nonintrusive high angle-of-attack flush airdata sensing (HI-FADS) system was verified for quasi-steady flight conditions up to 55 deg angle of attack during the F-18 High Alpha Research Vehicle (HARV) Program. The system is a matrix of nine pressure ports arranged in annular rings on the aircraft nose. The complete airdata set is estimated using nonlinear regression. Satisfactory frequency response was verified to the system Nyquist frequency (12.5 Hz). The effects of acoustical distortions within the individual pressure sensors of the nonintrusive pressure matrix on overall system performance are addressed. To quantify these effects, a frequency-response model describing the dynamics of acoustical distortion is developed and simple design criteria are derived. The model adjusts measured HI-FADS pressure data for the acoustical distortion and quantifies the effects of internal sensor geometries on system performance. Analysis results indicate that sensor frequency response characteristics very greatly with altitude, thus it is difficult to select satisfactory sensor geometry for all altitudes. The solution used presample filtering to eliminate resonance effects, and short pneumatic tubing sections to reduce lag effects. Without presample signal conditioning the system designer must use the pneumatic transmission line to attenuate the resonances and accept the resulting altitude variability.
NASA Astrophysics Data System (ADS)
Liu, Zhilong; Wang, Biao; Tong, Weichao
2015-08-01
This paper designs a solar automatic tracking wireless charging system based on the four quadrant photoelectric sensor. The system track the sun's rays automatically in real time to received the maximum energy and wireless charging to the load through electromagnetic coupling. Four quadrant photoelectric sensor responsive to the solar spectrum, the system could get the current azimuth and elevation angle of the light by calculating the solar energy incident on the sensor profile. System driver the solar panels by the biaxial movement mechanism to rotate and tilt movement until the battery plate and light perpendicular to each other. Maximize the use of solar energy, and does not require external power supply to achieve energy self-sufficiency. Solar energy can be collected for portable devices and load wireless charging by close electromagnetic field coupling. Experimental data show that: Four quadrant photoelectric sensor more sensitive to light angle measurement. when track positioning solar light, Azimuth deviation is less than 0.8°, Elevation angle deviation is less than 0.6°. Use efficiency of a conventional solar cell is only 10% -20%.The system uses a Four quadrant dual-axis tracking to raise the utilization rate of 25% -35%.Wireless charging electromagnetic coupling efficiency reached 60%.
Development of the Multi-Angle Stratospheric Aerosol Radiometer (MASTAR) Instrument
NASA Astrophysics Data System (ADS)
DeLand, M. T.; Colarco, P. R.; Kowalewski, M. G.; Gorkavyi, N.; Ramos-Izquierdo, L.
2017-12-01
Aerosol particles in the stratosphere ( 15-25 km altitude), both produced naturally and perturbed by volcanic eruptions and anthropogenic emissions, continue to be a source of significant uncertainty in the Earth's energy budget. Stratospheric aerosols can offset some of the warming effects caused by greenhouse gases. These aerosols are currently monitored using measurements from the Ozone Mapping and Profiling Suite (OMPS) Limb Profiler (LP) instrument on the Suomi NPP satellite. In order to improve the sensitivity and spatial coverage of these aerosol data, we are developing an aerosol-focused compact version of the OMPS LP sensor called Multi-Angle Stratospheric Aerosol Radiometer (MASTAR) to fly on a 3U Cubesat satellite, using a NASA Instrument Incubator Program (IIP) grant. This instrument will make limb viewing measurements of the atmosphere in multiple directions simultaneously, and uses only a few selected wavelengths to reduce size and cost. An initial prototype version has been constructed using NASA GSFC internal funding and tested in the laboratory. Current design work is targeted towards a preliminary field test in Spring 2018. We will discuss the scientific benefits of MASTAR and the status of the project.
Bio-inspired multi-mode optic flow sensors for micro air vehicles
NASA Astrophysics Data System (ADS)
Park, Seokjun; Choi, Jaehyuk; Cho, Jihyun; Yoon, Euisik
2013-06-01
Monitoring wide-field surrounding information is essential for vision-based autonomous navigation in micro-air-vehicles (MAV). Our image-cube (iCube) module, which consists of multiple sensors that are facing different angles in 3-D space, can be applied to the wide-field of view optic flows estimation (μ-Compound eyes) and to attitude control (μ- Ocelli) in the Micro Autonomous Systems and Technology (MAST) platforms. In this paper, we report an analog/digital (A/D) mixed-mode optic-flow sensor, which generates both optic flows and normal images in different modes for μ- Compound eyes and μ-Ocelli applications. The sensor employs a time-stamp based optic flow algorithm which is modified from the conventional EMD (Elementary Motion Detector) algorithm to give an optimum partitioning of hardware blocks in analog and digital domains as well as adequate allocation of pixel-level, column-parallel, and chip-level signal processing. Temporal filtering, which may require huge hardware resources if implemented in digital domain, is remained in a pixel-level analog processing unit. The rest of the blocks, including feature detection and timestamp latching, are implemented using digital circuits in a column-parallel processing unit. Finally, time-stamp information is decoded into velocity from look-up tables, multiplications, and simple subtraction circuits in a chip-level processing unit, thus significantly reducing core digital processing power consumption. In the normal image mode, the sensor generates 8-b digital images using single slope ADCs in the column unit. In the optic flow mode, the sensor estimates 8-b 1-D optic flows from the integrated mixed-mode algorithm core and 2-D optic flows with an external timestamp processing, respectively.
System For Characterizing Three-Phase Brushless dc Motors
NASA Technical Reports Server (NTRS)
Howard, David E.; Smith, Dennis A.
1996-01-01
System of electronic hardware and software developed to automate measurements and calculations needed to characterize electromechanical performances of three-phase brushless dc motors, associated shaft-angle sensors needed for commutation, and associated brushless tachometers. System quickly takes measurements on all three phases of motor, tachometer, and shaft-angle sensor simultaneously and processes measurements into performance data. Also useful in development and testing of motors with not only three phases but also two, four, or more phases.
Shamshirband, Shahaboddin; Banjanovic-Mehmedovic, Lejla; Bosankic, Ivan; Kasapovic, Suad; Abdul Wahab, Ainuddin Wahid Bin
2016-01-01
Intelligent Transportation Systems rely on understanding, predicting and affecting the interactions between vehicles. The goal of this paper is to choose a small subset from the larger set so that the resulting regression model is simple, yet have good predictive ability for Vehicle agent speed relative to Vehicle intruder. The method of ANFIS (adaptive neuro fuzzy inference system) was applied to the data resulting from these measurements. The ANFIS process for variable selection was implemented in order to detect the predominant variables affecting the prediction of agent speed relative to intruder. This process includes several ways to discover a subset of the total set of recorded parameters, showing good predictive capability. The ANFIS network was used to perform a variable search. Then, it was used to determine how 9 parameters (Intruder Front sensors active (boolean), Intruder Rear sensors active (boolean), Agent Front sensors active (boolean), Agent Rear sensors active (boolean), RSSI signal intensity/strength (integer), Elapsed time (in seconds), Distance between Agent and Intruder (m), Angle of Agent relative to Intruder (angle between vehicles °), Altitude difference between Agent and Intruder (m)) influence prediction of agent speed relative to intruder. The results indicated that distance between Vehicle agent and Vehicle intruder (m) and angle of Vehicle agent relative to Vehicle Intruder (angle between vehicles °) is the most influential parameters to Vehicle agent speed relative to Vehicle intruder.
Presentation of a new BRDF measurement device
NASA Astrophysics Data System (ADS)
Serrot, Gerard; Bodilis, Madeleine; Briottet, Xavier; Cosnefroy, Helene
1998-12-01
The bi-directional reflectance distribution function (BRDF) plays a major role to evaluate or analyze signals reflected by Earth in the solar spectrum. A BRDF measurement device that covers a large spectral and directional domain was recently developed by ONERA/DOTA. It was designed to allow both laboratory and outside measurements. Its main characteristics are a spectral domain: 0.42-0.95 micrometers ; a geometrical domain: 0-60 degrees for zenith angle, 0-180 degrees for azimuth; a maximum target size for nadir measurements: 22 cm. For a given zenith angle of the source, the BRDF device needs about seven minutes to take measurements for a viewing zenith angle varying from 0-60 degrees and relative azimuth angle varying from 0-180 degrees. The performances, imperfections and properties of each component of the measurement chain are studied. A part of the work was devoted to characterize precisely the source, and particularly the spatial variability of the irradiance at the target level, the temporal stability and the spectral profile of the lamp. Some of these imperfections are modeled and taken into account in corrections of BRDF measurements. Concerning the sensor, a calibration in wavelength was done. Measurements of bi- directional reflectance of which is well known. A software was developed to convert all the raw data acquired automatically into BRDF values. To illustrate measurements taken by this device, some results are also presented here. They are taken over sand and short grass, for different wavelengths and geometrical conditions.
Examining view angle effects on leaf N estimation in wheat using field reflectance spectroscopy
NASA Astrophysics Data System (ADS)
Song, Xiao; Feng, Wei; He, Li; Xu, Duanyang; Zhang, Hai-Yan; Li, Xiao; Wang, Zhi-Jie; Coburn, Craig A.; Wang, Chen-Yang; Guo, Tian-Cai
2016-12-01
Real-time, nondestructive monitoring of crop nitrogen (N) status is a critical factor for precision N management during wheat production. Over a 3-year period, we analyzed different wheat cultivars grown under different experimental conditions in China and Canada and studied the effects of viewing angle on the relationships between various vegetation indices (VIs) and leaf nitrogen concentration (LNC) using hyperspectral data from 11 field experiments. The objective was to improve the prediction accuracy by minimizing the effects of viewing angle on LNC estimation to construct a novel vegetation index (VI) for use under different experimental conditions. We examined the stability of previously reported optimum VIs obtained from 13 traditional indices for estimating LNC at 13 viewing zenith angles (VZAs) in the solar principal plane (SPP). Backscattering direction showed better index performance than forward scattering direction. Red-edge VIs including modified normalized difference vegetation index (mND705), ratio index within the red edge region (RI-1dB) and normalized difference red edge index (NDRE) were highly correlated with LNC, as confirmed by high R2 determination coefficients. However, these common VIs tended to saturation, as the relationships strongly depended on experimental conditions. To overcome the influence of VZA on VIs, the chlorophyll- and LNC-sensitive NDRE index was divided by the floating-position water band index (FWBI) to generate the integrated narrow-band vegetation index. The highest correlation between the novel NDRE/FWBI parameter and LNC (R2 = 0.852) occurred at -10°, while the lowest correlation (R2 = 0.745) occurred at 60°. NDRE/FWBI was more highly correlated with LNC than existing commonly used VIs at an identical viewing zenith angle. Upon further analysis of angle combinations, our novel VI exhibited the best performance, with the best prediction accuracy at 0° to -20° (R2 = 0.838, RMSE = 0.360) and relatively good accuracy at 0° to -30° (R2 = 0.835, RMSE = 0.366). As it is possible to monitor plant N status over a wide range of angles using portable spectrometers, viewing angles of as much as 0° to -30° are common. Consequently, we developed a united model across angles of 0° to -30° to reduce the effects of viewing angle on LNC prediction in wheat. The proposed combined NDRE/FWBI parameter, designated the wide-angle-adaptability nitrogen index (WANI), is superior for estimating LNC in wheat on a regional scale in China and Canada.
Photometric normalization of LROC WAC images
NASA Astrophysics Data System (ADS)
Sato, H.; Denevi, B.; Robinson, M. S.; Hapke, B. W.; McEwen, A. S.; LROC Science Team
2010-12-01
The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) acquires near global coverage on a monthly basis. The WAC is a push frame sensor with a 90° field of view (FOV) in BW mode and 60° FOV in 7-color mode (320 nm to 689 nm). WAC images are acquired during each orbit in 10° latitude segments with cross track coverage of ~50 km. Before mosaicking, WAC images are radiometrically calibrated to remove instrumental artifacts and to convert at sensor radiance to I/F. Images are also photometrically normalized to common viewing and illumination angles (30° phase), a challenge due to the wide angle nature of the WAC where large differences in phase angle are observed in a single image line (±30°). During a single month the equatorial incidence angle drifts about 28° and over the course of ~1 year the lighting completes a 360° cycle. The light scattering properties of the lunar surface depend on incidence(i), emission(e), and phase(p) angles as well as soil properties such as single-scattering albedo and roughness that vary with terrain type and state of maturity [1]. We first tested a Lommel-Seeliger Correction (LSC) [cos(i)/(cos(i) + cos(e))] [2] with a phase function defined by an exponential decay plus 4th order polynomial term [3] which did not provide an adequate solution. Next we employed a LSC with an exponential 2nd order decay phase correction that was an improvement, but still exhibited unacceptable frame-to-frame residuals. In both cases we fitted the LSC I/F vs. phase angle to derive the phase corrections. To date, the best results are with a lunar-lambert function [4] with exponential 2nd order decay phase correction (LLEXP2) [(A1exp(B1p)+A2exp(B2p)+A3) * cos(i)/(cos(e) + cos(i)) + B3cos(i)]. We derived the parameters for the LLEXP2 from repeat imaging of a small region and then corrected that region with excellent results. When this correction was applied to the whole Moon the results were less than optimal - no surprise given the variability of the regolith from region to region. As the fitting area increases, the accuracy of curve fitting decreases due to the larger variety of albedo, topography, and composition. Thus we have adopted an albedo-dependent photometric normalization routine. Phase curves are derived for discreet bins of preliminary normalized reflectance calculated from Clementine global mosaic in a fitting area that is composed of predominantly mare in Oceanus Procellarum. The global WAC mosaic was then corrected pixel-by-pixel according to its preliminary reflectance map with satisfactory results. We observed that the phase curves per normalized-reflectance bins become steeper as the reflectance value increases. Further filtering by using FeO, TiO2, or optical maturity [5] for parameter calculations may help elucidate the effects of surface composition and maturity on photometric properties of the surface. [1] Hapke, B.W. (1993) Theory of Reflectance and Emittance Spectroscopy, Cambridge Univ. Press. [2] Schoenberg (1925) Ada. Soc. Febb., vol. 50. [3] Hillier et al. (1999) Icarus 141, 205-225. [4] McEwen (1991) Icarus 92, 298-311. [5] Lucey et al. (2000) JGR, v105, no E8, p20377-20386.
Lee, Ji-Hoon; Lee, Jung Jin; Lim, Young Jin; Kundu, Sudarshan; Kang, Shin-Woong; Lee, Seung Hee
2013-11-04
Long standing electro-optic problems of a polymer-dispersed liquid crystal (PDLC) such as low contrast ratio and transmittances decrease in oblique viewing angle have been challenged with a mixture of dual frequency liquid crystal (DFLC) and reactive mesogen (RM). The DFLC and RM molecules were vertically aligned and then photo-polymerized using a UV light. At scattering state under 50 kHz electric field, DFLC was switched to planar state, giving greater extraordinary refractive index than the normal PDLC cell. Consequently, the scattering intensity and the contrast ratio were increased compared to the conventional PDLC cell. At transparent state under 1 kHz electric field, the extraordinary refractive index of DFLC was simultaneously matched with the refractive index of vertically aligned RM so that the light scattering in oblique viewing angles was minimized, giving rise to high transmittance in all viewing angles.
NASA Astrophysics Data System (ADS)
Ermida, Sofia; DaCamara, Carlos C.; Trigo, Isabel F.; Pires, Ana C.; Ghent, Darren
2017-04-01
Land Surface Temperature (LST) is a key climatological variable and a diagnostic parameter of land surface conditions. Remote sensing constitutes the most effective method to observe LST over large areas and on a regular basis. Although LST estimation from remote sensing instruments operating in the Infrared (IR) is widely used and has been performed for nearly 3 decades, there is still a list of open issues. One of these is the LST dependence on viewing and illumination geometry. This effect introduces significant discrepancies among LST estimations from different sensors, overlapping in space and time, that are not related to uncertainties in the methodologies or input data used. Furthermore, these directional effects deviate LST products from an ideally defined LST, which should represent to the ensemble of directional radiometric temperature of all surface elements within the FOV. Angular effects on LST are here conveniently estimated by means of a kernel model of the surface thermal emission, which describes the angular dependence of LST as a function of viewing and illumination geometry. The model is calibrated using LST data as provided by a wide range of sensors to optimize spatial coverage, namely: 1) a LEO sensor - the Moderate Resolution Imaging Spectroradiometer (MODIS) on-board NASA's TERRA and AQUA; and 2) 3 GEO sensors - the Spinning Enhanced Visible and Infrared Imager (SEVIRI) on-board EUMETSAT's Meteosat Second Generation (MSG), the Japanese Meteorological Imager (JAMI) on-board the Japanese Meteorological Association (JMA) Multifunction Transport SATellite (MTSAT-2), and NASA's Geostationary Operational Environmental Satellites (GOES). As shown in our previous feasibility studies the sampling of illumination and view angles has a high impact on the obtained model parameters. This impact may be mitigated when the sampling size is increased by aggregating pixels with similar surface conditions. Here we propose a methodology where land surface is stratified by means of a cluster analysis using information on land cover type, fraction of vegetation cover and topography. The kernel model is then adjusted to LST data corresponding to each cluster. It is shown that the quality of the cluster based kernel model is very close to the pixel based one. Furthermore, the reduced number of parameters (limited to the number of identified clusters, instead of a pixel-by-pixel model calibration) allows improving the kernel model trough the incorporation of a seasonal component. The application of the here discussed procedure towards the harmonization of LST products from multi-sensors is on the framework of the ESA DUE GlobTemperature project.
NASA Technical Reports Server (NTRS)
Bauer, M. E. (Principal Investigator); Vanderbilt, V. C.; Robinson, B. F.; Biehl, L. L.; Vanderbilt, A. S.
1981-01-01
The reflectance response with view angle of wheat, was analyzed. The analyses, which assumes there are no atmospheric effects, and otherwise simulates the response of a multispectral scanner, is based upon spectra taken continuously in wavelength from 0.45 to 2.4 micrometers at more than 1200 view/illumination directions using an Exotech model 20C spectra radiometer. Data were acquired six meters above four wheat canopies, each at a different growth stage. The analysis shows that the canopy reflective response is a pronounced function of illumination angle, scanner view angle and wavelength. The variation is greater at low solar elevations compared to high solar elevations.
Visual Image Sensor Organ Replacement
NASA Technical Reports Server (NTRS)
Maluf, David A.
2014-01-01
This innovation is a system that augments human vision through a technique called "Sensing Super-position" using a Visual Instrument Sensory Organ Replacement (VISOR) device. The VISOR device translates visual and other sensors (i.e., thermal) into sounds to enable very difficult sensing tasks. Three-dimensional spatial brightness and multi-spectral maps of a sensed image are processed using real-time image processing techniques (e.g. histogram normalization) and transformed into a two-dimensional map of an audio signal as a function of frequency and time. Because the human hearing system is capable of learning to process and interpret extremely complicated and rapidly changing auditory patterns, the translation of images into sounds reduces the risk of accidentally filtering out important clues. The VISOR device was developed to augment the current state-of-the-art head-mounted (helmet) display systems. It provides the ability to sense beyond the human visible light range, to increase human sensing resolution, to use wider angle visual perception, and to improve the ability to sense distances. It also allows compensation for movement by the human or changes in the scene being viewed.
NASA Technical Reports Server (NTRS)
Swedberg, J. L.; Maschhogg, R. H.
1982-01-01
Characterization studies were performed on flight spare ERB wide field of view Earth flux sensors. Field of view sensitivity profiles were determined for total energy sensors with and without painted baffles. Similarly, sensors with filter domes were also characterized in terms of field of view. The transient response of sensors with filter domes was determined for both long wave and short wave radiation. Long wave radiation interacts directly with the quartz dome causing undesired responses. While short wave radiation was shown not to interact with the domes, modules as a whole exhibited a secondary response to bursts of short wave radiation indicative of a heating mechanism. How the results of this characterization can or should be applied to the data emanating from these sensors on ERB-6 and 7 is outlined.
Stone, T.C.
2008-01-01
With the increased emphasis on monitoring the Earth's climate from space, more stringent calibration requirements are being placed on the data products from remote sensing satellite instruments. Among these are stability over decade-length time scales and consistency across sensors and platforms. For radiometer instruments in the solar reflectance wavelength range (visible to shortwave infrared), maintaining calibration on orbit is difficult due to the lack of absolute radiometric standards suitable for flight use. The Moon presents a luminous source that can be viewed by all instruments in Earth orbit. Considered as a solar diffuser, the lunar surface is exceedingly stable. The chief difficulty with using the Moon is the strong variations in the Moon's brightness with illumination and viewing geometry. This mandates the use of a photometric model to compare lunar observations, either over time by the same instrument or between instruments. The U.S. Geological Survey in Flagstaff, Arizona, under NASA sponsorship, has developed a model for the lunar spectral irradiance that explicitly accounts for the effects of phase, the lunar librations, and the lunar surface reflectance properties. The model predicts variations in the Moon's brightness with precision ???1% over a continuous phase range from eclipse to the quarter lunar phases. Given a time series of Moon observations taken by an instrument, the geometric prediction capability of the lunar irradiance model enables sensor calibration stability with sub-percent per year precision. Cross-calibration of instruments with similar passbands can be achieved with precision comparable to the model precision. Although the Moon observations used for intercomparison can be widely separated in phase angle and/or time, SeaWiFS and MODIS have acquired lunar views closely spaced in time. These data provide an example to assess inter-calibration biases between these two instruments.
Fast calibration of electromagnetically tracked oblique-viewing rigid endoscopes.
Liu, Xinyang; Rice, Christina E; Shekhar, Raj
2017-10-01
The oblique-viewing (i.e., angled) rigid endoscope is a commonly used tool in conventional endoscopic surgeries. The relative rotation between its two moveable parts, the telescope and the camera head, creates a rotation offset between the actual and the projection of an object in the camera image. A calibration method tailored to compensate such offset is needed. We developed a fast calibration method for oblique-viewing rigid endoscopes suitable for clinical use. In contrast to prior approaches based on optical tracking, we used electromagnetic (EM) tracking as the external tracking hardware to improve compactness and practicality. Two EM sensors were mounted on the telescope and the camera head, respectively, with considerations to minimize EM tracking errors. Single-image calibration was incorporated into the method, and a sterilizable plate, laser-marked with the calibration pattern, was also developed. Furthermore, we proposed a general algorithm to estimate the rotation center in the camera image. Formulas for updating the camera matrix in terms of clockwise and counterclockwise rotations were also developed. The proposed calibration method was validated using a conventional [Formula: see text], 5-mm laparoscope. Freehand calibrations were performed using the proposed method, and the calibration time averaged 2 min and 8 s. The calibration accuracy was evaluated in a simulated clinical setting with several surgical tools present in the magnetic field of EM tracking. The root-mean-square re-projection error averaged 4.9 pixel (range 2.4-8.5 pixel, with image resolution of [Formula: see text] for rotation angles ranged from [Formula: see text] to [Formula: see text]. We developed a method for fast and accurate calibration of oblique-viewing rigid endoscopes. The method was also designed to be performed in the operating room and will therefore support clinical translation of many emerging endoscopic computer-assisted surgical systems.
Comparison of Sentinel-2A and Landsat-8 Nadir BRDF Adjusted Reflectance (NBAR) over Southern Africa
NASA Astrophysics Data System (ADS)
Li, J.; Roy, D. P.; Zhang, H.
2016-12-01
The Landsat satellites have been providing moderate resolution imagery of the Earth's surface for over 40 years with continuity provided by the Landsat 8 and planned Landsat 9 missions. The European Space Agency Sentinel-2 satellite was successfully launched into a polar sun-synchronous orbit in 2015 and carries the Multi Spectral Instrument (MSI) that has Landsat-like bands and acquisition coverage. These new sensors acquire images at view angles ± 7.5° (Landsat) and ± 10.3° (Sentinel-2) from nadir that result in small directional effects in the surface reflectance. When data from adjoining paths, or from long time series are used, a model of the surface anisotropy is required to adjust observations to a uniform nadir view (primarily for visual consistency, vegetation monitoring, or detection of subtle surface changes). Recently a generalized approach was published that provides consistent Landsat view angle corrections to provide nadir BRDF-adjusted reflectance (NBAR). Because the BRDF shapes of different terrestrial surfaces are sufficiently similar over the narrow 15° Landsat field of view, a fixed global set of MODIS BRDF spectral model parameters was shown to be adequate for Landsat NBAR derivation with little sensitivity to the land cover type, condition, or surface disturbance. This poster demonstrates the application of this methodology to Sentinel-2 data over a west-east transect across southern Africa. The reflectance differences between adjacent overlapping paths in the forward and backward scatter directions are quantified for both before and after BRDF correction. Sentinel-2 and Landsat-8 reflectance and NBAR inter-comparison results considering different stages of cloud and saturation filtering, and filtering to reduce surface state differences caused by acquisition time differences, demonstrate the utility of the approach. The relevance and limitations of the corrections for providing consistent moderate resolution reflectance are discussed.
NASA Astrophysics Data System (ADS)
Wolf, Kevin; Ehrlich, André; Hüneke, Tilman; Pfeilsticker, Klaus; Werner, Frank; Wirth, Martin; Wendisch, Manfred
2017-03-01
Spectral radiance measurements collected in nadir and sideward viewing directions by two airborne passive solar remote sensing instruments, the Spectral Modular Airborne Radiation measurement sysTem (SMART) and the Differential Optical Absorption Spectrometer (mini-DOAS), are used to compare the remote sensing results of cirrus optical thickness τ. The comparison is based on a sensitivity study using radiative transfer simulations (RTS) and on data obtained during three airborne field campaigns: the North Atlantic Rainfall VALidation (NARVAL) mission, the Mid-Latitude Cirrus Experiment (ML-CIRRUS) and the Aerosol, Cloud, Precipitation, and Radiation Interactions and Dynamics of Convective Cloud Systems (ACRIDICON) campaign. Radiative transfer simulations are used to quantify the sensitivity of measured upward radiance I with respect to τ, ice crystal effective radius reff, viewing angle of the sensor θV, spectral surface albedo α, and ice crystal shape. From the calculations it is concluded that sideward viewing measurements are generally better suited than radiance data from the nadir direction to retrieve τ of optically thin cirrus, especially at wavelengths larger than λ = 900 nm. Using sideward instead of nadir-directed spectral radiance measurements significantly improves the sensitivity and accuracy in retrieving τ, in particular for optically thin cirrus of τ ≤ 2. The comparison of retrievals of τ based on nadir and sideward viewing radiance measurements from SMART, mini-DOAS and independent estimates of τ from an additional active remote sensing instrument, the Water Vapor Lidar Experiment in Space (WALES), shows general agreement within the range of measurement uncertainties. For the selected example a mean τ of 0.54 ± 0.2 is derived from SMART, and 0.49 ± 0.2 by mini-DOAS nadir channels, while WALES obtained a mean value of τ = 0.32 ± 0.02 at 532 nm wavelength, respectively. The mean of τ derived from the sideward viewing mini-DOAS channels is 0.26 ± 0.2. For the few simultaneous measurements, the mini-DOAS sideward channel measurements systematically underestimate (-17.6 %) the nadir observations from SMART and mini-DOAS. The agreement between mini-DOAS sideward viewing channels and WALES is better, showing the advantage of using sideward viewing measurements for cloud remote sensing for τ ≤ 1. Therefore, we suggest sideward viewing measurements for retrievals of τ of thin cirrus because of the significantly enhanced capability of sideward viewing compared to nadir measurements.
Normalization of multidirectional red and NIR reflectances with the SAVI
NASA Technical Reports Server (NTRS)
Huete, A. R.; Hua, G.; Qi, J.; Chehbouni, A.; Van Leeuwen, W. J. D.
1992-01-01
Directional reflectance measurements were made over a semi-desert gramma grassland at various times of the growing season. View angle measurements from +40 to -40 degrees were made at various solar zenith angles and soil moisture conditions. The sensitivity of the Normalized Difference Vegetation Index (NDVI) and the Soil Adjusted Vegetation Index (SAVI) to bidirectional measurements was assessed for purposes of improving remote temporal monitoring of vegetation dynamics. The SAVI view angle response was found to be symmetric about nadir while the NDVI response was strongly anisotropic. This enabled the view angle behavior of the SAVI to be normalized with a cosine function. In contrast to the NDVI, the SAVI was able to minimize soil moisture and shadow influences for all measurement conditions.
Expansion of the visual angle of a car rear-view image via an image mosaic algorithm
NASA Astrophysics Data System (ADS)
Wu, Zhuangwen; Zhu, Liangrong; Sun, Xincheng
2015-05-01
The rear-view image system is one of the active safety devices in cars and is widely applied in all types of vehicles and traffic safety areas. However, studies made by both domestic and foreign researchers were based on a single image capture device while reversing, so a blind area still remained to drivers. Even if multiple cameras were used to expand the visual angle of the car's rear-view image in some studies, the blind area remained because different source images were not mosaicked together. To acquire an expanded visual angle of a car rear-view image, two charge-coupled device cameras with optical axes angled at 30 deg were mounted below the left and right fenders of a car in three light conditions-sunny outdoors, cloudy outdoors, and an underground garage-to capture rear-view heterologous images of the car. Then these rear-view heterologous images were rapidly registered through the scale invariant feature transform algorithm. Combined with the random sample consensus algorithm, the two heterologous images were finally mosaicked using the linear weighted gradated in-and-out fusion algorithm, and a seamless and visual-angle-expanded rear-view image was acquired. The four-index test results showed that the algorithms can mosaic rear-view images well in the underground garage condition, where the average rate of correct matching was the lowest among the three conditions. The rear-view image mosaic algorithm presented had the best information preservation, the shortest computation time and the most complete preservation of the image detail features compared to the mean value method (MVM) and segmental fusion method (SFM), and it was also able to perform better in real time and provided more comprehensive image details than MVM and SFM. In addition, it had the most complete image preservation from source images among the three algorithms. The method introduced by this paper provided the basis for researching the expansion of the visual angle of a car rear-view image in all-weather conditions.
Bidirectional Reflectance Functions for Application to Earth Radiation Budget Studies
NASA Technical Reports Server (NTRS)
Manalo-Smith, N.; Tiwari, S. N.; Smith, G. L.
1997-01-01
Reflected solar radiative fluxes emerging for the Earth's top of the atmosphere are inferred from satellite broadband radiance measurements by applying bidirectional reflectance functions (BDRFs) to account for the anisotropy of the radiation field. BDRF's are dependent upon the viewing geometry (i.e. solar zenith angle, view zenith angle, and relative azimuth angle), the amount and type of cloud cover, the condition of the intervening atmosphere, and the reflectance characteristics of the underlying surface. A set of operational Earth Radiation Budget Experiment (ERBE) BDRFs is available which was developed from the Nimbus 7 ERB (Earth Radiation Budget) scanner data for a three-angle grid system, An improved set of bidirectional reflectance is required for mission planning and data analysis of future earth radiation budget instruments, such as the Clouds and Earth's Radiant Energy System (CERES), and for the enhancement of existing radiation budget data products. This study presents an analytic expression for BDRFs formulated by applying a fit to the ERBE operational model tabulations. A set of model coefficients applicable to any viewing condition is computed for an overcast and a clear sky scene over four geographical surface types: ocean, land, snow, and desert, and partly cloudy scenes over ocean and land. The models are smooth in terms of the directional angles and adhere to the principle of reciprocity, i.e., they are invariant with respect to the interchange of the incoming and outgoing directional angles. The analytic BDRFs and the radiance standard deviations are compared with the operational ERBE models and validated with ERBE data. The clear ocean model is validated with Dlhopolsky's clear ocean model. Dlhopolsky developed a BDRF of higher angular resolution for clear sky ocean from ERBE radiances. Additionally, the effectiveness of the models accounting for anisotropy for various viewing directions is tested with the ERBE along tract data. An area viewed from nadir and from the side give two different radiance measurements but should yield the same flux when converted by the BDRF. The analytic BDRFs are in very good qualitative agreement with the ERBE models. The overcast scenes exhibit constant retrieved albedo over viewing zenith angles for solar zenith angles less than 60 degrees. The clear ocean model does not produce constant retrieved albedo over viewing zenith angles but gives an improvement over the ERBE operational clear sky ocean BDRF.
THE VIEWING ANGLES OF BROAD ABSORPTION LINE VERSUS UNABSORBED QUASARS
DOE Office of Scientific and Technical Information (OSTI.GOV)
DiPompeo, M. A.; Brotherton, M. S.; De Breuck, C.
2012-06-10
It was recently shown that there is a significant difference in the radio spectral index distributions of broad absorption line (BAL) quasars and unabsorbed quasars, with an overabundance of BAL quasars with steeper radio spectra. This result suggests that source orientation does play into the presence or absence of BAL features. In this paper, we provide more quantitative analysis of this result based on Monte Carlo simulations. While the relationship between viewing angle and spectral index does indeed contain a lot of scatter, the spectral index distributions are different enough to overcome that intrinsic variation. Utilizing two different models ofmore » the relationship between spectral index and viewing angle, the simulations indicate that the difference in spectral index distributions can be explained by allowing BAL quasar viewing angles to extend about 10 Degree-Sign farther from the radio jet axis than non-BAL sources, though both can be seen at small angles. These results show that orientation cannot be the only factor determining whether BAL features are present, but it does play a role.« less
Barrier Coverage for 3D Camera Sensor Networks
Wu, Chengdong; Zhang, Yunzhou; Jia, Zixi; Ji, Peng; Chu, Hao
2017-01-01
Barrier coverage, an important research area with respect to camera sensor networks, consists of a number of camera sensors to detect intruders that pass through the barrier area. Existing works on barrier coverage such as local face-view barrier coverage and full-view barrier coverage typically assume that each intruder is considered as a point. However, the crucial feature (e.g., size) of the intruder should be taken into account in the real-world applications. In this paper, we propose a realistic resolution criterion based on a three-dimensional (3D) sensing model of a camera sensor for capturing the intruder’s face. Based on the new resolution criterion, we study the barrier coverage of a feasible deployment strategy in camera sensor networks. Performance results demonstrate that our barrier coverage with more practical considerations is capable of providing a desirable surveillance level. Moreover, compared with local face-view barrier coverage and full-view barrier coverage, our barrier coverage is more reasonable and closer to reality. To the best of our knowledge, our work is the first to propose barrier coverage for 3D camera sensor networks. PMID:28771167
Barrier Coverage for 3D Camera Sensor Networks.
Si, Pengju; Wu, Chengdong; Zhang, Yunzhou; Jia, Zixi; Ji, Peng; Chu, Hao
2017-08-03
Barrier coverage, an important research area with respect to camera sensor networks, consists of a number of camera sensors to detect intruders that pass through the barrier area. Existing works on barrier coverage such as local face-view barrier coverage and full-view barrier coverage typically assume that each intruder is considered as a point. However, the crucial feature (e.g., size) of the intruder should be taken into account in the real-world applications. In this paper, we propose a realistic resolution criterion based on a three-dimensional (3D) sensing model of a camera sensor for capturing the intruder's face. Based on the new resolution criterion, we study the barrier coverage of a feasible deployment strategy in camera sensor networks. Performance results demonstrate that our barrier coverage with more practical considerations is capable of providing a desirable surveillance level. Moreover, compared with local face-view barrier coverage and full-view barrier coverage, our barrier coverage is more reasonable and closer to reality. To the best of our knowledge, our work is the first to propose barrier coverage for 3D camera sensor networks.
An Algorithm for Converting Static Earth Sensor Measurements into Earth Observation Vectors
NASA Technical Reports Server (NTRS)
Harman, R.; Hashmall, Joseph A.; Sedlak, Joseph
2004-01-01
An algorithm has been developed that converts penetration angles reported by Static Earth Sensors (SESs) into Earth observation vectors. This algorithm allows compensation for variation in the horizon height including that caused by Earth oblateness. It also allows pitch and roll to be computed using any number (greater than 1) of simultaneous sensor penetration angles simplifying processing during periods of Sun and Moon interference. The algorithm computes body frame unit vectors through each SES cluster. It also computes GCI vectors from the spacecraft to the position on the Earth's limb where each cluster detects the Earth's limb. These body frame vectors are used as sensor observation vectors and the GCI vectors are used as reference vectors in an attitude solution. The attitude, with the unobservable yaw discarded, is iteratively refined to provide the Earth observation vector solution.
Fiber Optic Sensor System Using Birefringent Filters For Spectral Encoding
NASA Astrophysics Data System (ADS)
Dorsch, Friedhelm; Ulrich, Reinhard
1989-02-01
A system of multimode fiber optic sensors is described for the remote measurement of position, angle, force, pressure and other measurands that can be converted into a rotation of polarization. A birefringent filter encodes the polarization angle into the power ratio of two interleaved comb spectra or, in a modified implementation, into the absolute spectral position of a comb spectrum. By using identical filters in all transducers and in the evaluation unit, transducers for the same or different measurands become interchange-able. All sensors are of the incremental type, with accuracies reaching 0.5 % of one period of the measurand, independent of variations in the attenuation of the fiber link of up to 20dB.
NASA Astrophysics Data System (ADS)
Lu, Yanfang; Shen, Changyu; Chen, Debao; Chu, Jinlei; Wang, Qiang; Dong, Xinyong
2014-10-01
The transmission intensity of the tilted fiber Bragg grating (TFBG) is strongly dependent on the polarization properties of the TFBG. The polarization characteristic of the cladding modes can be used for twist measuring. In this paper, a highly sensitive fiber twist sensor is proposed. The transmission intensity on the strong loss wavelength showed a quasi-sin θ changing with the twist angle ranging from 0° to 180° for S- or P-polarized input. A high sensitivity of 0.299 dB/° is achieved, which is almost 17.9 times higher than that of the current similar existing twist sensor. The twist angle can be measured precisely with the matrix.
Proceedings of the 1997 Battlespace Atmospherics Conference 2-4 December 1997
1998-03-01
sensor capabilities are highlighted in our SSM/I section, where coincident passive microwave and Visible/Infrared products are created automatically ...field of view 60 ’. ^igure 1. Effect of changing sensor fiele of view on received signal The signal at 0° field of view is the directly transmitted...not used here because the sensor is a photon counting device and that irradiance does not add up spectrally in the same way as photon flux. In the UVS
Survey on Ranging Sensors and Cooperative Techniques for Relative Positioning of Vehicles
de Ponte Müller, Fabian
2017-01-01
Future driver assistance systems will rely on accurate, reliable and continuous knowledge on the position of other road participants, including pedestrians, bicycles and other vehicles. The usual approach to tackle this requirement is to use on-board ranging sensors inside the vehicle. Radar, laser scanners or vision-based systems are able to detect objects in their line-of-sight. In contrast to these non-cooperative ranging sensors, cooperative approaches follow a strategy in which other road participants actively support the estimation of the relative position. The limitations of on-board ranging sensors regarding their detection range and angle of view and the facility of blockage can be approached by using a cooperative approach based on vehicle-to-vehicle communication. The fusion of both, cooperative and non-cooperative strategies, seems to offer the largest benefits regarding accuracy, availability and robustness. This survey offers the reader a comprehensive review on different techniques for vehicle relative positioning. The reader will learn the important performance indicators when it comes to relative positioning of vehicles, the different technologies that are both commercially available and currently under research, their expected performance and their intrinsic limitations. Moreover, the latest research in the area of vision-based systems for vehicle detection, as well as the latest work on GNSS-based vehicle localization and vehicular communication for relative positioning of vehicles, are reviewed. The survey also includes the research work on the fusion of cooperative and non-cooperative approaches to increase the reliability and the availability. PMID:28146129
Multi-phenology WorldView-2 imagery improves remote sensing of savannah tree species
NASA Astrophysics Data System (ADS)
Madonsela, Sabelo; Cho, Moses Azong; Mathieu, Renaud; Mutanga, Onisimo; Ramoelo, Abel; Kaszta, Żaneta; Kerchove, Ruben Van De; Wolff, Eléonore
2017-06-01
Biodiversity mapping in African savannah is important for monitoring changes and ensuring sustainable use of ecosystem resources. Biodiversity mapping can benefit from multi-spectral instruments such as WorldView-2 with very high spatial resolution and a spectral configuration encompassing important spectral regions not previously available for vegetation mapping. This study investigated i) the benefits of the eight-band WorldView-2 (WV-2) spectral configuration for discriminating tree species in Southern African savannah and ii) if multiple-images acquired at key points of the typical phenological development of savannahs (peak productivity, transition to senescence) improve on tree species classifications. We first assessed the discriminatory power of WV-2 bands using interspecies-Spectral Angle Mapper (SAM) via Band Add-On procedure and tested the spectral capability of WorldView-2 against simulated IKONOS for tree species classification. The results from interspecies-SAM procedure identified the yellow and red bands as the most statistically significant bands (p = 0.000251 and p = 0.000039 respectively) in the discriminatory power of WV-2 during the transition from wet to dry season (April). Using Random Forest classifier, the classification scenarios investigated showed that i) the 8-bands of the WV-2 sensor achieved higher classification accuracy for the April date (transition from wet to dry season, senescence) compared to the March date (peak productivity season) ii) the WV-2 spectral configuration systematically outperformed the IKONOS sensor spectral configuration and iii) the multi-temporal approach (March and April combined) improved the discrimination of tress species and produced the highest overall accuracy results at 80.4%. Consistent with the interspecies-SAM procedure, the yellow (605 nm) band also showed a statistically significant contribution in the improved classification accuracy from WV-2. These results highlight the mapping opportunities presented by WV-2 data for monitoring the distribution status of e.g. species often harvested by local communities (e.g. Sclerocharya birrea), encroaching species, or species-specific tree losses induced by elephants.
Lidar on small UAV for 3D mapping
NASA Astrophysics Data System (ADS)
Tulldahl, H. Michael; Larsson, Hâkan
2014-10-01
Small UAV:s (Unmanned Aerial Vehicles) are currently in an explosive technical development phase. The performance of UAV-system components such as inertial navigation sensors, propulsion, control processors and algorithms are gradually improving. Simultaneously, lidar technologies are continuously developing in terms of reliability, accuracy, as well as speed of data collection, storage and processing. The lidar development towards miniature systems with high data rates has, together with recent UAV development, a great potential for new three dimensional (3D) mapping capabilities. Compared to lidar mapping from manned full-size aircraft a small unmanned aircraft can be cost efficient over small areas and more flexible for deployment. An advantage with high resolution lidar compared to 3D mapping from passive (multi angle) photogrammetry is the ability to penetrate through vegetation and detect partially obscured targets. Another advantage is the ability to obtain 3D data over the whole survey area, without the limited performance of passive photogrammetry in low contrast areas. The purpose of our work is to demonstrate 3D lidar mapping capability from a small multirotor UAV. We present the first experimental results and the mechanical and electrical integration of the Velodyne HDL-32E lidar on a six-rotor aircraft with a total weight of 7 kg. The rotating lidar is mounted at an angle of 20 degrees from the horizontal plane giving a vertical field-of-view of 10-50 degrees below the horizon in the aircraft forward directions. For absolute positioning of the 3D data, accurate positioning and orientation of the lidar sensor is of high importance. We evaluate the lidar data position accuracy both based on inertial navigation system (INS) data, and on INS data combined with lidar data. The INS sensors consist of accelerometers, gyroscopes, GPS, magnetometers, and a pressure sensor for altimetry. The lidar range resolution and accuracy is documented as well as the capability for target surface reflectivity estimation based on measurements on calibration standards. Initial results of the general mapping capability including the detection through partly obscured environments is demonstrated through field data collection and analysis.
GEONEX: Land Monitoring From a New Generation of Geostationary Satellite Sensors
NASA Technical Reports Server (NTRS)
Nemani, Ramakrishna; Lyapustin, Alexei; Wang, Weile; Wang, Yujie; Hashimoto, Hirofumi; Li, Shuang; Ganguly, Sangram; Michaelis, Andrew; Higuchi, Atsushi; Takaneka, Hideaki;
2017-01-01
The latest generation of geostationary satellites carry sensors such as ABI (Advanced Baseline Imager on GOES-16) and the AHI (Advanced Himawari Imager on Himawari) that closely mimic the spatial and spectral characteristics of Earth Observing System flagship MODIS for monitoring land surface conditions. More importantly they provide observations at 5-15 minute intervals. Such high frequency data offer exciting possibilities for producing robust estimates of land surface conditions by overcoming cloud cover, enabling studies of diurnally varying local-to-regional biosphere-atmosphere interactions, and operational decision-making in agriculture, forestry and disaster management. But the data come with challenges that need special attention. For instance, geostationary data feature changing sun angle at constant view for each pixel, which is reciprocal to sun-synchronous observations, and thus require careful adaptation of EOS algorithms. Our goal is to produce a set of land surface products from geostationary sensors by leveraging NASA's investments in EOS algorithms and in the data/compute facility NEX. The land surface variables of interest include atmospherically corrected surface reflectances, snow cover, vegetation indices and leaf area index (LAI)/fraction of photosynthetically absorbed radiation (FPAR), as well as land surface temperature and fires. In order to get ready to produce operational products over the US from GOES-16 starting 2018, we have utilized 18 months of data from Himawari AHI over Australia to test the production pipeline and the performance of various algorithms for our initial tests. The end-to-end processing pipeline consists of a suite of modules to (a) perform calibration and automatic georeference correction of the AHI L1b data, (b) adopt the Multi-Angle Implementation of Atmospheric Correction (MAIAC) algorithm to produce surface spectral reflectances along with compositing schemes and QA, and (c) modify relevant EOS retrieval algorithms (e.g., LAI and FPAR, GPP, etc.) for subsequent science product generation. Initial evaluation of Himawari AHI products against standard MODIS products indicate general agreement, suggesting that data from geostationary sensors can augment low earth orbit (LEO) satellite observations.
GEONEX: Land monitoring from a new generation of geostationary satellite sensors
NASA Astrophysics Data System (ADS)
Nemani, R. R.; Lyapustin, A.; Wang, W.; Ganguly, S.; Wang, Y.; Michaelis, A.; Hashimoto, H.; Li, S.; Higuchi, A.; Huete, A. R.; Yeom, J. M.; camacho De Coca, F.; Lee, T. J.; Takenaka, H.
2017-12-01
The latest generation of geostationary satellites carry sensors such as ABI (Advanced Baseline Imager on GOES-16) and the AHI (Advanced Himawari Imager on Himawari) that closely mimic the spatial and spectral characteristics of Earth Observing System flagship MODIS for monitoring land surface conditions. More importantly they provide observations at 5-15 minute intervals. Such high frequency data offer exciting possibilities for producing robust estimates of land surface conditions by overcoming cloud cover, enabling studies of diurnally varying local-to-regional biosphere-atmosphere interactions, and operational decision-making in agriculture, forestry and disaster management. But the data come with challenges that need special attention. For instance, geostationary data feature changing sun angle at constant view for each pixel, which is reciprocal to sun-synchronous observations, and thus require careful adaptation of EOS algorithms. Our goal is to produce a set of land surface products from geostationary sensors by leveraging NASA's investments in EOS algorithms and in the data/compute facility NEX. The land surface variables of interest include atmospherically corrected surface reflectances, snow cover, vegetation indices and leaf area index (LAI)/fraction of photosynthetically absorbed radiation (FPAR), as well as land surface temperature and fires. In order to get ready to produce operational products over the US from GOES-16 starting 2018, we have utilized 18 months of data from Himawari AHI over Australia to test the production pipeline and the performance of various algorithms for our initial tests. The end-to-end processing pipeline consists of a suite of modules to (a) perform calibration and automatic georeference correction of the AHI L1b data, (b) adopt the Multi-Angle Implementation of Atmospheric Correction (MAIAC) algorithm to produce surface spectral reflectances along with compositing schemes and QA, and (c) modify relevant EOS retrieval algorithms (e.g., LAI and FPAR, GPP, etc.) for subsequent science product generation. Initial evaluation of Himawari AHI products against standard MODIS products indicate general agreement, suggesting that data from geostationary sensors can augment low earth orbit (LEO) satellite observations.
4. Elevation view of Bunker 104 with ultrawide angle lens ...
4. Elevation view of Bunker 104 with ultrawide angle lens shows about 70 percent of east facade including entire south end with steps and doors. View shows slope of south end and vegetation growing atop building. See also photo WA-203-C-3. - Puget Sound Naval Shipyard, Munitions Storage Bunker, Naval Ammunitions Depot, South of Campbell Trail, Bremerton, Kitsap County, WA
NASA Astrophysics Data System (ADS)
Baumgart, Marcus; Tortschanoff, Andreas
2013-05-01
A tilt mirror's deflection angle tracking setup is examined from a theoretical point of view. The proposed setup is based on a simple optical approach and easily scalable. Thus, the principle is especially of interest for small and fast oscillating MEMS/MOEMS based tilt mirrors. An experimentally established optical scheme is used as a starting point for accurate and fast mirror angle-position detection. This approach uses an additional layer, positioned under the MOEMS mirror's backside, consisting of a light source in the center and two photodetectors positioned symmetrical around the center. The mirror's back surface is illuminated by the light source and the intensity change due to mirror tilting is tracked via the photodiodes. The challenge of this method is to get a linear relation between the measured intensity and the current mirror tilt angle even for larger angles. State-of-the-art MOEMS mirrors achieve angles up to ±30°, which exceeds the linear angle approximations. The use of an LED, small laser diode or VCSEL as a lightsource is appropriate due to their small size and inexpensive price. Those light sources typically emit light with a Gaussian intensity distribution. This makes an analytical prediction of the expected detector signal quite complicated. In this publication an analytical simulation model is developed to evaluate the influence of the main parameters for this optical mirror tilt-sensor design. An easy and fast to calculate value directly linked to the mirror's tilt-angle is the "relative differential intensity" (RDI = (I1 - I2) / (I1 + I2)). Evaluation of its slope and nonlinear error highlights dependencies between the identified parameters for best SNR and linearity. Also the energy amount covering the detector area is taken into account. Design optimizing rules are proposed and discussed based on theoretical considerations.
NASA Technical Reports Server (NTRS)
Angal, Amit; Xiong, Xiaoxiong; Wu, Aisheng; Chen, Hongda; Geng, Xu; Link, Daniel; Li, Yonghong; Wald, Andrew; Brinkmann, Jake
2016-01-01
Moderate Resolution Imaging Spectroradiometer (MODIS) is the keystone instrument for NASAs EOS Terra and Aqua missions, designed to extend and improve heritage sensor measurements and data records of the land, oceans and atmosphere. The reflective solar bands (RSB) of MODIS covering wavelengths from 0.41 micrometers to 2.2 micrometers, are calibrated on-orbit using a solar diffuser (SD), with its on-orbit bi-directional reflectance factor (BRF) changes tracked using a solar diffuser stability monitor (SDSM). MODIS is a scanning radiometer using a two-sided paddle-wheel mirror to collect earth view (EV) data over a range of (+/-)55 deg. off instrument nadir. In addition to the solar calibration provided by the SD and SDSM system, lunar observations at nearly constant phase angles are regularly scheduled to monitor the RSB calibration stability. For both Terra and Aqua MODIS, the SD and lunar observations are used together to track the on-orbit changes of RSB response versus scan angle (RVS) as the SD and SV port are viewed at different angles of incidence (AOI) on the scan mirror. The MODIS Level 1B (L1B) Collection 6 (C6) algorithm incorporated several enhancements over its predecessor Collection 5 (C5) algorithm. A notable improvement was the use of the earth-view (EV) response trends from pseudo-invariant desert targets to characterize the on-orbit RVS for select RSB (Terra bands 1-4, 8, 9 and Aqua bands 8, 9) and the time, AOI, and wavelength-dependent uncertainty. The MODIS Characterization Support Team (MCST) has been maintaining and enhancing the C6 algorithm since its first update in November, 2011 for Aqua MODIS, and February, 2012 for Terra MODIS. Several calibration improvements have been incorporated that include extending the EV-based RVS approach to other RSB, additional correction for SD degradation at SWIR wavelengths, and alternative approaches for on-orbit RVS characterization. In addition to the on-orbit performance of the MODIS RSB, this paper also discusses in detail the recent calibration improvements implemented in the MODIS L1B C6.
What is MISR? MISR Instrument? MISR Project?
Atmospheric Science Data Center
2014-12-08
... to improve our understanding of the Earth's environment and climate. Viewing the sunlit Earth simultaneously at nine widely-spaced angles, ... types of atmospheric particles and clouds on climate. The change in reflection at different view angles affords the means to distinguish ...
Friedrich, D T; Sommer, F; Scheithauer, M O; Greve, J; Hoffmann, T K; Schuler, P J
2017-12-01
Objective Advanced transnasal sinus and skull base surgery remains a challenging discipline for head and neck surgeons. Restricted access and space for instrumentation can impede advanced interventions. Thus, we present the combination of an innovative robotic endoscope guidance system and a specific endoscope with adjustable viewing angle to facilitate transnasal surgery in a human cadaver model. Materials and Methods The applicability of the robotic endoscope guidance system with custom foot pedal controller was tested for advanced transnasal surgery on a fresh frozen human cadaver head. Visualization was enabled using a commercially available endoscope with adjustable viewing angle (15-90 degrees). Results Visualization and instrumentation of all paranasal sinuses, including the anterior and middle skull base, were feasible with the presented setup. Controlling the robotic endoscope guidance system was effectively precise, and the adjustable endoscope lens extended the view in the surgical field without the common change of fixed viewing angle endoscopes. Conclusion The combination of a robotic endoscope guidance system and an advanced endoscope with adjustable viewing angle enables bimanual surgery in transnasal interventions of the paranasal sinuses and the anterior skull base in a human cadaver model. The adjustable lens allows for the abandonment of fixed-angle endoscopes, saving time and resources, without reducing the quality of imaging.
Multi-viewer tracking integral imaging system and its viewing zone analysis.
Park, Gilbae; Jung, Jae-Hyun; Hong, Keehoon; Kim, Yunhee; Kim, Young-Hoon; Min, Sung-Wook; Lee, Byoungho
2009-09-28
We propose a multi-viewer tracking integral imaging system for viewing angle and viewing zone improvement. In the tracking integral imaging system, the pickup angles in each elemental lens in the lens array are decided by the positions of viewers, which means the elemental image can be made for each viewer to provide wider viewing angle and larger viewing zone. Our tracking integral imaging system is implemented with an infrared camera and infrared light emitting diodes which can track the viewers' exact positions robustly. For multiple viewers to watch integrated three-dimensional images in the tracking integral imaging system, it is needed to formulate the relationship between the multiple viewers' positions and the elemental images. We analyzed the relationship and the conditions for the multiple viewers, and verified them by the implementation of two-viewer tracking integral imaging system.
Proposals for the implementation of the variants of automatic control of the telescope AZT-2
NASA Astrophysics Data System (ADS)
Shavlovskyi, V. I.; Puha, S. P.; Vidmachenko, A. P.; Volovyk, D. V.; Puha, G. P.; Obolonskyi, V. O.; Kratko, O. O.; Stefurak, M. V.
2018-05-01
Based on the experience of astronomical observations, structural features and results of the review of the technical state of the mechanism of the telescope AZT-2 in the Main Astronomical Observatory of NAS of Ukraine, in 2012 it was decided to carry out works on its modernization. To this end, it was suggested that the telescope control system should consist of angle sensors on the time axis "alpha" and the axis "delta", personal computer (PC), corresponding software, power control unit, and rotation system of telescope. The angle sensor should be absolute, with a resolution of better than 10 angular minutes. The PC should perform the functions of data processing from the angle sensor, and control the power node. The developed software allows the operator to direct the telescope in an automatic mode, and to set the necessary parameters of the system. With using of PC, the power control node will directly control the engine of the rotation system.
77 FR 11789 - Airworthiness Directives; The Boeing Company Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-28
...-icing system for the angle of attack sensor, the total air temperature, and the pitot probes. We are proposing this AD to prevent ice from forming on air data system sensors and consequent loss of or... receive about this proposed AD. Discussion The air data sensor heating system, when ON, heats the pitot...
Internal Acoustics Measurements of a Full Scale Advanced Ducted Propulsor Demonstrator
NASA Technical Reports Server (NTRS)
Santa Maria, O. L.; Soderman, P. T.; Horne, W. C.; Jones, M. G.; Bock, L. A.
1995-01-01
Acoustics measurements of a Pratt & Whitney full-scale ADP (Advanced Ducted Propulsor), an ultrahigh by-pass ratio engine, were conducted in the NASA Ames 40- by 80-Foot Wind Tunnel. This paper presents data from measurements taken from sensors on a fan exit guide vane in the ADP. Data from two sensors, one at mid-span and the other at the tip of the fan exit guide vane, are presented. At the blade passage frequency (BPF), the levels observed at the various engine and wind speeds were higher at the mid-span sensor than the tip sensor. The coherence between these internal sensors and external microphones were calculated and plotted as a function of angle (angles ranged from 5 degrees to 160 degrees) relative to the ADP longitudinal axis. At the highest engine and wind speeds, the coherence between the tip sensor and the external microphones was observed to decrease at higher multiples of the BPF. These results suggest that the rotor-stator interaction tones are stronger in the mid-span region than at the tip.
Tracking a convoy of multiple targets using acoustic sensor data
NASA Astrophysics Data System (ADS)
Damarla, T. R.
2003-08-01
In this paper we present an algorithm to track a convoy of several targets in a scene using acoustic sensor array data. The tracking algorithm is based on template of the direction of arrival (DOA) angles for the leading target. Often the first target is the closest target to the sensor array and hence the loudest with good signal to noise ratio. Several steps were used to generate a template of the DOA angle for the leading target, namely, (a) the angle at the present instant should be close to the angle at the previous instant and (b) the angle at the present instant should be within error bounds of the predicted value based on the previous values. Once the template of the DOA angles of the leading target is developed, it is used to predict the DOA angle tracks of the remaining targets. In order to generate the tracks for the remaining targets, a track is established if the angles correspond to the initial track values of the first target. Second the time delay between the first track and the remaining tracks are estimated at the highest correlation points between the first track and the remaining tracks. As the vehicles move at different speeds the tracks either compress or expand depending on whether a target is moving fast or slow compared to the first target. The expansion and compression ratios are estimated and used to estimate the predicted DOA angle values of the remaining targets. Based on these predicted DOA angles of the remaining targets the DOA angles obtained from the MVDR or Incoherent MUSIC will be appropriately assigned to proper tracks. Several other rules were developed to avoid mixing the tracks. The algorithm is tested on data collected at Aberdeen Proving Ground with a convoy of 3, 4 and 5 vehicles. Some of the vehicles are tracked and some are wheeled vehicles. The tracking algorithm results are found to be good. The results will be presented at the conference and in the paper.
Comparison of two on-orbit attitude sensor alignment methods
NASA Technical Reports Server (NTRS)
Krack, Kenneth; Lambertson, Michael; Markley, F. Landis
1990-01-01
Compared here are two methods of on-orbit alignment of vector attitude sensors. The first method uses the angular difference between simultaneous measurements from two or more sensors. These angles are compared to the angular differences between the respective reference positions of the sensed objects. The alignments of the sensors are adjusted to minimize the difference between the two sets of angles. In the second method, the sensor alignment is part of a state vector that includes the attitude. The alignments are adjusted along with the attitude to minimize all observation residuals. It is shown that the latter method can result in much less alignment uncertainty when gyroscopes are used for attitude propagation during the alignment estimation. The additional information for this increased accuracy comes from knowledge of relative attitude obtained from the spacecraft gyroscopes. The theoretical calculations of this difference in accuracy are presented. Also presented are numerical estimates of the alignment uncertainties of the fixed-head star trackers on the Extreme Ultraviolet Explorer spacecraft using both methods.
NASA Technical Reports Server (NTRS)
Imhoff, Marc; Lawrence, William; Condit, Richard; Wright, Joseph; Johnson, Patrick; Holford, Warren; Hyer, Joseph; May, Lisa; Carson, Steven
2000-01-01
A synthetic aperture radar sensor operating in 5 bands between 80 and 120 MHz was flown over forested areas in the canal zone of the Republic of Panama in an experiment to measure biomass in heavy tropical forests. The sensor is a pulse coherent SAR flown on a small aircraft and oriented straight down. The doppler history is processed to collect data on the ground in rectangular cells of varying size over a range of incidence angles fore and aft of nadir (+45 to - 45 degrees). Sensor data consists of 5 frequency bands with 20 incidence angles per band. Sensor data for over 12+ sites were collected with forest stands having biomass densities ranging from 50 to 300 tons/ha dry above ground biomass. Results are shown exploring the biomass saturation thresholds using these frequencies, the system design is explained, and preliminary attempts at data visualization using this unique sensor design are described.
Sul, Onejae; Lee, Seung-Beck
2017-01-01
In this article, we report on a flexible sensor based on a sandpaper molded elastomer that simultaneously detects planar displacement, rotation angle, and vertical contact pressure. When displacement, rotation, and contact pressure are applied, the contact area between the translating top elastomer electrode and the stationary three bottom electrodes change characteristically depending on the movement, making it possible to distinguish between them. The sandpaper molded undulating surface of the elastomer reduces friction at the contact allowing the sensor not to affect the movement during measurement. The sensor showed a 0.25 mm−1 displacement sensitivity with a ±33 μm accuracy, a 0.027 degree−1 of rotation sensitivity with ~0.95 degree accuracy, and a 4.96 kP−1 of pressure sensitivity. For possible application to joint movement detection, we demonstrated that our sensor effectively detected the up-and-down motion of a human forefinger and the bending and straightening motion of a human arm. PMID:28878166
Choi, Eunsuk; Sul, Onejae; Lee, Seung-Beck
2017-09-06
In this article, we report on a flexible sensor based on a sandpaper molded elastomer that simultaneously detects planar displacement, rotation angle, and vertical contact pressure. When displacement, rotation, and contact pressure are applied, the contact area between the translating top elastomer electrode and the stationary three bottom electrodes change characteristically depending on the movement, making it possible to distinguish between them. The sandpaper molded undulating surface of the elastomer reduces friction at the contact allowing the sensor not to affect the movement during measurement. The sensor showed a 0.25 mm −1 displacement sensitivity with a ±33 μm accuracy, a 0.027 degree −1 of rotation sensitivity with ~0.95 degree accuracy, and a 4.96 kP −1 of pressure sensitivity. For possible application to joint movement detection, we demonstrated that our sensor effectively detected the up-and-down motion of a human forefinger and the bending and straightening motion of a human arm.
Stability test of the silicon Fiber Bragg Grating embroidered on textile for joint angle measurement
NASA Astrophysics Data System (ADS)
Apiwattanadej, Thanit; Chun, Byung Jae; Lee, Hyub; Li, King Ho Holden; Kim, Young-Jin
2017-06-01
Recently, Fiber Bragg Grating (FBG) sensors are being used for motion tracking applications. However, the sensitivity, linearity and stability of the systems have not been fully studied. Herein, an embroidered optical Fiber Bragg Grating (FBG) on a stretchable supportive textile for elbow movement measurement was developed. The sensing principle of this system is based on the alteration of Bragg wavelength due to strain from the elbow movements. The relationship between elbow movements and reflected Bragg wavelength was found to be linear. The dynamic range of FBG sensor on elbow support is between 0 and 120 degree. Finally, the stability of the FBG sensor on the supportive textile was tested during the exercise and the cleaning process with water. The sensitivity of FBG sensors for joint angle measurement and the effect of the movement and cleaning process to signals from FBG sensors after using in the real activity will be the basis knowledge for design and actual implementation of future optical fiber based wearable devices.
Asymmetrical dual tapered fiber Mach-Zehnder interferometer for fiber-optic directional tilt sensor.
Lee, Cheng-Ling; Shih, Wen-Cheng; Hsu, Jui-Ming; Horng, Jing-Shyang
2014-10-06
This work proposes a novel, highly sensitive and directional fiber tilt sensor that is based on an asymmetrical dual tapered fiber Mach-Zehnder interferometer (ADTFMZI). The fiber-optic tilt sensor consists of two abrupt tapers with different tapered waists into which are incorporated a set of iron spheres to generate an asymmetrical strain in the ADTFMZI that is correlated with the tilt angle and the direction of inclination. Owing to the asymmetrical structure of the dual tapers, the proposed sensor can detect the non-horizontal/horizontal state of a structure and whether the test structure is tilted to clockwise or counterclockwise by measuring the spectral responses. Experimental results show that the spectral wavelengths are blue-shifted and red-shifted when the sensor tilts to clockwise (-θ) and counterclockwise ( + θ), respectively. Tilt angle sensitivities of about 335 pm/deg. and 125 pm/deg. are achieved in the -θ and + θ directions, respectively, when the proposed sensing scheme is utilized.
Buffet induced structural/flight-control system interaction of the X-29A aircraft
NASA Technical Reports Server (NTRS)
Voracek, David F.; Clarke, Robert
1991-01-01
High angle-of-attack flight regime research is currently being conducted for modern fighter aircraft at the NASA Ames Research Center's Dryden Flight Research Facility. This flight regime provides enhanced maneuverability to fighter pilots in combat situations. Flight research data are being acquired to compare and validate advanced computational fluid dynamic solutions and wind-tunnel models. High angle-of-attack flight creates unique aerodynamic phenomena including wing rock and buffet on the airframe. These phenomena increase the level of excitation of the structural modes, especially on the vertical and horizontal stabilizers. With high gain digital flight-control systems, this structural response may result in an aeroservoelastic interaction. A structural interaction on the X-29A aircraft was observed during high angle-of-attack flight testing. The roll and yaw rate gyros sensed the aircraft's structural modes at 11, 13, and 16 Hz. The rate gyro output signals were then amplified through the flight-control laws and sent as commands to the flaperons and rudder. The flight data indicated that as the angle of attack increased, the amplitude of the buffet on the vertical stabilizer increased, which resulted in more excitation to the structural modes. The flight-control system sensors and command signals showed this increase in modal power at the structural frequencies up to a 30 degree angle-of-attack. Beyond a 30 degree angle-of-attack, the vertical stabilizer response, the feedback sensor amplitude, and control surface command signal amplitude remained relatively constant. Data are presented that show the increased modal power in the aircraft structural accelerometers, the feedback sensors, and the command signals as a function of angle of attack. This structural interaction is traced from the aerodynamic buffet to the flight-control surfaces.
Ground Optical Signal Processing Architecture for Contributing SSA Space Based Sensor Data
NASA Astrophysics Data System (ADS)
Koblick, D.; Klug, M.; Goldsmith, A.; Flewelling, B.; Jah, M.; Shanks, J.; Piña, R.
2014-09-01
The main objective of the DARPA program Orbit Outlook (O^2) is to improve the metric tracking and detection performance of the Space Situational Network (SSN) by adding a diverse low-cost network of contributing sensors to the Space Situational Awareness (SSA) mission. In order to accomplish this objective, not only must a sensor be in constant communication with a planning and scheduling system to process tasking requests, there must be an underlying framework to provide useful data products, such as angles only measurements. Existing optical signal processing implementations such as the Optical Processing Architecture at Lincoln (OPAL) are capable of converting mission data collections to angles only observations, but may be difficult for many users to obtain, support, and customize for low-cost missions and demonstration programs. The Ground Optical Signal Processing Architecture (GOSPA) will ingest raw imagery and telemetry data from a space based electro optical sensor and perform a background removal process to remove anomalous pixels, interpolate over bad pixels, and dominant temporal noise. After background removal, the streak end points and target centroids are located using a corner detection algorithm developed by Air Force Research Laboratory. These identified streak locations are then fused with the corresponding spacecraft telemetry data to determine the Right Ascension and Declination measurements with respect to time. To demonstrate the performance of GOSPA, non-rate tracking collections against a satellite in Geosynchronous Orbit are simulated from a visible optical imaging sensor in a polar Low Earth Orbit. Stars, noise and bad pixels are added to the simulated images based on look angles and sensor parameters. These collections are run through the GOSPA framework to provide angles- only measurements to the Air Force Research Laboratory Constrained Admissible Region Multiple Hypothesis Filter (CAR-MHF) in which an Initial Orbit Determination is performed and compared to truth data.
NASA Technical Reports Server (NTRS)
Pang, Yong; Lefskky, Michael; Sun, Guoqing; Ranson, Jon
2011-01-01
A spaceborne lidar mission could serve multiple scientific purposes including remote sensing of ecosystem structure, carbon storage, terrestrial topography and ice sheet monitoring. The measurement requirements of these different goals will require compromises in sensor design. Footprint diameters that would be larger than optimal for vegetation studies have been proposed. Some spaceborne lidar mission designs include the possibility that a lidar sensor would share a platform with another sensor, which might require off-nadir pointing at angles of up to 16 . To resolve multiple mission goals and sensor requirements, detailed knowledge of the sensitivity of sensor performance to these aspects of mission design is required. This research used a radiative transfer model to investigate the sensitivity of forest height estimates to footprint diameter, off-nadir pointing and their interaction over a range of forest canopy properties. An individual-based forest model was used to simulate stands of mixed conifer forest in the Tahoe National Forest (Northern California, USA) and stands of deciduous forests in the Bartlett Experimental Forest (New Hampshire, USA). Waveforms were simulated for stands generated by a forest succession model using footprint diameters of 20 m to 70 m. Off-nadir angles of 0 to 16 were considered for a 25 m diameter footprint diameter. Footprint diameters in the range of 25 m to 30 m were optimal for estimates of maximum forest height (R(sup 2) of 0.95 and RMSE of 3 m). As expected, the contribution of vegetation height to the vertical extent of the waveform decreased with larger footprints, while the contribution of terrain slope increased. Precision of estimates decreased with an increasing off-nadir pointing angle, but off-nadir pointing had less impact on height estimates in deciduous forests than in coniferous forests. When pointing off-nadir, the decrease in precision was dependent on local incidence angle (the angle between the off-nadir beam and a line normal to the terrain surface) which is dependent on the off-nadir pointing angle, terrain slope, and the difference between the laser pointing azimuth and terrain aspect; the effect was larger when the sensor was aligned with the terrain azimuth but when aspect and azimuth are opposed, there was virtually no effect on R2 or RMSE. A second effect of off-nadir pointing is that the laser beam will intersect individual crowns and the canopy as a whole from a different angle which had a distinct effect on the precision of lidar estimates of height, decreasing R2 and increasing RMSE, although the effect was most pronounced for coniferous crowns.
NASA Astrophysics Data System (ADS)
Florio, Christopher J.; Cota, Steve A.; Gaffney, Stephanie K.
2010-08-01
In a companion paper presented at this conference we described how The Aerospace Corporation's Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) may be used in conjunction with a limited number of runs of AFRL's MODTRAN4 radiative transfer code, to quickly predict the top-of-atmosphere (TOA) radiance received in the visible through midwave IR (MWIR) by an earth viewing sensor, for any arbitrary combination of solar and sensor elevation angles. The method is particularly useful for large-scale scene simulations where each pixel could have a unique value of reflectance/emissivity and temperature, making the run-time required for direct prediction via MODTRAN4 prohibitive. In order to be self-consistent, the method described requires an atmospheric model (defined, at a minimum, as a set of vertical temperature, pressure and water vapor profiles) that is consistent with the average scene temperature. MODTRAN4 provides only six model atmospheres, ranging from sub-arctic winter to tropical conditions - too few to cover with sufficient temperature resolution the full range of average scene temperatures that might be of interest. Model atmospheres consistent with intermediate temperature values can be difficult to come by, and in any event, their use would be too cumbersome for use in trade studies involving a large number of average scene temperatures. In this paper we describe and assess a method for predicting TOA radiance for any arbitrary average scene temperature, starting from only a limited number of model atmospheres.
4MOST optical system: presentation and design details
NASA Astrophysics Data System (ADS)
Azaïs, Nicolas; Frey, Steffen; Bellido, Olga; Winkler, Roland
2017-09-01
The 4-meter Multi-Object Spectroscopic Telescope (4MOST) is a wide-field, high-multiplex spectroscopic survey facility under development for the Visible and Infrared Survey Telescope for Astronomy (VISTA) 4 meter telescope of the European Southern Observatory (ESO) at Cerro Paranal. The objective of 4MOST is to enable the simultaneous spectroscopy of a significant number of targets within a 2.5° diameter field of view, to allow high-efficiency all-sky spectroscopic surveys. A wide field corrector (WFC) is needed to couple targets across the 2.5° field diameter with the exit pupil concentric with the spherical focal surface where 2400 fibres are configured by a fibre positioner (AESOP). For optimal fibre optic coupling and active optics wavefront sensing the WFC will correct optical aberrations of the primary (M1) and secondary (M2) VISTA optics across the full field of view and provide a well-defined and stable focal surface to which the acquisition/guiding sensors, wavefront sensors, and fibre positioner are interfaced. It will also compensate for the effects of atmospheric dispersion, allowing good chromatic coupling of stellar images with the fibre apertures over a wide range of telescope zenith angles (ZD). The fibres feed three spectrographs; two thirds of the fibres will feed two low resolution spectrographs and the remaining 812 fibres will feed a high-resolution spectrograph. The three spectrographs are fixed-configuration with three channels each. We present the 4MOST optical system together with optical simulation of subsystems.
Digital data from shuttle photography: The effects of platform variables
NASA Technical Reports Server (NTRS)
Davis, Bruce E.
1987-01-01
Two major criticisms of using Shuttle hand held photography as an Earth science sensor are that it is nondigital, nonquantitative and that it has inconsistent platform characteristics, e.g., variable look angles, especially as compared to remote sensing satellites such as LANDSAT and SPOT. However, these criticisms are assumptions and have not been systematically investigated. The spectral effects of off-nadir views of hand held photography from the Shuttle and their role in interpretation of lava flow morphology on the island of Hawaii are studied. Digitization of photography at JSC and use of LIPS image analysis software in obtaining data is discussed. Preliminary interpretative results of one flow are given. Most of the time was spent in developing procedures and overcoming equipment problems. Preliminary data are satisfactory for detailed analysis.
1. VARIABLEANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND ...
1. VARIABLE-ANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING NORTH TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
NASA Technical Reports Server (NTRS)
Ranniger, C. U.; Sorenson, E. A.; Akin, D. L.
1995-01-01
The University of Maryland Space Systems Laboratory, as a participant in NASA's INSTEP program, is developing a non-invasive, self-contained sensor system which can provide quantitative measurements of joint angles and muscle fatigue in the hand and forearm. The goal of this project is to develop a system with which hand/forearm motion and fatigue metrics can be determined in various terrestrial and zero-G work environments. A preliminary study of the prototype sensor systems and data reduction techniques for the fatigue measurement system are presented. The sensor systems evaluated include fiberoptics, used to measure joint angle, surface electrodes, which measure the electrical signals created in muscle as it contracts; microphones, which measure the noise made by contracting muscle; and accelerometers, which measure the lateral muscle acceleration during contraction. The prototype sensor systems were used to monitor joint motion of the metacarpophalangeal joint and muscle fatigue in flexor digitorum superficialis and flexor carpi ulnaris in subjects performing gripping tasks. Subjects were asked to sustain a 60-second constant-contraction (isometric) exercise and subsequently to perform a repetitive handgripping task to failure. Comparison of the electrical and mechanical signals of the muscles during the different tasks will be used to evaluate the applicability of muscle signal measurement techniques developed for isometric contraction tasks to fatigue prediction in quasi-dynamic exercises. Potential data reduction schemes are presented.
Chan, Woei-Leong; Hsiao, Fei-Bin
2011-01-01
This paper presents a complete procedure for sensor compatibility correction of a fixed-wing Unmanned Air Vehicle (UAV). The sensors consist of a differential air pressure transducer for airspeed measurement, two airdata vanes installed on an airdata probe for angle of attack (AoA) and angle of sideslip (AoS) measurement, and an Attitude and Heading Reference System (AHRS) that provides attitude angles, angular rates, and acceleration. The procedure is mainly based on a two pass algorithm called the Rauch-Tung-Striebel (RTS) smoother, which consists of a forward pass Extended Kalman Filter (EKF) and a backward recursion smoother. On top of that, this paper proposes the implementation of the Wiener Type Filter prior to the RTS in order to avoid the complicated process noise covariance matrix estimation. Furthermore, an easy to implement airdata measurement noise variance estimation method is introduced. The method estimates the airdata and subsequently the noise variances using the ground speed and ascent rate provided by the Global Positioning System (GPS). It incorporates the idea of data regionality by assuming that some sort of statistical relation exists between nearby data points. Root mean square deviation (RMSD) is being employed to justify the sensor compatibility. The result shows that the presented procedure is easy to implement and it improves the UAV sensor data compatibility significantly. PMID:22163819
Chan, Woei-Leong; Hsiao, Fei-Bin
2011-01-01
This paper presents a complete procedure for sensor compatibility correction of a fixed-wing Unmanned Air Vehicle (UAV). The sensors consist of a differential air pressure transducer for airspeed measurement, two airdata vanes installed on an airdata probe for angle of attack (AoA) and angle of sideslip (AoS) measurement, and an Attitude and Heading Reference System (AHRS) that provides attitude angles, angular rates, and acceleration. The procedure is mainly based on a two pass algorithm called the Rauch-Tung-Striebel (RTS) smoother, which consists of a forward pass Extended Kalman Filter (EKF) and a backward recursion smoother. On top of that, this paper proposes the implementation of the Wiener Type Filter prior to the RTS in order to avoid the complicated process noise covariance matrix estimation. Furthermore, an easy to implement airdata measurement noise variance estimation method is introduced. The method estimates the airdata and subsequently the noise variances using the ground speed and ascent rate provided by the Global Positioning System (GPS). It incorporates the idea of data regionality by assuming that some sort of statistical relation exists between nearby data points. Root mean square deviation (RMSD) is being employed to justify the sensor compatibility. The result shows that the presented procedure is easy to implement and it improves the UAV sensor data compatibility significantly.
Omni-Directional Scanning Localization Method of a Mobile Robot Based on Ultrasonic Sensors.
Mu, Wei-Yi; Zhang, Guang-Peng; Huang, Yu-Mei; Yang, Xin-Gang; Liu, Hong-Yan; Yan, Wen
2016-12-20
Improved ranging accuracy is obtained by the development of a novel ultrasonic sensor ranging algorithm, unlike the conventional ranging algorithm, which considers the divergence angle and the incidence angle of the ultrasonic sensor synchronously. An ultrasonic sensor scanning method is developed based on this algorithm for the recognition of an inclined plate and to obtain the localization of the ultrasonic sensor relative to the inclined plate reference frame. The ultrasonic sensor scanning method is then leveraged for the omni-directional localization of a mobile robot, where the ultrasonic sensors are installed on a mobile robot and follow the spin of the robot, the inclined plate is recognized and the position and posture of the robot are acquired with respect to the coordinate system of the inclined plate, realizing the localization of the robot. Finally, the localization method is implemented into an omni-directional scanning localization experiment with the independently researched and developed mobile robot. Localization accuracies of up to ±3.33 mm for the front, up to ±6.21 for the lateral and up to ±0.20° for the posture are obtained, verifying the correctness and effectiveness of the proposed localization method.
High speed three-dimensional laser scanner with real time processing
NASA Technical Reports Server (NTRS)
Lavelle, Joseph P. (Inventor); Schuet, Stefan R. (Inventor)
2008-01-01
A laser scanner computes a range from a laser line to an imaging sensor. The laser line illuminates a detail within an area covered by the imaging sensor, the area having a first dimension and a second dimension. The detail has a dimension perpendicular to the area. A traverse moves a laser emitter coupled to the imaging sensor, at a height above the area. The laser emitter is positioned at an offset along the scan direction with respect to the imaging sensor, and is oriented at a depression angle with respect to the area. The laser emitter projects the laser line along the second dimension of the area at a position where a image frame is acquired. The imaging sensor is sensitive to laser reflections from the detail produced by the laser line. The imaging sensor images the laser reflections from the detail to generate the image frame. A computer having a pipeline structure is connected to the imaging sensor for reception of the image frame, and for computing the range to the detail using height, depression angle and/or offset. The computer displays the range to the area and detail thereon covered by the image frame.
2011-12-19
NASA acquired November 24, 2011 From its vantage 824 kilometers (512 miles) above Earth, the Visible Infrared Imager Radiometer Suite (VIIRS) on the NPOESS Preparatory Project (NPP) satellite gets a complete view of our planet every day. This image from November 24, 2011, is the first complete global image from VIIRS. The NPP satellite launched on October 28, 2011, and VIIRS acquired its first measurements on November 21. To date, the images are preliminary, used to gauge the health of the sensor as engineers continue to power it up for full operation. Rising from the south and setting in the north on the daylight side of Earth, VIIRS images the surface in long wedges measuring 3,000 kilometers (1,900 miles) across. The swaths from each successive orbit overlap one another, so that at the end of the day, the sensor has a complete view of the globe. The Arctic is missing because it is too dark to view in visible light during the winter. The NPP satellite was placed in a Sun-synchronous orbit, a unique path that takes the satellite over the equator at the same local (ground) time in every orbit. So, when NPP flies over Kenya, it is about 1:30 p.m. on the ground. When NPP reaches Gabon—about 3,000 kilometers to the west—on the next orbit, it is close to 1:30 p.m. on the ground. This orbit allows the satellite to maintain the same angle between the Earth and the Sun so that all images have similar lighting. The consistent lighting is evident in the daily global image. Stripes of sunlight (sunglint) reflect off the ocean in the same place on the left side of every swath. The consistent angle is important because it allows scientists to compare images from year to year without worrying about extreme changes in shadows and lighting. The image also shows a band of haze along the right side of every orbit swath. When light travels through the atmosphere, it bounces off particles or scatters, making the atmosphere look hazy. The scattering effect is most pronounced along the edge of the swath, where the sensor is looking at an angle through more of the atmosphere. Scientists can correct for this scattering effect, but need measurements from a range of wavelengths to do so. The degree to which light scatters depends partly on the wavelength of the light. Blue light scatters more than red light, for example, which is why the sky is blue. VIIRS measures 22 different wavelengths of light, but not all of the sensor’s detectors are operating at peak performance yet. Those measuring thermal infrared light are not yet cold enough to collect reliable measurements. Once VIIRS begins full operations, it will produce a range of measurements from ocean temperature to clouds to the locations of fires. These measurements will help extend the record from earlier sensors like the Moderate Resolution Imaging Spectroradiometer (MODIS). VIIRS is very similar to MODIS, but flies at a higher altitude to measure the whole planet without gaps. (MODIS daily measurements have gaps at the equator. See the MODIS image from November 24.) VIIRS also sees the Earth in less detail, 375 meters per pixel, compared to 250 meters per pixel for MODIS. Image by NASA’s NPP Land Product Evaluation and Testing Element. Caption by Holli Riebeek. Credit: NASA Earth Observatory NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
NASA Astrophysics Data System (ADS)
Dowdy, Josh; Anderson, Derek T.; Luke, Robert H.; Ball, John E.; Keller, James M.; Havens, Timothy C.
2016-05-01
Explosive hazards in current and former conflict zones are a threat to both military and civilian personnel. As a result, much effort has been dedicated to identifying automated algorithms and systems to detect these threats. However, robust detection is complicated due to factors like the varied composition and anatomy of such hazards. In order to solve this challenge, a number of platforms (vehicle-based, handheld, etc.) and sensors (infrared, ground penetrating radar, acoustics, etc.) are being explored. In this article, we investigate the detection of side attack explosive ballistics via a vehicle-mounted acoustic sensor. In particular, we explore three acoustic features, one in the time domain and two on synthetic aperture acoustic (SAA) beamformed imagery. The idea is to exploit the varying acoustic frequency profile of a target due to its unique geometry and material composition with respect to different viewing angles. The first two features build their angle specific frequency information using a highly constrained subset of the signal data and the last feature builds its frequency profile using all available signal data for a given region of interest (centered on the candidate target location). Performance is assessed in the context of receiver operating characteristic (ROC) curves on cross-validation experiments for data collected at a U.S. Army test site on different days with multiple target types and clutter. Our preliminary results are encouraging and indicate that the top performing feature is the unrolled two dimensional discrete Fourier transform (DFT) of SAA beamformed imagery.
Zhao, Anbang; Ma, Lin; Ma, Xuefei; Hui, Juan
2017-02-20
In this paper, an improved azimuth angle estimation method with a single acoustic vector sensor (AVS) is proposed based on matched filtering theory. The proposed method is mainly applied in an active sonar detection system. According to the conventional passive method based on complex acoustic intensity measurement, the mathematical and physical model of this proposed method is described in detail. The computer simulation and lake experiments results indicate that this method can realize the azimuth angle estimation with high precision by using only a single AVS. Compared with the conventional method, the proposed method achieves better estimation performance. Moreover, the proposed method does not require complex operations in frequencydomain and achieves computational complexity reduction.
Analysis of load monitoring system in hydraulic mobile cranes
NASA Astrophysics Data System (ADS)
Kalairassan, G.; Boopathi, M.; Mohan, Rijo Mathew
2017-11-01
Load moment limiters or safe load control systems or are very important in crane safety. The system detects the moment of lifting load and compares this actual moment with the rated moment. The system uses multiple sensors such as boom angle sensor, boom length sensor for telescopic booms, pressure transducers for measuring the load, anti-two block switch and roller switches. The system works both on rubber and on outriggers. The sensors measure the boom extension, boom angle and load to give as inputs to the central processing, which calculate the safe working load range for that particular configuration of the crane and compare it with the predetermined safe load. If the load exceeds the safe load, actions will be taken which will reduce the load moment, which is boom telescopic retraction and boom lifting. Anti-two block switch is used to prevent the two blocking condition. The system is calibrated and load tested for at most precision.
Impedance spectroscopy of undoped and Cr-doped ZnO gas sensors under different oxygen concentrations
NASA Astrophysics Data System (ADS)
Al-Hardan, N.; Abdullah, M. J.; Aziz, A. Abdul
2011-08-01
Thin films of undoped and chromium (Cr)-doped zinc oxide (ZnO) were synthesized by RF reactive co-sputtering for oxygen gas sensing applications. The prepared films showed a highly c-axis oriented phase with a dominant (0 0 2) peak appeared at a Bragg angle of around 34.13 °, which was lower than that of the standard reference of ZnO powder (34.42 °). The peak shifted to a slightly higher angle with Cr doping. The operating temperature of the ZnO gas sensor was around 350 °C, which shifted to around 250 °C with Cr-doping. The response of the sensor to oxygen gas was enhanced by doping ZnO with 1 at.% Cr. Impedance spectroscopy analysis showed that the resistance due to grain boundaries significantly contributed to the characteristics of the gas sensor.
79. VIEW OF VAL FIRING RANGE LOOKING SOUTHWEST SHOWING LAUNCHER ...
79. VIEW OF VAL FIRING RANGE LOOKING SOUTHWEST SHOWING LAUNCHER BRIDGE, BARGES, SONAR BUOY RANGE AND MORRIS DAM IN BACKGROUND, June 10, 1948. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Mobile Robot Localization by Remote Viewing of a Colored Cylinder
NASA Technical Reports Server (NTRS)
Volpe, R.; Litwin, T.; Matthies, L.
1995-01-01
A system was developed for the Mars Pathfinder rover in which the rover checks its position by viewing the angle back to a colored cylinder with different colors for different angles. The rover determines distance by the apparent size of the cylinder.
New developments of a knowledge based system (VEG) for inferring vegetation characteristics
NASA Technical Reports Server (NTRS)
Kimes, D. S.; Harrison, P. A.; Harrison, P. R.
1992-01-01
An extraction technique for inferring physical and biological surface properties of vegetation using nadir and/or directional reflectance data as input has been developed. A knowledge-based system (VEG) accepts spectral data of an unknown target as input, determines the best strategy for inferring the desired vegetation characteristic, applies the strategy to the target data, and provides a rigorous estimate of the accuracy of the inference. Progress in developing the system is presented. VEG combines methods from remote sensing and artificial intelligence, and integrates input spectral measurements with diverse knowledge bases. VEG has been developed to (1) infer spectral hemispherical reflectance from any combination of nadir and/or off-nadir view angles; (2) test and develop new extraction techniques on an internal spectral database; (3) browse, plot, or analyze directional reflectance data in the system's spectral database; (4) discriminate between user-defined vegetation classes using spectral and directional reflectance relationships; and (5) infer unknown view angles from known view angles (known as view angle extension).
An approach to improving the signal-to-optical-noise ratio of pulsed magnetic field photonic sensors
NASA Astrophysics Data System (ADS)
Wang, Jiang-ping; Li, Yu-quan
2008-12-01
During last years, interest in pulsed magnetic field sensors has widely increased. In fact, magnetic field measurement has a critical part in various scientific and technical areas. In order to research on pulsed magnetic field characteristic and corresponding measuring and defending means, a sensor with high immunity to electrical noise, high sensitivity, high accuracy and wide dynamic range is needed. The conventional magnetic field measurement system currently use active metallic probes which can disturb the measuring magnetic field and make sensor very sensitive to electromagnetic noise. Photonic magnetic field sensor exhibit great advantages with respect to the electronic ones: a very good galvanic insulation, high sensitivity and very wide bandwidth. Photonic sensing technology is fit for demand of a measure pulsed magnetic field. A type of pulsed magnetic field photonic sensor has been designed, analyzed, and tested. The cross polarization angle in photonic sensor effect on the signal-to-optical-noise ratio is theoretically analyzed in this paper. A novel approach for improving the signal-to-optical-noise ratio of pulsed magnetic field sensors was proposed. The experiments have proved that this approach is practical. The theoretical analysis and simulation results show that the signal-to-optical-noise ratio can potentially be considerably improved by setup suitable for the cross polarization angle.
Bergamini, Elena; Ligorio, Gabriele; Summa, Aurora; Vannozzi, Giuseppe; Cappozzo, Aurelio; Sabatini, Angelo Maria
2014-10-09
Magnetic and inertial measurement units are an emerging technology to obtain 3D orientation of body segments in human movement analysis. In this respect, sensor fusion is used to limit the drift errors resulting from the gyroscope data integration by exploiting accelerometer and magnetic aiding sensors. The present study aims at investigating the effectiveness of sensor fusion methods under different experimental conditions. Manual and locomotion tasks, differing in time duration, measurement volume, presence/absence of static phases, and out-of-plane movements, were performed by six subjects, and recorded by one unit located on the forearm or the lower trunk, respectively. Two sensor fusion methods, representative of the stochastic (Extended Kalman Filter) and complementary (Non-linear observer) filtering, were selected, and their accuracy was assessed in terms of attitude (pitch and roll angles) and heading (yaw angle) errors using stereophotogrammetric data as a reference. The sensor fusion approaches provided significantly more accurate results than gyroscope data integration. Accuracy improved mostly for heading and when the movement exhibited stationary phases, evenly distributed 3D rotations, it occurred in a small volume, and its duration was greater than approximately 20 s. These results were independent from the specific sensor fusion method used. Practice guidelines for improving the outcome accuracy are provided.
Xiang, Yun; Yan, Lei; Zhao, Yun-sheng; Gou, Zhi-yang; Chen, Wei
2011-12-01
Polarized reflectance is influenced by such factors as its physical and chemical properties, the viewing geometry composed of light incident zenith, viewing zenith and viewing azimuth relative to light incidence, surface roughness and texture, surface density, detection wavelengths, polarization phase angle and so on. In the present paper, the influence of surface roughness on the degree of polarization (DOP) of biotite plagioclase gneiss varying with viewing angle was inquired and analyzed quantitatively. The polarized spectra were measured by ASD FS3 spectrometer on the goniometer located in Northeast Normal University. When the incident zenith angle was fixed at 50 degrees, it was showed that on the rock surfaces with different roughness, in the specular reflection direction, the DOP spectrum within 350-2500 nm increased to the highest value first, and then began to decline varying with viewing zenith angle from 0 degree to 80 degrees. The characterized band (520 +/- 10) nm was picked out for further analysis. The correlation analysis between the peak DOP value of zenith and surface roughness showed that they are in a power function relationship, with the regression equation: y = 0.604x(-0.297), R2 = 0.985 4. The correlation model of the angle where the peak is in and the surface roughness is y = 3.4194x + 51.584, y < 90 degrees , R2 = 0.8177. With the detecting azimuth farther away from 180 degrees azimuth where the maximum DOP exists, the DOP lowers gradually and tends to 0. In the detection azimuth 180 dgrees , the correlation analysis between the peak values of DOP on the (520 =/- 10) nm band for five rocks and their surface roughness indicates a power function, with the regression equation being y = 0.5822x(-0.333), R2 = 0.9843. F tests of the above regression models indicate that the peak value and its corresponding viewing angle correlate much with surface roughness. The study provides a theoretical base for polarization remote sensing, and impels the rock and city architecture discrimination and minerals mapping.
Dual light field and polarization imaging using CMOS diffractive image sensors.
Jayasuriya, Suren; Sivaramakrishnan, Sriram; Chuang, Ellen; Guruaribam, Debashree; Wang, Albert; Molnar, Alyosha
2015-05-15
In this Letter we present, to the best of our knowledge, the first integrated CMOS image sensor that can simultaneously perform light field and polarization imaging without the use of external filters or additional optical elements. Previous work has shown how photodetectors with two stacks of integrated metal gratings above them (called angle sensitive pixels) diffract light in a Talbot pattern to capture four-dimensional light fields. We show, in addition to diffractive imaging, that these gratings polarize incoming light and characterize the response of these sensors to polarization and incidence angle. Finally, we show two applications of polarization imaging: imaging stress-induced birefringence and identifying specular reflections in scenes to improve light field algorithms for these scenes.
NASA Astrophysics Data System (ADS)
Silva, Ana S.; Catarino, André; Correia, Miguel V.; Frazão, Orlando
2013-12-01
The work presented here describes the development and characterization of intensity fiber optic sensor integrated in a specifically designed piece of garment to measure elbow flexion. The sensing head is based on macrobending incorporated in the garment, and the increase of curvature number was studied in order to investigate which scheme provided a good result in terms of sensitivity and repeatability. Results showed the configuration that assured a higher sensitivity (0.644 dBm/deg) and better repeatability was the one with four loops. Ultimately, this sensor can be used for rehabilitation purposes to monitor human joint angles, namely, elbow flexion on stroke survivors while performing the reach functional task, which is the most common upper-limb human gesture.
Multi-angle Imaging Spectro Radiometer (MISR) Design Issues Influened by Performance Requirements
NASA Technical Reports Server (NTRS)
Bruegge, C. J.; White, M. L.; Chrien, N. C. L.; Villegas, E. B.; Raouf, N.
1993-01-01
The design of an Earth Remote Sensing Sensor, such as the Multi-angle Imaging SpectroRadiometer (MISR), begins with a set of science requirements and is quickly followed by a set of instrument specifications.
Integrated multi sensors and camera video sequence application for performance monitoring in archery
NASA Astrophysics Data System (ADS)
Taha, Zahari; Arif Mat-Jizat, Jessnor; Amirul Abdullah, Muhammad; Muazu Musa, Rabiu; Razali Abdullah, Mohamad; Fauzi Ibrahim, Mohamad; Hanafiah Shaharudin, Mohd Ali
2018-03-01
This paper explains the development of a comprehensive archery performance monitoring software which consisted of three camera views and five body sensors. The five body sensors evaluate biomechanical related variables of flexor and extensor muscle activity, heart rate, postural sway and bow movement during archery performance. The three camera views with the five body sensors are integrated into a single computer application which enables the user to view all the data in a single user interface. The five body sensors’ data are displayed in a numerical and graphical form in real-time. The information transmitted by the body sensors are computed with an embedded algorithm that automatically transforms the summary of the athlete’s biomechanical performance and displays in the application interface. This performance will be later compared to the pre-computed psycho-fitness performance from the prefilled data into the application. All the data; camera views, body sensors; performance-computations; are recorded for further analysis by a sports scientist. Our developed application serves as a powerful tool for assisting the coach and athletes to observe and identify any wrong technique employ during training which gives room for correction and re-evaluation to improve overall performance in the sport of archery.
Multi-Beam Approach for Accelerating Alignment and Calibration of HyspIRI-Like Imaging Spectrometers
NASA Technical Reports Server (NTRS)
Eastwood, Michael L.; Green, Robert O.; Mouroulis, Pantazis; Hochberg, Eric B.; Hein, Randall C.; Kroll, Linley A.; Geier, Sven; Coles, James B.; Meehan, Riley
2012-01-01
A paper describes an optical stimulus that produces more consistent results, and can be automated for unattended, routine generation of data analysis products needed by the integration and testing team assembling a high-fidelity imaging spectrometer system. One key attribute of the system is an arrangement of pick-off mirrors that provides multiple input beams (five in this implementation) to simultaneously provide stimulus light to several field angles along the field of view of the sensor under test, allowing one data set to contain all the information that previously required five data sets to be separately collected. This stimulus can also be fed by quickly reconfigured sources that ultimately provide three data set types that would previously be collected separately using three different setups: Spectral Response Function (SRF), Cross-track Response Function (CRF), and Along-track Response Function (ARF), respectively. This method also lends itself to expansion of the number of field points if less interpolation across the field of view is desirable. An absolute minimum of three is required at the beginning stages of imaging spectrometer alignment.
NASA Astrophysics Data System (ADS)
Voss, K. J.; Morel, A.; Antoine, D.
2007-09-01
The radiance viewed from the ocean depends on the illumination and viewing geometry along with the water properties, and this variation is called the bidirectional effect. This bidirectional effect depends on the inherent optical properties of the water, including the volume scattering function, and is important when comparing data from different satellite sensors. The current model of f/Q, which contains the bidirectional effect, by Morel et al. (2002) depends on modeled, not measured, water parameters, thus must be carefully validated. In this paper we combined upwelling radiance distribution data from several cruises, in varied water types and with a wide range of solar zenith angles. We compared modeled and measured Lview/Lnadir and found that the average difference between the model and data was less than 0.01, while the RMS difference between the model and data was on the order of 0.02-0.03. This is well within the statistical noise of the data, which was on the order of 0.04-0.05, due to environmental noise sources such as wave focusing.
Live Aircraft Encounter Visualization at FutureFlight Central
NASA Technical Reports Server (NTRS)
Murphy, James R.; Chinn, Fay; Monheim, Spencer; Otto, Neil; Kato, Kenji; Archdeacon, John
2018-01-01
Researchers at the National Aeronautics and Space Administration (NASA) have developed an aircraft data streaming capability that can be used to visualize live aircraft in near real-time. During a joint Federal Aviation Administration (FAA)/NASA Airborne Collision Avoidance System flight series, test sorties between unmanned aircraft and manned intruder aircraft were shown in real-time at NASA Ames' FutureFlight Central tower facility as a virtual representation of the encounter. This capability leveraged existing live surveillance, video, and audio data streams distributed through a Live, Virtual, Constructive test environment, then depicted the encounter from the point of view of any aircraft in the system showing the proximity of the other aircraft. For the demonstration, position report data were sent to the ground from on-board sensors on the unmanned aircraft. The point of view can be change dynamically, allowing encounters from all angles to be observed. Visualizing the encounters in real-time provides a safe and effective method for observation of live flight testing and a strong alternative to travel to the remote test range.
DOE Office of Scientific and Technical Information (OSTI.GOV)
A Rasmussen, Andrew P.; Hale, Layton; Kim, Peter
Meeting the science goals for the Large Synoptic Survey Telescope (LSST) translates into a demanding set of imaging performance requirements for the optical system over a wide (3.5{sup o}) field of view. In turn, meeting those imaging requirements necessitates maintaining precise control of the focal plane surface (10 {micro}m P-V) over the entire field of view (640 mm diameter) at the operating temperature (T {approx} -100 C) and over the operational elevation angle range. We briefly describe the hierarchical design approach for the LSST Camera focal plane and the baseline design for assembling the flat focal plane at room temperature.more » Preliminary results of gravity load and thermal distortion calculations are provided, and early metrological verification of candidate materials under cold thermal conditions are presented. A detailed, generalized method for stitching together sparse metrology data originating from differential, non-contact metrological data acquisition spanning multiple (non-continuous) sensor surfaces making up the focal plane, is described and demonstrated. Finally, we describe some in situ alignment verification alternatives, some of which may be integrated into the camera's focal plane.« less
Hunter, Steven L.; Boro, Carl O.; Farris, Alvis
2002-01-01
A tiltmeter device having a pair of orthogonally disposed tilt sensors that are levelable within an inner housing containing the sensors. An outer housing can be rotated to level at least one of the sensor pair while the inner housing can be rotated to level the other sensor of the pair. The sensors are typically rotated up to about plus or minus 100 degrees. The device is effective for measuring tilts in a wide range of angles of inclination of wells and can be employed to level a platform containing a third sensor.
NASA Astrophysics Data System (ADS)
Senn, S.; Liewald, M.
2017-09-01
Deep drawn parts often do have complex designs and, therefore, must be trimmed or punched subsequently in a second stage. Due to the complex part geometry, most punching areas do reveal critical slant angle (angle between part surface and ram movement direction) different to perpendicular direction. Piercing within a critical range of slant angle may lead to severe damage of the cutting tool. Consequently, expensive cam units are required to transform the ram moving direction in order to perform the piercing process perpendicularly to the local part surface. For modern sheet metals, however, the described critical angle of attack has not been investigated adequately until now. Therefore, cam units are used in cases in which regular piercing with high slant angle wouldn’t be possible. Purpose of this study is to investigate influencing factors and their effect on punch damage during piercing of high strength steels with slant angles. Therefore, a modular shearing tool was designed, which allows to simply switch die parts to vary cutting clearance and cutting angle. The target size of the study is to measure the lateral deviation of the punch which is monitored by an eddy current sensor. The sensor is located in the downholder and measures the lateral punch deviation in-line during manufacturing. The deviation is mainly influenced by slant angle of workpiece surface. In relation to slang angle and sheet thickness the clearance has a small influence on the measured punch deflection.
NASA Technical Reports Server (NTRS)
Gatebe, C. K.; King, M. D.; Tsay, S.-C.; Ji, Q.; Arnold, T.
2000-01-01
In this sensitivity study, we examined the ratio technique, the official method for remote sensing of aerosols over land from Moderate Resolution Imaging Spectroradiometer (MODIS) DATA, for view angles from nadir to 65 deg. off-nadir using Cloud Absorption Radiometer (CAR) data collected during the Smoke, Clouds, and Radiation-Brazil (SCAR-B) experiment conducted in 1995. For the data analyzed and for the view angles tested, results seem to suggest that the reflectance (rho)0.47 and (rho)0.67 are predictable from (rho)2.1 using: (rho)0.47 = (rho)2.1/6, which is a slight modification and (rho)0.67 = (rho)2.1/2. These results hold for target viewed from backscattered direction, but not for the forward direction.
Transmission-grating-based wavefront tilt sensor.
Iwata, Koichi; Fukuda, Hiroki; Moriwaki, Kousuke
2009-07-10
We propose a new type of tilt sensor. It consists of a grating and an image sensor. It detects the tilt of the collimated wavefront reflected from a plane mirror. Its principle is described and analyzed based on wave optics. Experimental results show its validity. Simulations of the ordinary autocollimator and the proposed tilt sensor show that the effect of noise on the measured angle is smaller for the latter. These results show a possibility of making a smaller and simpler tilt sensor.
Tomographic wavefront retrieval by combined use of geometric and plenoptic sensors
NASA Astrophysics Data System (ADS)
Trujillo-Sevilla, J. M.; Rodríguez-Ramos, L. F.; Fernández-Valdivia, Juan J.; Marichal-Hernández, José G.; Rodríguez-Ramos, J. M.
2014-05-01
Modern astronomic telescopes take advantage of multi-conjugate adaptive optics, in which wavefront sensors play a key role. A single sensor capable of measuring wavefront phases at any angle of observation would be helpful when improving atmospheric tomographic reconstruction. A new sensor combining both geometric and plenoptic arrangements is proposed, and a simulation demonstrating its working principle is also shown. Results show that this sensor is feasible, and also that single extended objects can be used to perform tomography of atmospheric turbulence.
A novel method to detect ignition angle of diesel
NASA Astrophysics Data System (ADS)
Li, Baofu; Peng, Yong; Huang, Hongzhong
2018-04-01
This paper is based on the combustion signal collected by the combustion sensor of piezomagnetic type, taking how to get the diesel fuel to start the combustion as the starting point. It analyzes the operating principle and pressure change of the combustion sensor, the compression peak signal of the diesel engine in the process of compression, and several common methods. The author puts forward a new idea that ignition angle timing can be determined more accurately by the compression peak decomposition method. Then, the method is compared with several common methods.
A new method for determining which stars are near a star sensor field-of-view
NASA Technical Reports Server (NTRS)
Yates, Russell E., Jr.; Vedder, John D.
1991-01-01
A new method is described for determining which stars in a navigation star catalog are near a star sensor field of view (FOV). This method assumes that an estimate of spacecraft inertial attitude is known. Vector component ranges for the star sensor FOV are computed, so that stars whose vector components lie within these ranges are near the star sensor FOV. This method requires no presorting of the navigation star catalog, and is more efficient than tradition methods.
Wang, Xingliang; Zhang, Youan; Wu, Huali
2016-03-01
The problem of impact angle control guidance for a field-of-view constrained missile against non-maneuvering or maneuvering targets is solved by using the sliding mode control theory. The existing impact angle control guidance laws with field-of-view constraint are only applicable against stationary targets and most of them suffer abrupt-jumping of guidance command due to the application of additional guidance mode switching logic. In this paper, the field-of-view constraint is handled without using any additional switching logic. In particular, a novel time-varying sliding surface is first designed to achieve zero miss distance and zero impact angle error without violating the field-of-view constraint during the sliding mode phase. Then a control integral barrier Lyapunov function is used to design the reaching law so that the sliding mode can be reached within finite time and the field-of-view constraint is not violated during the reaching phase as well. A nonlinear extended state observer is constructed to estimate the disturbance caused by unknown target maneuver, and the undesirable chattering is alleviated effectively by using the estimation as a compensation item in the guidance law. The performance of the proposed guidance law is illustrated with simulations. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
74. DETAIL VIEW OF INSIDE THE LAUNCHING BRIDGE LOOKING SOUTHWEST ...
74. DETAIL VIEW OF INSIDE THE LAUNCHING BRIDGE LOOKING SOUTHWEST SHOWING ADJUSTABLE STAIRS ON THE LEFT AND LAUNCHING TUBE ON THE RIGHT, Date unknown, circa 1948. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
A miniature pressure sensor for blast event evaluation
NASA Astrophysics Data System (ADS)
Wu, Nan; Wang, Wenhui; Tian, Ye; Niezrecki, Christopher; Wang, Xingwei
2011-06-01
Traumatic brain injury (TBI) is a great potential threat to people who deal with explosive devices. Protection from TBI has attracted more and more interest. Great efforts have been taken to the studies on the understanding of the propagation of the blast events and its effect on TBI. However, one of the biggest challenges is that the current available pressure sensors are not fast enough to capture the blast wave especially the transient period. This paper reports an ultrafast pressure sensor that could be very useful for analysis of the fast changing blast signal. The sensor is based on Fabry-Perot (FP) principle. It uses a 45º angle polished fiber sitting in a V-groove on a silicon chip. The endface of the angle polished fiber and the diaphragm which is lifted off on the side wall of the V-groove form the FP cavity. The sensor is very small and can be mounted on different locations of a helmet to measure blast pressure simultaneously. The tests were conducted at Natick Soldier Research, Development, and Engineering Center (NSRDEC) in Natick, MA. The sensors were mounted in a shock tube, side by side with the reference sensors, to measure a rapidly increased pressure. The results demonstrated that our sensors' responses agreed well with those from the electrical reference sensors and their response time is comparable.
Park, Jae Byung; Lee, Seung Hun; Lee, Il Jae
2009-01-01
In this study, we propose a precise 3D lug pose detection sensor for automatic robot welding of a lug to a huge steel plate used in shipbuilding, where the lug is a handle to carry the huge steel plate. The proposed sensor consists of a camera and four laser line diodes, and its design parameters are determined by analyzing its detectable range and resolution. For the lug pose acquisition, four laser lines are projected on both lug and plate, and the projected lines are detected by the camera. For robust detection of the projected lines against the illumination change, the vertical threshold, thinning, Hough transform and separated Hough transform algorithms are successively applied to the camera image. The lug pose acquisition is carried out by two stages: the top view alignment and the side view alignment. The top view alignment is to detect the coarse lug pose relatively far from the lug, and the side view alignment is to detect the fine lug pose close to the lug. After the top view alignment, the robot is controlled to move close to the side of the lug for the side view alignment. By this way, the precise 3D lug pose can be obtained. Finally, experiments with the sensor prototype are carried out to verify the feasibility and effectiveness of the proposed sensor. PMID:22400007
NASA Astrophysics Data System (ADS)
LIM, M.; PARK, Y.; Jung, H.; SHIN, Y.; Rim, H.; PARK, C.
2017-12-01
To measure all components of a physical property, for example the magnetic field, is more useful than to measure its magnitude only in interpretation and application thereafter. To convert the physical property measured in 3 components on a random coordinate system, for example on moving magnetic sensor body's coordinate system, into 3 components on a fixed coordinate system, for example on geographical coordinate system, by the rotations of coordinate system around Euler angles for example, we should have the attitude values of the sensor body in time series, which could be acquired by an INS-GNSS system of which the axes are installed coincident with those of the sensor body. But if we want to install some magnetic sensors in array at sea floor but without attitude acquisition facility of the magnetic sensors and to monitor the variation of magnetic fields in time, we should have also some way to estimate the relation between the geographical coordinate system and each sensor body's coordinate system by comparison of the vectors only measured on both coordinate systems on the assumption that the directions of the measured magnetic field on both coordinate systems are the same. For that estimation, we have at least 3 ways. The first one is to calculate 3 Euler angles phi, theta, psi from the equation Vgeograph = Rx(phi) Ry(theta) Rz(psi) Vrandom, where Vgeograph is the vector on geographical coordinate system etc. and Rx(phi) is the rotation matrix around the x axis by the angle phi etc. The second one is to calculate the difference of inclination and declination between the 2 vectors on spherical coordinate system. The third one, used by us for this study, is to calculate the angle of rotation along a great circle around the rotation axis, and the direction of the rotation axis. We installed no. 1 and no. 2 FVM-400 fluxgate magnetometers in array near Cheongyang Geomagnetic Observatory (IAGA code CYG) and acquired time series of magnetic fields for CYG and for the two magnetometers. Once the angle of rotation and the direction of the rotation axis for each couple of CYG and no. 1 and of CYG and no. 2 estimated, we rotated the measured time series of vectors using quaternion rotation to get 3 time series of magnetic fields all on geographical coordinate system, which were used for tracing the moving magnetic bodies along time in that area.
Optimal design of wide-view-angle waveplate used for polarimetric diagnosis of lithography system
NASA Astrophysics Data System (ADS)
Gu, Honggang; Jiang, Hao; Zhang, Chuanwei; Chen, Xiuguo; Liu, Shiyuan
2016-03-01
The diagnosis and control of the polarization aberrations is one of the main concerns in a hyper numerical aperture (NA) lithography system. Waveplates are basic and indispensable optical components in the polarimetric diagnosis tools for the immersion lithography system. The retardance of a birefringent waveplate is highly sensitive to the incident angle of the light, which makes the conventional waveplate not suitable to be applied in the polarimetric diagnosis for the immersion lithography system with a hyper NA. In this paper, we propose a method for the optimal design of a wideview- angle waveplate by combining two positive waveplates made from magnesium fluoride (MgF2) and two negative waveplates made from sapphire using the simulated annealing algorithm. Theoretical derivations and numerical simulations are performed and the results demonstrate that the maximum variation in the retardance of the optimally designed wide-view-angle waveplate is less than +/- 0.35° for a wide-view-angle range of +/- 20°.
Satellite Ocean-Color Validation Using Ships of Opportunity. Chapter 5
NASA Technical Reports Server (NTRS)
Frouin, Robert; Cutchin, David L.; Gross-Colzy, Lydwine; Poteau, Antoine; Deschamps, Pierre-Yves
2003-01-01
The investigation s main objective is to collect from platforms of opportunity (merchant ships, research vessels) concomitant normalized water-leaving radiance and aerosol optical thickness data over the world s oceans. A global, long-term data set of these variables is needed to verify whether satellite retrievals of normalized water-leaving radiance are within acceptable error limits and, eventually, to adjust atmospheric correction schemes. To achieve this objective, volunteer officers, technicians, and scientists onboard the selected ships collect data from portable SIMBAD and Advanced SIMBAD (SIMBADA) radiometers. These instruments are specifically designed for evaluation of satellite-derived ocean color. They measure radiance in spectral bands typical of ocean-color sensors. The SIMBAD version measures in 5 spectral bands centered at 443, 490, 560, 670, and 870 nm, and the Advanced SIMBAD version in 11 spectral bands centered at 350, 380, 412, 443, 490, 510, 565, 620, 670, 750, and 870 nm. Aerosol optical thickness is obtained by viewing the sun disk like a classic sun photometer. Normalized water-leaving radiance, or marine reflectance, is obtained by viewing the ocean surface through a vertical polarizer in a specific geometry (nadir angle of 45o and relative azimuth angle of 135deg) to minimize direct sun glint and reflected sky radiation. The SIMBAD and SIMBADA data, after proper quality control and processing, are delivered to the SIMBIOS project office for inclusion in the SeaBASS archive. They complement data collected in a similar way by the Laboratoire d'Optique Atmospherique of the University of Lille, France. The SIMBAD and SIMBADA data are used to check the radiometric calibration of satellite ocean-color sensors after launch and to evaluate derived ocean-color variables (i.e., normalized water-leaving radiance, aerosol optical thickness, and aerosol type). Analysis of the SIMBAD and SIMBADA data provides information on the accuracy of satellite retrievals of normalized water-leaving radiance, an understanding of the discrepancies between satellite and in situ data, and algorithms that reduce the discrepancies, contributing to more accurate and consistent global ocean color data sets.
View-angle-dependent AIRS Cloudiness and Radiance Variance: Analysis and Interpretation
NASA Technical Reports Server (NTRS)
Gong, Jie; Wu, Dong L.
2013-01-01
Upper tropospheric clouds play an important role in the global energy budget and hydrological cycle. Significant view-angle asymmetry has been observed in upper-level tropical clouds derived from eight years of Atmospheric Infrared Sounder (AIRS) 15 um radiances. Here, we find that the asymmetry also exists in the extra-tropics. It is larger during day than that during night, more prominent near elevated terrain, and closely associated with deep convection and wind shear. The cloud radiance variance, a proxy for cloud inhomogeneity, has consistent characteristics of the asymmetry to those in the AIRS cloudiness. The leading causes of the view-dependent cloudiness asymmetry are the local time difference and small-scale organized cloud structures. The local time difference (1-1.5 hr) of upper-level (UL) clouds between two AIRS outermost views can create parts of the observed asymmetry. On the other hand, small-scale tilted and banded structures of the UL clouds can induce about half of the observed view-angle dependent differences in the AIRS cloud radiances and their variances. This estimate is inferred from analogous study using Microwave Humidity Sounder (MHS) radiances observed during the period of time when there were simultaneous measurements at two different view-angles from NOAA-18 and -19 satellites. The existence of tilted cloud structures and asymmetric 15 um and 6.7 um cloud radiances implies that cloud statistics would be view-angle dependent, and should be taken into account in radiative transfer calculations, measurement uncertainty evaluations and cloud climatology investigations. In addition, the momentum forcing in the upper troposphere from tilted clouds is also likely asymmetric, which can affect atmospheric circulation anisotropically.
2013-09-01
ORGANIZATION REPORT NUMBER ARL-TR-6576 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S) 11 . SPONSOR... 11 Figure 11 . Estimated angle-of-attack components history, projectile no.2... 11 Figure 12. Comparison of angle-of-attack component estimates, projectile no.2. ........................12 Figure 13. Total angle-of
Forest height Mapping using the fusion of Lidar and MULTI-ANGLE spectral data
NASA Astrophysics Data System (ADS)
Pang, Y.; Li, Z.
2016-12-01
Characterizing the complexity of forest ecosystem over large area is highly complex. Light detection and Ranging (LIDAR) approaches have demonstrated a high capacity to accurately estimate forest structural parameters. A number of satellite mission concepts have been proposed to fuse LiDAR with other optical imagery allowing Multi-angle spectral observations to be captured using the Bidirectional Reflectance Distribution Function (BRDF) characteristics of forests. China is developing the concept of Chinese Terrestrial Carbon Mapping Satellite. A multi-beam waveform Lidar is the main sensor. A multi-angle imagery system is considered as the spatial mapping sensor. In this study, we explore the fusion potential of Lidar and multi-angle spectral data to estimate forest height across different scales. We flew intensive airborne Lidar and Multi-angle hyperspectral data in Genhe Forest Ecological Research Station, Northeast China. Then extended the spatial scale with some long transect flights to cover more forest structures. Forest height data derived from airborne lidar data was used as reference data and the multi-angle hyperspectral data was used as model inputs. Our results demonstrate that the multi-angle spectral data can be used to estimate forest height with the RMSE of 1.1 m with an R2 approximately 0.8.
A novel camera localization system for extending three-dimensional digital image correlation
NASA Astrophysics Data System (ADS)
Sabato, Alessandro; Reddy, Narasimha; Khan, Sameer; Niezrecki, Christopher
2018-03-01
The monitoring of civil, mechanical, and aerospace structures is important especially as these systems approach or surpass their design life. Often, Structural Health Monitoring (SHM) relies on sensing techniques for condition assessment. Advancements achieved in camera technology and optical sensors have made three-dimensional (3D) Digital Image Correlation (DIC) a valid technique for extracting structural deformations and geometry profiles. Prior to making stereophotogrammetry measurements, a calibration has to be performed to obtain the vision systems' extrinsic and intrinsic parameters. It means that the position of the cameras relative to each other (i.e. separation distance, cameras angle, etc.) must be determined. Typically, cameras are placed on a rigid bar to prevent any relative motion between the cameras. This constraint limits the utility of the 3D-DIC technique, especially as it is applied to monitor large-sized structures and from various fields of view. In this preliminary study, the design of a multi-sensor system is proposed to extend 3D-DIC's capability and allow for easier calibration and measurement. The suggested system relies on a MEMS-based Inertial Measurement Unit (IMU) and a 77 GHz radar sensor for measuring the orientation and relative distance of the stereo cameras. The feasibility of the proposed combined IMU-radar system is evaluated through laboratory tests, demonstrating its ability in determining the cameras position in space for performing accurate 3D-DIC calibration and measurements.
The big picture: effects of surround on immersion and size perception.
Baranowski, Andreas M; Hecht, Heiko
2014-01-01
Despite the fear of the entertainment industry that illegal downloads of films might ruin their business, going to the movies continues to be a popular leisure activity. One reason why people prefer to watch movies in cinemas may be the surround of the movie screen or its physically huge size. To disentangle the factors that might contribute to the size impression, we tested several measures of subjective size and immersion in different viewing environments. For this purpose we built a model cinema that provided visual angle information comparable with that of a real cinema. Subjects watched identical movie clips in a real cinema, a model cinema, and on a display monitor in isolation. Whereas the isolated display monitor was inferior, the addition of a contextual model improved the viewing immersion to the extent that it was comparable with the movie theater experience, provided the viewing angle remained the same. In a further study we built an identical but even smaller model cinema to unconfound visual angle and viewing distance. Both model cinemas produced similar results. There was a trend for the larger screen to be more immersive; however, viewing angle did not play a role in how the movie was evaluated.
22. VAL, VIEW OF PROJECTILE LOADING DECK LOOKING NORTHEAST TOWARD ...
22. VAL, VIEW OF PROJECTILE LOADING DECK LOOKING NORTHEAST TOWARD TOP OF CONCRETE 'A' FRAME STRUCTURE SHOWING DRIVE CABLES, DRIVE GEAR, BOTTOM OF CAMERA TOWER AND 'CROWS NEST' CONTROL ROOM. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Wide angle view of the Flight control room of Mission control center
1984-10-06
Wide angle view of the flight control room (FCR) of the Mission Control Center (MCC). Some of the STS 41-G crew can be seen on a large screen at the front of the MCC along with a map tracking the progress of the orbiter.
Experimental Investigation of Turbulent Flames in Hypersonic Flows
2015-09-01
kg and 1 MPa at a stagnation condition. A settling chamber upstream of the C/D nozzle has a pressure sensor and an optical access window for...are recorded by a pressure sensor attached on the reservoir. Overall fuel equivalence ratio () in the combustor is estimated by the ratio of...freestream flow direction and 22.5° ramp (back step) angle. Five pressure sensors (Kulite) and five temperature sensors (MEDTHERM coaxial thermocouple
An Automated Method for Navigation Assessment for Earth Survey Sensors Using Island Targets
NASA Technical Reports Server (NTRS)
Patt, F. S.; Woodward, R. H.; Gregg, W. W.
1997-01-01
An automated method has been developed for performing navigation assessment on satellite-based Earth sensor data. The method utilizes islands as targets which can be readily located in the sensor data and identified with reference locations. The essential elements are an algorithm for classifying the sensor data according to source, a reference catalogue of island locations, and a robust pattern-matching algorithm for island identification. The algorithms were developed and tested for the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), an ocean colour sensor. This method will allow navigation error statistics to be automatically generated for large numbers of points, supporting analysis over large spatial and temporal ranges.
Automated navigation assessment for earth survey sensors using island targets
NASA Technical Reports Server (NTRS)
Patt, Frederick S.; Woodward, Robert H.; Gregg, Watson W.
1997-01-01
An automated method has been developed for performing navigation assessment on satellite-based Earth sensor data. The method utilizes islands as targets which can be readily located in the sensor data and identified with reference locations. The essential elements are an algorithm for classifying the sensor data according to source, a reference catalog of island locations, and a robust pattern-matching algorithm for island identification. The algorithms were developed and tested for the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), an ocean color sensor. This method will allow navigation error statistics to be automatically generated for large numbers of points, supporting analysis over large spatial and temporal ranges.
Object positioning in storages of robotized workcells using LabVIEW Vision
NASA Astrophysics Data System (ADS)
Hryniewicz, P.; Banaś, W.; Sękala, A.; Gwiazda, A.; Foit, K.; Kost, G.
2015-11-01
During the manufacturing process, each performed task is previously developed and adapted to the conditions and the possibilities of the manufacturing plant. The production process is supervised by a team of specialists because any downtime causes great loss of time and hence financial loss. Sensors used in industry for tracking and supervision various stages of a production process make it much easier to maintain it continuous. One of groups of sensors used in industrial applications are non-contact sensors. This group includes: light barriers, optical sensors, rangefinders, vision systems, and ultrasonic sensors. Through to the rapid development of electronics the vision systems were widespread as the most flexible type of non-contact sensors. These systems consist of cameras, devices for data acquisition, devices for data analysis and specialized software. Vision systems work well as sensors that control the production process itself as well as the sensors that control the product quality level. The LabVIEW program as well as the LabVIEW Vision and LabVIEW Builder represent the application that enables program the informatics system intended to process and product quality control. The paper presents elaborated application for positioning elements in a robotized workcell. Basing on geometric parameters of manipulated object or on the basis of previously developed graphical pattern it is possible to determine the position of particular manipulated elements. This application could work in an automatic mode and in real time cooperating with the robot control system. It allows making the workcell functioning more autonomous.
Wide angle view of Mission Control Center during Apollo 14 transmission
1971-01-31
S71-17122 (31 Jan. 1971) --- A wide angle overall view of the Mission Operations Control Room (MOCR) in the Mission Control Center at the Manned spacecraft Center. This view was photographed during the first color television transmission from the Apollo 14 Command Module. Projected on the large screen at the right front of the MOCR is a view of the Apollo 14 Lunar Module, still attached to the Saturn IVB stage. The Command and Service Modules were approaching the LM/S-IVB during transposition and docking maneuvers.
Spectral sea surface reflectance of skylight.
Zhang, Xiaodong; He, Shuangyan; Shabani, Afshin; Zhai, Peng-Wang; Du, Keping
2017-02-20
In examining the dependence of the sea surface reflectance of skylight ρs on sky conditions, wind speed, solar zenith angle, and viewing geometry, Mobley [Appl. Opt.38, 7442 (1999).10.1364/AO.38.007442] assumed ρs is independent of wavelength. Lee et al. [Opt. Express18, 26313 (2010).10.1364/OE.18.026313] showed experimentally that ρs does vary spectrally due to the spectral difference of sky radiance coming from different directions, which was ignored in Mobley's study. We simulated ρs from 350 nm to 1000 nm by explicitly accounting for spectral variations of skylight distribution and Fresnel reflectance. Furthermore, we separated sun glint from sky glint because of significant differences in magnitude, spectrum and polarization state between direct sun light and skylight light. The results confirm that spectral variation of ρs(λ) mainly arises from the spectral distribution of skylight and would vary from slightly blueish due to normal dispersion of the refractive index of water, to neutral and then to reddish with increasing wind speeds and decreasing solar zenith angles. Polarization moderately increases sky glint by 8 - 20% at 400 nm but only by 0 - 10% at 1000 nm. Sun glint is inherently reddish and becomes significant (>10% of sky glint) when the sun is at the zenith with moderate winds or when the sea is roughened (wind speeds > 10 m s-1) with solar zenith angles < 20°. We recommend a two-step procedure by first correcting the glint due to direct sun light, which is unpolarized, followed by removing the glint due to diffused and polarized skylight. The simulated ρs(λ) as a function of wind speeds, sun angles and aerosol concentrations for currently recommended sensor-sun geometry, i.e., zenith angle = 40° and azimuthal angle relative to the sun = 45°, is available upon request.
Effect of phase advance on the brushless dc motor torque speed respond
NASA Astrophysics Data System (ADS)
Mohd, M. S.; Karsiti, M. N.; Mohd, M. S.
2015-12-01
Brushless direct current (BLDC) motor is widely used in small and medium sized electric vehicles as it exhibit highest specific power and thermal efficiency as compared to the induction motor. Permanent magnets BLDC rotor create a constant magnetic flux, which limit the motor top speed. As the back electromotive force (EMF) voltage increases proportionally with motor rotational speed and it approaches the amplitude of the input voltage, the phase current amplitude will reach zero. By advancing the phase current, it is possible to extend the maximum speed of the BLDC motor beyond the rated top speed. This will allow smaller BLDC motor to be used in small electric vehicles (EV) and in larger applications will allow the use of BLDC motor without the use of multispeed transmission unit for high speed operation. However, increasing the speed of BLDC will affect the torque speed response. The torque output will decrease as speed increases. Adjusting the phase angle will affect the speed of the motor as each coil is energized earlier than the corresponding rise in the back emf of the coil. This paper discusses the phase advance strategy of Brushless DC motor by phase angle manipulation approaches using external hall sensors. Tests have been performed at different phase advance angles in advance and retard positions for different voltage levels applied. The objective is to create the external hall sensor system to commutate the BLDC motor, to establish the phase advance of the BLDC by varying the phase angle through external hall sensor manipulation, observe the respond of the motor while applying the phase advance by hall sensor adjustment.
NASA Astrophysics Data System (ADS)
Augere, B.; Besson, B.; Fleury, D.; Goular, D.; Planchat, C.; Valla, M.
2016-05-01
Lidar (light detection and ranging) is a well-established measurement method for the prediction of atmospheric motions through velocity measurements. Recent advances in 1.5 μm Lidars show that the technology is mature, offers great ease of use, and is reliable and compact. A 1.5 μm airborne Lidar appears to be a good candidate for airborne in-flight measurement systems. It allows measurements remotely, outside aircraft aerodynamic disturbance, and absolute air speed (no need for calibration) with great precision in all aircraft flight domains. In the framework of the EU AIM2 project, the ONERA task has consisted of developing and testing a 1.5 μm anemometer sensor for in-flight airspeed measurements. The objective of this work is to demonstrate that the 1.5 μm Lidar sensor can increase the quality of the data acquisition procedure for aircraft flight test certification. This article presents the 1.5 μm anemometer sensor dedicated to in-flight airspeed measurements and describes the flight tests performed successfully on-board the Piaggio P180 aircraft. Lidar air data have been graphically compared to the air data provided by the aircraft flight test instrumentation (FTI) in the reference frame of the Lidar sensor head. Very good agreement of true air speed (TAS) by a fraction of ms-1, angle of sideslip (AOS), and angle of attack (AOA) by a fraction of degree were observed.
Optical system design of CCD star sensor with large aperture and wide field of view
NASA Astrophysics Data System (ADS)
Wang, Chao; Jiang, Lun; Li, Ying-chao; Liu, Zhuang
2017-10-01
The star sensor is one of the sensors which are used to determine the spatial attitude of the space vehicle. An optical system of star sensor with large aperture and wide field of view was designed in this paper. The effective focal length of the optics was 16mm, and the F-number is 1.2, the field of view of the optical system is 20°.The working spectrum is 500 to 800 nanometer. The lens system selects a similar complicated Petzval structure and special glass-couple, and get a high imaging quality in the whole spectrum range. For each field-of-view point, the values of the modulation transfer function at 50 cycles/mm is higher than 0.3. On the detecting plane, the encircled energy in a circle of 14μm diameter could be up to 80% of the total energy. In the whole range of the field of view, the dispersion spot diameter in the imaging plane is no larger than 13μm. The full field distortion was less than 0.1%, which was helpful to obtain the accurate location of the reference star through the picture gotten by the star sensor. The lateral chromatic aberration is less than 2μm in the whole spectrum range.
Optimal directional view angles for remote-sensing missions
NASA Technical Reports Server (NTRS)
Kimes, D. S.; Holben, B. N.; Tucker, C. J.; Newcomb, W. W.
1984-01-01
The present investigation is concerned with the directional, off-nadir viewing of terrestrial scenes using remote-sensing systems from aircraft and satellite platforms, taking into account advantages of such an approach over strictly nadir viewing systems. Directional reflectance data collected for bare soil and several different vegetation canopies in NOAA-7 AVHRR bands 1 and 2 were analyzed. Optimum view angles were recommended for two strategies. The first strategy views the utility of off-nadir measurements as extending spatial and temporal coverage of the target area. The second strategy views the utility of off-nadir measurements as providing additional information about the physical characteristics of the target. Conclusions regarding the two strategies are discussed.
NASA Astrophysics Data System (ADS)
Tanabe, Ichiro; Tanaka, Yoshito Y.; Ryoki, Takayuki; Watari, Koji; Goto, Takeyoshi; Kikawada, Masakazu; Inami, Wataru; Kawata, Yoshimasa; Ozaki, Yukihiro
2016-09-01
We investigated the surface plasmon resonance (SPR) of aluminum (Al) thin films with varying refractive index of the environment near the films in the far‒ultraviolet (FUV, <= 200 nm) and deep‒ultraviolet (DUV, <= 300 nm) regions. By using our original FUV‒DUV spectrometer which adopts an attenuated total reflectance (ATR) system, the measurable wavelength range was down to the 180 nm, and the environment near the Al surface could be controlled. In addition, this spectrometer was equipped with a variable incident angle apparatus, which enabled us to measure the FUV‒DUV reflectance spectra (170-450 nm) with various incident angles ranging from 45° to 85°. Based on the obtained spectra, the dispersion relation of Al‒SPR in the FUV and DUV regions was obtained. In the presence of various liquids (HFIP, water, alcohols etc.) on the Al film, the angle and wavelength of the SPR became larger and longer, respectively, compared with those in the air (i.e., with no materials on the film). These shifts correspond well with the results of simulations performed according to the Fresnel equations, and can be used in the application of SPR sensors. FUV‒DUV‒SPR sensors (in particular, FUV‒SPR sensors) with tunable incident light wavelength have three experimental advantages compared with conventional visible‒SPR sensors, as discussed based on the Fresnel equations, i.e., higher sensitivity, more narrowly limited surface measurement, and better material selectivity.
NASA Astrophysics Data System (ADS)
Chen, Shimeng; Liu, Yun; Gao, Xiaotong; Liu, Xiuxin; Peng, Wei
2014-11-01
We present a wavelength-tunable tapered optics fiber surface Plasmon resonance (SPR) sensor by polishing the end faces of multimode fibers(MMF).Two hard plastic clad optical fibers joint closely and are used as the light input and output channels. Their end faces are polished to produce two oblique planes, which are coated with gold film to be the sensing surface and the front mirror. The presence of the tapered geometry formed by the two oblique planes in the orthogonal directions makes it possible to adjust incident angle through changing the tilt angles of the two end faces, so as to achieve tuning the SPR coupling wavelength-angle pair. Compared with previous researches based a tapered optic fiber probe, we report the approach theoretically increase the signal noise ratio (SNR) by separating incident and emergent light propagating in the different coordinate fiber. Since fabricating the sensing surface and the front mirror on the two fibers to replace one single fiber tip, there is more incident light can reach the sensing surface and satisfy SPR effective. In addition, this improvement in structure has advantages of large grinding and sensing area, which can lead to high sensitivity and simple manufacture process of the sensor. Experimental measurement demonstrates the sensor has a favorable SPR resonanceabsorption and the ability of measuring refractive index (RI) of aqueous solution. This novel tapered SPR sensor has the potential to be applied to the biological sensing field.
Zhao, Anbang; Ma, Lin; Ma, Xuefei; Hui, Juan
2017-01-01
In this paper, an improved azimuth angle estimation method with a single acoustic vector sensor (AVS) is proposed based on matched filtering theory. The proposed method is mainly applied in an active sonar detection system. According to the conventional passive method based on complex acoustic intensity measurement, the mathematical and physical model of this proposed method is described in detail. The computer simulation and lake experiments results indicate that this method can realize the azimuth angle estimation with high precision by using only a single AVS. Compared with the conventional method, the proposed method achieves better estimation performance. Moreover, the proposed method does not require complex operations in frequency-domain and achieves computational complexity reduction. PMID:28230763
Development and validation of a Kalman filter-based model for vehicle slip angle estimation
NASA Astrophysics Data System (ADS)
Gadola, M.; Chindamo, D.; Romano, M.; Padula, F.
2014-01-01
It is well known that vehicle slip angle is one of the most difficult parameters to measure on a vehicle during testing or racing activities. Moreover, the appropriate sensor is very expensive and it is often difficult to fit to a car, especially on race cars. We propose here a strategy to eliminate the need for this sensor by using a mathematical tool which gives a good estimation of the vehicle slip angle. A single-track car model, coupled with an extended Kalman filter, was used in order to achieve the result. Moreover, a tuning procedure is proposed that takes into consideration both nonlinear and saturation characteristics typical of vehicle lateral dynamics. The effectiveness of the proposed algorithm has been proven by both simulation results and real-world data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ilieva, Yordanka; Allison, Lee; Cao, Tongtong
Here we present preliminary results for the gain performance of commercially available 3- mum and 6- mum pore-size single-anode microchannel-plate photomultipliers (MCP PMTs) in magnetic fields up to 5 T and for various orientations of the sensor relative to the field direction. The measurements were performed at Thomas Jefferson National Accelerator Facility in Newport News, VA. Our results show that smaller-pore-size PMTs have better gain performance in magnetic fields. At various angles, the shape of the gain dependence on the strength of the magnetic field strongly depends on the type of the sensor. Also, for each sensor, the azimuthal dependencemore » is strongly correlated with the polar angle. Overall, the sensors exhibit a reasonable performance up to 2 T, although that upper limit depends on the sensor, the applied high voltage, and the orientation of the sensor relative to the field. To optimize the operational and design parameters of MCP PMTs for performance in high magnetic fields, further measurements and simulation studies will be pursued. Furthermore, our studies are part of an R&D for development of a Detector of Internally Reflected Cherenkov Light for the central detector of a future U.S. Electron Ion Collider.« less
NASA Astrophysics Data System (ADS)
Lee, Jun Ho; Hwang, Sunglyoung; Jeong, Dohwan; Hong, Jinsuk; Kim, Youngsoo; Kim, Yeonsoo; Kim, Hyunsook
2017-09-01
We report an innovative simple alignment method for a VNIR spectrometer in the wavelength region of 400-900 nm; this device is later combined with fore-optics (a telescope) to form a f/2.5 hyperspectral imaging spectrometer with a field of view of +/-7.68°. The detector at the final image plane is a 640×480 charge-coupled device with a 24 μm pixel size. We first assembled the fore-optics and the spectrometer separately and then combined them via a slit co-located on the image plane of the fore-optics and the object plane of the spectrometer. The spectrometer was assembled in three steps. In the initial step, the optics was simply assembled with an optical axis guiding He-Ne laser. In the second step, we located a pin-hole on the slit plane and a Shack-Hartmann sensor on the detector plane. The wavefront errors over the full field were scanned simply by moving the point source along the slit direction while the Shack-Hartmann sensor was constantly conjugated to the pin-hole position by a motorized stage. Optimal alignment was then performed based on the reverse sensitivity method. In the final stage, the pin-hole and the Shack-Hartmann sensor were exchanged with an equispaced 10 pin-hole slit called a field identifier and a detector. The light source was also changed from the laser (single wavelength source) to a krypton lamp (discrete multi-wavelength source). We were then easily able to calculate the distortion and keystone on the detector plane without any scanning or moving optical components; rather, we merely calculated the spectral centroids of the 10 pin-holes on the detector. We then tuned the clocking angles of the convex grating and the detector to minimize the distortion and keystone. The final assembly was tested and found to have an RMS WFE < 90 nm over the entire field of view, a keystone of 0.08 pixels, a smile of 1.13 pixels and a spectral resolution of 4.32 nm.
A mechanical simulator of cardiac wall kinematics.
Cutrì, Elena; Bagnoli, Paola; Marcelli, Emanuela; Biondi, Federico; Cercenelli, Laura; Costantino, Maria Laura; Plicchi, Gianni; Fumero, Roberto
2010-01-01
Aim of this study is to develop a mechanical simulator (MS) reproducing cardiac wall kinematics [i.e., radial (R), longitudinal (L) and rotational (RT) motions] to test piezoelectric gyroscopic sensors (GS) that are able to measure cardiac torsion that has proved to be a sensitive index of cardiac performance. The MS consists of three brushless motors controlled by a dedicated software either separately or simultaneously reproducing the three main cardiac wall movements (R, L, RT) obtained by implementing different physiologic or pathologic velocity profiles derived from in vivo data. GS accuracy (max % error) was experimentally tested by connecting it to the MS driven in velocity in different working conditions [i.e., cardiac period (515-1030 ms), RT angle (4-16 degrees), GS axis inclination (0-90 degrees) with respect to the cardiac rotation axis]. The MS reproduced the tested velocity profiles well. The GS showed high accuracy in measuring both physiologic and pathologic RT velocity profiles, whereas they proved insensitive to R and L motions. GS axis inclination influenced measurements; however, it was possible to correct this taking the inclination angle cosine into account. The MS proved to be a useful tool to study cardiac wall kinematics and test GS reliability with a view to in vivo application.
Optimal angle of needle insertion for fluoroscopy-guided transforaminal epidural injection of L5.
Ra, In-Hoo; Min, Woo-Kie
2015-06-01
Unlike other sites, there is difficulty in performing TFESI at the L5-S1 level because the iliac crest is an obstacle to needle placement. The objective of this study was to identify the optimal angle of fluoroscopy for insertion and advancement of a needle during L5 TEFSI. We conducted an observational study of patients undergoing fluoroscopy-guided L5 TFESI in the prone position. A total of 80 patients (40 men and 40 women) with radiating pain of lower limbs were enrolled. During TFESI, we measured the angle at which the L5 vertebral body forms a rectangular shape and compared men and women. Then, we measured area of safe triangle in tilting angle of fluoroscopy from 15° to 35° and compared men and women. The mean cephalocaudal angle, where the vertebral body takes the shape of a rectangle, was 11.0° in men and 13.9° in women (P = 0.007). In men, the triangular area was maximal at 18.3 mm² with an oblique view angle of 25°. In women, the area was maximal at 23.6 mm² with an oblique view angle of 30°. At an oblique view angle of 30° and 35°, the area was significantly greater in women (P < 0.05). When TFESI is performed at the L5 region in the prone position, placement of fluoroscopy at a cephalocaudal angle of 11.0° and an oblique angle of 25° in men and cephalocaudal angle of 13.9° and an oblique angle of 30° in women would be most reasonable. © 2014 World Institute of Pain.
Erdenebat, Munkh-Uchral; Kwon, Ki-Chul; Yoo, Kwan-Hee; Baasantseren, Ganbat; Park, Jae-Hyeung; Kim, Eun-Soo; Kim, Nam
2014-04-15
We propose a 360 degree integral-floating display with an enhanced vertical viewing angle. The system projects two-dimensional elemental image arrays via a high-speed digital micromirror device projector and reconstructs them into 3D perspectives with a lens array. Double floating lenses relate initial 3D perspectives to the center of a vertically curved convex mirror. The anamorphic optic system tailors the initial 3D perspectives horizontally and vertically disperse light rays more widely. By the proposed method, the entire 3D image provides both monocular and binocular depth cues, a full-parallax demonstration with high-angular ray density and an enhanced vertical viewing angle.
Detection Angle Calibration of Pressure-Sensitive Paints
NASA Technical Reports Server (NTRS)
Bencic, Timothy J.
2000-01-01
Uses of the pressure-sensitive paint (PSP) techniques in areas other than external aerodynamics continue to expand. The NASA Glenn Research Center has become a leader in the application of the global technique to non-conventional aeropropulsion applications including turbomachinery testing. The use of the global PSP technique in turbomachinery applications often requires detection of the luminescent paint in confined areas. With the limited viewing usually available, highly oblique illumination and detection angles are common in the confined areas in these applications. This paper will describe the results of pressure, viewing and excitation angle dependence calibrations using three popular PSP formulations to get a better understanding of the errors associated with these non-traditional views.
An Integrated Wireless Wearable Sensor System for Posture Recognition and Indoor Localization.
Huang, Jian; Yu, Xiaoqiang; Wang, Yuan; Xiao, Xiling
2016-10-31
In order to provide better monitoring for the elderly or patients, we developed an integrated wireless wearable sensor system that can realize posture recognition and indoor localization in real time. Five designed sensor nodes which are respectively fixed on lower limbs and a standard Kalman filter are used to acquire basic attitude data. After the attitude angles of five body segments (two thighs, two shanks and the waist) are obtained, the pitch angles of the left thigh and waist are used to realize posture recognition. Based on all these attitude angles of body segments, we can also calculate the coordinates of six lower limb joints (two hip joints, two knee joints and two ankle joints). Then, a novel relative localization algorithm based on step length is proposed to realize the indoor localization of the user. Several sparsely distributed active Radio Frequency Identification (RFID) tags are used to correct the accumulative error in the relative localization algorithm and a set-membership filter is applied to realize the data fusion. The experimental results verify the effectiveness of the proposed algorithms.
Garcia-Pozuelo, Daniel; Yunta, Jorge; Olatunbosun, Oluremi; Yang, Xiaoguang; Diaz, Vicente
2017-04-16
Tires equipped with sensors, the so-called "intelligent tires", can provide vital information for control systems, drivers and external users. In this research, tire dynamic strain characteristics in cornering conditions are collected and analysed in relation to the variation of tire working conditions, such as inflation pressure, rolling speed, vertical load and slip angle. An experimental tire strain-based prototype and an indoor tire test rig are used to demonstrate the suitability of strain sensors to establish relations between strain data and lateral force. The results of experiments show that strain values drop sharply when lateral force is decreasing, which can be used to predict tire slip conditions. As a first approach to estimate some tire working conditions, such as the slip angle and vertical load, a fuzzy logic method has been developed. The simulation and test results confirm the feasibility of strain sensors and the proposed computational model to solve the non-linearity characteristics of the tires' parameters and turn tires into a source of useful information.
An Integrated Wireless Wearable Sensor System for Posture Recognition and Indoor Localization
Huang, Jian; Yu, Xiaoqiang; Wang, Yuan; Xiao, Xiling
2016-01-01
In order to provide better monitoring for the elderly or patients, we developed an integrated wireless wearable sensor system that can realize posture recognition and indoor localization in real time. Five designed sensor nodes which are respectively fixed on lower limbs and a standard Kalman filter are used to acquire basic attitude data. After the attitude angles of five body segments (two thighs, two shanks and the waist) are obtained, the pitch angles of the left thigh and waist are used to realize posture recognition. Based on all these attitude angles of body segments, we can also calculate the coordinates of six lower limb joints (two hip joints, two knee joints and two ankle joints). Then, a novel relative localization algorithm based on step length is proposed to realize the indoor localization of the user. Several sparsely distributed active Radio Frequency Identification (RFID) tags are used to correct the accumulative error in the relative localization algorithm and a set-membership filter is applied to realize the data fusion. The experimental results verify the effectiveness of the proposed algorithms. PMID:27809230
Garcia-Pozuelo, Daniel; Yunta, Jorge; Olatunbosun, Oluremi; Yang, Xiaoguang; Diaz, Vicente
2017-01-01
Tires equipped with sensors, the so-called “intelligent tires”, can provide vital information for control systems, drivers and external users. In this research, tire dynamic strain characteristics in cornering conditions are collected and analysed in relation to the variation of tire working conditions, such as inflation pressure, rolling speed, vertical load and slip angle. An experimental tire strain-based prototype and an indoor tire test rig are used to demonstrate the suitability of strain sensors to establish relations between strain data and lateral force. The results of experiments show that strain values drop sharply when lateral force is decreasing, which can be used to predict tire slip conditions. As a first approach to estimate some tire working conditions, such as the slip angle and vertical load, a fuzzy logic method has been developed. The simulation and test results confirm the feasibility of strain sensors and the proposed computational model to solve the non-linearity characteristics of the tires’ parameters and turn tires into a source of useful information. PMID:28420156
Development of low cost and accurate homemade sensor system based on Surface Plasmon Resonance (SPR)
NASA Astrophysics Data System (ADS)
Laksono, F. D.; Supardianningsih; Arifin, M.; Abraha, K.
2018-04-01
In this paper, we developed homemade and computerized sensor system based on Surface Plasmon Resonance (SPR). The developed systems consist of mechanical system instrument, laser power sensor, and user interface. The mechanical system development that uses anti-backlash gear design was successfully able to enhance the angular resolution angle of incidence laser up to 0.01°. In this system, the laser detector acquisition system and stepper motor controller utilizing Arduino Uno which is easy to program, flexible, and low cost, was used. Furthermore, we employed LabView’s user interface as the virtual instrument for facilitating the sample measurement and for transforming the data recording directly into the digital form. The test results using gold-deposited half-cylinder prism showed the Total Internal Reflection (TIR) angle of 41,34°± 0,01° and SPR angle of 44,20°± 0,01°, respectively. The result demonstrated that the developed system managed to reduce the measurement duration and data recording errors caused by human error. Also, the test results also concluded that the system’s measurement is repeatable and accurate.
Concept development for the ITER equatorial port visible∕infrared wide angle viewing system.
Reichle, R; Beaumont, B; Boilson, D; Bouhamou, R; Direz, M-F; Encheva, A; Henderson, M; Huxford, R; Kazarian, F; Lamalle, Ph; Lisgo, S; Mitteau, R; Patel, K M; Pitcher, C S; Pitts, R A; Prakash, A; Raffray, R; Schunke, B; Snipes, J; Diaz, A Suarez; Udintsev, V S; Walker, C; Walsh, M
2012-10-01
The ITER equatorial port visible∕infrared wide angle viewing system concept is developed from the measurement requirements. The proposed solution situates 4 viewing systems in the equatorial ports 3, 9, 12, and 17 with 4 views each (looking at the upper target, the inner divertor, and tangentially left and right). This gives sufficient coverage. The spatial resolution of the divertor system is 2 times higher than the other views. For compensation of vacuum-vessel movements, an optical hinge concept is proposed. Compactness and low neutron streaming is achieved by orienting port plug doglegs horizontally. Calibration methods, risks, and R&D topics are outlined.
NASA Technical Reports Server (NTRS)
Mueller, James L.
2001-01-01
This Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) contract supports acquisition of match up radiometric and bio-optical data for validation of Sea-Viewing Wide Field-of-view Sensor (SeaWiFS) and other ocean color satellites, and evaluation of uncertainty budgets and protocols for in situ measurements of normalized water leaving radiances.
Eliminating Deadbands In Resistive Angle Sensors
NASA Technical Reports Server (NTRS)
Salomon, Phil M.; Allen, Russell O.; Marchetto, Carl A.
1992-01-01
Proposed shaft-angle-measuring circuit provides continuous indication of angle of rotation from 0 degree to 360 degrees. Sensing elements are two continuous-rotation potentiometers, and associated circuitry eliminates deadband that occurs when wiper contact of potentiometer crosses end contacts near 0 degree position of circular resistive element. Used in valve-position indicator or similar device in which long operating life and high angular precision not required.
Model-Scale Experiment of the Seakeeping Performance for R/V Melville, Model 5720
2012-07-01
Angle 1 Y None Deg Sensor Bourns Rotary Potentiometer 6574S-1-103 NA 39596 KVH Sin 2 Y None volts Sensor KVH Fluxgate Compass C-100...NA Deg Sensor KVH Calc Heading NA N None DegM Calculated KVH Fluxgate Compass C-100 39449 Bow Tracker Sensor Bottom NA N None...3DM-3XI combined three axis of angular rate gyros, accelerometers, and magnetometers to provide various combinations of gyro stabilized Euler
81. VIEW OF VAL LOOKING NORTH AS SEEN FROM THE ...
81. VIEW OF VAL LOOKING NORTH AS SEEN FROM THE RESERVOIR SHOWING TWO LAUNCHING TUBES ON THE LAUNCHER BRIDGE, Date unknown, circa 1952. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
63. VIEW LOOKING DOWN VAL LAUNCHING SLAB SHOWING DRIVE GEARS, ...
63. VIEW LOOKING DOWN VAL LAUNCHING SLAB SHOWING DRIVE GEARS, CABLES, LAUNCHER RAILS, PROJECTILE CAR AND SUPPORT CARRIAGE, April 8, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
NASA Astrophysics Data System (ADS)
Bialke, Bill
1992-05-01
In order to satisfy the stringent cost and power requirements of small satellites, an advanced SCANWHEEL was designed, built, and qualified by ITHACO, Inc. The T-SCANWHEEL is a modular momentum/reaction wheel with an integral conical Earth scanner. The momentum wheel provides momentum bias and control torques about the pitch axis of a spacecraft. An angled scan mirror coupled to the rotating shaft of the momentum wheel provides a conical scan of the field-of-view of an infrared sensor to provide pitch-and-roll attitude information. By using the same motor and bearings for the momentum wheel and Earth scanner, the overall power consumption is reduced and the system reliability is enhanced. The evolution of the T-SCANWHEEL is presented, including design ground rules, tradeoff analyses, and performance results.
Limb Correction of Polar-Orbiting Imagery for the Improved Interpretation of RGB Composites
NASA Technical Reports Server (NTRS)
Jedlovec, Gary J.; Elmer, Nicholas
2016-01-01
Red-Green-Blue (RGB) composite imagery combines information from several spectral channels into one image to aid in the operational analysis of atmospheric processes. However, infrared channels are adversely affected by the limb effect, the result of an increase in optical path length of the absorbing atmosphere between the satellite and the earth as viewing zenith angle increases. This paper reviews a newly developed technique to quickly correct for limb effects in both clear and cloudy regions using latitudinally and seasonally varying limb correction coefficients for real-time applications. These limb correction coefficients account for the increase in optical path length in order to produce limb-corrected RGB composites. The improved utility of a limb-corrected Air Mass RGB composite from the application of this approach is demonstrated using Aqua Moderate Resolution Imaging Spectroradiometer (MODIS) imagery. However, the limb correction can be applied to any polar-orbiting sensor infrared channels, provided the proper limb correction coefficients are calculated. Corrected RGB composites provide multiple advantages over uncorrected RGB composites, including increased confidence in the interpretation of RGB features, improved situational awareness for operational forecasters, and the ability to use RGB composites from multiple sensors jointly to increase the temporal frequency of observations.
NASA Astrophysics Data System (ADS)
Rowlands, Neil; Hutchings, John; Murowinski, Richard G.; Alexander, Russ
2003-03-01
Instrumentation for the Next Generation Space Telescope (NGST) is currently in the Phase A definition stage. We have developed a concept for the NGST Fine Guidance Sensor or FGS. The FGS is a detector array based imager which resides in the NGST focal plane. We report here on tradeoff studies aimed at defining an overall configuration of the FGS which will meet the performance and interface requirements. A key performance requirement is a noise equivalent angle of 3 milli-arcseconds to be achieved with 95% probability for any pointing of the observatory in the celestial sphere. A key interface requirement is compatibility with the architecture of the Integrated Science Instrument Module (ISIM). The concept developed consists of two independent and redundant FGS modules, each with a 4' x 2' field of view covered by two 2048 x 2048 infrared detector arrays, providing 60 milli-arcsecond sampling. Performance modeling supporting the choice of this architecture and the trade space considered is presented. Each module has a set of readout electronics which perform star detection, pixel-by-pixel correction, and in fine guiding mode, centroid calculation. These readout electronics communicate with the ISIM Command &Data Handling Units where the FGS control software is based. Rationale for this choice of architecture is also presented.
Low-latitude Ionospheric Research using the CIRCE Mission
NASA Astrophysics Data System (ADS)
Dymond, K.; Nicholas, A. C.; Budzien, S. A.; Stephan, A. W.
2016-12-01
The Coordinated Ionospheric Reconstruction Cubesat Experiment (CIRCE) is a dual-satellite mission consisting of two 6U CubeSats actively maintaining a lead-follow configuration in the same orbit with a launch planned for the 2018-2019 time frame. These nano-satellites will each feature two 1U ultraviolet photometers, observing the 135.6 nm emission of atomic oxygen at nighttime. The primary objective is to characterize the two-dimensional distribution of electrons in the Equatorial Ionization Anomaly (EIA). The methodology used to reconstruct the nighttime ionosphere employs continuous UV photometry from four distinct viewing angles in combination with an additional data source, such as in situ plasma density measurements or a wide-band beacon data, with advanced image space reconstruction algorithm tomography techniques. The COSMIC/FORMOSAT-3 (CF3) constellation featured six Tiny Ionospheric Photometers, a compact UV sensor design which served as the pathfinder for the CIRCE instruments. The TIP instruments on the CF3 satellites demonstrated detection of ionospheric bubbles before they had penetrated the peak of the F-region ionosphere. We present our mission concept, simulations illustrating the imaging capability of the sensor suite, and a range of science questions addressable using such a system.
NASA Astrophysics Data System (ADS)
Zhang, Jialin; Chen, Qian; Sun, Jiasong; Li, Jiaji; Zuo, Chao
2018-01-01
Lensfree holography provides a new way to effectively bypass the intrinsical trade-off between the spatial resolution and field-of-view (FOV) of conventional lens-based microscopes. Unfortunately, due to the limited sensor pixel-size, unpredictable disturbance during image acquisition, and sub-optimum solution to the phase retrieval problem, typical lensfree microscopes only produce compromised imaging quality in terms of lateral resolution and signal-to-noise ratio (SNR). In this paper, we propose an adaptive pixel-super-resolved lensfree imaging (APLI) method to address the pixel aliasing problem by Z-scanning only, without resorting to subpixel shifting or beam-angle manipulation. Furthermore, an automatic positional error correction algorithm and adaptive relaxation strategy are introduced to enhance the robustness and SNR of reconstruction significantly. Based on APLI, we perform full-FOV reconstruction of a USAF resolution target across a wide imaging area of {29.85 mm2 and achieve half-pitch lateral resolution of 770 nm, surpassing 2.17 times of the theoretical Nyquist-Shannon sampling resolution limit imposed by the sensor pixel-size (1.67 μm). Full-FOV imaging result of a typical dicot root is also provided to demonstrate its promising potential applications in biologic imaging.
Floristic composition and across-track reflectance gradient in Landsat images over Amazonian forests
NASA Astrophysics Data System (ADS)
Muro, Javier; doninck, Jasper Van; Tuomisto, Hanna; Higgins, Mark A.; Moulatlet, Gabriel M.; Ruokolainen, Kalle
2016-09-01
Remotely sensed image interpretation or classification of tropical forests can be severely hampered by the effects of the bidirectional reflection distribution function (BRDF). Even for narrow swath sensors like Landsat TM/ETM+, the influence of reflectance anisotropy can be sufficiently strong to introduce a cross-track reflectance gradient. If the BRDF could be assumed to be linear for the limited swath of Landsat, it would be possible to remove this gradient during image preprocessing using a simple empirical method. However, the existence of natural gradients in reflectance caused by spatial variation in floristic composition of the forest can restrict the applicability of such simple corrections. Here we use floristic information over Peruvian and Brazilian Amazonia acquired through field surveys, complemented with information from geological maps, to investigate the interaction of real floristic gradients and the effect of reflectance anisotropy on the observed reflectances in Landsat data. In addition, we test the assumption of linearity of the BRDF for a limited swath width, and whether different primary non-inundated forest types are characterized by different magnitudes of the directional reflectance gradient. Our results show that a linear function is adequate to empirically correct for view angle effects, and that the magnitude of the across-track reflectance gradient is independent of floristic composition in the non-inundated forests we studied. This makes a routine correction of view angle effects possible. However, floristic variation complicates the issue, because different forest types have different mean reflectances. This must be taken into account when deriving the correction function in order to avoid eliminating natural gradients.
Yang, Pao-Keng
2017-08-01
By using a light-emitting diode as the probing light source and a Shack-Hartmann wavefront sensor as the recorder for the wavefront surface to execute a relative measurement, we present a useful method for determining the small wedge angle and optical homogeneity of a nominally planar glass plate from the wavefront measurements. The measured wavefront surface from the light source was first calibrated to be a horizontal plane before the plate under test was inserted. The wedge angle of the plate can be determined from the inclining angle of the regression plane of the measured wavefront surface after the plate was inserted between the light source and the wavefront sensor. Despite the annoying time-dependent altitude fluctuation in measured wavefront topography, the optical homogeneity of the plate can be estimated from the increment on the average variance of the wavefront surface to its regression plane after the light passes through it by using the Bienaymé formula.
Quantification of Finger-Tapping Angle Based on Wearable Sensors
Djurić-Jovičić, Milica; Jovičić, Nenad S.; Roby-Brami, Agnes; Popović, Mirjana B.; Kostić, Vladimir S.; Djordjević, Antonije R.
2017-01-01
We propose a novel simple method for quantitative and qualitative finger-tapping assessment based on miniature inertial sensors (3D gyroscopes) placed on the thumb and index-finger. We propose a simplified description of the finger tapping by using a single angle, describing rotation around a dominant axis. The method was verified on twelve subjects, who performed various tapping tasks, mimicking impaired patterns. The obtained tapping angles were compared with results of a motion capture camera system, demonstrating excellent accuracy. The root-mean-square (RMS) error between the two sets of data is, on average, below 4°, and the intraclass correlation coefficient is, on average, greater than 0.972. Data obtained by the proposed method may be used together with scores from clinical tests to enable a better diagnostic. Along with hardware simplicity, this makes the proposed method a promising candidate for use in clinical practice. Furthermore, our definition of the tapping angle can be applied to all tapping assessment systems. PMID:28125051
Quantification of Finger-Tapping Angle Based on Wearable Sensors.
Djurić-Jovičić, Milica; Jovičić, Nenad S; Roby-Brami, Agnes; Popović, Mirjana B; Kostić, Vladimir S; Djordjević, Antonije R
2017-01-25
We propose a novel simple method for quantitative and qualitative finger-tapping assessment based on miniature inertial sensors (3D gyroscopes) placed on the thumb and index-finger. We propose a simplified description of the finger tapping by using a single angle, describing rotation around a dominant axis. The method was verified on twelve subjects, who performed various tapping tasks, mimicking impaired patterns. The obtained tapping angles were compared with results of a motion capture camera system, demonstrating excellent accuracy. The root-mean-square (RMS) error between the two sets of data is, on average, below 4°, and the intraclass correlation coefficient is, on average, greater than 0.972. Data obtained by the proposed method may be used together with scores from clinical tests to enable a better diagnostic. Along with hardware simplicity, this makes the proposed method a promising candidate for use in clinical practice. Furthermore, our definition of the tapping angle can be applied to all tapping assessment systems.
Multi-Parameter Scattering Sensor and Methods
NASA Technical Reports Server (NTRS)
Greenberg, Paul S. (Inventor); Fischer, David G. (Inventor)
2016-01-01
Methods, detectors and systems detect particles and/or measure particle properties. According to one embodiment, a detector for detecting particles comprises: a sensor for receiving radiation scattered by an ensemble of particles; and a processor for determining a physical parameter for the detector, or an optimal detection angle or a bound for an optimal detection angle, for measuring at least one moment or integrated moment of the ensemble of particles, the physical parameter, or detection angle, or detection angle bound being determined based on one or more of properties (a) and/or (b) and/or (c) and/or (d) or ranges for one or more of properties (a) and/or (b) and/or (c) and/or (d), wherein (a)-(d) are the following: (a) is a wavelength of light incident on the particles, (b) is a count median diameter or other characteristic size parameter of the particle size distribution, (c) is a standard deviation or other characteristic width parameter of the particle size distribution, and (d) is a refractive index of particles.
NASA Astrophysics Data System (ADS)
Miller, I.; Forster, B. C.; Laffan, S. W.
2012-07-01
Spectral reflectance characteristics of substrates in a coral reef environment are often measured in the field by viewing a substrate at nadir. However, viewing a substrate from multiple angles would likely result in different spectral characteristics for most coral reef substrates and provide valuable information on structural properties. To understand the relationship between the morphology of a substrate and its spectral response it is necessary to correct the observed above-water radiance for the effects of atmosphere and water attenuation, at a number of view and azimuth angles. In this way the actual surface reflectance can be determined. This research examines the air-water surface interaction for two hypothetical atmospheric conditions (clear Rayleigh scattering and totally cloudcovered) and the global irradiance reaching the benthic surface. It accounts for both water scattering and absorption, with simplifications for shallow water conditions, as well as the additive effect of background reflectance being reflected at the water-air surface at angles greater than the critical refraction angle (~48°). A model was developed to correct measured above-water radiance along the refracted view angle for its decrease due to path attenuation and the "n squared law of radiance" and the additive surface reflectance. This allows bidirectional benthic surface reflectance and nadir-normalised reflectance to be determined. These theoretical models were adapted to incorporate above-water measures relative to a standard, diffuse, white reference panel. The derived spectral signatures of a number of coral and non-coral benthic surfaces compared well with other published results, and the signatures and nadir normalised reflectance of the corals and other benthic surface classes indicate good class separation.