Partially-overlapped viewing zone based integral imaging system with super wide viewing angle.
Xiong, Zhao-Long; Wang, Qiong-Hua; Li, Shu-Li; Deng, Huan; Ji, Chao-Chao
2014-09-22
In this paper, we analyze the relationship between viewer and viewing zones of integral imaging (II) system and present a partially-overlapped viewing zone (POVZ) based integral imaging system with a super wide viewing angle. In the proposed system, the viewing angle can be wider than the viewing angle of the conventional tracking based II system. In addition, the POVZ can eliminate the flipping and time delay of the 3D scene as well. The proposed II system has a super wide viewing angle of 120° without flipping effect about twice as wide as the conventional one.
Sun-view angle effects on reflectance factors of corn canopies
NASA Technical Reports Server (NTRS)
Ranson, K. J.; Daughtry, C. S. T.; Biehl, L. L.; Bauer, M. E.
1985-01-01
The effects of sun and view angles on reflectance factors of corn (Zea mays L.) canopies ranging from the six leaf stage to harvest maturity were studied on the Purdue University Agronomy Farm by a multiband radiometer. The two methods of acquiring spectral data, the truck system and the tower systrem, are described. The analysis of the spectral data is presented in three parts: solar angle effects on reflectance factors viewed at nadir; solar angle effects on reflectance factors viewed at a fixed sun angle; and both sun and view angles effect on reflectance factors. The analysis revealed that for nadir-viewed reflectance factors there is a strong solar angle dependence in all spectral bands for canopies with low leaf area index. Reflectance factors observed from the sun angle at different view azimuth angles showed that the position of the sensor relative to the sun is important in determining angular reflectance characteristics. For both sun and view angles, reflectance factors are maximized when the sensor view direction is towards the sun.
Multi-viewer tracking integral imaging system and its viewing zone analysis.
Park, Gilbae; Jung, Jae-Hyun; Hong, Keehoon; Kim, Yunhee; Kim, Young-Hoon; Min, Sung-Wook; Lee, Byoungho
2009-09-28
We propose a multi-viewer tracking integral imaging system for viewing angle and viewing zone improvement. In the tracking integral imaging system, the pickup angles in each elemental lens in the lens array are decided by the positions of viewers, which means the elemental image can be made for each viewer to provide wider viewing angle and larger viewing zone. Our tracking integral imaging system is implemented with an infrared camera and infrared light emitting diodes which can track the viewers' exact positions robustly. For multiple viewers to watch integrated three-dimensional images in the tracking integral imaging system, it is needed to formulate the relationship between the multiple viewers' positions and the elemental images. We analyzed the relationship and the conditions for the multiple viewers, and verified them by the implementation of two-viewer tracking integral imaging system.
NASA Astrophysics Data System (ADS)
Baek, Jong-In; Kim, Ki-Han; Kim, Jae Chang; Yoon, Tae-Hoon
2010-01-01
This paper proposes a method of omni-directional viewing-angle switching by controlling the beam diverging angle (BDA) in a liquid crystal (LC) panel. The LCs aligned randomly by in-cell polymer structures diffuse the collimated backlight for the bright state of the wide viewing-angle mode. We align the LCs homogeneously by applying an in-plane field for the narrow viewing-angle mode. By doing this the scattering is significantly reduced so that the small BDA is maintained as it passes through the LC layer. The dark state can be obtained by aligning the LCs homeotropically with a vertical electric field. We demonstrated experimentally the omni-directional switching of the viewing-angle, without an additional panel or backlighting system.
Friedrich, D T; Sommer, F; Scheithauer, M O; Greve, J; Hoffmann, T K; Schuler, P J
2017-12-01
Objective Advanced transnasal sinus and skull base surgery remains a challenging discipline for head and neck surgeons. Restricted access and space for instrumentation can impede advanced interventions. Thus, we present the combination of an innovative robotic endoscope guidance system and a specific endoscope with adjustable viewing angle to facilitate transnasal surgery in a human cadaver model. Materials and Methods The applicability of the robotic endoscope guidance system with custom foot pedal controller was tested for advanced transnasal surgery on a fresh frozen human cadaver head. Visualization was enabled using a commercially available endoscope with adjustable viewing angle (15-90 degrees). Results Visualization and instrumentation of all paranasal sinuses, including the anterior and middle skull base, were feasible with the presented setup. Controlling the robotic endoscope guidance system was effectively precise, and the adjustable endoscope lens extended the view in the surgical field without the common change of fixed viewing angle endoscopes. Conclusion The combination of a robotic endoscope guidance system and an advanced endoscope with adjustable viewing angle enables bimanual surgery in transnasal interventions of the paranasal sinuses and the anterior skull base in a human cadaver model. The adjustable lens allows for the abandonment of fixed-angle endoscopes, saving time and resources, without reducing the quality of imaging.
Optics of wide-angle panoramic viewing system-assisted vitreous surgery.
Chalam, Kakarla V; Shah, Vinay A
2004-01-01
The purpose of the article is to describe the optics of the contact wide-angle lens system with stereo-reinverter for vitreous surgery. A panoramic viewing system is made up of two components; an indirect ophthalmoscopy lens system for fundus image viewing, which is placed on the patient's cornea as a contact lens, and a separate removable prism system for reinversion of the image mounted on the microscope above the zooming system. The system provides a 104 degrees field of view in a phakic emmetropic eye with minification, which can be magnified by the operating microscope. It permits a binocular stereoptic view even through a small pupil (3 mm) or larger. In an air-filled phakic eye, field of view increases to approximately 130 degrees. The obtained image of the patient's fundus is reinverted to form true, erect, stereoscopic image by the reinversion system. In conclusion, this system permits wide-angle panoramic view of the surgical field. The contact lens neutralizes the optical irregularities of the corneal surface and allows improved visualization in eyes with irregular astigmatism induced by corneal scars. Excellent visualization is achieved in complex clinical situations such as miotic pupils, lenticular opacities, and in air-filled phakic eyes.
Qin, Zong; Wang, Kai; Chen, Fei; Luo, Xiaobing; Liu, Sheng
2010-08-02
In this research, the condition for uniform lighting generated by array of LEDs with large view angle was studied. The luminous intensity distribution of LED is not monotone decreasing with view angle. A LED with freeform lens was designed as an example for analysis. In a system based on LEDs designed in house with a thickness of 20mm and rectangular arrangement, the condition for uniform lighting was derived and the analytical results demonstrated that the uniformity was not decreasing monotonously with the increasing of LED-to-LED spacing. The illuminance uniformities were calculated with Monte Carlo ray tracing simulations and the uniformity was found to increase with the increasing of certain LED-to-LED spacings anomalously. Another type of large view angle LED and different arrangements were discussed in addition. Both analysis and simulation results showed that the method is available for LED array lighting system design on the basis of large view angle LED..
Application of AI techniques to infer vegetation characteristics from directional reflectance(s)
NASA Technical Reports Server (NTRS)
Kimes, D. S.; Smith, J. A.; Harrison, P. A.; Harrison, P. R.
1994-01-01
Traditionally, the remote sensing community has relied totally on spectral knowledge to extract vegetation characteristics. However, there are other knowledge bases (KB's) that can be used to significantly improve the accuracy and robustness of inference techniques. Using AI (artificial intelligence) techniques a KB system (VEG) was developed that integrates input spectral measurements with diverse KB's. These KB's consist of data sets of directional reflectance measurements, knowledge from literature, and knowledge from experts which are combined into an intelligent and efficient system for making vegetation inferences. VEG accepts spectral data of an unknown target as input, determines the best techniques for inferring the desired vegetation characteristic(s), applies the techniques to the target data, and provides a rigorous estimate of the accuracy of the inference. VEG was developed to: infer spectral hemispherical reflectance from any combination of nadir and/or off-nadir view angles; infer percent ground cover from any combination of nadir and/or off-nadir view angles; infer unknown view angle(s) from known view angle(s) (known as view angle extension); and discriminate between user defined vegetation classes using spectral and directional reflectance relationships developed from an automated learning algorithm. The errors for these techniques were generally very good ranging between 2 to 15% (proportional root mean square). The system is designed to aid scientists in developing, testing, and applying new inference techniques using directional reflectance data.
A see-through holographic head-mounted display with the large viewing angle
NASA Astrophysics Data System (ADS)
Chen, Zhidong; sang, Xinzhu; Lin, Qiaojun; Li, Jin; Yu, Xunbo; Gao, Xin; Yan, Binbin; Wang, Kuiru; Yu, Chongxiu; Xie, Songlin
2017-02-01
A novel solution for the large view angle holographic head-mounted display (HHMD) is presented. Divergent light is used for the hologram illumination to construct a large size three-dimensional object outside the display in a short distance. A designed project-type lens with large numerical aperture projects the object constructed by the hologram to its real location. The presented solution can realize a compact HHMD system with a large field of view. The basic principle and the structure of the system are described. An augmented reality (AR) prototype with the size of 50 mm×40 mm and the view angle above 60° is demonstrated.
New developments of a knowledge based system (VEG) for inferring vegetation characteristics
NASA Technical Reports Server (NTRS)
Kimes, D. S.; Harrison, P. A.; Harrison, P. R.
1992-01-01
An extraction technique for inferring physical and biological surface properties of vegetation using nadir and/or directional reflectance data as input has been developed. A knowledge-based system (VEG) accepts spectral data of an unknown target as input, determines the best strategy for inferring the desired vegetation characteristic, applies the strategy to the target data, and provides a rigorous estimate of the accuracy of the inference. Progress in developing the system is presented. VEG combines methods from remote sensing and artificial intelligence, and integrates input spectral measurements with diverse knowledge bases. VEG has been developed to (1) infer spectral hemispherical reflectance from any combination of nadir and/or off-nadir view angles; (2) test and develop new extraction techniques on an internal spectral database; (3) browse, plot, or analyze directional reflectance data in the system's spectral database; (4) discriminate between user-defined vegetation classes using spectral and directional reflectance relationships; and (5) infer unknown view angles from known view angles (known as view angle extension).
Concept development for the ITER equatorial port visible∕infrared wide angle viewing system.
Reichle, R; Beaumont, B; Boilson, D; Bouhamou, R; Direz, M-F; Encheva, A; Henderson, M; Huxford, R; Kazarian, F; Lamalle, Ph; Lisgo, S; Mitteau, R; Patel, K M; Pitcher, C S; Pitts, R A; Prakash, A; Raffray, R; Schunke, B; Snipes, J; Diaz, A Suarez; Udintsev, V S; Walker, C; Walsh, M
2012-10-01
The ITER equatorial port visible∕infrared wide angle viewing system concept is developed from the measurement requirements. The proposed solution situates 4 viewing systems in the equatorial ports 3, 9, 12, and 17 with 4 views each (looking at the upper target, the inner divertor, and tangentially left and right). This gives sufficient coverage. The spatial resolution of the divertor system is 2 times higher than the other views. For compensation of vacuum-vessel movements, an optical hinge concept is proposed. Compactness and low neutron streaming is achieved by orienting port plug doglegs horizontally. Calibration methods, risks, and R&D topics are outlined.
Optimal design of wide-view-angle waveplate used for polarimetric diagnosis of lithography system
NASA Astrophysics Data System (ADS)
Gu, Honggang; Jiang, Hao; Zhang, Chuanwei; Chen, Xiuguo; Liu, Shiyuan
2016-03-01
The diagnosis and control of the polarization aberrations is one of the main concerns in a hyper numerical aperture (NA) lithography system. Waveplates are basic and indispensable optical components in the polarimetric diagnosis tools for the immersion lithography system. The retardance of a birefringent waveplate is highly sensitive to the incident angle of the light, which makes the conventional waveplate not suitable to be applied in the polarimetric diagnosis for the immersion lithography system with a hyper NA. In this paper, we propose a method for the optimal design of a wideview- angle waveplate by combining two positive waveplates made from magnesium fluoride (MgF2) and two negative waveplates made from sapphire using the simulated annealing algorithm. Theoretical derivations and numerical simulations are performed and the results demonstrate that the maximum variation in the retardance of the optimally designed wide-view-angle waveplate is less than +/- 0.35° for a wide-view-angle range of +/- 20°.
Gaze and viewing angle influence visual stabilization of upright posture
Ustinova, KI; Perkins, J
2011-01-01
Focusing gaze on a target helps stabilize upright posture. We investigated how this visual stabilization can be affected by observing a target presented under different gaze and viewing angles. In a series of 10-second trials, participants (N = 20, 29.3 ± 9 years of age) stood on a force plate and fixed their gaze on a figure presented on a screen at a distance of 1 m. The figure changed position (gaze angle: eye level (0°), 25° up or down), vertical body orientation (viewing angle: at eye level but rotated 25° as if leaning toward or away from the participant), or both (gaze and viewing angle: 25° up or down with the rotation equivalent of a natural visual perspective). Amplitude of participants’ sagittal displacement, surface area, and angular position of the center of gravity (COG) were compared. Results showed decreased COG velocity and amplitude for up and down gaze angles. Changes in viewing angles resulted in altered body alignment and increased amplitude of COG displacement. No significant changes in postural stability were observed when both gaze and viewing angles were altered. Results suggest that both the gaze angle and viewing perspective may be essential variables of the visuomotor system modulating postural responses. PMID:22398978
Concept development for the ITER equatorial port visible/infrared wide angle viewing system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reichle, R.; Beaumont, B.; Boilson, D.
2012-10-15
The ITER equatorial port visible/infrared wide angle viewing system concept is developed from the measurement requirements. The proposed solution situates 4 viewing systems in the equatorial ports 3, 9, 12, and 17 with 4 views each (looking at the upper target, the inner divertor, and tangentially left and right). This gives sufficient coverage. The spatial resolution of the divertor system is 2 times higher than the other views. For compensation of vacuum-vessel movements, an optical hinge concept is proposed. Compactness and low neutron streaming is achieved by orienting port plug doglegs horizontally. Calibration methods, risks, and R and D topicsmore » are outlined.« less
Erdenebat, Munkh-Uchral; Kwon, Ki-Chul; Yoo, Kwan-Hee; Baasantseren, Ganbat; Park, Jae-Hyeung; Kim, Eun-Soo; Kim, Nam
2014-04-15
We propose a 360 degree integral-floating display with an enhanced vertical viewing angle. The system projects two-dimensional elemental image arrays via a high-speed digital micromirror device projector and reconstructs them into 3D perspectives with a lens array. Double floating lenses relate initial 3D perspectives to the center of a vertically curved convex mirror. The anamorphic optic system tailors the initial 3D perspectives horizontally and vertically disperse light rays more widely. By the proposed method, the entire 3D image provides both monocular and binocular depth cues, a full-parallax demonstration with high-angular ray density and an enhanced vertical viewing angle.
Large-viewing-angle electroholography by space projection
NASA Astrophysics Data System (ADS)
Sato, Koki; Obana, Kazuki; Okumura, Toshimichi; Kanaoka, Takumi; Nishikawa, Satoko; Takano, Kunihiko
2004-06-01
The specification of hologram image is the full parallax 3D image. In this case we can get more natural 3D image because focusing and convergence are coincident each other. We try to get practical electro-holography system because for conventional electro-holography the image viewing angle is very small. This is due to the limited display pixel size. Now we are developing new method for large viewing angle by space projection method. White color laser is irradiated to single DMD panel ( time shared CGH of RGB three colors ). 3D space screen constructed by very small water particle is used to reconstruct the 3D image with large viewing angle by scattering of water particle.
Takaki, Yasuhiro; Hayashi, Yuki
2008-07-01
The narrow viewing zone angle is one of the problems associated with electronic holography. We propose a technique that enables the ratio of horizontal and vertical resolutions of a spatial light modulator (SLM) to be altered. This technique increases the horizontal resolution of a SLM several times, so that the horizontal viewing zone angle is also increased several times. A SLM illuminated by a slanted point light source array is imaged by a 4f imaging system in which a horizontal slit is located on the Fourier plane. We show that the horizontal resolution was increased four times and that the horizontal viewing zone angle was increased approximately four times.
Integrated large view angle hologram system with multi-slm
NASA Astrophysics Data System (ADS)
Yang, ChengWei; Liu, Juan
2017-10-01
Recently holographic display has attracted much attention for its ability to generate real-time 3D reconstructed image. CGH provides an effective way to produce hologram, and spacial light modulator (SLM) is used to reconstruct the image. However the reconstructing system is usually very heavy and complex, and the view-angle is limited by the pixel size and spatial bandwidth product (SBP) of the SLM. In this paper a light portable holographic display system is proposed by integrating the optical elements and host computer units.Which significantly reduces the space taken in horizontal direction. CGH is produced based on the Fresnel diffraction and point source method. To reduce the memory usage and image distortion, we use an optimized accurate compressed look up table method (AC-LUT) to compute the hologram. In the system, six SLMs are concatenated to a curved plane, each one loading the phase-only hologram in a different angle of the object, the horizontal view-angle of the reconstructed image can be expanded to about 21.8°.
ERIC Educational Resources Information Center
Hsu, Wen-Chun; Shih, Ju-Ling
2016-01-01
In this study, to learn the routine of Tantui, a branch of martial arts was taken as an object of research. Fitts' stages of motor learning and augmented reality (AR) were applied to a 3D mobile-assisted learning system for martial arts, which was characterized by free viewing angles. With the new system, learners could rotate the viewing angle of…
Optimal directional view angles for remote-sensing missions
NASA Technical Reports Server (NTRS)
Kimes, D. S.; Holben, B. N.; Tucker, C. J.; Newcomb, W. W.
1984-01-01
The present investigation is concerned with the directional, off-nadir viewing of terrestrial scenes using remote-sensing systems from aircraft and satellite platforms, taking into account advantages of such an approach over strictly nadir viewing systems. Directional reflectance data collected for bare soil and several different vegetation canopies in NOAA-7 AVHRR bands 1 and 2 were analyzed. Optimum view angles were recommended for two strategies. The first strategy views the utility of off-nadir measurements as extending spatial and temporal coverage of the target area. The second strategy views the utility of off-nadir measurements as providing additional information about the physical characteristics of the target. Conclusions regarding the two strategies are discussed.
Design considerations for a backlight with switchable viewing angles
NASA Astrophysics Data System (ADS)
Fujieda, Ichiro; Takagi, Yoshihiko; Rahadian, Fanny
2006-08-01
Small-sized liquid crystal displays are widely used for mobile applications such as cell phones. Electronic control of a viewing angle range is desired in order to maintain privacy for viewing in public as well as to provide wide viewing angles for solitary viewing. Conventionally, a polymer-dispersed liquid crystal (PDLC) panel is inserted between a backlight and a liquid crystal panel. The PDLC layer either transmits or scatters the light from the backlight, thus providing an electronic control of viewing angles. However, such a display system is obviously thick and expensive. Here, we propose to place an electronically-controlled, light-deflecting device between an LED and a light-guide of a backlight. For example, a liquid crystal lens is investigated for other applications and its focal length is controlled electronically. A liquid crystal phase grating either transmits or diffracts an incoming light depending on whether or not a periodic phase distribution is formed inside its liquid crystal layer. A bias applied to such a device will control the angular distribution of the light propagating inside a light-guide. Output couplers built in the light-guide extract the propagating light to outside. They can be V-shaped grooves, pyramids, or any other structures that can refract, reflect or diffract light. When any of such interactions occur, the output couplers translate the changes in the propagation angles into the angular distribution of the output light. Hence the viewing-angle characteristic can be switched. The designs of the output couplers and the LC devices are important for such a backlight system.
Directional infrared temperature and emissivity of vegetation: Measurements and models
NASA Technical Reports Server (NTRS)
Norman, J. M.; Castello, S.; Balick, L. K.
1994-01-01
Directional thermal radiance from vegetation depends on many factors, including the architecture of the plant canopy, thermal irradiance, emissivity of the foliage and soil, view angle, slope, and the kinetic temperature distribution within the vegetation-soil system. A one dimensional model, which includes the influence of topography, indicates that thermal emissivity of vegetation canopies may remain constant with view angle, or emissivity may increase or decrease as view angle from nadir increases. Typically, variations of emissivity with view angle are less than 0.01. As view angle increases away from nadir, directional infrared canopy temperature usually decreases but may remain nearly constant or even increase. Variations in directional temperature with view angle may be 5C or more. Model predictions of directional emissivity are compared with field measurements in corn canopies and over a bare soil using a method that requires two infrared thermometers, one sensitive to the 8 to 14 micrometer wavelength band and a second to the 14 to 22 micrometer band. After correction for CO2 absorption by the atmosphere, a directional canopy emissivity can be obtained as a function of view angle in the 8 to 14 micrometer band to an accuracy of about 0.005. Modeled and measured canopy emissivities for corn varied slightly with view angle (0.990 at nadir and 0.982 at 75 deg view zenith angle) and did not appear to vary significantly with view angle for the bare soil. Canopy emissivity is generally nearer to unity than leaf emissivity may vary by 0.02 with wavelength even though leaf emissivity. High spectral resolution, canopy thermal emissivity may vary by 0.02 with wavelength even though leaf emissivity may vary by 0.07. The one dimensional model provides reasonably accurate predictions of infrared temperature and can be used to study the dependence of infrared temperature on various plant, soil, and environmental factors.
Mobile Robot Localization by Remote Viewing of a Colored Cylinder
NASA Technical Reports Server (NTRS)
Volpe, R.; Litwin, T.; Matthies, L.
1995-01-01
A system was developed for the Mars Pathfinder rover in which the rover checks its position by viewing the angle back to a colored cylinder with different colors for different angles. The rover determines distance by the apparent size of the cylinder.
Holographic elements and curved slit used to enlarge field of view in rocket detection system
NASA Astrophysics Data System (ADS)
Breton, Mélanie; Fortin, Jean; Lessard, Roger A.; Châteauneuf, Marc
2006-09-01
Rocket detection over a wide field of view is an important issue in the protection of light armored vehicle. Traditionally, the detection occurs in UV band, but recent studies have shown the existence of significant emission peaks in the visible and near infrared at rocket launch time. The use of the visible region is interesting in order to reduce the weight and cost of systems. Current methods to detect those specific peaks involve use of interferometric filters. However, they fail to combine wide angle with wavelength selectivity. A linear array of volume holographic elements combined with a curved exit slit is proposed for the development of a wide field of view sensor for the detection of solid propellant motor launch flash. The sensor is envisaged to trigger an active protection system. On the basis of geometric theory, a system has been designed. It consists of a collector, a linear array of holographic elements, a curved slit and a detector. The collector is an off-axis parabolic mirror. Holographic elements are recorded subdividing a hologram film in regions, each individually exposed with a different incidence angle. All regions have a common diffraction angle. The incident angle determines the instantaneous field of view of the elements. The volume hologram performs the function of separating and focusing the diffracted beam on an image plane to achieve wavelength filtering. Conical diffraction property is used to enlarge the field of view in elevation. A curved slit was designed to correspond to oblique incidence of the holographic linear array. It is situated at the image plane and filters the diffracted spectrum toward the sensor. The field of view of the design was calculated to be 34 degrees. This was validated by a prototype tested during a field trial. Results are presented and analyzed. The system succeeded in detecting the rocket launch flash at desired fields of view.
Bidirectional Reflectance Functions for Application to Earth Radiation Budget Studies
NASA Technical Reports Server (NTRS)
Manalo-Smith, N.; Tiwari, S. N.; Smith, G. L.
1997-01-01
Reflected solar radiative fluxes emerging for the Earth's top of the atmosphere are inferred from satellite broadband radiance measurements by applying bidirectional reflectance functions (BDRFs) to account for the anisotropy of the radiation field. BDRF's are dependent upon the viewing geometry (i.e. solar zenith angle, view zenith angle, and relative azimuth angle), the amount and type of cloud cover, the condition of the intervening atmosphere, and the reflectance characteristics of the underlying surface. A set of operational Earth Radiation Budget Experiment (ERBE) BDRFs is available which was developed from the Nimbus 7 ERB (Earth Radiation Budget) scanner data for a three-angle grid system, An improved set of bidirectional reflectance is required for mission planning and data analysis of future earth radiation budget instruments, such as the Clouds and Earth's Radiant Energy System (CERES), and for the enhancement of existing radiation budget data products. This study presents an analytic expression for BDRFs formulated by applying a fit to the ERBE operational model tabulations. A set of model coefficients applicable to any viewing condition is computed for an overcast and a clear sky scene over four geographical surface types: ocean, land, snow, and desert, and partly cloudy scenes over ocean and land. The models are smooth in terms of the directional angles and adhere to the principle of reciprocity, i.e., they are invariant with respect to the interchange of the incoming and outgoing directional angles. The analytic BDRFs and the radiance standard deviations are compared with the operational ERBE models and validated with ERBE data. The clear ocean model is validated with Dlhopolsky's clear ocean model. Dlhopolsky developed a BDRF of higher angular resolution for clear sky ocean from ERBE radiances. Additionally, the effectiveness of the models accounting for anisotropy for various viewing directions is tested with the ERBE along tract data. An area viewed from nadir and from the side give two different radiance measurements but should yield the same flux when converted by the BDRF. The analytic BDRFs are in very good qualitative agreement with the ERBE models. The overcast scenes exhibit constant retrieved albedo over viewing zenith angles for solar zenith angles less than 60 degrees. The clear ocean model does not produce constant retrieved albedo over viewing zenith angles but gives an improvement over the ERBE operational clear sky ocean BDRF.
Limited Angle Dual Modality Breast Imaging
NASA Astrophysics Data System (ADS)
More, Mitali J.; Li, Heng; Goodale, Patricia J.; Zheng, Yibin; Majewski, Stan; Popov, Vladimir; Welch, Benjamin; Williams, Mark B.
2007-06-01
We are developing a dual modality breast scanner that can obtain x-ray transmission and gamma ray emission images in succession at multiple viewing angles with the breast held under mild compression. These views are reconstructed and fused to obtain three-dimensional images that combine structural and functional information. Here, we describe the dual modality system and present results of phantom experiments designed to test the system's ability to obtain fused volumetric dual modality data sets from a limited number of projections, acquired over a limited (less than 180 degrees) angular range. We also present initial results from phantom experiments conducted to optimize the acquisition geometry for gamma imaging. The optimization parameters include the total number of views and the angular range over which these views should be spread, while keeping the total number of detected counts fixed. We have found that in general, for a fixed number of views centered around the direction perpendicular to the direction of compression, in-plane contrast and SNR are improved as the angular range of the views is decreased. The improvement in contrast and SNR with decreasing angular range is much greater for deeper lesions and for a smaller number of views. However, the z-resolution of the lesion is significantly reduced with decreasing angular range. Finally, we present results from limited angle tomography scans using a system with dual, opposing heads.
View angle effect in LANDSAT imagery
NASA Technical Reports Server (NTRS)
Kaneko, T.; Engvall, J. L.
1977-01-01
The view angle effect in LANDSAT 2 imagery was investigated. The LANDSAT multispectral scanner scans over a range of view angles of -5.78 to 5.78 degrees. The view angle effect, which is caused by differing view angles, could be studied by comparing data collected at different view angles over a fixed location at a fixed time. Since such LANDSAT data is not available, consecutive day acquisition data were used as a substitute: they were collected over the same geographical location, acquired 24 hours apart, with a view angle change of 7 to 8 degrees at a latitude of 35 to 45 degrees. It is shown that there is approximately a 5% reduction in the average sensor response on the second-day acquisitions as compared with the first-day acquisitions, and that the view angle effect differs field to field and crop to crop. On false infrared color pictures the view angle effect causes changes primarily in brightness and to a lesser degree in color (hue and saturation). An implication is that caution must be taken when images with different view angles are combined for classification and a signature extension technique needs to take the view angle effect into account.
Development of scanning holographic display using MEMS SLM
NASA Astrophysics Data System (ADS)
Takaki, Yasuhiro
2016-10-01
Holography is an ideal three-dimensional (3D) display technique, because it produces 3D images that naturally satisfy human 3D perception including physiological and psychological factors. However, its electronic implementation is quite challenging because ultra-high resolution is required for display devices to provide sufficient screen size and viewing zone. We have developed holographic display techniques to enlarge the screen size and the viewing zone by use of microelectromechanical systems spatial light modulators (MEMS-SLMs). Because MEMS-SLMs can generate hologram patterns at a high frame rate, the time-multiplexing technique is utilized to virtually increase the resolution. Three kinds of scanning systems have been combined with MEMS-SLMs; the screen scanning system, the viewing-zone scanning system, and the 360-degree scanning system. The screen scanning system reduces the hologram size to enlarge the viewing zone and the reduced hologram patterns are scanned on the screen to increase the screen size: the color display system with a screen size of 6.2 in. and a viewing zone angle of 11° was demonstrated. The viewing-zone scanning system increases the screen size and the reduced viewing zone is scanned to enlarge the viewing zone: a screen size of 2.0 in. and a viewing zone angle of 40° were achieved. The two-channel system increased the screen size to 7.4 in. The 360-degree scanning increases the screen size and the reduced viewing zone is scanned circularly: the display system having a flat screen with a diameter of 100 mm was demonstrated, which generates 3D images viewed from any direction around the flat screen.
NASA Astrophysics Data System (ADS)
Je, Uikyu; Cho, Hyosung; Lee, Minsik; Oh, Jieun; Park, Yeonok; Hong, Daeki; Park, Cheulkyu; Cho, Heemoon; Choi, Sungil; Koo, Yangseo
2014-06-01
Recently, reducing radiation doses has become an issue of critical importance in the broader radiological community. As a possible technical approach, especially, in dental cone-beam computed tomography (CBCT), reconstruction from limited-angle view data (< 360°) would enable fast scanning with reduced doses to the patient. In this study, we investigated and implemented an efficient reconstruction algorithm based on compressed-sensing (CS) theory for the scan geometry and performed systematic simulation works to investigate the image characteristics. We also performed experimental works by applying the algorithm to a commercially-available dental CBCT system to demonstrate its effectiveness for image reconstruction in incomplete data problems. We successfully reconstructed CBCT images with incomplete projections acquired at selected scan angles of 120, 150, 180, and 200° with a fixed angle step of 1.2° and evaluated the reconstruction quality quantitatively. Both simulation and experimental demonstrations of the CS-based reconstruction from limited-angle view data show that the algorithm can be applied directly to current dental CBCT systems for reducing the imaging doses and further improving the image quality.
The influence of radiographic viewing perspective and demographics on the Critical Shoulder Angle
Suter, Thomas; Popp, Ariane Gerber; Zhang, Yue; Zhang, Chong; Tashjian, Robert Z.; Henninger, Heath B.
2014-01-01
Background Accurate assessment of the critical shoulder angle (CSA) is important in clinical evaluation of degenerative rotator cuff tears. This study analyzed the influence of radiographic viewing perspective on the CSA, developed a classification system to identify malpositioned radiographs, and assessed the relationship between the CSA and demographic factors. Methods Glenoid height, width and retroversion were measured on 3D CT reconstructions of 68 cadaver scapulae. A digitally reconstructed radiograph was aligned perpendicular to the scapular plane, and retroversion was corrected to obtain a true antero-posterior (AP) view. In 10 scapulae, incremental anteversion/retroversion and flexion/extension views were generated. The CSA was measured and a clinically applicable classification system was developed to detect views with >2° change in CSA versus true AP. Results The average CSA was 33±4°. Intra- and inter-observer reliability was high (ICC≥0.81) but decreased with increasing viewing angle. Views beyond 5° anteversion, 8° retroversion, 15° flexion and 26° extension resulted in >2° deviation of the CSA compared to true AP. The classification system was capable of detecting aberrant viewing perspectives with sensitivity of 95% and specificity of 53%. Correlations between glenoid size and CSA were small (R≤0.3), and CSA did not vary by gender (p=0.426) or side (p=0.821). Conclusions The CSA was most susceptible to malposition in ante/retroversion. Deviations as little as 5° in anteversion resulted in a CSA >2° from true AP. A new classification system refines the ability to collect true AP radiographs of the scapula. The CSA was unaffected by demographic factors. PMID:25591458
Objective for monitoring the corona discharge
NASA Astrophysics Data System (ADS)
Obrezkov, Andrey; Rodionov, Andrey Yu.; Pisarev, Viktor N.; Chivanov, Alexsey N.; Baranov, Yuri P.; Korotaev, Valery V.
2016-04-01
Remote optoelectronic probing is one of the most actual aspects of overhead electric line maintenances. By installing such systems on a helicopter (for example) it becomes possible to monitor overhead transmission line status and to search damaged parts of the lines. Thermal and UV-cameras are used for more effective diagnostic. UV-systems are fitted with filters, that attenuate visible spectrum, which is an undesired type of signal. Also these systems have a wide view angle for better view and proper diagnostics. For even more effectiveness, it is better to use several spectral channels: like UV and IR. Such spectral selection provides good noise reduction. Experimental results of spectral parameters of the wide view angle multispectral objective for such systems are provided in this report. There is also data on point spread function, UV and IR scattering index data and technical requirements for detectors.
Characteristics of mist 3D screen for projection type electro-holography
NASA Astrophysics Data System (ADS)
Sato, Koki; Okumura, Toshimichi; Kanaoka, Takumi; Koizumi, Shinya; Nishikawa, Satoko; Takano, Kunihiko
2006-01-01
The specification of hologram image is the full parallax 3D image. In this case we can get more natural 3D image because focusing and convergence are coincident each other. We try to get practical electro-holography system because for conventional electro-holography the image viewing angle is very small. This is due to the limited display pixel size. Now we are developing new method for large viewing angle by space projection method. White color laser is irradiated to single DMD panel (time shared CGH of RGB three colors). 3D space screen constructed by very small water particle is used to reconstruct the 3D image with large viewing angle by scattering of water particle.
Hologram generation by horizontal scanning of a high-speed spatial light modulator.
Takaki, Yasuhiro; Okada, Naoya
2009-06-10
In order to increase the image size and the viewing zone angle of a hologram, a high-speed spatial light modulator (SLM) is imaged as a vertically long image by an anamorphic imaging system, and this image is scanned horizontally by a galvano scanner. The reduction in horizontal pixel pitch of the SLM provides a wide viewing zone angle. The increased image height and horizontal scanning increased the image size. We demonstrated the generation of a hologram having a 15 degrees horizontal viewing zone angle and an image size of 3.4 inches with a frame rate of 60 Hz using a digital micromirror device with a frame rate of 13.333 kHz as a high-speed SLM.
On the viewing angle dependence of blazar variability
NASA Astrophysics Data System (ADS)
Eldar, Avigdor; Levinson, Amir
2000-05-01
Internal shocks propagating through an ambient radiation field are subject to a radiative drag that, under certain conditions, can significantly affect their dynamics, and consequently the evolution of the beaming cone of emission produced behind the shocks. The resultant change of the Doppler factor combined with opacity effects leads to a strong dependence on the viewing angle of the variability pattern produced by such systems; specifically, the shape of the light curves and the characteristics of correlated emission. One implication is that objects oriented at relatively large viewing angles to the observer should exhibit a higher level of activity at high synchrotron frequencies (above the self-absorption frequency), and also at gamma-ray energies below the threshold energy of pair production, than at lower (radio/millimetre) frequencies.
Adjustable Bracket For Entry Of Welding Wire
NASA Technical Reports Server (NTRS)
Gilbert, Jeffrey L.; Gutow, David A.
1993-01-01
Wire-entry bracket on welding torch in robotic welding system provides for adjustment of angle of entry of welding wire over range of plus or minus 30 degrees from nominal entry angle. Wire positioned so it does not hide weld joint in view of through-the-torch computer-vision system part of robot-controlling and -monitoring system. Swiveling bracket also used on nonvision torch on which wire-feed-through tube interferes with workpiece. Angle simply changed to one giving sufficient clearance.
Complete 360° circumferential SSOCT gonioscopy of the iridocorneal angle
NASA Astrophysics Data System (ADS)
McNabb, Ryan P.; Kuo, Anthony N.; Izatt, Joseph A.
2014-02-01
The ocular iridocorneal angle is generally an optically inaccessible area when viewed directly through the cornea due to the high angle of incidence required and the large index of refraction difference between air and cornea (nair = 1.000 and ncornea = 1.376) resulting in total internal reflection. Gonioscopy allows for viewing of the angle by removing the aircornea interface through the use of a special contact lens on the eye. Gonioscopy is used clinically to visualize the angle directly but only en face. Optical coherence tomography (OCT) has been used to image the angle and deeper structures via an external approach. Typically, this imaging technique is performed by utilizing a conventional anterior segment OCT scanning system. However, instead of imaging the apex of the cornea, either the scanner or the subject is tilted such that the corneoscleral limbus is orthogonal to the optical axis of the scanner requiring multiple volumes to obtain complete circumferential coverage of the ocular angle. We developed a novel gonioscopic OCT (GOCT) system that images the entire ocular angle within a single volume via an "internal" approach through the use of a custom radially symmetric gonioscopic contact lens. We present, to our knowledge, the first complete 360° circumferential volumes of the iridocorneal angle from a direct, internal approach.
Description of a landing site indicator (LASI) for light aircraft operation
NASA Technical Reports Server (NTRS)
Fuller, H. V.; Outlaw, B. K. E.
1976-01-01
An experimental cockpit mounted head-up type display system was developed and evaluated by LaRC pilots during the landing phase of light aircraft operations. The Landing Site Indicator (LASI) system display consists of angle of attack, angle of sideslip, and indicated airspeed images superimposed on the pilot's view through the windshield. The information is made visible to the pilot by means of a partially reflective viewing screen which is suspended directly in frot of the pilot's eyes. Synchro transmitters are operated by vanes, located at the left wing tip, which sense angle of attack and sideslip angle. Information is presented near the center of the display in the form of a moving index on a fixed grid. The airspeed is sensed by a pitot-static pressure transducer and is presented in numerical form at the top center of the display.
Preferred viewing distance and screen angle of electronic paper displays.
Shieh, Kong-King; Lee, Der-Song
2007-09-01
This study explored the viewing distance and screen angle for electronic paper (E-Paper) displays under various light sources, ambient illuminations, and character sizes. Data analysis showed that the mean viewing distance and screen angle were 495 mm and 123.7 degrees. The mean viewing distances for Kolin Chlorestic Liquid Crystal display was 500 mm, significantly longer than Sony electronic ink display, 491 mm. Screen angle for Kolin was 127.4 degrees, significantly greater than that of Sony, 120.0 degrees. Various light sources revealed no significant effect on viewing distances; nevertheless, they showed significant effect on screen angles. The screen angle for sunlight lamp (D65) was similar to that of fluorescent lamp (TL84), but greater than that of tungsten lamp (F). Ambient illumination and E-paper type had significant effects on viewing distance and screen angle. The higher the ambient illumination was, the longer the viewing distance and the lesser the screen angle. Character size had significant effect on viewing distances: the larger the character size, the longer the viewing distance. The results of this study indicated that the viewing distance for E-Paper was similar to that of visual display terminal (VDT) at around 500 mm, but greater than normal paper at about 360 mm. The mean screen angle was around 123.7 degrees, which in terms of viewing angle is 29.5 degrees below horizontal eye level. This result is similar to the general suggested viewing angle between 20 degrees and 50 degrees below the horizontal line of sight.
Microwave Brightness Temperatures of Tilted Convective Systems
NASA Technical Reports Server (NTRS)
Hong, Ye; Haferman, Jeffrey L.; Olson, William S.; Kummerow, Christian D.
1998-01-01
Aircraft and ground-based radar data from the Tropical Ocean and Global Atmosphere Coupled-Ocean Atmosphere Response Experiment (TOGA COARE) show that convective systems are not always vertical. Instead, many are tilted from vertical. Satellite passive microwave radiometers observe the atmosphere at a viewing angle. For example, the Special Sensor Microwave/Imager (SSM/I) on Defense Meteorological Satellite Program (DMSP) satellites and the Tropical Rainfall Measurement Mission (TRMM) Microwave Imager (TMI) on the TRMM satellite have an incident angle of about 50deg. Thus, the brightness temperature measured from one direction of tilt may be different than that viewed from the opposite direction due to the different optical depth. This paper presents the investigation of passive microwave brightness temperatures of tilted convective systems. To account for the effect of tilt, a 3-D backward Monte Carlo radiative transfer model has been applied to a simple tilted cloud model and a dynamically evolving cloud model to derive the brightness temperature. The radiative transfer results indicate that brightness temperature varies when the viewing angle changes because of the different optical depth. The tilt increases the displacements between high 19 GHz brightness temperature (Tb(sub 19)) due to liquid emission from lower level of cloud and the low 85 GHz brightness temperature (Tb(sub 85)) due to ice scattering from upper level of cloud. As the resolution degrades, the difference of brightness temperature due to the change of viewing angle decreases dramatically. The dislocation between Tb(sub 19) and Tb(sub 85), however, remains prominent.
Zheng, Yongbin; Chen, Huimin; Zhou, Zongtan
2018-05-23
The accurate angle measurement of objects outside the linear field of view (FOV) is a challenging task for a strapdown semi-active laser seeker and is not yet well resolved. Considering the fact that the strapdown semi-active laser seeker is equipped with GPS and an inertial navigation system (INS) on a missile, in this work, we present an angle measurement method based on the fusion of the seeker’s data and GPS and INS data for a strapdown semi-active laser seeker. When an object is in the nonlinear FOV or outside the FOV, by solving the problems of space consistency and time consistency, the pitch angle and yaw angle of the object can be calculated via the fusion of the last valid angles measured by the seeker and the corresponding GPS and INS data. The numerical simulation results demonstrate the correctness and effectiveness of the proposed method.
Bidirectional measurements of surface reflectance for view angle corrections of oblique imagery
NASA Technical Reports Server (NTRS)
Jackson, R. D.; Teillet, P. M.; Slater, P. N.; Fedosejevs, G.; Jasinski, Michael F.
1990-01-01
An apparatus for acquiring bidirectional reflectance-factor data was constructed and used over four surface types. Data sets were obtained over a headed wheat canopy, bare soil having several different roughness conditions, playa (dry lake bed), and gypsum sand. Results are presented in terms of relative bidirectional reflectance factors (BRFs) as a function of view angle at a number of solar zenith angles, nadir BRFs as a function of solar zenith angles, and, for wheat, vegetation indices as related to view and solar zenith angles. The wheat canopy exhibited the largest BRF changes with view angle. BRFs for the red and the NIR bands measured over wheat did not have the same relationship with view angle. NIR/Red ratios calculated from nadir BRFs changed by nearly a factor of 2 when the solar zenith angle changed from 20 to 50 degs. BRF versus view angle relationships were similar for soils having smooth and intermediate rough surfaces but were considerably different for the roughest surface. Nadir BRF versus solar-zenith angle relationships were distinctly different for the three soil roughness levels. Of the various surfaces, BRFs for gypsum sand changed the least with view angle (10 percent at 30 degs).
Color image generation for screen-scanning holographic display.
Takaki, Yasuhiro; Matsumoto, Yuji; Nakajima, Tatsumi
2015-10-19
Horizontally scanning holography using a microelectromechanical system spatial light modulator (MEMS-SLM) can provide reconstructed images with an enlarged screen size and an increased viewing zone angle. Herein, we propose techniques to enable color image generation for a screen-scanning display system employing a single MEMS-SLM. Higher-order diffraction components generated by the MEMS-SLM for R, G, and B laser lights were coupled by providing proper illumination angles on the MEMS-SLM for each color. An error diffusion technique to binarize the hologram patterns was developed, in which the error diffusion directions were determined for each color. Color reconstructed images with a screen size of 6.2 in. and a viewing zone angle of 10.2° were generated at a frame rate of 30 Hz.
Study of the retardance of a birefringent waveplate at tilt incidence by Mueller matrix ellipsometer
NASA Astrophysics Data System (ADS)
Gu, Honggang; Chen, Xiuguo; Zhang, Chuanwei; Jiang, Hao; Liu, Shiyuan
2018-01-01
Birefringent waveplates are indispensable optical elements for polarization state modification in various optical systems. The retardance of a birefringent waveplate will change significantly when the incident angle of the light varies. Therefore, it is of great importance to study such field-of-view errors on the polarization properties, especially the retardance of a birefringent waveplate, for the performance improvement of the system. In this paper, we propose a generalized retardance formula at arbitrary incidence and azimuth for a general plane-parallel composite waveplate consisting of multiple aligned single waveplates. An efficient method and corresponding experimental set-up have been developed to characterize the retardance versus the field-of-view angle based on a constructed spectroscopic Mueller matrix ellipsometer. Both simulations and experiments on an MgF2 biplate over an incident angle of 0°-8° and an azimuthal angle of 0°-360° are presented as an example, and the dominant experimental errors are discussed and corrected. The experimental results strongly agree with the simulations with a maximum difference of 0.15° over the entire field of view, which indicates the validity and great potential of the presented method for birefringent waveplate characterization at tilt incidence.
Yamashita, Wakayo; Wang, Gang; Tanaka, Keiji
2010-01-01
One usually fails to recognize an unfamiliar object across changes in viewing angle when it has to be discriminated from similar distractor objects. Previous work has demonstrated that after long-term experience in discriminating among a set of objects seen from the same viewing angle, immediate recognition of the objects across 30-60 degrees changes in viewing angle becomes possible. The capability for view-invariant object recognition should develop during the within-viewing-angle discrimination, which includes two kinds of experience: seeing individual views and discriminating among the objects. The aim of the present study was to determine the relative contribution of each factor to the development of view-invariant object recognition capability. Monkeys were first extensively trained in a task that required view-invariant object recognition (Object task) with several sets of objects. The animals were then exposed to a new set of objects over 26 days in one of two preparatory tasks: one in which each object view was seen individually, and a second that required discrimination among the objects at each of four viewing angles. After the preparatory period, we measured the monkeys' ability to recognize the objects across changes in viewing angle, by introducing the object set to the Object task. Results indicated significant view-invariant recognition after the second but not first preparatory task. These results suggest that discrimination of objects from distractors at each of several viewing angles is required for the development of view-invariant recognition of the objects when the distractors are similar to the objects.
Dual-view-zone tabletop 3D display system based on integral imaging.
He, Min-Yang; Zhang, Han-Le; Deng, Huan; Li, Xiao-Wei; Li, Da-Hai; Wang, Qiong-Hua
2018-02-01
In this paper, we propose a dual-view-zone tabletop 3D display system based on integral imaging by using a multiplexed holographic optical element (MHOE) that has the optical properties of two sets of microlens arrays. The MHOE is recorded by a reference beam using the single-exposure method. The reference beam records the wavefronts of a microlens array from two different directions. Thus, when the display beam is projected on the MHOE, two wavefronts with the different directions will be rebuilt and the 3D virtual images can be reconstructed in two viewing zones. The MHOE has angle and wavelength selectivity. Under the conditions of the matched wavelength and the angle of the display beam, the diffraction efficiency of the MHOE is greatest. Because the unmatched light just passes through the MHOE, the MHOE has the advantage of a see-through display. The experimental results confirm the feasibility of the dual-view-zone tabletop 3D display system.
Expansion of the visual angle of a car rear-view image via an image mosaic algorithm
NASA Astrophysics Data System (ADS)
Wu, Zhuangwen; Zhu, Liangrong; Sun, Xincheng
2015-05-01
The rear-view image system is one of the active safety devices in cars and is widely applied in all types of vehicles and traffic safety areas. However, studies made by both domestic and foreign researchers were based on a single image capture device while reversing, so a blind area still remained to drivers. Even if multiple cameras were used to expand the visual angle of the car's rear-view image in some studies, the blind area remained because different source images were not mosaicked together. To acquire an expanded visual angle of a car rear-view image, two charge-coupled device cameras with optical axes angled at 30 deg were mounted below the left and right fenders of a car in three light conditions-sunny outdoors, cloudy outdoors, and an underground garage-to capture rear-view heterologous images of the car. Then these rear-view heterologous images were rapidly registered through the scale invariant feature transform algorithm. Combined with the random sample consensus algorithm, the two heterologous images were finally mosaicked using the linear weighted gradated in-and-out fusion algorithm, and a seamless and visual-angle-expanded rear-view image was acquired. The four-index test results showed that the algorithms can mosaic rear-view images well in the underground garage condition, where the average rate of correct matching was the lowest among the three conditions. The rear-view image mosaic algorithm presented had the best information preservation, the shortest computation time and the most complete preservation of the image detail features compared to the mean value method (MVM) and segmental fusion method (SFM), and it was also able to perform better in real time and provided more comprehensive image details than MVM and SFM. In addition, it had the most complete image preservation from source images among the three algorithms. The method introduced by this paper provided the basis for researching the expansion of the visual angle of a car rear-view image in all-weather conditions.
Flow visualization and characterization of evaporating liquid drops
NASA Technical Reports Server (NTRS)
Chao, David F. (Inventor); Zhang, Nengli (Inventor)
2004-01-01
An optical system, consisting of drop-reflection image, reflection-refracted shadowgraphy and top-view photography, is used to measure the spreading and instant dynamic contact angle of a volatile-liquid drop on a non-transparent substrate. The drop-reflection image and the shadowgraphy is shown by projecting the images of a collimated laser beam partially reflected by the drop and partially passing through the drop onto a screen while the top view photograph is separately viewed by use of a camera video recorder and monitor. For a transparent liquid on a reflective solid surface, thermocapillary convection in the drop, induced by evaporation, can be viewed nonintrusively, and the drop real-time profile data are synchronously recorded by video recording systems. Experimental results obtained from this technique clearly reveal that evaporation and thermocapillary convection greatly affect the spreading process and the characteristics of dynamic contact angle of the drop.
Interactive stereo electron microscopy enhanced with virtual reality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bethel, E.Wes; Bastacky, S.Jacob; Schwartz, Kenneth S.
2001-12-17
An analytical system is presented that is used to take measurements of objects perceived in stereo image pairs obtained from a scanning electron microscope (SEM). Our system operates by presenting a single stereo view that contains stereo image data obtained from the SEM, along with geometric representations of two types of virtual measurement instruments, a ''protractor'' and a ''caliper''. The measurements obtained from this system are an integral part of a medical study evaluating surfactant, a liquid coating the inner surface of the lung which makes possible the process of breathing. Measurements of the curvature and contact angle of submicronmore » diameter droplets of a fluorocarbon deposited on the surface of airways are performed in order to determine surface tension of the air/liquid interface. This approach has been extended to a microscopic level from the techniques of traditional surface science by measuring submicrometer rather than millimeter diameter droplets, as well as the lengths and curvature of cilia responsible for movement of the surfactant, the airway's protective liquid blanket. An earlier implementation of this approach for taking angle measurements from objects perceived in stereo image pairs using a virtual protractor is extended in this paper to include distance measurements and to use a unified view model. The system is built around a unified view model that is derived from microscope-specific parameters, such as focal length, visible area and magnification. The unified view model ensures that the underlying view models and resultant binocular parallax cues are consistent between synthetic and acquired imagery. When the view models are consistent, it is possible to take measurements of features that are not constrained to lie within the projection plane. The system is first calibrated using non-clinical data of known size and resolution. Using the SEM, stereo image pairs of grids and spheres of known resolution are created to calibrate the measurement system. After calibration, the system is used to take distance and angle measurements of clinical specimens.« less
New three-dimensional visualization system based on angular image differentiation
NASA Astrophysics Data System (ADS)
Montes, Juan D.; Campoy, Pascual
1995-03-01
This paper presents a new auto-stereoscopic system capable of reproducing static or moving 3D images by projection with horizontal parallax or with horizontal and vertical parallaxes. The working principle is based on the angular differentiation of the images which are projected onto the back side of the new patented screen. The most important features of this new system are: (1) Images can be seen by naked eye, without the use of glasses or any other aid. (2) The 3D view angle is not restricted by the angle of the optics making up the screen. (3) Fine tuning is not necessary, independently of the parallax and of the size of the 3D view angle. (4) Coherent light is not necessary neither in capturing the image nor in its reproduction, but standard cameras and projectors. (5) Since the images are projected, the size and depth of the reproduced scene is unrestricted. (6) Manufacturing cost is not excessive, due to the use of optics of large focal length, to the lack of fine tuning and to the use of the same screen several reproduction systems. (7) This technology can be used for any projection system: slides, movies, TV cannons,... A first prototype of static images has been developed and tested with a 3D view angle of 90 degree(s) and a photographic resolution over a planar screen of 900 mm, of diagonal length. Present developments have success on a dramatic size reduction of the projecting system and of its cost. Simultaneous tasks have been carried out on the development of a prototype of 3D moving images.
Nguyen, Dorothy; Vedamurthy, Indu; Schor, Clifton
2008-03-01
Accommodation and convergence systems are cross-coupled so that stimulation of one system produces responses by both systems. Ideally, the cross-coupled responses of accommodation and convergence match their respective stimuli. When expressed in diopters and meter angles, respectively, stimuli for accommodation and convergence are equal in the mid-sagittal plane when viewed with symmetrical convergence, where historically, the gains of the cross coupling (AC/A and CA/C ratios) have been quantified. However, targets at non-zero azimuth angles, when viewed with asymmetric convergence, present unequal stimuli for accommodation and convergence. Are the cross-links between the two systems calibrated to compensate for stimulus mismatches that increase with gaze-azimuth? We measured the response AC/A and stimulus CA/C ratios at zero azimuth, 17.5 and 30 deg of rightward gaze eccentricities with a Badal Optometer and Wheatstone-mirror haploscope. AC/A ratios were measured under open-loop convergence conditions along the iso-accommodation circle (locus of points that stimulate approximately equal amounts of accommodation to the two eyes at all azimuth angles). CA/C ratios were measured under open-loop accommodation conditions along the iso-vergence circle (locus of points that stimulate constant convergence at all azimuth angles). Our results show that the gain of accommodative-convergence (AC/A ratio) decreased and the bias of convergence-accommodation increased at the 30 deg gaze eccentricity. These changes are in directions that compensate for stimulus mismatches caused by spatial-viewing geometry during asymmetric convergence.
Analysis of the restricting factors of laser countermeasure active detection technology
NASA Astrophysics Data System (ADS)
Zhang, Yufa; Sun, Xiaoquan
2016-07-01
The detection effect of laser active detection system is affected by various kinds of factors. In view of the application requirement of laser active detection, the influence factors for laser active detection are analyzed. The mathematical model of cat eye target detection distance has been built, influence of the parameters of laser detection system and the environment on detection range and the detection efficiency are analyzed. Various parameters constraint detection performance is simulated. The results show that the discovery distance of laser active detection is affected by the laser divergence angle, the incident angle and the visibility of the atmosphere. For a given detection range, the laser divergence angle and the detection efficiency are mutually restricted. Therefore, in view of specific application environment, it is necessary to select appropriate laser detection parameters to achieve optimal detection effect.
On-orbit Characterization of RVS for MODIS Thermal Emissive Bands
NASA Technical Reports Server (NTRS)
Xiong, X.; Salomonson, V.; Chiang, K.; Wu, A.; Guenther, B.; Barnes, W.
2004-01-01
Response versus scan angle (RVS) is a key calibration parameter for remote sensing radiometers that make observations using a scanning optical system, such as a scan mirror in MODIS and GLI or a rotating telescope in SeaWiFS and VIIRS, since the calibration is typically performed at a fixed viewing angle while the Earth scene observations are made over a range of viewing angles. Terra MODIS has been in operation for more than four years since its launch in December 1999. It has 36 spectral bands covering spectral range from visible (VIS) to long-wave infrared (LWIR). It is a cross-track scanning radiometer using a two-sided paddle wheel scan mirror, making observations over a wide field of view (FOV) of +/-55 deg from the instrument nadir. This paper describes on-orbit characterization of MODIS RVS for its thermal emissive bands (TEB), using the Earth view data collected during Terra spacecraft deep space maneuvers (DSM). Comparisons with pre-launch analysis and early on-orbit measurements are also provided.
Outer planets mission television subsystem optics study
NASA Technical Reports Server (NTRS)
1972-01-01
An optics study was performed to establish a candidate optical system design for the proposed NASA Mariner Jupiter/Saturn 77 mission. The study was performed over the 6-month period from January through June 1972. The candidate optical system contains both a wide angle (A) and a narrow angle (B) lens. An additional feature is a transfer mirror mechanism that allows image transfer from the B lens to the vidicon initially used for the A lens. This feature adds an operational redundancy to the optical system in allowing for narrow angle viewing if the narrow angle vidicon were to fail. In this failure mode, photography in the wide angle mode would be discontinued. The structure of the candidate system consists mainly of aluminum with substructures of Invar for athermalization. The total optical system weighs (excluding vidicons) approximately 30 pounds and has overall dimensions of 26.6 by 19.5 by 12.3 inches.
Jiao, Leizi; Dong, Daming; Zhao, Xiande; Han, Pengcheng
2016-12-01
In the study, we proposed an animal surface temperature measurement method based on Kinect sensor and infrared thermal imager to facilitate the screening of animals with febrile diseases. Due to random motion and small surface temperature variation of animals, the influence of the angle of view on temperature measurement is significant. The method proposed in the present study could compensate the temperature measurement error caused by the angle of view. Firstly, we analyzed the relationship between measured temperature and angle of view and established the mathematical model for compensating the influence of the angle of view with the correlation coefficient above 0.99. Secondly, the fusion method of depth and infrared thermal images was established for synchronous image capture with Kinect sensor and infrared thermal imager and the angle of view of each pixel was calculated. According to experimental results, without compensation treatment, the temperature image measured in the angle of view of 74° to 76° showed the difference of more than 2°C compared with that measured in the angle of view of 0°. However, after compensation treatment, the temperature difference range was only 0.03-1.2°C. This method is applicable for real-time compensation of errors caused by the angle of view during the temperature measurement process with the infrared thermal imager. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Y; Yin, F; Ren, L
Purpose: To develop an adaptive prior knowledge based image estimation method to reduce the scan angle needed in the LIVE system to reconstruct 4D-CBCT for intrafraction verification. Methods: The LIVE system has been previously proposed to reconstructs 4D volumetric images on-the-fly during arc treatment for intrafraction target verification and dose calculation. This system uses limited-angle beam’s eye view (BEV) MV cine images acquired from the treatment beam together with the orthogonally acquired limited-angle kV projections to reconstruct 4D-CBCT images for target verification during treatment. In this study, we developed an adaptive constrained free-form deformation reconstruction technique in LIVE to furthermore » reduce the scanning angle needed to reconstruct the CBCT images. This technique uses free form deformation with energy minimization to deform prior images to estimate 4D-CBCT based on projections acquired in limited angle (orthogonal 6°) during the treatment. Note that the prior images are adaptively updated using the latest CBCT images reconstructed by LIVE during treatment to utilize the continuity of patient motion.The 4D digital extended-cardiac-torso (XCAT) phantom was used to evaluate the efficacy of this technique with LIVE system. A lung patient was simulated with different scenario, including baseline drifts, amplitude change and phase shift. Limited-angle orthogonal kV and beam’s eye view (BEV) MV projections were generated for each scenario. The CBCT reconstructed by these projections were compared with the ground-truth generated in XCAT.Volume-percentage-difference (VPD) and center-of-mass-shift (COMS) were calculated between the reconstructed and the ground-truth tumors to evaluate the reconstruction accuracy. Results: Using orthogonal-view of 6° kV and BEV- MV projections, the VPD/COMS values were 12.7±4.0%/0.7±0.5 mm, 13.0±5.1%/0.8±0.5 mm, and 11.4±5.4%/0.5±0.3 mm for the three scenarios, respectively. Conclusion: The technique enables LIVE to accurately reconstruct 4D-CBCT images using only orthogonal 6° angle, which greatly improves the efficiency and reduces dose of LIVE for intrafraction verification.« less
Dual-mode switching of a liquid crystal panel for viewing angle control
NASA Astrophysics Data System (ADS)
Baek, Jong-In; Kwon, Yong-Hoan; Kim, Jae Chang; Yoon, Tae-Hoon
2007-03-01
The authors propose a method to control the viewing angle of a liquid crystal (LC) panel using dual-mode switching. To realize both wide viewing angle (WVA) characteristics and narrow viewing angle (NVA) characteristics with a single LC panel, the authors use two different dark states. The LC layer can be aligned homogeneously parallel to the transmission axis of the bottom polarizer for WVA dark state operation, while it can be aligned vertically for NVA dark state operation. The authors demonstrated that viewing angle control can be achieved with a single panel without any loss of contrast at the front.
Okamura, Jun-ya; Yamaguchi, Reona; Honda, Kazunari; Tanaka, Keiji
2014-01-01
One fails to recognize an unfamiliar object across changes in viewing angle when it must be discriminated from similar distractor objects. View-invariant recognition gradually develops as the viewer repeatedly sees the objects in rotation. It is assumed that different views of each object are associated with one another while their successive appearance is experienced in rotation. However, natural experience of objects also contains ample opportunities to discriminate among objects at each of the multiple viewing angles. Our previous behavioral experiments showed that after experiencing a new set of object stimuli during a task that required only discrimination at each of four viewing angles at 30° intervals, monkeys could recognize the objects across changes in viewing angle up to 60°. By recording activities of neurons from the inferotemporal cortex after various types of preparatory experience, we here found a possible neural substrate for the monkeys' performance. For object sets that the monkeys had experienced during the task that required only discrimination at each of four viewing angles, many inferotemporal neurons showed object selectivity covering multiple views. The degree of view generalization found for these object sets was similar to that found for stimulus sets with which the monkeys had been trained to conduct view-invariant recognition. These results suggest that the experience of discriminating new objects in each of several viewing angles develops the partially view-generalized object selectivity distributed over many neurons in the inferotemporal cortex, which in turn bases the monkeys' emergent capability to discriminate the objects across changes in viewing angle. PMID:25378169
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomassen, K I
The SSPX Thermistor is a glass encapsulated bead thermistor made by Thermometrics, a BR 14 P A 103 J. The BR means ruggedized bead structure, 14 is the nominal bead diameter in mils, P refers to opposite end leads, A is the material system code letter, 103 refers to its 10 k{Omega} zero-power resistance at 25 C, and the tolerance letter J indicates {+-} 5% at 25 C. It is football shaped, with height ->, and is viewed through a slot of height h = 0.01 inches. The slot is perpendicular to the long axis of the bead, and ismore » a distance s {approx} 0.775 cm in front of the thermistor. So plasma is viewed over a large angle along the slot, but over a small angle {alpha} perpendicular to the slot. The angle {alpha} is given by 2s tan{alpha} = -> + h.« less
Polarimetric Imaging for the Detection of Disturbed Surfaces
2009-06-01
9 Figure 4. Rayleigh Roughness Criterion as a Function of Incident Angle ......................10 Figure 5. Definition of Geometrical...Terms (after Egan & Hallock, 1966).....................11 Figure 6. Haleakala Ash Depolarization for (a) °0 Viewing Angle and (b) °60 Viewing... Angle (from Egan et al., 1968)..........................................................13 Figure 7. Basalt Depolarization at (a) °0 Viewing Angle and
Inventory and monitoring of natural vegetation and related resources in an arid environment
NASA Technical Reports Server (NTRS)
Schrumpf, B. J. (Principal Investigator); Johnson, J. R.; Mouat, D. A.
1973-01-01
The author has identified the following significant results. A vegetation classification has been established for the test site (approx. 8300 sq km); 31 types are recognized. Some relationships existing among vegetation types and associated terrain features have been characterized. Terrain features can be used to discriminate vegetation types. Macrorelief interpretations on ERTS-1 imagery can be performed with greater accuracy when using high sun angle stereoscopic viewing rather than low sun angle monoscopic viewing. Some plant phenological changes are being recorded by the MSS system.
Scheduling Randomly-Deployed Heterogeneous Video Sensor Nodes for Reduced Intrusion Detection Time
NASA Astrophysics Data System (ADS)
Pham, Congduc
This paper proposes to use video sensor nodes to provide an efficient intrusion detection system. We use a scheduling mechanism that takes into account the criticality of the surveillance application and present a performance study of various cover set construction strategies that take into account cameras with heterogeneous angle of view and those with very small angle of view. We show by simulation how a dynamic criticality management scheme can provide fast event detection for mission-critical surveillance applications by increasing the network lifetime and providing low stealth time of intrusions.
Upper wide-angle viewing system for ITER.
Lasnier, C J; McLean, A G; Gattuso, A; O'Neill, R; Smiley, M; Vasquez, J; Feder, R; Smith, M; Stratton, B; Johnson, D; Verlaan, A L; Heijmans, J A C
2016-11-01
The Upper Wide Angle Viewing System (UWAVS) will be installed on five upper ports of ITER. This paper shows major requirements, gives an overview of the preliminary design with reasons for some design choices, examines self-emitted IR light from UWAVS optics and its effect on accuracy, and shows calculations of signal-to-noise ratios for the two-color temperature output as a function of integration time and divertor temperature. Accurate temperature output requires correction for vacuum window absorption vs. wavelength and for self-emitted IR, which requires good measurement of the temperature of the optical components. The anticipated signal-to-noise ratio using presently available IR cameras is adequate for the required 500 Hz frame rate.
NASA Astrophysics Data System (ADS)
Teng, Dongdong; Liu, Lilin; Zhang, Yueli; Pang, Zhiyong; Wang, Biao
2014-09-01
Through the creative usage of a shiftable cylindrical lens, a wide-view-angle holographic display system is developed for medical object display in real three-dimensional (3D) space based on a time-multiplexing method. The two-dimensional (2D) source images for all computer generated holograms (CGHs) needed by the display system are only one group of computerized tomography (CT) or magnetic resonance imaging (MRI) slices from the scanning device. Complicated 3D message reconstruction on the computer is not necessary. A pelvis is taken as the target medical object to demonstrate this method and the obtained horizontal viewing angle reaches 28°.
On techniques for angle compensation in nonideal iris recognition.
Schuckers, Stephanie A C; Schmid, Natalia A; Abhyankar, Aditya; Dorairaj, Vivekanand; Boyce, Christopher K; Hornak, Lawrence A
2007-10-01
The popularity of the iris biometric has grown considerably over the past two to three years. Most research has been focused on the development of new iris processing and recognition algorithms for frontal view iris images. However, a few challenging directions in iris research have been identified, including processing of a nonideal iris and iris at a distance. In this paper, we describe two nonideal iris recognition systems and analyze their performance. The word "nonideal" is used in the sense of compensating for off-angle occluded iris images. The system is designed to process nonideal iris images in two steps: 1) compensation for off-angle gaze direction and 2) processing and encoding of the rotated iris image. Two approaches are presented to account for angular variations in the iris images. In the first approach, we use Daugman's integrodifferential operator as an objective function to estimate the gaze direction. After the angle is estimated, the off-angle iris image undergoes geometric transformations involving the estimated angle and is further processed as if it were a frontal view image. The encoding technique developed for a frontal image is based on the application of the global independent component analysis. The second approach uses an angular deformation calibration model. The angular deformations are modeled, and calibration parameters are calculated. The proposed method consists of a closed-form solution, followed by an iterative optimization procedure. The images are projected on the plane closest to the base calibrated plane. Biorthogonal wavelets are used for encoding to perform iris recognition. We use a special dataset of the off-angle iris images to quantify the performance of the designed systems. A series of receiver operating characteristics demonstrate various effects on the performance of the nonideal-iris-based recognition system.
Kim, Hwi; Hahn, Joonku; Choi, Hee-Jin
2011-04-10
We investigate the viewing angle enhancement of a lenticular three-dimensional (3D) display with a triplet lens array. The theoretical limitations of the viewing angle and view number of the lenticular 3D display with the triplet lens array are analyzed numerically. For this, the genetic-algorithm-based design method of the triplet lens is developed. We show that a lenticular 3D display with viewing angle of 120° and 144 views without interview cross talk can be realized with the use of an optimally designed triplet lens array. © 2011 Optical Society of America
Voyager spacecraft images of Jupiter and Saturn
NASA Technical Reports Server (NTRS)
Birnbaum, M. M.
1982-01-01
The Voyager imaging system is described, noting that it is made up of a narrow-angle and a wide-angle TV camera, each in turn consisting of optics, a filter wheel and shutter assembly, a vidicon tube, and an electronics subsystem. The narrow-angle camera has a focal length of 1500 mm; its field of view is 0.42 deg and its focal ratio is f/8.5. For the wide-angle camera, the focal length is 200 mm, the field of view 3.2 deg, and the focal ratio of f/3.5. Images are exposed by each camera through one of eight filters in the filter wheel on the photoconductive surface of a magnetically focused and deflected vidicon having a diameter of 25 mm. The vidicon storage surface (target) is a selenium-sulfur film having an active area of 11.14 x 11.14 mm; it holds a frame consisting of 800 lines with 800 picture elements per line. Pictures of Jupiter, Saturn, and their moons are presented, with short descriptions given of the area being viewed.
Efficient fabrication method of nano-grating for 3D holographic display with full parallax views.
Wan, Wenqiang; Qiao, Wen; Huang, Wenbin; Zhu, Ming; Fang, Zongbao; Pu, Donglin; Ye, Yan; Liu, Yanhua; Chen, Linsen
2016-03-21
Without any special glasses, multiview 3D displays based on the diffractive optics can present high resolution, full-parallax 3D images in an ultra-wide viewing angle. The enabling optical component, namely the phase plate, can produce arbitrarily distributed view zones by carefully designing the orientation and the period of each nano-grating pixel. However, such 3D display screen is restricted to a limited size due to the time-consuming fabricating process of nano-gratings on the phase plate. In this paper, we proposed and developed a lithography system that can fabricate the phase plate efficiently. Here we made two phase plates with full nano-grating pixel coverage at a speed of 20 mm2/mins, a 500 fold increment in the efficiency when compared to the method of E-beam lithography. One 2.5-inch phase plate generated 9-view 3D images with horizontal-parallax, while the other 6-inch phase plate produced 64-view 3D images with full-parallax. The angular divergence in horizontal axis and vertical axis was 1.5 degrees, and 1.25 degrees, respectively, slightly larger than the simulated value of 1.2 degrees by Finite Difference Time Domain (FDTD). The intensity variation was less than 10% for each viewpoint, in consistency with the simulation results. On top of each phase plate, a high-resolution binary masking pattern containing amplitude information of all viewing zone was well aligned. We achieved a resolution of 400 pixels/inch and a viewing angle of 40 degrees for 9-view 3D images with horizontal parallax. In another prototype, the resolution of each view was 160 pixels/inch and the view angle was 50 degrees for 64-view 3D images with full parallax. As demonstrated in the experiments, the homemade lithography system provided the key fabricating technology for multiview 3D holographic display.
Jo, Jaehyuck; Moon, Byung Gil; Lee, Joo Yong
2017-12-01
To report the outcome of scleral buckling using a non-contact wide-angle viewing system with a 25-gauge chandelier endoilluminator. Retrospective analyses of medical records were performed for 17 eyes of 16 patients with primary rhegmatogenous retinal detachment (RRD) without proliferative vitreoretinopathy who had undergone conventional scleral buckling with cryoretinopexy using the combination of a non-contact wide-angle viewing system and chandelier endoillumination. The patients were eight males and five females with a mean age of 26.8 ± 10.2 (range, 11 to 47) years. The mean follow-up period was 7.3 ± 3.1 months. Baseline best-corrected visual acuity was 0.23 ± 0.28 logarithm of the minimum angle of resolution units. Best-corrected visual acuity at the final visit showed improvement (0.20 ± 0.25 logarithm of the minimum angle of resolution units), but the improvement was not statistically significant (p = 0.722). As a surgery-related complication, there was vitreous loss at the end of surgery in one eye. As a postoperative complication, increased intraocular pressure (four cases) and herpes simplex epithelial keratitis (one case) were controlled postoperatively with eye drops. One case of persistent RRD after primary surgery needed additional vitrectomy, and the retina was postoperatively attached. Scleral buckling with chandelier illumination as a surgical technique for RRD has the advantages of relieving the surgeon's neck pain from prolonged use of the indirect ophthalmoscope and sharing the surgical procedure with another surgical team member. In addition, fine retinal breaks that are hard to identify using an indirect ophthalmoscope can be easily found under the microscope by direct endoillumination. © 2017 The Korean Ophthalmological Society
Okamura, Jun-Ya; Yamaguchi, Reona; Honda, Kazunari; Wang, Gang; Tanaka, Keiji
2014-11-05
One fails to recognize an unfamiliar object across changes in viewing angle when it must be discriminated from similar distractor objects. View-invariant recognition gradually develops as the viewer repeatedly sees the objects in rotation. It is assumed that different views of each object are associated with one another while their successive appearance is experienced in rotation. However, natural experience of objects also contains ample opportunities to discriminate among objects at each of the multiple viewing angles. Our previous behavioral experiments showed that after experiencing a new set of object stimuli during a task that required only discrimination at each of four viewing angles at 30° intervals, monkeys could recognize the objects across changes in viewing angle up to 60°. By recording activities of neurons from the inferotemporal cortex after various types of preparatory experience, we here found a possible neural substrate for the monkeys' performance. For object sets that the monkeys had experienced during the task that required only discrimination at each of four viewing angles, many inferotemporal neurons showed object selectivity covering multiple views. The degree of view generalization found for these object sets was similar to that found for stimulus sets with which the monkeys had been trained to conduct view-invariant recognition. These results suggest that the experience of discriminating new objects in each of several viewing angles develops the partially view-generalized object selectivity distributed over many neurons in the inferotemporal cortex, which in turn bases the monkeys' emergent capability to discriminate the objects across changes in viewing angle. Copyright © 2014 the authors 0270-6474/14/3415047-13$15.00/0.
McNabb, Ryan P.; Challa, Pratap; Kuo, Anthony N.; Izatt, Joseph A.
2015-01-01
Clinically, gonioscopy is used to provide en face views of the ocular angle. The angle has been imaged with optical coherence tomography (OCT) through the corneoscleral limbus but is currently unable to image the angle from within the ocular anterior chamber. We developed a novel gonioscopic OCT system that images the angle circumferentially from inside the eye through a custom, radially symmetric, gonioscopic contact lens. We present, to our knowledge, the first 360° circumferential volumes (two normal subjects, two subjects with pathology) of peripheral iris and iridocorneal angle structures obtained via an internal approach not typically available in the clinic. PMID:25909021
Scalable screen-size enlargement by multi-channel viewing-zone scanning holography.
Takaki, Yasuhiro; Nakaoka, Mitsuki
2016-08-08
Viewing-zone scanning holographic displays can enlarge both the screen size and the viewing zone. However, limitations exist in the screen size enlargement process even if the viewing zone is effectively enlarged. This study proposes a multi-channel viewing-zone scanning holographic display comprising multiple projection systems and a planar scanner to enable the scalable enlargement of the screen size. Each projection system produces an enlarged image of the screen of a MEMS spatial light modulator. The multiple enlarged images produced by the multiple projection systems are seamlessly tiled on the planar scanner. This screen size enlargement process reduces the viewing zones of the projection systems, which are horizontally scanned by the planar scanner comprising a rotating off-axis lens and a vertical diffuser to enlarge the viewing zone. A screen size of 7.4 in. and a viewing-zone angle of 43.0° are demonstrated.
Improved flight-simulator viewing lens
NASA Technical Reports Server (NTRS)
Kahlbaum, W. M.
1979-01-01
Triplet lens system uses two acrylic plastic double convex lenses and one polystyrene plastic single convex lens to reduce chromatic distortion and lateral aberation, especially at large field angles within in-line systems of flight simulators.
Wide-angle ITER-prototype tangential infrared and visible viewing system for DIII-D.
Lasnier, C J; Allen, S L; Ellis, R E; Fenstermacher, M E; McLean, A G; Meyer, W H; Morris, K; Seppala, L G; Crabtree, K; Van Zeeland, M A
2014-11-01
An imaging system with a wide-angle tangential view of the full poloidal cross-section of the tokamak in simultaneous infrared and visible light has been installed on DIII-D. The optical train includes three polished stainless steel mirrors in vacuum, which view the tokamak through an aperture in the first mirror, similar to the design concept proposed for ITER. A dichroic beam splitter outside the vacuum separates visible and infrared (IR) light. Spatial calibration is accomplished by warping a CAD-rendered image to align with landmarks in a data image. The IR camera provides scrape-off layer heat flux profile deposition features in diverted and inner-wall-limited plasmas, such as heat flux reduction in pumped radiative divertor shots. Demonstration of the system to date includes observation of fast-ion losses to the outer wall during neutral beam injection, and shows reduced peak wall heat loading with disruption mitigation by injection of a massive gas puff.
Wide-angle ITER-prototype tangential infrared and visible viewing system for DIII-D
Lasnier, Charles J.; Allen, Steve L.; Ellis, Ronald E.; ...
2014-08-26
An imaging system with a wide-angle tangential view of the full poloidal cross-section of the tokamak in simultaneous infrared and visible light has been installed on DIII-D. The optical train includes three polished stainless steel mirrors in vacuum, which view the tokamak through an aperture in the first mirror, similar to the design concept proposed for ITER. A dichroic beam splitter outside the vacuum separates visible and infrared (IR) light. Spatial calibration is accomplished by warping a CAD-rendered image to align with landmarks in a data image. The IR camera provides scrape-off layer heat flux profile deposition features in divertedmore » and inner-wall-limited plasmas, such as heat flux reduction in pumped radiative divertor shots. As a result, demonstration of the system to date includes observation of fast-ion losses to the outer wall during neutral beam injection, and shows reduced peak wall heat loading with disruption mitigation by injection of a massive gas puff.« less
New ways in creating pixelgram images
NASA Astrophysics Data System (ADS)
Malureanu, Radu; Di Fabrizio, Enzo
2006-09-01
Since the diffraction gratings were invented, their use in various security systems has been exploited. Their big advantage is the low production cost and, in the same time, the difficulty of replicating them. Most of the nowadays security systems are using those gratings to prove their originality. They can be seen on all the CDs, DVDs, most of the major credit cards and even on the wine bottles. In this article we present a new way of making such gratings without changing the production steps but generating an even more difficult to be replicated item. This new way consists not only in changing the grating period so that various false colours can be seen, but also their orientation so that for a complete check of the grating it should be seen under a certain solid angle. In the same time, one can also keep the possibility to change the grating period so this way various colours can be seen for each angle variation. By combining these two techniques (changing period and changing the angle ones) one can indeed create different images for each view angle and thus increasing the security of the object. In the same time, as can be seen, from the fabrication point of view no further complications appear. The production steps are identical, the only difference being the pattern. The resolution of the grating is not increased necessarily so neither from this point of view will complications appear.
General view of the flight deck of the Orbiter Discovery ...
General view of the flight deck of the Orbiter Discovery looking from a low angle up and aft from approximately behind the commander's station. In the view you can see the overhead aft observation windows, the payload operations work area and in this view the payload bay observation windows have protective covers on them. This view was taken at Kennedy Space Center. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX
NASA Technical Reports Server (NTRS)
Bolton, Matthew L.; Bass, Ellen J.; Comstock, James R., Jr.
2006-01-01
Synthetic Vision Systems (SVS) depict computer generated views of terrain surrounding an aircraft. In the assessment of textures and field of view (FOV) for SVS, no studies have directly measured the 3 levels of spatial awareness: identification of terrain, its relative spatial location, and its relative temporal location. This work introduced spatial awareness measures and used them to evaluate texture and FOV in SVS displays. Eighteen pilots made 4 judgments (relative angle, distance, height, and abeam time) regarding the location of terrain points displayed in 112 5-second, non-interactive simulations of a SVS heads down display. Texture produced significant main effects and trends for the magnitude of error in the relative distance, angle, and abeam time judgments. FOV was significant for the directional magnitude of error in the relative distance, angle, and height judgments. Pilots also provided subjective terrain awareness ratings that were compared with the judgment based measures. The study found that elevation fishnet, photo fishnet, and photo elevation fishnet textures best supported spatial awareness for both the judgments and the subjective awareness measures.
A Low-Cost PC-Based Image Workstation for Dynamic Interactive Display of Three-Dimensional Anatomy
NASA Astrophysics Data System (ADS)
Barrett, William A.; Raya, Sai P.; Udupa, Jayaram K.
1989-05-01
A system for interactive definition, automated extraction, and dynamic interactive display of three-dimensional anatomy has been developed and implemented on a low-cost PC-based image workstation. An iconic display is used for staging predefined image sequences through specified increments of tilt and rotation over a solid viewing angle. Use of a fast processor facilitates rapid extraction and rendering of the anatomy into predefined image views. These views are formatted into a display matrix in a large image memory for rapid interactive selection and display of arbitrary spatially adjacent images within the viewing angle, thereby providing motion parallax depth cueing for efficient and accurate perception of true three-dimensional shape, size, structure, and spatial interrelationships of the imaged anatomy. The visual effect is that of holding and rotating the anatomy in the hand.
2007-03-01
front of a large area blackbody as background. The viewing angle , defined as the angle between surface normal and camera line of sight, was varied by...and polarization angle were derived from the Stokes parameters. The dependence of these polarization characteristics on viewing angle was investigated
Upper wide-angle viewing system for ITER
Lasnier, C. J.; McLean, A. G.; Gattuso, A.; ...
2016-08-15
The Upper Wide Angle Viewing System (UWAVS) will be installed on five upper ports of ITER. Here, this paper shows major requirements, gives an overview of the preliminary design with reasons for some design choices, examines self-emitted IR light from UWAVS optics and its effect on accuracy, and shows calculations of signal-to-noise ratios for the two-color temperature output as a function of integration time and divertor temperature. Accurate temperature output requires correction for vacuum window absorption vs. wavelength and for self-emitted IR, which requires good measurement of the temperature of the optical components. The anticipated signal-to-noise ratio using presently availablemore » IR cameras is adequate for the required 500 Hz frame rate.« less
2015-10-15
NASA's Cassini spacecraft zoomed by Saturn's icy moon Enceladus on Oct. 14, 2015, capturing this stunning image of the moon's north pole. A companion view from the wide-angle camera (PIA20010) shows a zoomed out view of the same region for context. Scientists expected the north polar region of Enceladus to be heavily cratered, based on low-resolution images from the Voyager mission, but high-resolution Cassini images show a landscape of stark contrasts. Thin cracks cross over the pole -- the northernmost extent of a global system of such fractures. Before this Cassini flyby, scientists did not know if the fractures extended so far north on Enceladus. North on Enceladus is up. The image was taken in visible green light with the Cassini spacecraft narrow-angle camera. The view was acquired at a distance of approximately 4,000 miles (6,000 kilometers) from Enceladus and at a Sun-Enceladus-spacecraft, or phase, angle of 9 degrees. Image scale is 115 feet (35 meters) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA19660
NASA Technical Reports Server (NTRS)
Simard, M.; Riel, Bryan; Hensley, S.; Lavalle, Marco
2011-01-01
Radar backscatter data contain both geometric and radiometric distortions due to underlying topography and the radar viewing geometry. Our objective is to develop a radiometric correction algorithm specific to the UAVSAR system configuration that would improve retrieval of forest structure parameters. UAVSAR is an airborne Lband radar capable of repeat?pass interferometry producing images with a spatial resolution of 5m. It is characterized by an electronically steerable antenna to compensate for aircraft attitude. Thus, the computation of viewing angles (i.e. look, incidence and projection) must include aircraft attitude angles (i.e. yaw, pitch and roll) in addition to the antenna steering angle. In this presentation, we address two components of radiometric correction: area projection and vegetation reflectivity. The first correction is applied by normalization of the radar backscatter by the local ground area illuminated by the radar beam. The second is a correction due to changes in vegetation reflectivity with viewing geometry.
2015-11-09
Although Epimetheus appears to be lurking above the rings here, it's actually just an illusion resulting from the viewing angle. In reality, Epimetheus and the rings both orbit in Saturn's equatorial plane. Inner moons and rings orbit very near the equatorial plane of each of the four giant planets in our solar system, but more distant moons can have orbits wildly out of the equatorial plane. It has been theorized that the highly inclined orbits of the outer, distant moons are remnants of the random directions from which they approached the planets they orbit. This view looks toward the unilluminated side of the rings from about -0.3 degrees below the ringplane. The image was taken in visible light with the Cassini spacecraft narrow-angle camera on July 26, 2015. The view was obtained at a distance of approximately 500,000 miles (800,000 kilometers) from Epimetheus and at a Sun-Epimetheus-spacecraft, or phase, angle of 62 degrees. Image scale is 3 miles (5 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA18342
Emissive and reflective properties of curved displays in relation to image quality
NASA Astrophysics Data System (ADS)
Boher, Pierre; Leroux, Thierry; Bignon, Thibault; Collomb-Patton, Véronique; Blanc, Pierre; Sandré-Chardonnal, Etienne
2016-03-01
Different aspects of the characterization of curved displays are presented. The limit of validity of viewing angle measurements without angular distortion on such displays using goniometer or Fourier optics viewing angle instrument is given. If the condition cannot be fulfilled the measurement can be corrected using a general angular distortion formula as demonstrated experimentally using a Samsung Galaxy S6 edge phone display. The reflective properties of the display are characterized by measuring the spectral BRDF using a multispectral Fourier optics viewing angle system. The surface of a curved OLED TV has been measured. The BDRF patterns show a mirror like behavior with and additional strong diffraction along the pixels lines and columns that affect the quality of the display when observed with parasitic lighting. These diffraction effects are very common on OLED surfaces. We finally introduce a commercial ray tracing software that can use directly the measured emissive and reflective properties of the display to make realistic simulation under any lighting environment.
Joanny, M; Salasca, S; Dapena, M; Cantone, B; Travère, J M; Thellier, C; Fermé, J J; Marot, L; Buravand, O; Perrollaz, G; Zeile, C
2012-10-01
ITER first mirrors (FMs), as the first components of most ITER optical diagnostics, will be exposed to high plasma radiation flux and neutron load. To reduce the FMs heating and optical surface deformation induced during ITER operation, the use of relevant materials and cooling system are foreseen. The calculations led on different materials and FMs designs and geometries (100 mm and 200 mm) show that the use of CuCrZr and TZM, and a complex integrated cooling system can limit efficiently the FMs heating and reduce their optical surface deformation under plasma radiation flux and neutron load. These investigations were used to evaluate, for the ITER equatorial port visible∕infrared wide angle viewing system, the impact of the FMs properties change during operation on the instrument main optical performances. The results obtained are presented and discussed.
Touch-screen tablet user configurations and case-supported tilt affect head and neck flexion angles.
Young, Justin G; Trudeau, Matthieu; Odell, Dan; Marinelli, Kim; Dennerlein, Jack T
2012-01-01
The aim of this study was to determine how head and neck postures vary when using two media tablet (slate) computers in four common user configurations. Fifteen experienced media tablet users completed a set of simulated tasks with two media tablets in four typical user configurations. The four configurations were: on the lap and held with the user's hands, on the lap and in a case, on a table and in a case, and on a table and in a case set at a high angle for watching movies. An infra-red LED marker based motion analysis system measured head/neck postures. Head and neck flexion significantly varied across the four configurations and across the two tablets tested. Head and neck flexion angles during tablet use were greater, in general, than angles previously reported for desktop and notebook computing. Postural differences between tablets were driven by case designs, which provided significantly different tilt angles, while postural differences between configurations were driven by gaze and viewing angles. Head and neck posture during tablet computing can be improved by placing the tablet higher to avoid low gaze angles (i.e. on a table rather than on the lap) and through the use of a case that provides optimal viewing angles.
NASA Astrophysics Data System (ADS)
Xu, F.; Diner, D. J.; Seidel, F. C.; Dubovik, O.; Zhai, P.
2014-12-01
A vector Markov chain radiative transfer method was developed for forward modeling of radiance and polarization fields in a coupled atmosphere-ocean system. The method was benchmarked against an independent Successive Orders of Scattering code and linearized through the use of Jacobians. Incorporated with the multi-patch optimization algorithm and look-up-table method, simultaneous aerosol and ocean color retrievals were performed using imagery acquired by the Airborne Multiangle SpectroPolarimetric Imager (AirMSPI) when it was operated in step-and-stare mode with 9 viewing angles ranging between ±67°. Data from channels near 355, 380, 445, 470*, 555, 660*, and 865* nm were used in the retrievals, where the asterisk denotes the polarimetric bands. Retrievals were run for AirMSPI overflights over Southern California and Monterey Bay, CA. For the relatively high aerosol optical depth (AOD) case (~0.28 at 550 nm), the retrieved aerosol concentration, size distribution, water-leaving radiance, and chlorophyll concentration were compared to those reported by the USC SeaPRISM AERONET-OC site off the coast of Southern California on 6 February 2013. For the relatively low AOD case (~0.08 at 550 nm), the retrieved aerosol concentration and size distribution were compared to those reported by the Monterey Bay AERONET site on 28 April 2014. Further, we evaluate the benefits of multi-angle and polarimetric observations by performing the retrievals using (a) all view angles and channels; (b) all view angles but radiances only (no polarization); (c) the nadir view angle only with both radiance and polarization; and (d) the nadir view angle without polarization. Optimized retrievals using different initial guesses were performed to provide a measure of retrieval uncertainty. Removal of multi-angular or polarimetric information resulted in increases in both parameter uncertainty and systematic bias. Potential accuracy improvements afforded by applying constraints on the surface and aerosol parametric models will also be discussed.
C-arm technique using distance driven method for nephrolithiasis and kidney stones detection
NASA Astrophysics Data System (ADS)
Malalla, Nuhad; Sun, Pengfei; Chen, Ying; Lipkin, Michael E.; Preminger, Glenn M.; Qin, Jun
2016-04-01
Distance driven represents a state of art method that used for reconstruction for x-ray techniques. C-arm tomography is an x-ray imaging technique that provides three dimensional information of the object by moving the C-shaped gantry around the patient. With limited view angle, C-arm system was investigated to generate volumetric data of the object with low radiation dosage and examination time. This paper is a new simulation study with two reconstruction methods based on distance driven including: simultaneous algebraic reconstruction technique (SART) and Maximum Likelihood expectation maximization (MLEM). Distance driven is an efficient method that has low computation cost and free artifacts compared with other methods such as ray driven and pixel driven methods. Projection images of spherical objects were simulated with a virtual C-arm system with a total view angle of 40 degrees. Results show the ability of limited angle C-arm technique to generate three dimensional images with distance driven reconstruction.
NASA Technical Reports Server (NTRS)
Donovan, Sheila
1985-01-01
A full evaluation of the bidirectional reflectance properties of different vegetated surfaces was limited in past studies by instrumental inadequacies. With the development of the PARABOLA, it is now possible to sample reflectances from a large number of view angles in a short period of time, maintaining an almost constant solar zenith angle. PARABOLA data collected over five different canopies in Texas are analyzed. The objective of this investigation was to evaluate the intercanopy and intracanopy differences in bidirectional reflectance patterns. Particular attention was given to the separability of canopy types using different view angles for the red and the near infrared (NIR) spectral bands. Comparisons were repeated for different solar zenith angles. Statistical and other quantitative techniques were used to assess these differences. For the canopies investigated, the greatest reflectances were found in the backscatter direction for both bands. Canopy discrimination was found to vary with both view angle and the spectral reflectance band considered, the forward scatter view angles being most suited to observations in the NIR and backscatter view angles giving better results in the red band. Because of different leaf angle distribution characteristics, discrimination was found to be better at small solar zenith angles in both spectral bands.
Military display performance parameters
NASA Astrophysics Data System (ADS)
Desjardins, Daniel D.; Meyer, Frederick
2012-06-01
The military display market is analyzed in terms of four of its segments: avionics, vetronics, dismounted soldier, and command and control. Requirements are summarized for a number of technology-driving parameters, to include luminance, night vision imaging system compatibility, gray levels, resolution, dimming range, viewing angle, video capability, altitude, temperature, shock and vibration, etc., for direct-view and virtual-view displays in cockpits and crew stations. Technical specifications are discussed for selected programs.
Collins Aerodyne VTOL aircraft investigations
1960-01-11
Collins Aerodyne vertical take-off and landing (VTOL) aircraft investigations. Ground plane support system. 3/4 front view. Dave Koening (from Collins Aerodyne) in photo. Mounted on variable height struts, ground board system, zero degree angle of attack. 01/11/1960
High-efficiency directional backlight design for an automotive display.
Chen, Bo-Tsuen; Pan, Jui-Wen
2018-06-01
We propose a high-efficiency directional backlight module (DBM) for automotive display applications. The DBM is composed of light sources, a light guide plate (LGP), and an optically patterned plate (OPP). The LGP has a collimator on the input surface that serves to control the angle of the light emitted to be in the horizontal direction. The OPP has an inverse prism to adjust the light emission angle in the vertical direction. The DBM has a simple structure and high optical efficiency. Compared with conventional backlight systems, the DBM has higher optical efficiency and a suitable viewing angle. This is an improvement in normalized on-axis luminous intensity of 2.6 times and a twofold improvement in optical efficiency. The viewing angles are 100° in the horizontal direction and 35° in the vertical direction. The angle of the half-luminous intensity is 72° in the horizontal direction and 20° in the vertical direction. The uniformity of the illuminance reaches 82%. The DBM is suitable for use in the center information displays of automobiles.
Alam, Md Ashraful; Piao, Mei-Lan; Bang, Le Thanh; Kim, Nam
2013-10-01
Viewing-zone control of integral imaging (II) displays using a directional projection and elemental image (EI) resizing method is proposed. Directional projection of EIs with the same size of microlens pitch causes an EI mismatch at the EI plane. In this method, EIs are generated computationally using a newly introduced algorithm: the directional elemental image generation and resizing algorithm considering the directional projection geometry of each pixel as well as an EI resizing method to prevent the EI mismatch. Generated EIs are projected as a collimated projection beam with a predefined directional angle, either horizontally or vertically. The proposed II display system allows reconstruction of a 3D image within a predefined viewing zone that is determined by the directional projection angle.
BOREAS RSS-2 Level-1B ASAS Image Data: At-Sensor Radiance in BSQ Format
NASA Technical Reports Server (NTRS)
Russell, C.; Hall, Forrest G. (Editor); Nickeson, Jaime (Editor); Dabney, P. W.; Kovalick, W.; Graham, D.; Bur, Michael; Irons, James R.; Tierney, M.
2000-01-01
The BOREAS RSS-2 team used the ASAS instrument, mounted on the NASA C-130 aircraft, to create at-sensor radiance images of various sites as a function of spectral wavelength, view geometry (combinations of view zenith angle, view azimuth angle, solar zenith angle, and solar azimuth angle), and altitude. The level-1b ASAS images of the BOREAS study areas were collected from April to September 1994 and March to July 1996.
NASA Astrophysics Data System (ADS)
Chen, J. M.; He, L.; Chou, S.; Ju, W.; Zhang, Y.; Joiner, J.; Liu, J.; Mo, G.
2017-12-01
Sun-induced chlorophyll fluorescence (SIF) measured from plant canopies originates mostly from sunlit leaves. Observations of SIF by satellite sensors, such as GOME-2 and GOSAT, are often made over large view zenith angle ranges, causing large changes in the viewed sunlit leaf fraction across the scanning swath. Although observations made by OCO-2 are near nadir, the observed sunlit leaf fraction could still vary greatly due to changes in the solar zenith angle with latitude and time of overpass. To demonstrate the importance of considering the satellite-target-view geometry in using SIF for assessing vegetation productivity, we conducted multi-angle measurements of SIF using a hyperspectral sensor mounted on an automated rotating system over a rice field near Nanjing, China. A method is developed to separate SIF measurements at each angle into sunlit and shaded leaf components, and an angularly normalized canopy-level SIF is obtained as the weighted sum of sunlit and shaded SIF. This normalized SIF is shown to be a much better proxy of GPP of the rice field measured by an eddy covariance system than the unnormalized SIF observations. The same normalization scheme is also applied to the far-red GOME-2 SIF observations on sunny days, and we found that the normalized SIF is better correlated with model-simulated GPP than the original SIF observations. The coefficient of determination (R2) is improved by 0.07±0.04 on global average using the normalization scheme. The most significant improvement in R2 by 0.09±0.04 is found in deciduous broadleaf forests, where the observed sunlit leaf fraction is highly sensitive to solar zenith angle.
Berry phase and Hannay’s angle in the Born–Oppenheimer hybrid systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, H.D.; Yi, X.X.; Fu, L.B., E-mail: lbfu.iapcm@gmail.com
2013-12-15
In this paper, we investigate the Berry phase and Hannay’s angle in the Born–Oppenheimer (BO) hybrid systems and obtain their algebraic expressions in terms of one form connection. The semiclassical relation of Berry phase and Hannay’s angle is discussed. We find that, besides the usual connection term, the Berry phase of quantum BO composite system also contains a novel term brought forth by the coupling induced effective gauge potential. This quantum modification can be viewed as an effective Aharonov–Bohm effect. Moreover, the similar phenomenon is founded in Hannay’s angle of classical BO composite system, which indicates that the Berry phasemore » and Hannay’s angle possess the same relation as the usual one. An example is used to illustrate our theory. This scheme can be used to generate artificial gauge potentials for neutral atoms. Besides, the quantum–classical hybrid BO system is also studied to compare with the results in full quantum and full classical composite systems. -- Highlights: •We have derived the Berry phase and Hannay’s angle in BO hybrid systems. •The Berry phase contains a novel term brought by the effective gauge potential. •This mechanism can be used to generate artificial gauge potentials for neutral atoms. •The relation between Hannay’s angles and Berry phases is established.« less
Structural colored liquid membrane without angle dependence.
Takeoka, Yukikazu; Honda, Masaki; Seki, Takahiro; Ishii, Masahiko; Nakamura, Hiroshi
2009-05-01
We have demonstrated for the first time that condensed gel particle suspensions in amorphous-like states display structural color with low angle dependence. This finding is in contrast to the common understanding that a periodic dielectric structure is fundamental to photonic band gap (PBG) production, and it validates the theory that a "tight bonding model" that is applicable to semiconductor systems can also be applied to photonic systems. More practically, this structural colored suspension represents a promising new material for the manufacture of reflective full-color displays with a wide viewing angle and nonfading color materials. This liquid system shows promise as a display material because electronic equipment used for display systems can easily be filled with the liquid in the same way that liquid crystals are currently used.
Xu, Chunyun; Cheng, Haobo; Feng, Yunpeng; Jing, Xiaoli
2016-09-01
A type of laser semiactive angle measurement system is designed for target detecting and tracking. Only one detector is used to detect target location from four distributed aperture optical systems through a 4×1 imaging fiber bundle. A telecentric optical system in image space is designed to increase the efficiency of imaging fiber bundles. According to the working principle of a four-quadrant (4Q) detector, fiber diamond alignment is adopted between an optical system and a 4Q detector. The structure of the laser semiactive angle measurement system is, we believe, novel. Tolerance analysis is carried out to determine tolerance limits of manufacture and installation errors of the optical system. The performance of the proposed method is identified by computer simulations and experiments. It is demonstrated that the linear region of the system is ±12°, with measurement error of better than 0.2°. In general, this new system can be used with large field of view and high accuracy, providing an efficient, stable, and fast method for angle measurement in practical situations.
NASA Technical Reports Server (NTRS)
Zhang, Neng-Li; Chao, David F.
2001-01-01
A new hybrid optical system, consisting of reflection-refracted shadowgraphy and top-view photography, is used to visualize flow phenomena and simultaneously measure the spreading and instant dynamic contact angle in a volatile-liquid drop on a nontransparent substrate. Thermocapillary convection in the drop, induced by evaporation, and the drop real-time profile data are synchronously recorded by video recording systems. Experimental results obtained from this unique technique clearly reveal that thermocapillary convection strongly affects the spreading process and the characteristics of dynamic contact angle of the drop. Comprehensive information of a sessile drop, including the local contact angle along the periphery, the instability of the three-phase contact line, and the deformation of the drop shape is obtained and analyzed.
All-around viewing display system for group activity on life review therapy
NASA Astrophysics Data System (ADS)
Sakamoto, Kunio; Okumura, Mitsuru
2009-10-01
This paper describes 360 degree viewing display system that can be viewed from any direction. A conventional monitor display is viewed from one direction, i.e., the display has narrow viewing angle and observers cannot view the screen from the opposite side. To solve this problem, we developed the 360 degree viewing display for collaborative tasks on the round table. This developed 360 degree viewing system has a liquid crystal display screen and a 360 degree rotating table by motor. The principle is very simple. The screen of a monitor only rotates at a uniform speed, but the optical techniques are also utilized. Moreover, we have developed a floating 360 degree viewing display that can be viewed from any direction. This new viewing system has a display screen, a rotating table and dual parabolic mirrors. In order to float the only image screen above the table, the rotating mechanism works in the parabolic mirrors. Because the dual parabolic mirrors generate a "mirage" image over the upper mirror, observers can view a floating 2D image on the virtual screen in front of them. Then the observer can view a monitor screen at any position surrounding the round table.
Simulations of Convection Zone Flows and Measurements from Multiple Viewing Angles
NASA Technical Reports Server (NTRS)
Duvall, Thomas L.; Hanasoge, Shravan
2011-01-01
A deep-focusing time-distance measurement technique has been applied to linear acoustic simulations of a solar interior perturbed by convective flows. The simulations are for the full sphere for r/R greater than 0.2. From these it is straightforward to simulate the observations from different viewing angles and to test how multiple viewing angles enhance detectibility. Some initial results will be presented.
Segmented slant hole collimator for stationary cardiac SPECT: Monte Carlo simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mao, Yanfei, E-mail: ymao@ucair.med.utah.edu; Yu, Zhicong; Zeng, Gengsheng L.
2015-09-15
Purpose: This work is a preliminary study of a stationary cardiac SPECT system. The goal of this research is to propose a stationary cardiac SPECT system using segmented slant-hole collimators and to perform computer simulations to test the feasibility. Compared to the rotational SPECT, a stationary system has a benefit of acquiring temporally consistent projections. The most challenging issue in building a stationary system is to provide sufficient projection view-angles. Methods: A GATE (GEANT4 application for tomographic emission) Monte Carlo model was developed to simulate a two-detector stationary cardiac SPECT that uses segmented slant-hole collimators. Each detector contains seven segmentedmore » slant-hole sections that slant to a common volume at the rotation center. Consequently, 14 view-angles over 180° were acquired without any gantry rotation. The NCAT phantom was used for data generation and a tailored maximum-likelihood expectation-maximization algorithm was used for image reconstruction. Effects of limited number of view-angles and data truncation were carefully evaluated in the paper. Results: Simulation results indicated that the proposed segmented slant-hole stationary cardiac SPECT system is able to acquire sufficient data for cardiac imaging without a loss of image quality, even when the uptakes in the liver and kidneys are high. Seven views are acquired simultaneously at each detector, leading to 5-fold sensitivity gain over the conventional dual-head system at the same total acquisition time, which in turn increases the signal-to-noise ratio by 19%. The segmented slant-hole SPECT system also showed a good performance in lesion detection. In our prototype system, a short hole-length was used to reduce the dead zone between neighboring collimator segments. The measured sensitivity gain is about 17-fold over the conventional dual-head system. Conclusions: The GATE Monte Carlo simulations confirm the feasibility of the proposed stationary cardiac SPECT system with segmented slant-hole collimators. The proposed collimator consists of combined parallel and slant holes, and the image on the detector is not reduced in size.« less
A multi-camera system for real-time pose estimation
NASA Astrophysics Data System (ADS)
Savakis, Andreas; Erhard, Matthew; Schimmel, James; Hnatow, Justin
2007-04-01
This paper presents a multi-camera system that performs face detection and pose estimation in real-time and may be used for intelligent computing within a visual sensor network for surveillance or human-computer interaction. The system consists of a Scene View Camera (SVC), which operates at a fixed zoom level, and an Object View Camera (OVC), which continuously adjusts its zoom level to match objects of interest. The SVC is set to survey the whole filed of view. Once a region has been identified by the SVC as a potential object of interest, e.g. a face, the OVC zooms in to locate specific features. In this system, face candidate regions are selected based on skin color and face detection is accomplished using a Support Vector Machine classifier. The locations of the eyes and mouth are detected inside the face region using neural network feature detectors. Pose estimation is performed based on a geometrical model, where the head is modeled as a spherical object that rotates upon the vertical axis. The triangle formed by the mouth and eyes defines a vertical plane that intersects the head sphere. By projecting the eyes-mouth triangle onto a two dimensional viewing plane, equations were obtained that describe the change in its angles as the yaw pose angle increases. These equations are then combined and used for efficient pose estimation. The system achieves real-time performance for live video input. Testing results assessing system performance are presented for both still images and video.
NASA Astrophysics Data System (ADS)
Zhou, Shudao; Ma, Zhongliang; Wang, Min; Peng, Shuling
2018-05-01
This paper proposes a novel alignment system based on the measurement of optical path using a light beam scanning mode in a transmissometer. The system controls both the probe beam and the receiving field of view while scanning in two vertical directions. The system then calculates the azimuth angle of the transmitter and the receiver to determine the precise alignment of the optical path. Experiments show that this method can determine the alignment angles in less than 10 min with errors smaller than 66 μrad in the azimuth. This system also features high collimation precision, process automation and simple installation.
Impact of large field angles on the requirements for deformable mirror in imaging satellites
NASA Astrophysics Data System (ADS)
Kim, Jae Jun; Mueller, Mark; Martinez, Ty; Agrawal, Brij
2018-04-01
For certain imaging satellite missions, a large aperture with wide field-of-view is needed. In order to achieve diffraction limited performance, the mirror surface Root Mean Square (RMS) error has to be less than 0.05 waves. In the case of visible light, it has to be less than 30 nm. This requirement is difficult to meet as the large aperture will need to be segmented in order to fit inside a launch vehicle shroud. To reduce this requirement and to compensate for the residual wavefront error, Micro-Electro-Mechanical System (MEMS) deformable mirrors can be considered in the aft optics of the optical system. MEMS deformable mirrors are affordable and consume low power, but are small in size. Due to the major reduction in pupil size for the deformable mirror, the effective field angle is magnified by the diameter ratio of the primary and deformable mirror. For wide field of view imaging, the required deformable mirror correction is field angle dependant, impacting the required parameters of a deformable mirror such as size, number of actuators, and actuator stroke. In this paper, a representative telescope and deformable mirror system model is developed and the deformable mirror correction is simulated to study the impact of the large field angles in correcting a wavefront error using a deformable mirror in the aft optics.
Esthetic smile preferences and the orientation of the maxillary occlusal plane.
Kattadiyil, Mathew T; Goodacre, Charles J; Naylor, W Patrick; Maveli, Thomas C
2012-12-01
The anteroposterior orientation of the maxillary occlusal plane has an important role in the creation, assessment, and perception of an esthetic smile. However, the effect of the angle at which this plane is visualized (the viewing angle) in a broad smile has not been quantified. The purpose of this study was to assess the esthetic preferences of dental professionals and nondentists by using 3 viewing angles of the anteroposterior orientation of the maxillary occlusal plane. After Institutional Review Board approval, standardized digital photographic images of the smiles of 100 participants were recorded by simultaneously triggering 3 cameras set at different viewing angles. The top camera was positioned 10 degrees above the occlusal plane (camera #1, Top view); the center camera was positioned at the level of the occlusal plane (camera #2, Center view); and the bottom camera was located 10 degrees below the occlusal plane (camera #3, Bottom view). Forty-two dental professionals and 31 nondentists (persons from the general population) independently evaluated digital images of each participant's smile captured from the Top view, Center view, and Bottom view. The 73 evaluators were asked individually through a questionnaire to rank the 3 photographic images of each patient as 'most pleasing,' 'somewhat pleasing,' or 'least pleasing,' with most pleasing being the most esthetic view and the preferred orientation of the occlusal plane. The resulting esthetic preferences were statistically analyzed by using the Friedman test. In addition, the participants were asked to rank their own images from the 3 viewing angles as 'most pleasing,' 'somewhat pleasing,' and 'least pleasing.' The 73 evaluators found statistically significant differences in the esthetic preferences between the Top and Bottom views and between the Center and Bottom views (P<.001). No significant differences were found between the Top and Center views. The Top position was marginally preferred over the Center, and both were significantly preferred over the Bottom position. When the participants evaluated their own smiles, a significantly greater number (P< .001) preferred the Top view over the Center or the Bottom views. No significant differences were found in preferences based on the demographics of the evaluators when comparing age, education, gender, profession, and race. The esthetic preference for the maxillary occlusal plane was influenced by the viewing angle with the higher (Top) and center views preferred by both dental and nondental evaluators. The participants themselves preferred the higher view of their smile significantly more often than the center or lower angle views (P<.001). Copyright © 2012 The Editorial Council of the Journal of Prosthetic Dentistry. Published by Mosby, Inc. All rights reserved.
An all-reflective wide-angle flat-field telescope for space
NASA Technical Reports Server (NTRS)
Hallam, K. L.; Howell, B. J.; Wilson, M. E.
1984-01-01
An all-reflective wide-angle flat-field telescope (WAFFT) designed and built at Goddard Space Flight Center demonstrates the markedly improved wide-angle imaging capability which can be achieved with a design based on a recently announced class of unobscured 3-mirror optical systems. Astronomy and earth observation missions in space dictate the necessity or preference for wide-angle all-reflective systems which can provide UV through IR wavelength coverage and tolerate the space environment. An initial prototype unit has been designed to meet imaging requirements suitable for monitoring the ultraviolet sky from space. The unobscured f/4, 36 mm efl system achieves a full 20 x 30 deg field of view with resolution over a flat focal surface that is well matched for use with advanced ultraviolet image array detectors. Aspects of the design and fabrication approach, which have especially important bearing on the system solution, are reviewed; and test results are compared with the analytic performance predictions. Other possible applications of the WAFFT class of imaging system are briefly discussed. The exceptional wide-angle, high quality resolution, and very wide spectral coverage of the WAFFT-type optical system could make it a very important tool for future space research.
C-band backscattering from corn canopies
NASA Technical Reports Server (NTRS)
Daughtry, C. S. T.; Ranson, K. J.; Biehl, L. L.
1991-01-01
A frequency-modulatad continuous-wave C-band (4.8 GHz) scatterometer was mounted on an aerial lift truck, and backscatter coefficients of corn (Zea mays L.) were acquired as functions of polarizations, view angles, and row directions. As phytomass and green-leaf area index increased, the backscatter also increased. Near anthesis, when the canopies were fully developed, the major scattering elements were located in the upper 1 m of the 2.8 m tall canopy and little backscatter was measured below that level for view angles of 30 deg or greater. C-band backscatter data could provide information to monitor tillage operations at small view zenith angles and vegetation at large view zenith angles.
The effect of viewing angle on the spectral behavior of a Gd plasma source near 6.7 nm
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Gorman, Colm; Li Bowen; Cummins, Thomas
2012-04-02
We have demonstrated the effect of viewing angle on the extreme ultraviolet (EUV) emission spectra of gadolinium (Gd) near 6.7 nm. The spectra are shown to have a strong dependence on viewing angle when produced with a laser pulse duration of 10 ns, which may be attributed to absorption by low ion stages of Gd and an angular variation in the ion distribution. Absorption effects are less pronounced at a 150-ps pulse duration due to reduced opacity resulting from plasma expansion. Thus for evaluating source intensity, it is necessary to allow for variation with both viewing angle and target orientation.
A Robust Mechanical Sensing System for Unmanned Sea Surface Vehicles
NASA Technical Reports Server (NTRS)
Kulczycki, Eric A.; Magnone, Lee J.; Huntsberger, Terrance; Aghazarian, Hrand; Padgett, Curtis W.; Trotz, David C.; Garrett, Michael S.
2009-01-01
The need for autonomous navigation and intelligent control of unmanned sea surface vehicles requires a mechanically robust sensing architecture that is watertight, durable, and insensitive to vibration and shock loading. The sensing system developed here comprises four black and white cameras and a single color camera. The cameras are rigidly mounted to a camera bar that can be reconfigured to mount multiple vehicles, and act as both navigational cameras and application cameras. The cameras are housed in watertight casings to protect them and their electronics from moisture and wave splashes. Two of the black and white cameras are positioned to provide lateral vision. They are angled away from the front of the vehicle at horizontal angles to provide ideal fields of view for mapping and autonomous navigation. The other two black and white cameras are positioned at an angle into the color camera's field of view to support vehicle applications. These two cameras provide an overlap, as well as a backup to the front camera. The color camera is positioned directly in the middle of the bar, aimed straight ahead. This system is applicable to any sea-going vehicle, both on Earth and in space.
FlySPEX: a flexible multi-angle spectropolarimetric sensing system
NASA Astrophysics Data System (ADS)
Snik, Frans; Keller, Christoph U.; Wijnen, Merijn; Peters, Hubert; Derks, Roy; Smulders, Edwin
2016-05-01
Accurate multi-angle spectropolarimetry permits the detailed and unambiguous characterization of a wide range of objects. Science cases and commercial applications include atmospheric aerosol studies, biomedical sensing, and food quality control. We introduce the FlySPEX spectropolarimetric fiber-head that constitutes the essential building block of a novel multi-angle sensing system. A combination of miniaturized standard polarization optics inside every fiber-head encodes the full linear polarization information as a spectral modulation of the light that enters two regular optical fibers. By orienting many FlySPEX fiber-heads in any desired set of directions, a fiber bundle contains the complete instantaneous information on polarization as a function of wavelength and as a function of the set of viewing directions. This information is to be recorded by one or several multi-fiber spectrometers. Not only is this system flexible in the amount of viewing directions and their configuration, it also permits multiplexing different wavelength ranges and spectral resolutions by implementing different spectrometers. We present the design and prototyping for a FlySPEX fiber-head that is optimized for both polarimetric accuracy and commercial series production. We integrate the polarimetric calibration of each FlySPEX fiber-head in the manufacturing process.
Automated comprehensive Adolescent Idiopathic Scoliosis assessment using MVC-Net.
Wu, Hongbo; Bailey, Chris; Rasoulinejad, Parham; Li, Shuo
2018-05-18
Automated quantitative estimation of spinal curvature is an important task for the ongoing evaluation and treatment planning of Adolescent Idiopathic Scoliosis (AIS). It solves the widely accepted disadvantage of manual Cobb angle measurement (time-consuming and unreliable) which is currently the gold standard for AIS assessment. Attempts have been made to improve the reliability of automated Cobb angle estimation. However, it is very challenging to achieve accurate and robust estimation of Cobb angles due to the need for correctly identifying all the required vertebrae in both Anterior-posterior (AP) and Lateral (LAT) view x-rays. The challenge is especially evident in LAT x-ray where occlusion of vertebrae by the ribcage occurs. We therefore propose a novel Multi-View Correlation Network (MVC-Net) architecture that can provide a fully automated end-to-end framework for spinal curvature estimation in multi-view (both AP and LAT) x-rays. The proposed MVC-Net uses our newly designed multi-view convolution layers to incorporate joint features of multi-view x-rays, which allows the network to mitigate the occlusion problem by utilizing the structural dependencies of the two views. The MVC-Net consists of three closely-linked components: (1) a series of X-modules for joint representation of spinal structure (2) a Spinal Landmark Estimator network for robust spinal landmark estimation, and (3) a Cobb Angle Estimator network for accurate Cobb Angles estimation. By utilizing an iterative multi-task training algorithm to train the Spinal Landmark Estimator and Cobb Angle Estimator in tandem, the MVC-Net leverages the multi-task relationship between landmark and angle estimation to reliably detect all the required vertebrae for accurate Cobb angles estimation. Experimental results on 526 x-ray images from 154 patients show an impressive 4.04° Circular Mean Absolute Error (CMAE) in AP Cobb angle and 4.07° CMAE in LAT Cobb angle estimation, which demonstrates the MVC-Net's capability of robust and accurate estimation of Cobb angles in multi-view x-rays. Our method therefore provides clinicians with a framework for efficient, accurate, and reliable estimation of spinal curvature for comprehensive AIS assessment. Copyright © 2018. Published by Elsevier B.V.
Digital mammography: comparative performance of color LCD and monochrome CRT displays.
Samei, Ehsan; Poolla, Ananth; Ulissey, Michael J; Lewin, John M
2007-05-01
To evaluate the comparative performance of high-fidelity liquid crystal display (LCD) and cathode ray tube (CRT) devices for mammography applications, and to assess the impact of LCD viewing angle on detection accuracy. Ninety 1 k x 1 k images were selected from a database of digital mammograms: 30 without any abnormality present, 30 with subtle masses, and 30 with subtle microcalcifications. The images were used with waived informed consent, Health Insurance Portability and Accountability Act compliance, and Institutional Review Board approval. With postprocessing presentation identical to those of the commercial mammography system used, 1 k x 1 k sections of images were viewed on a monochrome CRT and a color LCD in native grayscale, and with a grayscale representative of images viewed from a 30 degrees or 50 degrees off-normal viewing angle. Randomized images were independently scored by four experienced breast radiologists for the presence of lesions using a 0-100 grading scale. To compare diagnostic performance of the display modes, observer scores were analyzed using receiver operating characteristic (ROC) and analysis of variance. For masses and microcalcifications, the detection rate in terms of the area under the ROC curve (A(z)) showed a 2% increase and a 4% decrease from CRT to LCD, respectively. However, differences were not statistically significant (P > .05). The viewing angle data showed better microcalcification detection but lower mass detection at 30 degrees viewing orientation. The overall results varied notably from observer to observer yielding no statistically discernible trends across all observers, suggesting that within the 0-50 degrees viewing angle range and in a controlled observer experiment, the variation in the contrast response of the LCD has little or no impact on the detection of mammographic lesions. Although CRTs and LCDs differ in terms of angular response, resolution, noise, and color, these characteristics seem to have little influence on the detection of mammographic lesions. The results suggest comparable performance in clinical applications of the two devices.
View angle dependence of cloud optical thicknesses retrieved by MODIS
NASA Technical Reports Server (NTRS)
Marshak, Alexander; Varnai, Tamas
2005-01-01
This study examines whether cloud inhomogeneity influences the view angle dependence of MODIS cloud optical thickness (tau) retrieval results. The degree of cloud inhomogeneity is characterized through the local gradient in 11 microns brightness temperature. The analysis of liquid phase clouds in a one year long global dataset of Collection 4 MODIS data reveals that while optical thickness retrievals give remarkably consistent results for all view directions if clouds are homogeneous, they give much higher tau-values for oblique views than for overhead views if clouds are inhomogeneous and the sun is fairly oblique. For solar zenith angles larger than 55deg, the mean optical thickness retrieved for the most inhomogeneous third of cloudy pixels is more than 30% higher for oblique views than for overhead views. After considering a variety of possible scenarios, the paper concludes that the most likely reason for the increase lies in three-dimensional radiative interactions that are not considered in current, one-dimensional retrieval algorithms. Namely, the radiative effect of cloud sides viewed at oblique angles seems to contribute most to the enhanced tau-values. The results presented here will help understand cloud retrieval uncertainties related to cloud inhomogeneity. They complement the uncertainty estimates that will start accompanying MODIS cloud products in Collection 5 and may eventually help correct for the observed view angle dependent biases.
NASA Astrophysics Data System (ADS)
Castro, José J.; Pozo, Antonio M.; Rubiño, Manuel
2013-11-01
In this work we studied the color dependence with a horizontal-viewing angle and colorimetric characterization of two liquid-crystal displays (LCD) using two different backlighting: Cold Cathode Fluorescent Lamps (CCFLs) and light-emitting diodes (LEDs). The LCDs studied had identical resolution, size, and technology (TFT - thin film transistor). The colorimetric measurements were made with the spectroradiometer SpectraScan PR-650 following the procedure recommended in the European guideline EN 61747-6. For each display, we measured at the centre of the screen the chromaticity coordinates at horizontal viewing angles of 0, 20, 40, 60 and 80 degrees for the achromatic (A), red (R), green (G) and blue (B) channels. Results showed a greater color-gamut area for the display with LED backlight, compared with the CCFL backlight, showing a greater range of colors perceptible by human vision. This color-gamut area diminished with viewing angle for both displays. Higher differences between trends for viewing angles were observed in the LED-backlight, especially for the R- and G-channels, demonstrating a higher variability of the chromaticity coordinates with viewing angle. The best additivity was reached by the LED-backlight display (a lower error percentage). LED-backlight display provided better color performance of visualization.
Modular multiaperatures for light sensors
NASA Technical Reports Server (NTRS)
Rizzo, A. A.
1977-01-01
Process involves electroplating multiaperature masks as unit, eliminating alinement and assembly difficulties previously encountered. Technique may be applied to masks in automated and surveillance light systems, when precise, wide angle field of view is needed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davies, R.
The spatial autocorrelation functions of broad-band longwave and shortwave radiances measured by the Earth Radiation Budget Experiment (ERBE) are analyzed as a function of view angle in an investigation of the general effects of scene inhomogeneity on radiation. For nadir views, the correlation distance of the autocorrelation function is about 900 km for longwave radiance and about 500 km for shortwave radiance, consistent with higher degrees of freedom in shortwave reflection. Both functions rise monotonically with view angle, but there is a substantial difference in the relative angular dependence of the shortwave and longwave functions, especially for view angles lessmore » than 50 deg. In this range, the increase with angle of the longwave functions is found to depend only on the expansion of pixel area with angle, whereas the shortwave functions show an additional dependence on angle that is attributed to the occlusion of inhomogeneities by cloud height variations. Beyond a view angle of about 50 deg, both longwave and shortwave functions appear to be affected by cloud sides. The shortwave autocorrelation functions do not satisfy the principle of directional reciprocity, thereby proving that the average scene is horizontally inhomogeneous over the scale of an ERBE pixel (1500 sq km). Coarse stratification of the measurements by cloud amount, however, indicates that the average cloud-free scene does satisfy directional reciprocity on this scale.« less
Optical parameters of TN display with dichroic dye
NASA Astrophysics Data System (ADS)
Olifierczuk, Marek; Zielinski, Jerzy; Perkowski, Pawel
2000-05-01
The present work contain the studies on optical parameters (contrast ratio, viewing angle, birefringence and brightness) of twisted nematic display with black dichroic dye which is designed for an application in large-area information and advertising systems. The numerical optimization of display with a dye has been done. The absorption characteristic of the dye has been obtained. Birefringence of doped mixtures (Delta) n has been measured. The contrast ratio of doped mixtures has been measured in wide temperature range from -25 degree(s)C to +70 degree(s)C. The angle characteristics of contrast ratio for +20 degree(s)C have been obtained. In the work the detailed results describing the effect of a dye on temperature dependence of birefringence and contrast ratio, moreover, the effect of dye on the viewing angle for the first and second transmission minimum will be presented. Additionally, the dielectric characteristics of different mixtures will be shown.
What convention is used for the illumination and view angles?
Atmospheric Science Data Center
2014-12-08
... Azimuth angles are measured clockwise from the direction of travel to local north. For both the Sun and cameras, azimuth describes the ... to the equator, because of its morning equator crossing time. Additionally, the difference in view and solar azimuth angle will be near ...
Digital 3D holographic display using scattering layers for enhanced viewing angle and image size
NASA Astrophysics Data System (ADS)
Yu, Hyeonseung; Lee, KyeoReh; Park, Jongchan; Park, YongKeun
2017-05-01
In digital 3D holographic displays, the generation of realistic 3D images has been hindered by limited viewing angle and image size. Here we demonstrate a digital 3D holographic display using volume speckle fields produced by scattering layers in which both the viewing angle and the image size are greatly enhanced. Although volume speckle fields exhibit random distributions, the transmitted speckle fields have a linear and deterministic relationship with the input field. By modulating the incident wavefront with a digital micro-mirror device, volume speckle patterns are controlled to generate 3D images of micrometer-size optical foci with 35° viewing angle in a volume of 2 cm × 2 cm × 2 cm.
Airborne system for multispectral, multiangle polarimetric imaging.
Bowles, Jeffrey H; Korwan, Daniel R; Montes, Marcos J; Gray, Deric J; Gillis, David B; Lamela, Gia M; Miller, W David
2015-11-01
In this paper, we describe the design, fabrication, calibration, and deployment of an airborne multispectral polarimetric imager. The motivation for the development of this instrument was to explore its ability to provide information about water constituents, such as particle size and type. The instrument is based on four 16 MP cameras and uses wire grid polarizers (aligned at 0°, 45°, 90°, and 135°) to provide the separation of the polarization states. A five-position filter wheel provides for four narrow-band spectral filters (435, 550, 625, and 750 nm) and one blocked position for dark-level measurements. When flown, the instrument is mounted on a programmable stage that provides control of the view angles. View angles that range to ±65° from the nadir have been used. Data processing provides a measure of the polarimetric signature as a function of both the view zenith and view azimuth angles. As a validation of our initial results, we compare our measurements, over water, with the output of a Monte Carlo code, both of which show neutral points off the principle plane. The locations of the calculated and measured neutral points are compared. The random error level in the measured degree of linear polarization (8% at 435) is shown to be better than 0.25%.
Investigation of microwave hologram techniques for application to earth resources
NASA Technical Reports Server (NTRS)
Larson, R. W.; Bayma, R. W.; Evans, M. B.; Zelenka, J. S.; Doss, H. W.; Ferris, J. E.
1974-01-01
An investigation of microwave hologram techniques for application to earth resources was conducted during the period from June 1971 to November 1972. The objective of this investigation has been to verify the feasibility of an orbital microwave holographic radar experiment. The primary advantage of microwave hologram radar (MHR) over the side-looking airborne radar (SLAR) is that of aspect or viewing angle; the MHR has a viewing angle identical with that of photography and IR systems. The combination of these systems can thus extend the multispectral analysis concept to span optical through microwave wavelengths. Another advantage is the capacity of the MHR system to generate range contours by operating in a two-frequency mode. It should be clear that along-track resolution of an MHR can be comparable with SLAR systems, but cross-track resolution will be approximately an order of magnitude coarser than the range resolution achievable with an arbitrary SLAR system. An advantage of the MHR over the SLAR is that less average transmitter power is required. This reduction in power results from the much larger receiving apertures associated with MHR systems.
Wide-angle vision for road views
NASA Astrophysics Data System (ADS)
Huang, F.; Fehrs, K.-K.; Hartmann, G.; Klette, R.
2013-03-01
The field-of-view of a wide-angle image is greater than (say) 90 degrees, and so contains more information than available in a standard image. A wide field-of-view is more advantageous than standard input for understanding the geometry of 3D scenes, and for estimating the poses of panoramic sensors within such scenes. Thus, wide-angle imaging sensors and methodologies are commonly used in various road-safety, street surveillance, street virtual touring, or street 3D modelling applications. The paper reviews related wide-angle vision technologies by focusing on mathematical issues rather than on hardware.
Theoretical Limits of Lunar Vision Aided Navigation with Inertial Navigation System
2015-03-26
camera model. Light reflected or projected from objects in the scene of the outside world is taken in by the aperture (or opening) shaped as a double...model’s analog aspects with an analog-to-digital interface converting raw images of the outside world scene into digital information a computer can use to...Figure 2.7. Digital Image Coordinate System. Used with permission [30]. Angular Field of View. The angular field of view is the angle of the world scene
Optic for industrial endoscope/borescope with narrow field of view and low distortion
Stone, Gary F.; Trebes, James E.
2005-08-16
An optic for the imaging optics on the distal end of a flexible fiberoptic endoscope or rigid borescope inspection tool. The image coverage is over a narrow (<20 degrees) field of view with very low optical distortion (<5% pin cushion or barrel distortion), compared to the typical <20% distortion. The optic will permit non-contact surface roughness measurements using optical techniques. This optic will permit simultaneous collection of selected image plane data, which data can then be subsequently optically processed. The image analysis will yield non-contact surface topology data for inspection where access to the surface does not permit a mechanical styles profilometer verification of surface topology. The optic allows a very broad spectral band or range of optical inspection. It is capable of spectroscopic imaging and fluorescence induced imaging when a scanning illumination source is used. The total viewing angle for this optic is 10 degrees for the full field of view of 10 degrees, compared to 40-70 degrees full angle field of view of the conventional gradient index or GRIN's lens systems.
MESSENGER Reveals Mercury in New Detail
2008-01-16
As NASA MESSENGER approached Mercury on January 14, 2008, the spacecraft Narrow-Angle Camera on the Mercury Dual Imaging System MDIS instrument captured this view of the planet rugged, cratered landscape illuminated obliquely by the Sun.
10. 22'X34' original blueprint, VariableAngle Launcher, 'SIDE VIEW CAMERA CARSTEEL ...
10. 22'X34' original blueprint, Variable-Angle Launcher, 'SIDE VIEW CAMERA CAR-STEEL FRAME AND AXLES' drawn at 1/2'=1'-0'. (BOURD Sketch # 209124). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Two Perspectives on Forest Fire
NASA Technical Reports Server (NTRS)
2002-01-01
Multi-angle Imaging Spectroradiometer (MISR) images of smoke plumes from wildfires in western Montana acquired on August 14, 2000. A portion of Flathead Lake is visible at the top, and the Bitterroot Range traverses the images. The left view is from MISR's vertical-viewing (nadir) camera. The right view is from the camera that looks forward at a steep angle (60 degrees). The smoke location and extent are far more visible when seen at this highly oblique angle. However, vegetation is much darker in the forward view. A brown burn scar is located nearly in the exact center of the nadir image, while in the high-angle view it is shrouded in smoke. Also visible in the center and upper right of the images, and more obvious in the clearer nadir view, are checkerboard patterns on the surface associated with land ownership boundaries and logging. Compare these images with the high resolution infrared imagery captured nearby by Landsat 7 half an hour earlier. Images by NASA/GSFC/JPL, MISR Science Team.
System requirements for head down and helmet mounted displays in the military avionics environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flynn, M.F.; Kalmanash, M.; Sethna, V.
1996-12-31
The introduction of flat panel display technologies into the military avionics cockpit is a challenging proposition, due to the very difficult system level requirements which must be met. These relate to environmental extremes (temperature and vibrational), sever ambient lighting conditions (10,000 fL to nighttime viewing), night vision system compatibility, and wide viewing angle. At the same time, the display system must be packaged in minimal space and use minimal power. The authors will present details on the display system requirements for both head down and helmet mounted systems, as well as information on how these challenges may be overcome.
NASA Astrophysics Data System (ADS)
Tate, Tyler H.; McGregor, Davis; Barton, Jennifer K.
2017-02-01
The optical design for a dual modality endoscope based on piezo scanning fiber technology is presented including a novel technique to combine forward-viewing navigation and side viewing OCT. Potential applications include navigating body lumens such as the fallopian tube, biliary ducts and cardiovascular system. A custom cover plate provides a rotationally symmetric double reflection of the OCT beam to deviate and focus the OCT beam out the side of the endoscope for cross-sectional imaging of the tubal lumen. Considerations in the choice of the scanning fiber are explored and a new technique to increase the divergence angle of the scanning fiber to improve system performance is presented. Resolution and the necessary scanning density requirements to achieve Nyquist sampling of the full image are considered. The novel optical design lays the groundwork for a new approach integrating side-viewing OCT into multimodality endoscopes for small lumen imaging. KEYWORDS:
Arabi, Hossein; Kamali Asl, Ali Reza; Ay, Mohammad Reza; Zaidi, Habib
2015-07-01
The purpose of this work is to evaluate the impact of optimization of magnification on performance parameters of the variable resolution X-ray (VRX) CT scanner. A realistic model based on an actual VRX CT scanner was implemented in the GATE Monte Carlo simulation platform. To evaluate the influence of system magnification, spatial resolution, field-of-view (FOV) and scatter-to-primary ratio of the scanner were estimated for both fixed and optimum object magnification at each detector rotation angle. Comparison and inference between these performance parameters were performed angle by angle to determine appropriate object position at each opening half angle. Optimization of magnification resulted in a trade-off between spatial resolution and FOV of the scanner at opening half angles of 90°-12°, where the spatial resolution increased up to 50% and the scatter-to-primary ratio decreased from 4.8% to 3.8% at a detector angle of about 90° for the same FOV and X-ray energy spectrum. The disadvantage of magnification optimization at these angles is the significant reduction of the FOV (up to 50%). Moreover, magnification optimization was definitely beneficial for opening half angles below 12° improving the spatial resolution from 7.5 cy/mm to 20 cy/mm. Meanwhile, the FOV increased by more than 50% at these angles. It can be concluded that optimization of magnification is essential for opening half angles below 12°. For opening half angles between 90° and 12°, the VRX CT scanner magnification should be set according to the desired spatial resolution and FOV. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Cardiac motion correction based on partial angle reconstructed images in x-ray CT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Seungeon; Chang, Yongjin; Ra, Jong Beom, E-mail: jbra@kaist.ac.kr
2015-05-15
Purpose: Cardiac x-ray CT imaging is still challenging due to heart motion, which cannot be ignored even with the current rotation speed of the equipment. In response, many algorithms have been developed to compensate remaining motion artifacts by estimating the motion using projection data or reconstructed images. In these algorithms, accurate motion estimation is critical to the compensated image quality. In addition, since the scan range is directly related to the radiation dose, it is preferable to minimize the scan range in motion estimation. In this paper, the authors propose a novel motion estimation and compensation algorithm using a sinogrammore » with a rotation angle of less than 360°. The algorithm estimates the motion of the whole heart area using two opposite 3D partial angle reconstructed (PAR) images and compensates the motion in the reconstruction process. Methods: A CT system scans the thoracic area including the heart over an angular range of 180° + α + β, where α and β denote the detector fan angle and an additional partial angle, respectively. The obtained cone-beam projection data are converted into cone-parallel geometry via row-wise fan-to-parallel rebinning. Two conjugate 3D PAR images, whose center projection angles are separated by 180°, are then reconstructed with an angular range of β, which is considerably smaller than a short scan range of 180° + α. Although these images include limited view angle artifacts that disturb accurate motion estimation, they have considerably better temporal resolution than a short scan image. Hence, after preprocessing these artifacts, the authors estimate a motion model during a half rotation for a whole field of view via nonrigid registration between the images. Finally, motion-compensated image reconstruction is performed at a target phase by incorporating the estimated motion model. The target phase is selected as that corresponding to a view angle that is orthogonal to the center view angles of two conjugate PAR images. To evaluate the proposed algorithm, digital XCAT and physical dynamic cardiac phantom datasets are used. The XCAT phantom datasets were generated with heart rates of 70 and 100 bpm, respectively, by assuming a system rotation time of 300 ms. A physical dynamic cardiac phantom was scanned using a slowly rotating XCT system so that the effective heart rate will be 70 bpm for a system rotation speed of 300 ms. Results: In the XCAT phantom experiment, motion-compensated 3D images obtained from the proposed algorithm show coronary arteries with fewer motion artifacts for all phases. Moreover, object boundaries contaminated by motion are well restored. Even though object positions and boundary shapes are still somewhat different from the ground truth in some cases, the authors see that visibilities of coronary arteries are improved noticeably and motion artifacts are reduced considerably. The physical phantom study also shows that the visual quality of motion-compensated images is greatly improved. Conclusions: The authors propose a novel PAR image-based cardiac motion estimation and compensation algorithm. The algorithm requires an angular scan range of less than 360°. The excellent performance of the proposed algorithm is illustrated by using digital XCAT and physical dynamic cardiac phantom datasets.« less
VIEW OF CABLES AND TAPES ASSOCIATED WITH ADRIVE CONTROL ROD ...
VIEW OF CABLES AND TAPES ASSOCIATED WITH A-DRIVE CONTROL ROD SYSTEM, AT LEVEL +15, DIRECTLY ABOVE PDP CONTROL ROOM, LOOKING NORTHWEST. THE CABLES FROM THE PDP ROOM GO THROUGH THE CONCRETE WALL, MAKE A RIGHT ANGLE TURN DOWNWARD, AND DESCEND INTO THE PDP CONTROL ROOM AS VERTICAL TAPES - Physics Assembly Laboratory, Area A/M, Savannah River Site, Aiken, Aiken County, SC
NASA Technical Reports Server (NTRS)
Davies, Roger
1994-01-01
The spatial autocorrelation functions of broad-band longwave and shortwave radiances measured by the Earth Radiation Budget Experiment (ERBE) are analyzed as a function of view angle in an investigation of the general effects of scene inhomogeneity on radiation. For nadir views, the correlation distance of the autocorrelation function is about 900 km for longwave radiance and about 500 km for shortwave radiance, consistent with higher degrees of freedom in shortwave reflection. Both functions rise monotonically with view angle, but there is a substantial difference in the relative angular dependence of the shortwave and longwave functions, especially for view angles less than 50 deg. In this range, the increase with angle of the longwave functions is found to depend only on the expansion of pixel area with angle, whereas the shortwave functions show an additional dependence on angle that is attributed to the occlusion of inhomogeneities by cloud height variations. Beyond a view angle of about 50 deg, both longwave and shortwave functions appear to be affected by cloud sides. The shortwave autocorrelation functions do not satisfy the principle of directional reciprocity, thereby proving that the average scene is horizontally inhomogeneous over the scale of an ERBE pixel (1500 sq km). Coarse stratification of the measurements by cloud amount, however, indicates that the average cloud-free scene does satisfy directional reciprocity on this scale.
Thermophysical Properties of Selected Aerospace Materials. Part 1. Thermal Radiative Properties
1976-01-01
discusses the available data and information, the theoretical guidelines and other factors on which the critical evaluation, analysis, and synthesis of...text and a specification table. The former reviews and discusses the available data and information, the theoretical guidelines and other factors on...conditions 6’ Zenith angle for viewing conditions A6 Half angle of acceptance of optical system K Loss value factor X Wavelength p Reflectance p
1999-08-24
One wide-angle and eight narrow-angle camera images of Miranda, taken by NASA Voyager 2, were combined in this view. The controlled mosaic was transformed to an orthographic view centered on the south pole.
Khanduja, Sumeet; Sampangi, Raju; Hemlatha, B C; Singh, Satvir; Lall, Ashish
2018-01-01
Purpose: The purpose of this study is to describe the use of commercial digital single light reflex (DSLR) for vitreoretinal surgery recording and compare it to standard 3-chip charged coupling device (CCD) camera. Methods: Simultaneous recording was done using Sony A7s2 camera and Sony high-definition 3-chip camera attached to each side of the microscope. The videos recorded from both the camera systems were edited and sequences of similar time frames were selected. Three sequences that selected for evaluation were (a) anterior segment surgery, (b) surgery under direct viewing system, and (c) surgery under indirect wide-angle viewing system. The videos of each sequence were evaluated and rated on a scale of 0-10 for color, contrast, and overall quality Results: Most results were rated either 8/10 or 9/10 for both the cameras. A noninferiority analysis by comparing mean scores of DSLR camera versus CCD camera was performed and P values were obtained. The mean scores of the two cameras were comparable for each other on all parameters assessed in the different videos except of color and contrast in posterior pole view and color on wide-angle view, which were rated significantly higher (better) in DSLR camera. Conclusion: Commercial DSLRs are an affordable low-cost alternative for vitreoretinal surgery recording and may be used for documentation and teaching. PMID:29283133
Khanduja, Sumeet; Sampangi, Raju; Hemlatha, B C; Singh, Satvir; Lall, Ashish
2018-01-01
The purpose of this study is to describe the use of commercial digital single light reflex (DSLR) for vitreoretinal surgery recording and compare it to standard 3-chip charged coupling device (CCD) camera. Simultaneous recording was done using Sony A7s2 camera and Sony high-definition 3-chip camera attached to each side of the microscope. The videos recorded from both the camera systems were edited and sequences of similar time frames were selected. Three sequences that selected for evaluation were (a) anterior segment surgery, (b) surgery under direct viewing system, and (c) surgery under indirect wide-angle viewing system. The videos of each sequence were evaluated and rated on a scale of 0-10 for color, contrast, and overall quality Results: Most results were rated either 8/10 or 9/10 for both the cameras. A noninferiority analysis by comparing mean scores of DSLR camera versus CCD camera was performed and P values were obtained. The mean scores of the two cameras were comparable for each other on all parameters assessed in the different videos except of color and contrast in posterior pole view and color on wide-angle view, which were rated significantly higher (better) in DSLR camera. Commercial DSLRs are an affordable low-cost alternative for vitreoretinal surgery recording and may be used for documentation and teaching.
Measuring the Viewing Angle of GW170817 with Electromagnetic and Gravitational Waves
NASA Astrophysics Data System (ADS)
Finstad, Daniel; De, Soumi; Brown, Duncan A.; Berger, Edo; Biwer, Christopher M.
2018-06-01
The joint detection of gravitational waves (GWs) and electromagnetic (EM) radiation from the binary neutron star merger GW170817 ushered in a new era of multi-messenger astronomy. Joint GW–EM observations can be used to measure the parameters of the binary with better precision than either observation alone. Here, we use joint GW–EM observations to measure the viewing angle of GW170817, the angle between the binary’s angular momentum and the line of sight. We combine a direct measurement of the distance to the host galaxy of GW170817 (NGC 4993) of 40.7 ± 2.36 Mpc with the Laser Interferometer Gravitational-wave Observatory (LIGO)/Virgo GW data and find that the viewing angle is {32}-13+10 +/- 1.7 degrees (90% confidence, statistical, and systematic errors). We place a conservative lower limit on the viewing angle of ≥13°, which is robust to the choice of prior. This measurement provides a constraint on models of the prompt γ-ray and radio/X-ray afterglow emission associated with the merger; for example, it is consistent with the off-axis viewing angle inferred for a structured jet model. We provide for the first time the full posterior samples from Bayesian parameter estimation of LIGO/Virgo data to enable further analysis by the community.
Single DMD time-multiplexed 64-views autostereoscopic 3D display
NASA Astrophysics Data System (ADS)
Loreti, Luigi
2013-03-01
Based on previous prototype of the Real time 3D holographic display developed last year, we developed a new concept of auto-stereoscopic multiview display (64 views), wide angle (90°) 3D full color display. The display is based on a RGB laser light source illuminating a DMD (Discovery 4100 0,7") at 24.000 fps, an image deflection system made with an AOD (Acoustic Optic Deflector) driven by a piezo-electric transducer generating a variable standing acoustic wave on the crystal that acts as a phase grating. The DMD projects in fast sequence 64 point of view of the image on the crystal cube. Depending on the frequency of the standing wave, the input picture sent by the DMD is deflected in different angle of view. An holographic screen at a proper distance diffuse the rays in vertical direction (60°) and horizontally select (1°) only the rays directed to the observer. A telescope optical system will enlarge the image to the right dimension. A VHDL firmware to render in real-time (16 ms) 64 views (16 bit 4:2:2) of a CAD model (obj, dxf or 3Ds) and depth-map encoded video images was developed into the resident Virtex5 FPGA of the Discovery 4100 SDK, thus eliminating the needs of image transfer and high speed links
Water depth measurement using an airborne pulsed neon laser system
NASA Technical Reports Server (NTRS)
Hoge, F. E.; Swift, R. N.; Frederick, E. B.
1980-01-01
The paper presents the water depth measurement using an airborne pulsed neon laser system. The results of initial base-line field test results of NASA airborne oceanographic lidar in the bathymetry mode are given, with water-truth measurements of depth and beam attenuation coefficients by boat taken at the same time as overflights to aid in determining the system's operational performance. The nadir-angle tests and field-of-view data are presented; this laser bathymetry system is an improvement over prior models in that (1) the surface-to-bottom pulse waveform is digitally recorded on magnetic tape, and (2) wide-swath mapping data may be routinely acquired using a 30 deg full-angle conical scanner.
Modeling contact angle hysteresis of a liquid droplet sitting on a cosine wave-like pattern surface.
Promraksa, Arwut; Chen, Li-Jen
2012-10-15
A liquid droplet sitting on a hydrophobic surface with a cosine wave-like square-array pattern in the Wenzel state is simulated by using the Surface Evolver to determine the contact angle. For a fixed drop volume, multiple metastable states are obtained at two different surface roughnesses. Unusual and non-circular shape of the three-phase contact line of a liquid droplet sitting on the model surface is observed due to corrugation and distortion of the contact line by structure of the roughness. The contact angle varies along the contact line for each metastable state. The maximum and minimum contact angles among the multiple metastable states at a fixed viewing angle correspond to the advancing and the receding contact angles, respectively. It is interesting to observe that the advancing/receding contact angles (and contact angle hysteresis) are a function of viewing angle. In addition, the receding (or advancing) contact angles at different viewing angles are determined at different metastable states. The contact angle of minimum energy among the multiple metastable states is defined as the most stable (equilibrium) contact angle. The Wenzel model is not able to describe the contact angle along the three-phase contact line. The contact angle hysteresis at different drop volumes is determined. The number of the metastable states increases with increasing drop volume. Drop volume effect on the contact angles is also discussed. Crown Copyright © 2012. Published by Elsevier Inc. All rights reserved.
Andújar, Dionisio; Fernández-Quintanilla, César; Dorado, José
2015-06-04
In energy crops for biomass production a proper plant structure is important to optimize wood yields. A precise crop characterization in early stages may contribute to the choice of proper cropping techniques. This study assesses the potential of the Microsoft Kinect for Windows v.1 sensor to determine the best viewing angle of the sensor to estimate the plant biomass based on poplar seedling geometry. Kinect Fusion algorithms were used to generate a 3D point cloud from the depth video stream. The sensor was mounted in different positions facing the tree in order to obtain depth (RGB-D) images from different angles. Individuals of two different ages, e.g., one month and one year old, were scanned. Four different viewing angles were compared: top view (0°), 45° downwards view, front view (90°) and ground upwards view (-45°). The ground-truth used to validate the sensor readings consisted of a destructive sampling in which the height, leaf area and biomass (dry weight basis) were measured in each individual plant. The depth image models agreed well with 45°, 90° and -45° measurements in one-year poplar trees. Good correlations (0.88 to 0.92) between dry biomass and the area measured with the Kinect were found. In addition, plant height was accurately estimated with a few centimeters error. The comparison between different viewing angles revealed that top views showed poorer results due to the fact the top leaves occluded the rest of the tree. However, the other views led to good results. Conversely, small poplars showed better correlations with actual parameters from the top view (0°). Therefore, although the Microsoft Kinect for Windows v.1 sensor provides good opportunities for biomass estimation, the viewing angle must be chosen taking into account the developmental stage of the crop and the desired parameters. The results of this study indicate that Kinect is a promising tool for a rapid canopy characterization, i.e., for estimating crop biomass production, with several important advantages: low cost, low power needs and a high frame rate (frames per second) when dynamic measurements are required.
Narrow Angle Wide Spectral Range Radiometer Design FEANICS/REEFS Radiometer Design Report
NASA Technical Reports Server (NTRS)
Camperchioli, William
2005-01-01
A critical measurement for the Radiative Enhancement Effects on Flame Spread (REEFS) microgravity combustion experiment is the net radiative flux emitted from the gases and from the solid fuel bed. These quantities are measured using a set of narrow angle, wide spectral range radiometers. The radiometers are required to have an angular field of view of 1.2 degrees and measure over the spectral range of 0.6 to 30 microns, which presents a challenging design effort. This report details the design of this radiometer system including field of view, radiometer response, radiometric calculations, temperature effects, error sources, baffling and amplifiers. This report presents some radiometer specific data but does not present any REEFS experiment data.
Payload isolation and stabilization by a Suspended Experiment Mount (SEM)
NASA Technical Reports Server (NTRS)
Bailey, Wayne L.; Desanctis, Carmine E.; Nicaise, Placide D.; Schultz, David N.
1992-01-01
Many Space Shuttle and Space Station payloads can benefit from isolation from crew or attitude control system disturbances. Preliminary studies have been performed for a Suspended Experiment Mount (SEM) system that will provide isolation from accelerations and stabilize the viewing direction of a payload. The concept consists of a flexible suspension system and payload-mounted control moment gyros. The suspension system, which is rigidly locked for ascent and descent, isolates the payload from high frequency disturbances. The control moment gyros stabilize the payload orientation. The SEM will be useful for payloads that require a lower-g environment than a manned vehicle can provide, such as materials processing, and for payloads that require stabilization of pointing direction, but not large angle slewing, such as nadir-viewing earth observation or solar viewing payloads.
Three-dimensional face model reproduction method using multiview images
NASA Astrophysics Data System (ADS)
Nagashima, Yoshio; Agawa, Hiroshi; Kishino, Fumio
1991-11-01
This paper describes a method of reproducing three-dimensional face models using multi-view images for a virtual space teleconferencing system that achieves a realistic visual presence for teleconferencing. The goal of this research, as an integral component of a virtual space teleconferencing system, is to generate a three-dimensional face model from facial images, synthesize images of the model virtually viewed from different angles, and with natural shadow to suit the lighting conditions of the virtual space. The proposed method is as follows: first, front and side view images of the human face are taken by TV cameras. The 3D data of facial feature points are obtained from front- and side-views by an image processing technique based on the color, shape, and correlation of face components. Using these 3D data, the prepared base face models, representing typical Japanese male and female faces, are modified to approximate the input facial image. The personal face model, representing the individual character, is then reproduced. Next, an oblique view image is taken by TV camera. The feature points of the oblique view image are extracted using the same image processing technique. A more precise personal model is reproduced by fitting the boundary of the personal face model to the boundary of the oblique view image. The modified boundary of the personal face model is determined by using face direction, namely rotation angle, which is detected based on the extracted feature points. After the 3D model is established, the new images are synthesized by mapping facial texture onto the model.
NASA Astrophysics Data System (ADS)
Nikolashkin, S. V.; Reshetnikov, A. A.
2017-11-01
The system of video surveillance during active rocket experiments in the Polar geophysical observatory "Tixie" and studies of the effects of "Soyuz" vehicle launches from the "Vostochny" cosmodrome over the territory of the Republic of Sakha (Yakutia) is presented. The created system consists of three AHD video cameras with different angles of view mounted on a common platform mounted on a tripod with the possibility of manual guiding. The main camera with high-sensitivity black and white CCD matrix SONY EXview HADII is equipped depending on the task with lenses "MTO-1000" (F = 1000 mm) or "Jupiter-21M " (F = 300 mm) and is designed for more detailed shooting of luminous formations. The second camera of the same type, but with a 30 degree angle of view. It is intended for shooting of the general plan and large objects, and also for a binding of coordinates of object on stars. The third color wide-angle camera (120 degrees) is designed to be connected to landmarks in the daytime, the optical axis of this channel is directed at 60 degrees down. The data is recorded on the hard disk of a four-channel digital video recorder. Tests of the original version of the system with two channels were conducted during the launch of the geophysical rocket in Tixie in September 2015 and showed its effectiveness.
11. View of south side of radar scanner building no. ...
11. View of south side of radar scanner building no. 104 showing personnel exit door at side building, showing DR 1 antenna from oblique angle on foundation berm with DR 2 and DR 3 antennae in background. - Clear Air Force Station, Ballistic Missile Early Warning System Site II, One mile west of mile marker 293.5 on Parks Highway, 5 miles southwest of Anderson, Anderson, Denali Borough, AK
NASA Technical Reports Server (NTRS)
Bauer, James M.; Grav, Tommy; Buratti, Bonnie J.; Hicks, Michael D.
2006-01-01
During its 2005 January opposition, the saturnian system could be viewed at an unusually low phase angle. We surveyed a subset of Saturn's irregular satellites to obtain their true opposition magnitudes, or nearly so, down to phase angle values of 0.01 deg. Combining our data taken at the Palomar 200-inch and Cerro Tololo Inter-American Observatory's 4-m Blanco telescope with those in the literature, we present the first phase curves for nearly half the irregular satellites originally reported by Gladman et al. [2001. Nature 412, 163-166], including Paaliaq (SXX), Siarnaq (SXXIX), Tarvos (SXXI), Ijiraq (SXXII), Albiorix (SXVI), and additionally Phoebe's narrowest angle brightness measured to date. We find centaur-like steepness in the phase curves or opposition surges in most cases with the notable exception of three, Albiorix and Tarvos, which are suspected to be of similar origin based on dynamical arguments, and Siarnaq.During its 2005 January opposition, the saturnian system could be viewed at an unusually low phase angle. We surveyed a subset of Saturn's irregular satellites to obtain their true opposition magnitudes, or nearly so, down to phase angle values of 0.01 deg. Combining our data taken at the Palomar 200-inch and Cerro Tololo Inter-American Observatory's 4-m Blanco telescope with those in the literature, we present the first phase curves for nearly half the irregular satellites originally reported by Gladman et al. [2001. Nature 412, 163-166], including Paaliaq (SXX), Siarnaq (SXXIX), Tarvos (SXXI), Ijiraq (SXXII), Albiorix (SXVI), and additionally Phoebe's narrowest angle brightness measured to date. We find centaur-like steepness in the phase curves or opposition surges in most cases with the notable exception of three, Albiorix and Tarvos, which are suspected to be of similar origin based on dynamical arguments, and Siarnaq.
Visual Costs of the Inhomogeneity of Luminance and Contrast by Viewing LCD-TFT Screens Off-Axis.
Ziefle, Martina; Groeger, Thomas; Sommer, Dietmar
2003-01-01
In this study the anisotropic characteristics of TFT-LCD (Thin-Film-Transistor-Liquid Crystal Display) screens were examined. Anisotropy occurs as the distribution of luminance and contrast changes over the screen surface due to different viewing angles. On the basis of detailed photometric measurements the detection performance in a visual reaction task was measured in different viewing conditions. Viewing angle (0 degrees, frontal view; 30 degrees, off-axis; 50 degrees, off-axis) as well as ambient lighting (a dark or illuminated room) were varied. Reaction times and accuracy of detection performance were recorded. Results showed TFT's anisotropy to be a crucial factor deteriorating performance. With an increasing viewing angle performance decreased. It is concluded that TFT's anisotropy is a limiting factor for overall suitability and usefulness of this new display technology.
NASA Astrophysics Data System (ADS)
Gao, Xin; Sang, Xinzhu; Yu, Xunbo; Zhang, Wanlu; Yan, Binbin; Yu, Chongxiu
2018-06-01
The floating 3D display system based on Tessar array and directional diffuser screen is proposed. The directional diffuser screen can smoothen the gap of lens array and make the 3D image's brightness continuous. The optical structure and aberration characteristics of the floating three-dimensional (3D) display system are analyzed. The simulation and experiment are carried out, which show that the 3D image quality becomes more and more deteriorative with the further distance of the image plane and the increasing viewing angle. To suppress the aberrations, the Tessar array is proposed according to the aberration characteristics of the floating 3D display system. A 3840 × 2160 liquid crystal display panel (LCD) with the size of 23.6 inches, a directional diffuser screen and a Tessar array are used to display the final 3D images. The aberrations are reduced and the definition is improved compared with that of the display with a single-lens array. The display depth of more than 20 cm and the viewing angle of more than 45° can be achieved.
Performance evaluation of stereo endoscopic imaging system incorporating TFT-LCD.
Song, C-G; Park, S-K
2005-01-01
This paper presents a 3D endoscopic video system designed to improve visualization and enhance the ability of the surgeon to perform delicate endoscopic surgery. In a comparison of the polarized and electric shutter-type stereo imaging systems, the former was found to be superior in terms of both accuracy and speed for knot-tying and for the loop pass test. The results of our experiments show that the proposed 3D endoscopic system has a sufficiently wide viewing angle and zone for multi-viewing, and that it provides better image quality and more stable optical performance compared with the electric shutter-type.
Preferred viewing distance of liquid crystal high-definition television.
Lee, Der-Song
2012-01-01
This study explored the effect of TV size, illumination, and viewing angle on preferred viewing distance in high-definition liquid crystal display televisions (HDTV). Results showed that the mean preferred viewing distance was 2856 mm. TV size and illumination significantly affected preferred viewing distance. The larger the screen size, the greater the preferred viewing distance, at around 3-4 times the width of the screen (W). The greater the illumination, the greater the preferred viewing distance. Viewing angle also correlated significantly with preferred viewing distance. The more deflected from direct frontal view, the shorter the preferred viewing distance seemed to be. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Examining view angle effects on leaf N estimation in wheat using field reflectance spectroscopy
NASA Astrophysics Data System (ADS)
Song, Xiao; Feng, Wei; He, Li; Xu, Duanyang; Zhang, Hai-Yan; Li, Xiao; Wang, Zhi-Jie; Coburn, Craig A.; Wang, Chen-Yang; Guo, Tian-Cai
2016-12-01
Real-time, nondestructive monitoring of crop nitrogen (N) status is a critical factor for precision N management during wheat production. Over a 3-year period, we analyzed different wheat cultivars grown under different experimental conditions in China and Canada and studied the effects of viewing angle on the relationships between various vegetation indices (VIs) and leaf nitrogen concentration (LNC) using hyperspectral data from 11 field experiments. The objective was to improve the prediction accuracy by minimizing the effects of viewing angle on LNC estimation to construct a novel vegetation index (VI) for use under different experimental conditions. We examined the stability of previously reported optimum VIs obtained from 13 traditional indices for estimating LNC at 13 viewing zenith angles (VZAs) in the solar principal plane (SPP). Backscattering direction showed better index performance than forward scattering direction. Red-edge VIs including modified normalized difference vegetation index (mND705), ratio index within the red edge region (RI-1dB) and normalized difference red edge index (NDRE) were highly correlated with LNC, as confirmed by high R2 determination coefficients. However, these common VIs tended to saturation, as the relationships strongly depended on experimental conditions. To overcome the influence of VZA on VIs, the chlorophyll- and LNC-sensitive NDRE index was divided by the floating-position water band index (FWBI) to generate the integrated narrow-band vegetation index. The highest correlation between the novel NDRE/FWBI parameter and LNC (R2 = 0.852) occurred at -10°, while the lowest correlation (R2 = 0.745) occurred at 60°. NDRE/FWBI was more highly correlated with LNC than existing commonly used VIs at an identical viewing zenith angle. Upon further analysis of angle combinations, our novel VI exhibited the best performance, with the best prediction accuracy at 0° to -20° (R2 = 0.838, RMSE = 0.360) and relatively good accuracy at 0° to -30° (R2 = 0.835, RMSE = 0.366). As it is possible to monitor plant N status over a wide range of angles using portable spectrometers, viewing angles of as much as 0° to -30° are common. Consequently, we developed a united model across angles of 0° to -30° to reduce the effects of viewing angle on LNC prediction in wheat. The proposed combined NDRE/FWBI parameter, designated the wide-angle-adaptability nitrogen index (WANI), is superior for estimating LNC in wheat on a regional scale in China and Canada.
2017-05-30
Before NASA's Cassini entered its Grand Finale orbits, it acquired unprecedented views of the outer edges of the main ring system. For example, this close-up view of the Keeler Gap, which is near the outer edge of Saturn's main rings, shows in great detail just how much the moon Daphnis affects the edges of the gap. Daphnis creates waves in the edges of the gap through its gravitational influence. Some clumping of ring particles can be seen in the perturbed edge, similar to what was seen on the edges of the Encke Gap back when Cassini arrived at Saturn in 2004. This view looks toward the sunlit side of the rings from about 3 degrees above the ring plane. The view was acquired at a distance of approximately 18,000 miles (30,000 kilometers) from Daphnis and at a Sun-Daphnis-spacecraft, or phase, angle of 69 degrees. Image scale is 581 feet (177 meters) per pixel. The image was taken in visible light with the Cassini spacecraft narrow-angle camera on Jan. 16, 2017. https://photojournal.jpl.nasa.gov/catalog/PIA21329
Lee, Ji-Hoon; Lee, Jung Jin; Lim, Young Jin; Kundu, Sudarshan; Kang, Shin-Woong; Lee, Seung Hee
2013-11-04
Long standing electro-optic problems of a polymer-dispersed liquid crystal (PDLC) such as low contrast ratio and transmittances decrease in oblique viewing angle have been challenged with a mixture of dual frequency liquid crystal (DFLC) and reactive mesogen (RM). The DFLC and RM molecules were vertically aligned and then photo-polymerized using a UV light. At scattering state under 50 kHz electric field, DFLC was switched to planar state, giving greater extraordinary refractive index than the normal PDLC cell. Consequently, the scattering intensity and the contrast ratio were increased compared to the conventional PDLC cell. At transparent state under 1 kHz electric field, the extraordinary refractive index of DFLC was simultaneously matched with the refractive index of vertically aligned RM so that the light scattering in oblique viewing angles was minimized, giving rise to high transmittance in all viewing angles.
NASA Technical Reports Server (NTRS)
Bauer, M. E. (Principal Investigator); Vanderbilt, V. C.; Robinson, B. F.; Biehl, L. L.; Vanderbilt, A. S.
1981-01-01
The reflectance response with view angle of wheat, was analyzed. The analyses, which assumes there are no atmospheric effects, and otherwise simulates the response of a multispectral scanner, is based upon spectra taken continuously in wavelength from 0.45 to 2.4 micrometers at more than 1200 view/illumination directions using an Exotech model 20C spectra radiometer. Data were acquired six meters above four wheat canopies, each at a different growth stage. The analysis shows that the canopy reflective response is a pronounced function of illumination angle, scanner view angle and wavelength. The variation is greater at low solar elevations compared to high solar elevations.
2017-11-27
These two images illustrate just how far Cassini traveled to get to Saturn. On the left is one of the earliest images Cassini took of the ringed planet, captured during the long voyage from the inner solar system. On the right is one of Cassini's final images of Saturn, showing the site where the spacecraft would enter the atmosphere on the following day. In the left image, taken in 2001, about six months after the spacecraft passed Jupiter for a gravity assist flyby, the best view of Saturn using the spacecraft's high-resolution (narrow-angle) camera was on the order of what could be seen using the Earth-orbiting Hubble Space Telescope. At the end of the mission (at right), from close to Saturn, even the lower resolution (wide-angle) camera could capture just a tiny part of the planet. The left image looks toward Saturn from 20 degrees below the ring plane and was taken on July 13, 2001 in wavelengths of infrared light centered at 727 nanometers using the Cassini spacecraft narrow-angle camera. The view at right is centered on a point 6 degrees north of the equator and was taken in visible light using the wide-angle camera on Sept. 14, 2017. The view on the left was acquired at a distance of approximately 317 million miles (510 million kilometers) from Saturn. Image scale is about 1,900 miles (3,100 kilometers) per pixel. The view at right was acquired at a distance of approximately 360,000 miles (579,000 kilometers) from Saturn. Image scale is 22 miles (35 kilometers) per pixel. The Cassini spacecraft ended its mission on Sept. 15, 2017. https://photojournal.jpl.nasa.gov/catalog/PIA21353
Fiber optics welder having movable aligning mirror
Higgins, Robert W.; Robichaud, Roger E.
1981-01-01
A system for welding fiber optic waveguides together. The ends of the two fibers to be joined together are accurately, collinearly aligned in a vertical orientation and subjected to a controlled, diffuse arc to effect welding and thermal conditioning. A front-surfaced mirror mounted at a 45.degree. angle to the optical axis of a stereomicroscope mounted for viewing the junction of the ends provides two orthogonal views of the interface during the alignment operation.
Higgins, R.W.; Robichaud, R.E.
A system is described for welding fiber optic waveguides together. The ends of the two fibers to be joined together are accurately, collinearly aligned in a vertical orientation and subjected to a controlled, diffuse arc to effect welding and thermal conditioning. A front-surfaced mirror mounted at a 45/sup 0/ angle to the optical axis of a stereomicroscope mounted for viewing the junction of the ends provides two orthogonal views of the interface during the alignment operation.
Normalization of multidirectional red and NIR reflectances with the SAVI
NASA Technical Reports Server (NTRS)
Huete, A. R.; Hua, G.; Qi, J.; Chehbouni, A.; Van Leeuwen, W. J. D.
1992-01-01
Directional reflectance measurements were made over a semi-desert gramma grassland at various times of the growing season. View angle measurements from +40 to -40 degrees were made at various solar zenith angles and soil moisture conditions. The sensitivity of the Normalized Difference Vegetation Index (NDVI) and the Soil Adjusted Vegetation Index (SAVI) to bidirectional measurements was assessed for purposes of improving remote temporal monitoring of vegetation dynamics. The SAVI view angle response was found to be symmetric about nadir while the NDVI response was strongly anisotropic. This enabled the view angle behavior of the SAVI to be normalized with a cosine function. In contrast to the NDVI, the SAVI was able to minimize soil moisture and shadow influences for all measurement conditions.
Effects of changing canopy directional reflectance on feature selection
NASA Technical Reports Server (NTRS)
Smith, J. A.; Oliver, R. E.; Kilpela, O. E.
1973-01-01
The use of a Monte Carlo model for generating sample directional reflectance data for two simplified target canopies at two different solar positions is reported. Successive iterations through the model permit the calculation of a mean vector and covariance matrix for canopy reflectance for varied sensor view angles. These data may then be used to calculate the divergence between the target distributions for various wavelength combinations and for these view angles. Results of a feature selection analysis indicate that different sets of wavelengths are optimum for target discrimination depending on sensor view angle and that the targets may be more easily discriminated for some scan angles than others. The time-varying behavior of these results is also pointed out.
THE VIEWING ANGLES OF BROAD ABSORPTION LINE VERSUS UNABSORBED QUASARS
DOE Office of Scientific and Technical Information (OSTI.GOV)
DiPompeo, M. A.; Brotherton, M. S.; De Breuck, C.
2012-06-10
It was recently shown that there is a significant difference in the radio spectral index distributions of broad absorption line (BAL) quasars and unabsorbed quasars, with an overabundance of BAL quasars with steeper radio spectra. This result suggests that source orientation does play into the presence or absence of BAL features. In this paper, we provide more quantitative analysis of this result based on Monte Carlo simulations. While the relationship between viewing angle and spectral index does indeed contain a lot of scatter, the spectral index distributions are different enough to overcome that intrinsic variation. Utilizing two different models ofmore » the relationship between spectral index and viewing angle, the simulations indicate that the difference in spectral index distributions can be explained by allowing BAL quasar viewing angles to extend about 10 Degree-Sign farther from the radio jet axis than non-BAL sources, though both can be seen at small angles. These results show that orientation cannot be the only factor determining whether BAL features are present, but it does play a role.« less
Photogrammetry System and Method for Determining Relative Motion Between Two Bodies
NASA Technical Reports Server (NTRS)
Miller, Samuel A. (Inventor); Severance, Kurt (Inventor)
2014-01-01
A photogrammetry system and method provide for determining the relative position between two objects. The system utilizes one or more imaging devices, such as high speed cameras, that are mounted on a first body, and three or more photogrammetry targets of a known location on a second body. The system and method can be utilized with cameras having fish-eye, hyperbolic, omnidirectional, or other lenses. The system and method do not require overlapping fields-of-view if two or more cameras are utilized. The system and method derive relative orientation by equally weighting information from an arbitrary number of heterogeneous cameras, all with non-overlapping fields-of-view. Furthermore, the system can make the measurements with arbitrary wide-angle lenses on the cameras.
Modeling of the ITER-like wide-angle infrared thermography view of JET.
Aumeunier, M-H; Firdaouss, M; Travère, J-M; Loarer, T; Gauthier, E; Martin, V; Chabaud, D; Humbert, E
2012-10-01
Infrared (IR) thermography systems are mandatory to ensure safe plasma operation in fusion devices. However, IR measurements are made much more complicated in metallic environment because of the spurious contributions of the reflected fluxes. This paper presents a full predictive photonic simulation able to assess accurately the surface temperature measurement with classical IR thermography from a given plasma scenario and by taking into account the optical properties of PFCs materials. This simulation has been carried out the ITER-like wide angle infrared camera view of JET in comparing with experimental data. The consequences and the effects of the low emissivity and the bidirectional reflectivity distribution function used in the model for the metallic PFCs on the contribution of the reflected flux in the analysis are discussed.
MODIS and SeaWIFS on-orbit lunar calibration
Sun, Jielun; Eplee, R.E.; Xiong, X.; Stone, T.; Meister, G.; McClain, C.R.
2008-01-01
The Moon plays an important role in the radiometric stability monitoring of the NASA Earth Observing System's (EOS) remote sensors. The MODIS and SeaWIFS are two of the key instruments for NASA's EOS missions. The MODIS Protoflight Model (PFM) on-board the Terra spacecraft and the MODIS Flight Model 1 (FM1) on-board the Aqua spacecraft were launched on December 18, 1999 and May 4, 2002, respectively. They view the Moon through the Space View (SV) port approximately once a month to monitor the long-term radiometric stability of their Reflective Solar Bands (RSB). SeaWIFS was launched on-board the OrbView-2 spacecraft on August 1, 1997. The SeaWiFS lunar calibrations are obtained once a month at a nominal phase angle of 7??. The lunar irradiance observed by these instruments depends on the viewing geometry. The USGS photometric model of the Moon (the ROLO model) has been developed to provide the geometric corrections for the lunar observations. For MODIS, the lunar view responses with corrections for the viewing geometry are used to track the gain change for its reflective solar bands (RSB). They trend the system response degradation at the Angle Of Incidence (AOI) of sensor's SV port. With both the lunar observation and the on-board Solar Diffuser (SD) calibration, it is shown that the MODIS system response degradation is wavelength, mirror side, and AOI dependent. Time-dependent Response Versus Scan angle (RVS) Look-Up Tables (LUT) are applied in MODIS RSB calibration and lunar observations play a key role in RVS derivation. The corrections provided by the RVS in the Terra and Aqua MODIS data from the 412 nm band are as large as 16% and 13%, respectively. For SeaWIFS lunar calibrations, the spacecraft is pitched across the Moon so that the instrument views the Moon near nadir through the same optical path as it views the Earth. The SeaWiFS system gain changes for its eight bands are calibrated using the geometrically-corrected lunar observations. The radiometric corrections to the SeaWiFS data, after more than ten years on orbit, are 19% at 865 nm, 8% at 765 nm, and 1-3% in the other bands. In this report, the lunar calibration algorithms are reviewed and the RSB gain changes observed by the lunar observations are shown for all three sensors. The lunar observations for the three instruments are compared using the USGS photometric model. The USGS lunar model facilitates the cross calibration of instruments with different spectra bandpasses whose measurements of the Moon differ in time and observing geometry.
4. Elevation view of Bunker 104 with ultrawide angle lens ...
4. Elevation view of Bunker 104 with ultrawide angle lens shows about 70 percent of east facade including entire south end with steps and doors. View shows slope of south end and vegetation growing atop building. See also photo WA-203-C-3. - Puget Sound Naval Shipyard, Munitions Storage Bunker, Naval Ammunitions Depot, South of Campbell Trail, Bremerton, Kitsap County, WA
What is MISR? MISR Instrument? MISR Project?
Atmospheric Science Data Center
2014-12-08
... to improve our understanding of the Earth's environment and climate. Viewing the sunlit Earth simultaneously at nine widely-spaced angles, ... types of atmospheric particles and clouds on climate. The change in reflection at different view angles affords the means to distinguish ...
Array Of Sensors Measures Broadband Radiation
NASA Technical Reports Server (NTRS)
Hoffman, James W.; Grush, Ronald G.
1994-01-01
Multiple broadband radiation sensors aimed at various portions of total field of view. All sensors mounted in supporting frame, serving as common heat sink and temperature reference. Each sensor includes heater winding and differential-temperature-sensing bridge circuit. Power in heater winding adjusted repeatedly in effort to balance bridge circuit. Intended to be used aboard satellite in orbit around Earth to measure total radiation emitted, at various viewing angles, by mosaic of "footprint" areas (each defined by its viewing angle) on surface of Earth. Modified versions of array useful for angle-resolved measurements of broadband radiation in laboratory and field settings on Earth.
NASA Technical Reports Server (NTRS)
Fulton, C. L.; Harris, R. L., Jr.
1980-01-01
Factors that can affect oculometer measurements of pupil diameter are: horizontal (azimuth) and vertical (elevation) viewing angle of the pilot; refraction of the eye and cornea; changes in distance of eye to camera; illumination intensity of light on the eye; and counting sensitivity of scan lines used to measure diameter, and output voltage. To estimate the accuracy of the measurements, an artificial eye was designed and a series of runs performed with the oculometer system. When refraction effects are included, results show that pupil diameter is a parabolic function of the azimuth angle similar to the cosine function predicted by theory: this error can be accounted for by using a correction equation, reducing the error from 6% to 1.5% of the actual diameter. Elevation angle and illumination effects were found to be negligible. The effects of counting sensitivity and output voltage can be calculated directly from system documentation. The overall accuracy of the unmodified system is about 6%. After correcting for the azimuth angle errors, the overall accuracy is approximately 2%.
1. VARIABLEANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND ...
1. VARIABLE-ANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING NORTH TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
System of technical vision for autonomous unmanned aerial vehicles
NASA Astrophysics Data System (ADS)
Bondarchuk, A. S.
2018-05-01
This paper is devoted to the implementation of image recognition algorithm using the LabVIEW software. The created virtual instrument is designed to detect the objects on the frames from the camera mounted on the UAV. The trained classifier is invariant to changes in rotation, as well as to small changes in the camera's viewing angle. Finding objects in the image using particle analysis, allows you to classify regions of different sizes. This method allows the system of technical vision to more accurately determine the location of the objects of interest and their movement relative to the camera.
A single camera photogrammetry system for multi-angle fast localization of EEG electrodes.
Qian, Shuo; Sheng, Yang
2011-11-01
Photogrammetry has become an effective method for the determination of electroencephalography (EEG) electrode positions in three dimensions (3D). Capturing multi-angle images of the electrodes on the head is a fundamental objective in the design of photogrammetry system for EEG localization. Methods in previous studies are all based on the use of either a rotating camera or multiple cameras, which are time-consuming or not cost-effective. This study aims to present a novel photogrammetry system that can realize simultaneous acquisition of multi-angle head images in a single camera position. Aligning two planar mirrors with the angle of 51.4°, seven views of the head with 25 electrodes are captured simultaneously by the digital camera placed in front of them. A complete set of algorithms for electrode recognition, matching, and 3D reconstruction is developed. It is found that the elapsed time of the whole localization procedure is about 3 min, and camera calibration computation takes about 1 min, after the measurement of calibration points. The positioning accuracy with the maximum error of 1.19 mm is acceptable. Experimental results demonstrate that the proposed system provides a fast and cost-effective method for the EEG positioning.
Comparison of Angle of Attack Measurements for Wind Tunnel Testing
NASA Technical Reports Server (NTRS)
Jones, Thomas, W.; Hoppe, John C.
2001-01-01
Two optical systems capable of measuring model attitude and deformation were compared to inertial devices employed to acquire wind tunnel model angle of attack measurements during the sting mounted full span 30% geometric scale flexible configuration of the Northrop Grumman Unmanned Combat Air Vehicle (UCAV) installed in the NASA Langley Transonic Dynamics Tunnel (TDT). The overall purpose of the test at TDT was to evaluate smart materials and structures adaptive wing technology. The optical techniques that were compared to inertial devices employed to measure angle of attack for this test were: (1) an Optotrak (registered) system, an optical system consisting of two sensors, each containing a pair of orthogonally oriented linear arrays to compute spatial positions of a set of active markers; and (2) Video Model Deformation (VMD) system, providing a single view of passive targets using a constrained photogrammetric solution whose primary function was to measure wing and control surface deformations. The Optotrak system was installed for this test for the first time at TDT in order to assess the usefulness of the system for future static and dynamic deformation measurements.
A novel screen design for anti-ambient light front projection display with angle-selective absorber
NASA Astrophysics Data System (ADS)
Liao, Tianju; Chen, Weigang; He, Kebo; Zhang, Zhaoyu
2016-03-01
Ambient light is destructive to the reflective type projection system's contrast ratio which has great influence on the image quality. In contrast to the conventional front projection, short-throw projection has its advantage to reject the ambient light. Fresnel lens-shaped reflection layer is adapted to direct light from a large angle due to the low lens throw ratio to the viewing area. The structure separates the path of the ambient light and projection light, creating the chance to solve the problem that ambient light is mixed with projection light. However, with solely the lens-shaped reflection layer is not good enough to improve the contrast ratio due to the scattering layer, which contributes a necessarily wide viewing angle, could interfere with both light paths before hitting the layer. So we propose a new design that sets the draft angle surface with absorption layer and adds an angle-selective absorber to separate these two kinds of light. The absorber is designed to fit the direction of the projection light, leading to a small absorption cross section for the projection light and respectfully big absorption cross section for the ambient light. We have calculated the design with Tracepro, a ray tracing program and find a nearly 8 times contrast ratio improvement against the current design in theory. This design can hopefully provide efficient display in bright lit situation with better viewer satisfaction.
Multi-angle lensless digital holography for depth resolved imaging on a chip.
Su, Ting-Wei; Isikman, Serhan O; Bishara, Waheb; Tseng, Derek; Erlinger, Anthony; Ozcan, Aydogan
2010-04-26
A multi-angle lensfree holographic imaging platform that can accurately characterize both the axial and lateral positions of cells located within multi-layered micro-channels is introduced. In this platform, lensfree digital holograms of the micro-objects on the chip are recorded at different illumination angles using partially coherent illumination. These digital holograms start to shift laterally on the sensor plane as the illumination angle of the source is tilted. Since the exact amount of this lateral shift of each object hologram can be calculated with an accuracy that beats the diffraction limit of light, the height of each cell from the substrate can be determined over a large field of view without the use of any lenses. We demonstrate the proof of concept of this multi-angle lensless imaging platform by using light emitting diodes to characterize various sized microparticles located on a chip with sub-micron axial and lateral localization over approximately 60 mm(2) field of view. Furthermore, we successfully apply this lensless imaging approach to simultaneously characterize blood samples located at multi-layered micro-channels in terms of the counts, individual thicknesses and the volumes of the cells at each layer. Because this platform does not require any lenses, lasers or other bulky optical/mechanical components, it provides a compact and high-throughput alternative to conventional approaches for cytometry and diagnostics applications involving lab on a chip systems.
79. VIEW OF VAL FIRING RANGE LOOKING SOUTHWEST SHOWING LAUNCHER ...
79. VIEW OF VAL FIRING RANGE LOOKING SOUTHWEST SHOWING LAUNCHER BRIDGE, BARGES, SONAR BUOY RANGE AND MORRIS DAM IN BACKGROUND, June 10, 1948. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
NASA Technical Reports Server (NTRS)
1977-01-01
A preliminary design for a helicopter/VSTOL wide angle simulator image generation display system is studied. The visual system is to become part of a simulator capability to support Army aviation systems research and development within the near term. As required for the Army to simulate a wide range of aircraft characteristics, versatility and ease of changing cockpit configurations were primary considerations of the study. Due to the Army's interest in low altitude flight and descents into and landing in constrained areas, particular emphasis is given to wide field of view, resolution, brightness, contrast, and color. The visual display study includes a preliminary design, demonstrated feasibility of advanced concepts, and a plan for subsequent detail design and development. Analysis and tradeoff considerations for various visual system elements are outlined and discussed.
Xiang, Yun; Yan, Lei; Zhao, Yun-sheng; Gou, Zhi-yang; Chen, Wei
2011-12-01
Polarized reflectance is influenced by such factors as its physical and chemical properties, the viewing geometry composed of light incident zenith, viewing zenith and viewing azimuth relative to light incidence, surface roughness and texture, surface density, detection wavelengths, polarization phase angle and so on. In the present paper, the influence of surface roughness on the degree of polarization (DOP) of biotite plagioclase gneiss varying with viewing angle was inquired and analyzed quantitatively. The polarized spectra were measured by ASD FS3 spectrometer on the goniometer located in Northeast Normal University. When the incident zenith angle was fixed at 50 degrees, it was showed that on the rock surfaces with different roughness, in the specular reflection direction, the DOP spectrum within 350-2500 nm increased to the highest value first, and then began to decline varying with viewing zenith angle from 0 degree to 80 degrees. The characterized band (520 +/- 10) nm was picked out for further analysis. The correlation analysis between the peak DOP value of zenith and surface roughness showed that they are in a power function relationship, with the regression equation: y = 0.604x(-0.297), R2 = 0.985 4. The correlation model of the angle where the peak is in and the surface roughness is y = 3.4194x + 51.584, y < 90 degrees , R2 = 0.8177. With the detecting azimuth farther away from 180 degrees azimuth where the maximum DOP exists, the DOP lowers gradually and tends to 0. In the detection azimuth 180 dgrees , the correlation analysis between the peak values of DOP on the (520 =/- 10) nm band for five rocks and their surface roughness indicates a power function, with the regression equation being y = 0.5822x(-0.333), R2 = 0.9843. F tests of the above regression models indicate that the peak value and its corresponding viewing angle correlate much with surface roughness. The study provides a theoretical base for polarization remote sensing, and impels the rock and city architecture discrimination and minerals mapping.
Image dissector camera system study
NASA Technical Reports Server (NTRS)
Howell, L.
1984-01-01
Various aspects of a rendezvous and docking system using an image dissector detector as compared to a GaAs detector were discussed. Investigation into a gimbled scanning system is also covered and the measured video response curves from the image dissector camera are presented. Rendezvous will occur at ranges greater than 100 meters. The maximum range considered was 1000 meters. During docking, the range, range-rate, angle, and angle-rate to each reflector on the satellite must be measured. Docking range will be from 3 to 100 meters. The system consists of a CW laser diode transmitter and an image dissector receiver. The transmitter beam is amplitude modulated with three sine wave tones for ranging. The beam is coaxially combined with the receiver beam. Mechanical deflection of the transmitter beam, + or - 10 degrees in both X and Y, can be accomplished before or after it is combined with the receiver beam. The receiver will have a field-of-view (FOV) of 20 degrees and an instantaneous field-of-view (IFOV) of two milliradians (mrad) and will be electronically scanned in the image dissector. The increase in performance obtained from the GaAs photocathode is not needed to meet the present performance requirements.
Jia, Jia; Chen, Jhensi; Yao, Jun; Chu, Daping
2017-03-17
A high quality 3D display requires a high amount of optical information throughput, which needs an appropriate mechanism to distribute information in space uniformly and efficiently. This study proposes a front-viewing system which is capable of managing the required amount of information efficiently from a high bandwidth source and projecting 3D images with a decent size and a large viewing angle at video rate in full colour. It employs variable gratings to support a high bandwidth distribution. This concept is scalable and the system can be made compact in size. A horizontal parallax only (HPO) proof-of-concept system is demonstrated by projecting holographic images from a digital micro mirror device (DMD) through rotational tiled gratings before they are realised on a vertical diffuser for front-viewing.
NASA Technical Reports Server (NTRS)
Gatebe, C. K.; King, M. D.; Tsay, S.-C.; Ji, Q.; Arnold, T.
2000-01-01
In this sensitivity study, we examined the ratio technique, the official method for remote sensing of aerosols over land from Moderate Resolution Imaging Spectroradiometer (MODIS) DATA, for view angles from nadir to 65 deg. off-nadir using Cloud Absorption Radiometer (CAR) data collected during the Smoke, Clouds, and Radiation-Brazil (SCAR-B) experiment conducted in 1995. For the data analyzed and for the view angles tested, results seem to suggest that the reflectance (rho)0.47 and (rho)0.67 are predictable from (rho)2.1 using: (rho)0.47 = (rho)2.1/6, which is a slight modification and (rho)0.67 = (rho)2.1/2. These results hold for target viewed from backscattered direction, but not for the forward direction.
Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung
2016-08-31
Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user's head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest.
Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung
2016-01-01
Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user’s head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest. PMID:27589768
Wang, Xingliang; Zhang, Youan; Wu, Huali
2016-03-01
The problem of impact angle control guidance for a field-of-view constrained missile against non-maneuvering or maneuvering targets is solved by using the sliding mode control theory. The existing impact angle control guidance laws with field-of-view constraint are only applicable against stationary targets and most of them suffer abrupt-jumping of guidance command due to the application of additional guidance mode switching logic. In this paper, the field-of-view constraint is handled without using any additional switching logic. In particular, a novel time-varying sliding surface is first designed to achieve zero miss distance and zero impact angle error without violating the field-of-view constraint during the sliding mode phase. Then a control integral barrier Lyapunov function is used to design the reaching law so that the sliding mode can be reached within finite time and the field-of-view constraint is not violated during the reaching phase as well. A nonlinear extended state observer is constructed to estimate the disturbance caused by unknown target maneuver, and the undesirable chattering is alleviated effectively by using the estimation as a compensation item in the guidance law. The performance of the proposed guidance law is illustrated with simulations. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
74. DETAIL VIEW OF INSIDE THE LAUNCHING BRIDGE LOOKING SOUTHWEST ...
74. DETAIL VIEW OF INSIDE THE LAUNCHING BRIDGE LOOKING SOUTHWEST SHOWING ADJUSTABLE STAIRS ON THE LEFT AND LAUNCHING TUBE ON THE RIGHT, Date unknown, circa 1948. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Neutral particle beam sensing and steering
Maier, II, William B.; Cobb, Donald D.; Robiscoe, Richard T.
1991-01-01
The direction of a neutral particle beam (NPB) is determined by detecting Ly.alpha. radiation emitted during motional quenching of excited H(2S) atoms in the beam during movement of the atoms through a magnetic field. At least one detector is placed adjacent the beam exit to define an optical axis that intercepts the beam at a viewing angle to include a volume generating a selected number of photons for detection. The detection system includes a lens having an area that is small relative to the NPB area and a pixel array located in the focal plane of the lens. The lens viewing angle and area pixel array are selected to optimize the beam tilt sensitivity. In one embodiment, two detectors are placed coplanar with the beam axis to generate a difference signal that is insensitive to beam variations other than beam tilt.
A Neural-Dynamic Architecture for Concurrent Estimation of Object Pose and Identity
Lomp, Oliver; Faubel, Christian; Schöner, Gregor
2017-01-01
Handling objects or interacting with a human user about objects on a shared tabletop requires that objects be identified after learning from a small number of views and that object pose be estimated. We present a neurally inspired architecture that learns object instances by storing features extracted from a single view of each object. Input features are color and edge histograms from a localized area that is updated during processing. The system finds the best-matching view for the object in a novel input image while concurrently estimating the object’s pose, aligning the learned view with current input. The system is based on neural dynamics, computationally operating in real time, and can handle dynamic scenes directly off live video input. In a scenario with 30 everyday objects, the system achieves recognition rates of 87.2% from a single training view for each object, while also estimating pose quite precisely. We further demonstrate that the system can track moving objects, and that it can segment the visual array, selecting and recognizing one object while suppressing input from another known object in the immediate vicinity. Evaluation on the COIL-100 dataset, in which objects are depicted from different viewing angles, revealed recognition rates of 91.1% on the first 30 objects, each learned from four training views. PMID:28503145
On Orbit Measurement of Response vs. Scan Angle for the Infrared Bands on TRMM/VIRS
NASA Technical Reports Server (NTRS)
Barnes, William L.; Lyu, Cheng-Hsuan; Barnes, Robert A.
1999-01-01
The Visible and Infrared Scanner on the Tropical Rainfall Measuring Mission (TRMM/VIRS) is a whiskbroom imaging radiometer with two reflected solar bands and three emissive infrared bands. All five detectors are on a single cooled focal plane. This configuration necessitated the use of a paddlewheel scan mirror to avoid the effects of focal plane rotation that arise when using a scan mirror that is inclined to its axis of rotation. System radiometric requirements led to the need for protected silver as the mirror surface. Unfortunately, the SiO(x) coatings currently used to protect silver from oxidation introduce a change in reflectance with angle of incidence (AOI). This AOI dependence results in a modulation of system level response with scan angle. Measurement of system response vs. scan angle (RVS) was not difficult for the VIRS reflected solar bands, but attaining the required accuracy for the IR bands in the laboratory was not possible without a large vacuum chamber and a considerable amount of custom designed testing apparatus. Therefore, the decision was made to conduct the measurement on-orbit. On three separate occasions, the TRMM spacecraft was rotated about its pitch axis and, after the nadir view passed over the Earth's limb, the VIRS performed several thousand scans while viewing deep space. The resulting data has been analyzed and the RVS curves generated for the three IR bands are being used in the VIRS radiometric calibration algorithm. This, to our knowledge, the first time this measurement has been made on-orbit. Similar measurements are planned for the EOS-AM and EOS-PM MODIS sensors and are being considered for several systems under development. The VIRS on-orbit results will be compared to VIRS and MODIS system level laboratory measurements, MODIS scan mirror witness sample measurements and modeled data.
NASA Astrophysics Data System (ADS)
Hedman, Matthew M.; Burns, Joseph A.; Nicholson, Philip D.; Tiscareno, Matthew S.; Evans, Michael W.; Baker, Emily
2017-10-01
Around the start of Cassini's Grand Finale, the spacecraft passed a dozen times through Saturn's shadow, enabling its cameras and spectrometers to observe the ring system at extremely high phase angles. These opportunities yielded the best combination of signal-to-noise and resolution for many parts of Saturn's fainter dusty rings, and allowed the main rings to be viewed from previously inaccessible lighting geometries. We will highlight some of the surprising features found in the data obtained by Cassini's Imaging Science Subsystem (ISS) and Visual and Infrared Mapping Spectrometer (VIMS) during these time periods, and discuss what they might be able to tell us about the structure and dynamics of Saturn's various ring systems. For example, ISS captured global views of the entire ring system that reveal previously unseen structures in dust-filled regions like the D ring and the zone between Saturn's F and G rings, as well as novel fine-scale structures in the core of the E ring near Enceladus' orbit. These structures provide new insights into the forces that sculpt these tenuous rings. ISS and VIMS also detected an unexpected brightening and highly unusual spectra of the main rings at extremely high phase angles. These data may provide novel information about the distribution of small grains and particles in these denser rings.
Imaging system for cardiac planar imaging using a dedicated dual-head gamma camera
Majewski, Stanislaw [Morgantown, VA; Umeno, Marc M [Woodinville, WA
2011-09-13
A cardiac imaging system employing dual gamma imaging heads co-registered with one another to provide two dynamic simultaneous views of the heart sector of a patient torso. A first gamma imaging head is positioned in a first orientation with respect to the heart sector and a second gamma imaging head is positioned in a second orientation with respect to the heart sector. An adjustment arrangement is capable of adjusting the distance between the separate imaging heads and the angle between the heads. With the angle between the imaging heads set to 180 degrees and operating in a range of 140-159 keV and at a rate of up to 500kHz, the imaging heads are co-registered to produce simultaneous dynamic recording of two stereotactic views of the heart. The use of co-registered imaging heads maximizes the uniformity of detection sensitivity of blood flow in and around the heart over the whole heart volume and minimizes radiation absorption effects. A normalization/image fusion technique is implemented pixel-by-corresponding pixel to increase signal for any cardiac region viewed in two images obtained from the two opposed detector heads for the same time bin. The imaging system is capable of producing enhanced first pass studies, bloodpool studies including planar, gated and non-gated EKG studies, planar EKG perfusion studies, and planar hot spot imaging.
Evaluation of GPS Coverage for the X-33 Michael-6 Trajectory
NASA Technical Reports Server (NTRS)
Lundberg, John B.
1998-01-01
The onboard navigational system for the X-33 test flights will be based on the use of measurements collected from the Embedded Global Positioning System (GPS)/INS system. Some of the factors which will affect the quality of the GPS contribution to the navigational solution will be the number of pseudorange measurements collected at any instant in time, the distribution of the GPS satellites within the field of view, and the inherent noise level of the GPS receiver. The distribution of GPS satellites within the field of view of the receiver's antenna will depend on the receiver's position, the time of day, pointing direction of the antenna, and the effective cone angle of the antenna. The number of pseudorange measurements collected will depend upon these factors as well as the time required to lock onto a GPS satellite signal once the GPS satellite comes into the field of view of the antenna and the number of available receiver channels. The objective of this study is to evaluate the GPS coverage resulting from the proposed antenna pointing directions, the proposed antenna cone angles, and the effects due to the time of day for the X-33 Michael-6 trajectory from launch at Edwards AFB, California, to the start of the Terminal Area Energy Management (TAEM) phase on approach to Michael AAF, Utah.
Ultra-widefield retinal MHz-OCT imaging with up to 100 degrees viewing angle.
Kolb, Jan Philip; Klein, Thomas; Kufner, Corinna L; Wieser, Wolfgang; Neubauer, Aljoscha S; Huber, Robert
2015-05-01
We evaluate strategies to maximize the field of view (FOV) of in vivo retinal OCT imaging of human eyes. Three imaging modes are tested: Single volume imaging with 85° FOV as well as with 100° and stitching of five 60° images to a 100° mosaic (measured from the nodal point). We employ a MHz-OCT system based on a 1060nm Fourier domain mode locked (FDML) laser with a depth scan rate of 1.68MHz. The high speed is essential for dense isotropic sampling of the large areas. Challenges caused by the wide FOV are discussed and solutions to most issues are presented. Detailed information on the design and characterization of our sample arm optics is given. We investigate the origin of an angle dependent signal fall-off which we observe towards larger imaging angles. It is present in our 85° and 100° single volume images, but not in the mosaic. Our results suggest that 100° FOV OCT is possible with current swept source OCT technology.
Ultra-widefield retinal MHz-OCT imaging with up to 100 degrees viewing angle
Kolb, Jan Philip; Klein, Thomas; Kufner, Corinna L.; Wieser, Wolfgang; Neubauer, Aljoscha S.; Huber, Robert
2015-01-01
We evaluate strategies to maximize the field of view (FOV) of in vivo retinal OCT imaging of human eyes. Three imaging modes are tested: Single volume imaging with 85° FOV as well as with 100° and stitching of five 60° images to a 100° mosaic (measured from the nodal point). We employ a MHz-OCT system based on a 1060nm Fourier domain mode locked (FDML) laser with a depth scan rate of 1.68MHz. The high speed is essential for dense isotropic sampling of the large areas. Challenges caused by the wide FOV are discussed and solutions to most issues are presented. Detailed information on the design and characterization of our sample arm optics is given. We investigate the origin of an angle dependent signal fall-off which we observe towards larger imaging angles. It is present in our 85° and 100° single volume images, but not in the mosaic. Our results suggest that 100° FOV OCT is possible with current swept source OCT technology. PMID:26137363
Response Versus Scan-Angle Corrections for MODIS Reflective Solar Bands Using Deep Convective Clouds
NASA Technical Reports Server (NTRS)
Bhatt, Rajendra; Angal, Amit; Doelling, David R.; Xiong, Xiaoxiong; Wu, Aisheng; Haney, Conor O.; Scarino, Benjamin R.; Gopalan, Arun
2016-01-01
The absolute radiometric calibration of the reflective solar bands (RSBs) of Aqua- and Terra-MODIS is performed using on-board calibrators. A solar diffuser (SD) panel along with a solar diffuser stability monitor (SDSM) system, which tracks the performance of the SD over time, provides the absolute reference for calibrating the MODIS sensors. MODIS also views the moon and deep space through its space view (SV) port for lunar-based calibration and computing the zero input radiance, respectively. The MODIS instrument views the Earths surface through a two-sided scan mirror, whose reflectance is a function of angle of incidence (AOI) and is described by response versus scan-angle (RVS). The RVS for both MODIS instruments was characterized prior to launch. MODIS also views the SD and the moon at two different assigned RVS positions. There is sufficient evidence that the RVS is changing on orbit over time and as a function of wavelength. The SD and lunar observation scans can only track the RVS variation at two RVS positions. Consequently, the MODIS Characterization Support Team (MCST) developed enhanced approaches that supplement the onboard calibrator measurements with responses from pseudo-invariant desert sites. This approach has been implemented in Level 1B (L1B) Collection 6 (C6) for selected short-wavelength bands. This paper presents an alternative approach of characterizing the mirror RVS to derive the time-dependent RVS correction factors for MODIS RSBs using tropical deep convective cloud (DCC) targets. An initial assessment of the DCC response from Aqua-MODIS band 1 C6 data indicates evidence of RVS artifacts, which are not uniform across the scans and are more prevalent in the left side Earth-view scans.
Response Versus Scan-Angle Corrections for MODIS Reflective Solar Bands Using Deep Convective Clouds
NASA Technical Reports Server (NTRS)
Bhatt, Rajendra; Angal, Amit; Doelling, David R.; Xiong, Xiaoxiong; Wu, Aisheng; Haney, Conor O.; Scarino, Benjamin R.; Gopalan, Arun
2016-01-01
The absolute radiometric calibration of the reflective solar bands (RSBs) of Aqua- and Terra-MODIS is performed using on-board calibrators. A solar diffuser (SD) panel along with a solar diffuser stability monitor (SDSM) system, which tracks the performance of the SD over time, provides the absolute reference for calibrating the MODIS sensors. MODIS also views the moon and deep space through its space view (SV) port for lunar-based calibration and computing the zero input radiance, respectively. The MODIS instrument views the Earth's surface through a two-sided scan mirror, whose reflectance is a function of angle of incidence (AOI) and is described by response versus scan-angle (RVS). The RVS for both MODIS instruments was characterized prior to launch. MODIS also views the SD and the moon at two different assigned RVS positions. There is sufficient evidence that the RVS is changing on orbit over time and as a function of wavelength. The SD and lunar observation scans can only track the RVS variation at two RVS positions. Consequently, the MODIS Characterization Support Team (MCST) developed enhanced approaches that supplement the onboard calibrator measurements with responses from pseudo-invariant desert sites. This approach has been implemented in Level 1B (L1B) Collection 6 (C6) for selected short-wavelength bands. This paper presents an alternative approach of characterizing the mirror RVS to derive the time-dependent RVS correction factors for MODIS RSBs using tropical deep convective cloud (DCC) targets. An initial assessment of the DCC response from Aqua-MODIS band 1 C6 data indicates evidence of RVS artifacts, which are not uniform across the scans and are more prevalent in the left side Earth-view scans.
Response versus scan-angle corrections for MODIS reflective solar bands using deep convective clouds
NASA Astrophysics Data System (ADS)
Bhatt, Rajendra; Angal, Amit; Doelling, David R.; Xiong, Xiaoxiong; Wu, Aisheng; Haney, Conor O.; Scarino, Benjamin R.; Gopalan, Arun
2016-05-01
The absolute radiometric calibration of the reflective solar bands (RSBs) of Aqua- and Terra-MODIS is performed using on-board calibrators. A solar diffuser (SD) panel along with a solar diffuser stability monitor (SDSM) system, which tracks the degradation of the SD over time, provides the baseline for calibrating the MODIS sensors. MODIS also views the moon and deep space through its space view (SV) port for lunar-based calibration and computing the background, respectively. The MODIS instrument views the Earth's surface using a two-sided scan mirror, whose reflectance is a function of the angle of incidence (AOI) and is described by response versus scan-angle (RVS). The RVS for both MODIS instruments was characterized prior to launch. MODIS also views the SD and the moon at two different AOIs. There is sufficient evidence that the RVS is changing on orbit over time and as a function of wavelength. The SD and lunar observation scans can only track the RVS variation at two AOIs. Consequently, the MODIS Characterization Support Team (MCST) developed enhanced approaches that supplement the onboard calibrator measurements with responses from the pseudo-invariant desert sites. This approach has been implemented in Level 1B (L1B) Collection 6 (C6) for select short-wavelength bands. This paper presents an alternative approach of characterizing the mirror RVS to derive the time-dependent RVS correction factors for MODIS RSBs using tropical deep convective cloud (DCC) targets. An initial assessment of the DCC response from Aqua-MODIS band 1 C6 data indicates evidence of RVS artifacts, which are not uniform across the scans and are more prevalent at the beginning of the earth-view scan.
View-angle-dependent AIRS Cloudiness and Radiance Variance: Analysis and Interpretation
NASA Technical Reports Server (NTRS)
Gong, Jie; Wu, Dong L.
2013-01-01
Upper tropospheric clouds play an important role in the global energy budget and hydrological cycle. Significant view-angle asymmetry has been observed in upper-level tropical clouds derived from eight years of Atmospheric Infrared Sounder (AIRS) 15 um radiances. Here, we find that the asymmetry also exists in the extra-tropics. It is larger during day than that during night, more prominent near elevated terrain, and closely associated with deep convection and wind shear. The cloud radiance variance, a proxy for cloud inhomogeneity, has consistent characteristics of the asymmetry to those in the AIRS cloudiness. The leading causes of the view-dependent cloudiness asymmetry are the local time difference and small-scale organized cloud structures. The local time difference (1-1.5 hr) of upper-level (UL) clouds between two AIRS outermost views can create parts of the observed asymmetry. On the other hand, small-scale tilted and banded structures of the UL clouds can induce about half of the observed view-angle dependent differences in the AIRS cloud radiances and their variances. This estimate is inferred from analogous study using Microwave Humidity Sounder (MHS) radiances observed during the period of time when there were simultaneous measurements at two different view-angles from NOAA-18 and -19 satellites. The existence of tilted cloud structures and asymmetric 15 um and 6.7 um cloud radiances implies that cloud statistics would be view-angle dependent, and should be taken into account in radiative transfer calculations, measurement uncertainty evaluations and cloud climatology investigations. In addition, the momentum forcing in the upper troposphere from tilted clouds is also likely asymmetric, which can affect atmospheric circulation anisotropically.
On-Orbit Cross-Calibration of AM Satellite Remote Sensing Instruments using the Moon
NASA Technical Reports Server (NTRS)
Butler, James J.; Kieffer, Hugh H.; Barnes, Robert A.; Stone, Thomas C.
2003-01-01
On April 14,2003, three Earth remote sensing spacecraft were maneuvered enabling six satellite instruments operating in the visible through shortwave infrared wavelength region to view the Moon for purposes of on-orbit cross-calibration. These instruments included the Moderate Resolution Imaging Spectroradiometer (MODIS), the Multi-angle Imaging SpectroRadiometer (MISR), the Advanced Spaceborne Thermal Emission and Reflection (ASTER) radiometer on the Earth Observing System (EOS) Terra spacecraft, the Advanced Land Imager (ALI) and Hyperion instrument on Earth Observing-1 (EO-1) spacecraft, and the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) on the SeaStar spacecraft. Observations of the Moon were compared using a spectral photometric mode for lunar irradiance developed by the Robotic Lunar Observatory (ROLO) project located at the United States Geological Survey in Flagstaff, Arizona. The ROLO model effectively accounts for variations in lunar irradiance corresponding to lunar phase and libration angles, allowing intercomparison of observations made by instruments on different spacecraft under different time and location conditions. The spacecraft maneuvers necessary to view the Moon are briefly described and results of using the lunar irradiance model in comparing the radiometric calibration scales of the six satellite instruments are presented here.
1986-01-24
P-29516 BW Range: 125, 000 kilometers (78,000 miles) Voyager 2's wide-angle camera captured this view of the outer part of the Uranian ring system just 11 minutes before passing though the ring plane. The resolution in this clear-filter view is slightly better than 9 km (6 mi). The brightest, outermost ring is known as epsilon. Interior to epsilon lie (from top) the newly discovered 10th ring of Uranus--designated 1986UR1 and barely visible here--and then the delta, gamma and eta rings.
The big picture: effects of surround on immersion and size perception.
Baranowski, Andreas M; Hecht, Heiko
2014-01-01
Despite the fear of the entertainment industry that illegal downloads of films might ruin their business, going to the movies continues to be a popular leisure activity. One reason why people prefer to watch movies in cinemas may be the surround of the movie screen or its physically huge size. To disentangle the factors that might contribute to the size impression, we tested several measures of subjective size and immersion in different viewing environments. For this purpose we built a model cinema that provided visual angle information comparable with that of a real cinema. Subjects watched identical movie clips in a real cinema, a model cinema, and on a display monitor in isolation. Whereas the isolated display monitor was inferior, the addition of a contextual model improved the viewing immersion to the extent that it was comparable with the movie theater experience, provided the viewing angle remained the same. In a further study we built an identical but even smaller model cinema to unconfound visual angle and viewing distance. Both model cinemas produced similar results. There was a trend for the larger screen to be more immersive; however, viewing angle did not play a role in how the movie was evaluated.
22. VAL, VIEW OF PROJECTILE LOADING DECK LOOKING NORTHEAST TOWARD ...
22. VAL, VIEW OF PROJECTILE LOADING DECK LOOKING NORTHEAST TOWARD TOP OF CONCRETE 'A' FRAME STRUCTURE SHOWING DRIVE CABLES, DRIVE GEAR, BOTTOM OF CAMERA TOWER AND 'CROWS NEST' CONTROL ROOM. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Wide angle view of the Flight control room of Mission control center
1984-10-06
Wide angle view of the flight control room (FCR) of the Mission Control Center (MCC). Some of the STS 41-G crew can be seen on a large screen at the front of the MCC along with a map tracking the progress of the orbiter.
Effect of vision angle on the phase transition in flocking behavior of animal groups
NASA Astrophysics Data System (ADS)
Nguyen, P. The; Lee, Sang-Hee; Ngo, V. Thanh
2015-09-01
The nature of the phase transition in a system of self-propelling particles has been extensively studied during the past few decades. A theoretical model was proposed by [T. Vicsek et al. Phys. Rev. Lett. 75, 1226 (1995), 10.1103/PhysRevLett.75.1226] with a simple rule for updating the direction of motion of each particle. Based on the model of Vicsek et al., in this paper, we consider a group of animals as particles moving freely in a two-dimensional space. Due to the fact that the viewable area of animals depends on the species, we consider the motion of each individual within an angle φ =ϕ /2 (ϕ is called the angle of view) of a circle centered at its position of radius R . We obtained a phase diagram in the space (φ ,ηc ) with ηc being the critical noise. We show that the phase transition exists only in the case of a wide view's angle φ ≥0.5 π . The flocking of animals is a universal behavior of the species of prey but not the one of the predator. Our simulation results are in good agreement with experimental observation [C. Beccoa et al., Physica A 367, 487 (2006), 10.1016/j.physa.2005.11.041].
Wide angle view of Mission Control Center during Apollo 14 transmission
1971-01-31
S71-17122 (31 Jan. 1971) --- A wide angle overall view of the Mission Operations Control Room (MOCR) in the Mission Control Center at the Manned spacecraft Center. This view was photographed during the first color television transmission from the Apollo 14 Command Module. Projected on the large screen at the right front of the MOCR is a view of the Apollo 14 Lunar Module, still attached to the Saturn IVB stage. The Command and Service Modules were approaching the LM/S-IVB during transposition and docking maneuvers.
Waveguide detection of right-angle-scattered light in flow cytometry
Mariella, Jr., Raymond P.
2000-01-01
A transparent flow cell is used as an index-guided optical waveguide. A detector for the flow cell but not the liquid stream detects the Right-Angle-Scattered (RAS) Light exiting from one end of the flow cell. The detector(s) could view the trapped RAS light from the flow cell either directly or through intermediate optical light guides. If the light exits one end of the flow cell, then the other end of the flow cell can be given a high-reflectivity coating to approximately double the amount of light collected. This system is more robust in its alignment than the traditional flow cytometry systems which use imaging optics, such as microscope objectives.
First NAC Image Obtained in Mercury Orbit
2017-12-08
NASA image acquired: March 29, 2011 This is the first image of Mercury taken from orbit with MESSENGER’s Narrow Angle Camera (NAC). MESSENGER’s camera system, the Mercury Dual Imaging System (MDIS), has two cameras: the Narrow Angle Camera and the Wide Angle Camera (WAC). Comparison of this image with MESSENGER’s first WAC image of the same region shows the substantial difference between the fields of view of the two cameras. At 1.5°, the field of view of the NAC is seven times smaller than the 10.5° field of view of the WAC. This image was taken using MDIS’s pivot. MDIS is mounted on a pivoting platform and is the only instrument in MESSENGER’s payload capable of movement independent of the spacecraft. The other instruments are fixed in place, and most point down the spacecraft’s boresight at all times, relying solely on the guidance and control system for pointing. The 90° range of motion of the pivot gives MDIS a much-needed extra degree of freedom, allowing MDIS to image the planet’s surface at times when spacecraft geometry would normally prevent it from doing so. The pivot also gives MDIS additional imaging opportunities by allowing it to view more of the surface than that at which the boresight-aligned instruments are pointed at any given time. On March 17, 2011 (March 18, 2011, UTC), MESSENGER became the first spacecraft ever to orbit the planet Mercury. The mission is currently in the commissioning phase, during which spacecraft and instrument performance are verified through a series of specially designed checkout activities. In the course of the one-year primary mission, the spacecraft's seven scientific instruments and radio science investigation will unravel the history and evolution of the Solar System's innermost planet. Visit the Why Mercury? section of this website to learn more about the science questions that the MESSENGER mission has set out to answer. Credit: NASA/Johns Hopkins University Applied Physics Laboratory/Carnegie Institution of Washington NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook
Bright field segmentation tomography (BFST) for use as surface identification in stereomicroscopy
NASA Astrophysics Data System (ADS)
Thiesse, Jacqueline R.; Namati, Eman; de Ryk, Jessica; Hoffman, Eric A.; McLennan, Geoffrey
2004-07-01
Stereomicroscopy is an important method for use in image acquisition because it provides a 3D image of an object when other microscopic techniques can only provide the image in 2D. One challenge that is being faced with this type of imaging is determining the top surface of a sample that has otherwise indistinguishable surface and planar characteristics. We have developed a system that creates oblique illumination and in conjunction with image processing, the top surface can be viewed. The BFST consists of the Leica MZ12 stereomicroscope with a unique attached lighting source. The lighting source consists of eight light emitting diodes (LED's) that are separated by 45-degree angles. Each LED in this system illuminates with a 20-degree viewing angle once per cycle with a shadow over the rest of the sample. Subsequently, eight segmented images are taken per cycle. After the images are captured they are stacked through image addition to achieve the full field of view, and the surface is then easily identified. Image processing techniques, such as skeletonization can be used for further enhancement and measurement. With the use of BFST, advances can be made in detecting surface features from metals to tissue samples, such as in the analytical assessment of pulmonary emphysema using the technique of mean linear intercept.
Optimal angle of needle insertion for fluoroscopy-guided transforaminal epidural injection of L5.
Ra, In-Hoo; Min, Woo-Kie
2015-06-01
Unlike other sites, there is difficulty in performing TFESI at the L5-S1 level because the iliac crest is an obstacle to needle placement. The objective of this study was to identify the optimal angle of fluoroscopy for insertion and advancement of a needle during L5 TEFSI. We conducted an observational study of patients undergoing fluoroscopy-guided L5 TFESI in the prone position. A total of 80 patients (40 men and 40 women) with radiating pain of lower limbs were enrolled. During TFESI, we measured the angle at which the L5 vertebral body forms a rectangular shape and compared men and women. Then, we measured area of safe triangle in tilting angle of fluoroscopy from 15° to 35° and compared men and women. The mean cephalocaudal angle, where the vertebral body takes the shape of a rectangle, was 11.0° in men and 13.9° in women (P = 0.007). In men, the triangular area was maximal at 18.3 mm² with an oblique view angle of 25°. In women, the area was maximal at 23.6 mm² with an oblique view angle of 30°. At an oblique view angle of 30° and 35°, the area was significantly greater in women (P < 0.05). When TFESI is performed at the L5 region in the prone position, placement of fluoroscopy at a cephalocaudal angle of 11.0° and an oblique angle of 25° in men and cephalocaudal angle of 13.9° and an oblique angle of 30° in women would be most reasonable. © 2014 World Institute of Pain.
Detection Angle Calibration of Pressure-Sensitive Paints
NASA Technical Reports Server (NTRS)
Bencic, Timothy J.
2000-01-01
Uses of the pressure-sensitive paint (PSP) techniques in areas other than external aerodynamics continue to expand. The NASA Glenn Research Center has become a leader in the application of the global technique to non-conventional aeropropulsion applications including turbomachinery testing. The use of the global PSP technique in turbomachinery applications often requires detection of the luminescent paint in confined areas. With the limited viewing usually available, highly oblique illumination and detection angles are common in the confined areas in these applications. This paper will describe the results of pressure, viewing and excitation angle dependence calibrations using three popular PSP formulations to get a better understanding of the errors associated with these non-traditional views.
Radiometric sensitivity comparisons of multispectral imaging systems
NASA Technical Reports Server (NTRS)
Lu, Nadine C.; Slater, Philip N.
1989-01-01
Multispectral imaging systems provide much of the basic data used by the land and ocean civilian remote-sensing community. There are numerous multispectral imaging systems which have been and are being developed. A common way to compare the radiometric performance of these systems is to examine their noise-equivalent change in reflectance, NE Delta-rho. The NE Delta-rho of a system is the reflectance difference that is equal to the noise in the recorded signal. A comparison is made of the noise equivalent change in reflectance of seven different multispectral imaging systems (AVHRR, AVIRIS, ETM, HIRIS, MODIS-N, SPOT-1, HRV, and TM) for a set of three atmospheric conditions (continental aerosol with 23-km visibility, continental aerosol with 5-km visibility, and a Rayleigh atmosphere), five values of ground reflectance (0.01, 0.10, 0.25, 0.50, and 1.00), a nadir viewing angle, and a solar zenith angle of 45 deg.
Dallaire, Xavier; Thibault, Simon
2017-04-01
Plenoptic imaging has been used in the past decade mainly for 3D reconstruction or digital refocusing. It was also shown that this technology has potential for correcting monochromatic aberrations in a standard optical system. In this paper, we present an algorithm for reconstructing images using a projection technique while correcting defects present in it that can apply to chromatic aberrations and wide-angle optical systems. We show that the impact of noise on the reconstruction procedure is minimal. Trade-offs between the sampling of the optical system needed for characterization and image quality are presented. Examples are shown for aberrations in a classic optical system and for chromatic aberrations. The technique is also applied to a wide-angle full field of view of 140° (FFOV 140°) optical system. This technique could be used in order to further simplify or minimize optical systems.
An automatic calibration procedure for remote eye-gaze tracking systems.
Model, Dmitri; Guestrin, Elias D; Eizenman, Moshe
2009-01-01
Remote gaze estimation systems use calibration procedures to estimate subject-specific parameters that are needed for the calculation of the point-of-gaze. In these procedures, subjects are required to fixate on a specific point or points at specific time instances. Advanced remote gaze estimation systems can estimate the optical axis of the eye without any personal calibration procedure, but use a single calibration point to estimate the angle between the optical axis and the visual axis (line-of-sight). This paper presents a novel automatic calibration procedure that does not require active user participation. To estimate the angles between the optical and visual axes of each eye, this procedure minimizes the distance between the intersections of the visual axes of the left and right eyes with the surface of a display while subjects look naturally at the display (e.g., watching a video clip). Simulation results demonstrate that the performance of the algorithm improves as the range of viewing angles increases. For a subject sitting 75 cm in front of an 80 cm x 60 cm display (40" TV) the standard deviation of the error in the estimation of the angles between the optical and visual axes is 0.5 degrees.
Mitchnick, Krista A; Wideman, Cassidy E; Huff, Andrew E; Palmer, Daniel; McNaughton, Bruce L; Winters, Boyer D
2018-05-15
The capacity to recognize objects from different view-points or angles, referred to as view-invariance, is an essential process that humans engage in daily. Currently, the ability to investigate the neurobiological underpinnings of this phenomenon is limited, as few ethologically valid view-invariant object recognition tasks exist for rodents. Here, we report two complementary, novel view-invariant object recognition tasks in which rodents physically interact with three-dimensional objects. Prior to experimentation, rats and mice were given extensive experience with a set of 'pre-exposure' objects. In a variant of the spontaneous object recognition task, novelty preference for pre-exposed or new objects was assessed at various angles of rotation (45°, 90° or 180°); unlike control rodents, for whom the objects were novel, rats and mice tested with pre-exposed objects did not discriminate between rotated and un-rotated objects in the choice phase, indicating substantial view-invariant object recognition. Secondly, using automated operant touchscreen chambers, rats were tested on pre-exposed or novel objects in a pairwise discrimination task, where the rewarded stimulus (S+) was rotated (180°) once rats had reached acquisition criterion; rats tested with pre-exposed objects re-acquired the pairwise discrimination following S+ rotation more effectively than those tested with new objects. Systemic scopolamine impaired performance on both tasks, suggesting involvement of acetylcholine at muscarinic receptors in view-invariant object processing. These tasks present novel means of studying the behavioral and neural bases of view-invariant object recognition in rodents. Copyright © 2018 Elsevier B.V. All rights reserved.
MISR Global Images See the Light of Day
NASA Technical Reports Server (NTRS)
2002-01-01
As of July 31, 2002, global multi-angle, multi-spectral radiance products are available from the MISR instrument aboard the Terra satellite. Measuring the radiative properties of different types of surfaces, clouds and atmospheric particulates is an important step toward understanding the Earth's climate system. These images are among the first planet-wide summary views to be publicly released from the Multi-angle Imaging SpectroRadiometer experiment. Data for these images were collected during the month of March 2002, and each pixel represents monthly-averaged daylight radiances from an area measuring 1/2 degree in latitude by 1/2 degree in longitude.The top panel is from MISR's nadir (vertical-viewing) camera and combines data from the red, green and blue spectral bands to create a natural color image. The central view combines near-infrared, red, and green spectral data to create a false-color rendition that enhances highly vegetated terrain. It takes 9 days for MISR to view the entire globe, and only areas within 8 degrees of latitude of the north and south poles are not observed due to the Terra orbit inclination. Because a single pole-to-pole swath of MISR data is just 400 kilometers wide, multiple swaths must be mosaiced to create these global views. Discontinuities appear in some cloud patterns as a consequence of changes in cloud cover from one day to another.The lower panel is a composite in which red, green, and blue radiances from MISR's 70-degree forward-viewing camera are displayed in the northern hemisphere, and radiances from the 70-degree backward-viewing camera are displayed in the southern hemisphere. At the March equinox (spring in the northern hemisphere, autumn in the southern hemisphere), the Sun is near the equator. Therefore, both oblique angles are observing the Earth in 'forward scattering', particularly at high latitudes. Forward scattering occurs when you (or MISR) observe an object with the Sun at a point in the sky that is in front of you. Relative to the nadir view, this geometry accentuates the appearance of polar clouds, and can even reveal clouds that are invisible in the nadir direction. In relatively clear ocean areas, the oblique-angle composite is generally brighter than its nadir counterpart due to enhanced reflection of light by atmospheric particulates.MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.A Summer View of Russia's Lena Delta and Olenek
NASA Technical Reports Server (NTRS)
2004-01-01
These views of the Russian Arctic were acquired by NASA's Multi-angle Imaging SpectroRadiometer (MISR) instrument on July 11, 2004, when the brief arctic summer had transformed the frozen tundra and the thousands of lakes, channels, and rivers of the Lena Delta into a fertile wetland, and when the usual blanket of thick snow had melted from the vast plains and taiga forests. This set of three images cover an area in the northern part of the Eastern Siberian Sakha Republic. The Olenek River wends northeast from the bottom of the images to the upper left, and the top portions of the images are dominated by the delta into which the mighty Lena River empties when it reaches the Laptev Sea. At left is a natural color image from MISR's nadir (vertical-viewing) camera, in which the rivers appear murky due to the presence of sediment, and photosynthetically-active vegetation appears green. The center image is also from MISR's nadir camera, but is a false color view in which the predominant red color is due to the brightness of vegetation at near-infrared wavelengths. The most photosynthetically active parts of this area are the Lena Delta, in the lower half of the image, and throughout the great stretch of land that curves across the Olenek River and extends northeast beyond the relatively barren ranges of the Volyoi mountains (the pale tan-colored area to the right of image center). The right-hand image is a multi-angle false-color view made from the red band data of the 60o backward, nadir, and 60o forward cameras, displayed as red, green and blue, respectively. Water appears blue in this image because sun glitter makes smooth, wet surfaces look brighter at the forward camera's view angle. Much of the landscape and many low clouds appear purple since these surfaces are both forward and backward scattering, and clouds that are further from the surface appear in a different spot for each view angle, creating a rainbow-like appearance. However, the vegetated region that is darker green in the natural color nadir image, also appears to exhibit a faint greenish hue in the multi-angle composite. A possible explanation for this subtle green effect is that the taiga forest trees (or dwarf-shrubs) are not too dense here. Since the the nadir camera is more likly to observe any gaps between the trees or shrubs, and since the vegetation is not as bright (in the red band) as the underlying soil or surface, the brighter underlying surface results in an area that is relatively brighter at the nadir view angle. Accurate maps of vegetation structural units are an essential part of understanding the seasonal exchanges of energy and water at the Earth's surface, and of preserving the biodiversity in these regions. The Multiangle Imaging SpectroRadiometer observes the daylit Earth continuously and every 9 days views the entire globe between 82o north and 82o south latitude. These data products were generated from a portion of the imagery acquired during Terra orbit 24273. The panels cover an area of about 230 kilometers x 420 kilometers, and utilize data from blocks 30 to 34 within World Reference System-2 path 134. MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.81. VIEW OF VAL LOOKING NORTH AS SEEN FROM THE ...
81. VIEW OF VAL LOOKING NORTH AS SEEN FROM THE RESERVOIR SHOWING TWO LAUNCHING TUBES ON THE LAUNCHER BRIDGE, Date unknown, circa 1952. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
63. VIEW LOOKING DOWN VAL LAUNCHING SLAB SHOWING DRIVE GEARS, ...
63. VIEW LOOKING DOWN VAL LAUNCHING SLAB SHOWING DRIVE GEARS, CABLES, LAUNCHER RAILS, PROJECTILE CAR AND SUPPORT CARRIAGE, April 8, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Goodsitt, Mitchell M.; Helvie, Mark A.; Zelakiewicz, Scott; Schmitz, Andrea; Noroozian, Mitra; Paramagul, Chintana; Roubidoux, Marilyn A.; Nees, Alexis V.; Neal, Colleen H.; Carson, Paul; Lu, Yao; Hadjiiski, Lubomir; Wei, Jun
2014-01-01
Purpose To investigate the dependence of microcalcification cluster detectability on tomographic scan angle, angular increment, and number of projection views acquired at digital breast tomosynthesis (DBTdigital breast tomosynthesis). Materials and Methods A prototype DBTdigital breast tomosynthesis system operated in step-and-shoot mode was used to image breast phantoms. Four 5-cm-thick phantoms embedded with 81 simulated microcalcification clusters of three speck sizes (subtle, medium, and obvious) were imaged by using a rhodium target and rhodium filter with 29 kV, 50 mAs, and seven acquisition protocols. Fixed angular increments were used in four protocols (denoted as scan angle, angular increment, and number of projection views, respectively: 16°, 1°, and 17; 24°, 3°, and nine; 30°, 3°, and 11; and 60°, 3°, and 21), and variable increments were used in three (40°, variable, and 13; 40°, variable, and 15; and 60°, variable, and 21). The reconstructed DBTdigital breast tomosynthesis images were interpreted by six radiologists who located the microcalcification clusters and rated their conspicuity. Results The mean sensitivity for detection of subtle clusters ranged from 80% (22.5 of 28) to 96% (26.8 of 28) for the seven DBTdigital breast tomosynthesis protocols; the highest sensitivity was achieved with the 16°, 1°, and 17 protocol (96%), but the difference was significant only for the 60°, 3°, and 21 protocol (80%, P < .002) and did not reach significance for the other five protocols (P = .01–.15). The mean sensitivity for detection of medium and obvious clusters ranged from 97% (28.2 of 29) to 100% (24 of 24), but the differences fell short of significance (P = .08 to >.99). The conspicuity of subtle and medium clusters with the 16°, 1°, and 17 protocol was rated higher than those with other protocols; the differences were significant for subtle clusters with the 24°, 3°, and nine protocol and for medium clusters with 24°, 3°, and nine; 30°, 3°, and 11; 60°, 3° and 21; and 60°, variable, and 21 protocols (P < .002). Conclusion With imaging that did not include x-ray source motion or patient motion during acquisition of the projection views, narrow-angle DBTdigital breast tomosynthesis provided higher sensitivity and conspicuity than wide-angle DBTdigital breast tomosynthesis for subtle microcalcification clusters. © RSNA, 2014 PMID:25007048
1986-01-24
Range : 236,000 km. ( 147,000 mi. ) Resolution : 33 km. ( 20 mi. ) P-29525B/W This Voyager 2 image reveals a contiuos distribution of small particles throughout the Uranus ring system. This unigue geometry, the highest phase angle at which Voyager imaged the rings, allows us to see lanes of fine dust particles not visible from other viewing angles. All the previously known rings are visible. However, some of the brightest features in the image are bright dust lanes not previously seen. the combination of this unique geometry and a long, 96 second exposure allowed this spectacular observation, acquired through the clear filter if Voyager 2's wide angle camera. the long exposure produced a noticable, non-uniform smear, as well as streaks due to trailed stars.
On the location of the Io plasma torus: Voyager 1 observations
NASA Astrophysics Data System (ADS)
Volwerk, Martin
2018-06-01
The Voyager 1 outbound ultraviolet observations of the Io plasma torus are used to determine the location of the ansae, to obtain a third viewing angle of this structure in the Jovian magnetosphere. At an angle of -114° with respect to the Sun-Jupiter line, or a Jovian local time of 04:30 LT, the Voyager 1 data deliver a distance of 5.74±0.10 RJ for the approaching and 5.83±0.15 RJ for the receding ansa. Various periodicities in the radial distance, brightness and width of the ansae are seen with respect to system III longitude and Io phase angle. The torus ribbon feature does not appear in all ansa scans.
Jia, Jia; Chen, Jhensi; Yao, Jun; Chu, Daping
2017-01-01
A high quality 3D display requires a high amount of optical information throughput, which needs an appropriate mechanism to distribute information in space uniformly and efficiently. This study proposes a front-viewing system which is capable of managing the required amount of information efficiently from a high bandwidth source and projecting 3D images with a decent size and a large viewing angle at video rate in full colour. It employs variable gratings to support a high bandwidth distribution. This concept is scalable and the system can be made compact in size. A horizontal parallax only (HPO) proof-of-concept system is demonstrated by projecting holographic images from a digital micro mirror device (DMD) through rotational tiled gratings before they are realised on a vertical diffuser for front-viewing. PMID:28304371
View angle effects on relationships between leaf area index in wheat and vegetation indices
NASA Astrophysics Data System (ADS)
Chen, H.; Li, W.; Huang, W.; Niu, Z.
2016-12-01
The effects of plant types and view angles on the canopy-reflected spectrum can not be ignored in the estimation of leaf area index (LAI) using remote sensing vegetation indices. While vegetation indices derived from nadir-viewing remote sensors are insufficient in leaf area index (LAI) estimation because of its misinterpretation of structural characteristecs, vegetation indices derived from multi-angular remote sensors have potential to improve detection of LAI. However, view angle effects on relationships between these indices and LAI for low standing crops (i.e. wheat) has not been fully evaluated and thus limits them to applied for consistent and accurate monitoring of vegetation. View angles effects of two types of winter wheat (wheat 411, erectophile; and wheat 9507, planophile) on relationship between LAI and spectral reflectance are assessed and compared in this study. An evaluation is conducted with in-situ measurements of LAI and bidirectional reflectance in the principal plane from -60° (back-scattering direction ) ot 60° (forward scattering direction) in the growth cycle of winter wheat. A variety of vegetation indices (VIs) published are calculated by BRDF. Additionally, all combinations of the bands are used in order to calculate Normalized difference Spectral Indices (NDSI) and Simple Subtraction Indices (SSI). The performance of the above indices along with raw reflectance and reflectance derivatives on LAI estimation are examined based on a linearity comparison. The results will be helpful in further developing multi-angle remote sensing models for accurate LAI evaluation.
NASA Astrophysics Data System (ADS)
Miller, I.; Forster, B. C.; Laffan, S. W.
2012-07-01
Spectral reflectance characteristics of substrates in a coral reef environment are often measured in the field by viewing a substrate at nadir. However, viewing a substrate from multiple angles would likely result in different spectral characteristics for most coral reef substrates and provide valuable information on structural properties. To understand the relationship between the morphology of a substrate and its spectral response it is necessary to correct the observed above-water radiance for the effects of atmosphere and water attenuation, at a number of view and azimuth angles. In this way the actual surface reflectance can be determined. This research examines the air-water surface interaction for two hypothetical atmospheric conditions (clear Rayleigh scattering and totally cloudcovered) and the global irradiance reaching the benthic surface. It accounts for both water scattering and absorption, with simplifications for shallow water conditions, as well as the additive effect of background reflectance being reflected at the water-air surface at angles greater than the critical refraction angle (~48°). A model was developed to correct measured above-water radiance along the refracted view angle for its decrease due to path attenuation and the "n squared law of radiance" and the additive surface reflectance. This allows bidirectional benthic surface reflectance and nadir-normalised reflectance to be determined. These theoretical models were adapted to incorporate above-water measures relative to a standard, diffuse, white reference panel. The derived spectral signatures of a number of coral and non-coral benthic surfaces compared well with other published results, and the signatures and nadir normalised reflectance of the corals and other benthic surface classes indicate good class separation.
Soybean canopy reflectance as a function of view and illumination geometry
NASA Technical Reports Server (NTRS)
Bauer, M. E. (Principal Investigator); Ranson, K. J.; Vanderbilt, V. C.; Biehl, L. L.; Robinson, B. F.
1982-01-01
The results of an experiment designed to characterize a soybean field by its reflectance at various view and illumination angles and by its physical and agronomic attributes are presented. Reflectances were calculated from measurements at four wavelength bands through eight view azimuth and seven view zenith directions for various solar zenith and azimuth angles during portions of three days. An ancillary data set consisting of the agronomic and physical characteristics of the soybean field is described. The results indicate that the distribution of reflectance from a soybean field is a function of the solar illumination and viewing geometry, wavelength and row direction, as well as the state of development of the canopy. Shadows between rows greatly affected the reflectance in the visible wavelength bands and to a lesser extent in the near infrared wavelengths. A model is proposed that describes the reflectance variation as a function of projected solar and projected viewing angles. The model appears to approximate the reflectance variations in the visible wavelength bands from a canopy with well defined row structure.
Thermal IR exitance model of a plant canopy
NASA Technical Reports Server (NTRS)
Kimes, D. S.; Smith, J. A.; Link, L. E.
1981-01-01
A thermal IR exitance model of a plant canopy based on a mathematical abstraction of three horizontal layers of vegetation was developed. Canopy geometry within each layer is quantitatively described by the foliage and branch orientation distributions and number density. Given this geometric information for each layer and the driving meteorological variables, a system of energy budget equations was determined and solved for average layer temperatures. These estimated layer temperatures, together with the angular distributions of radiating elements, were used to calculate the emitted thermal IR radiation as a function of view angle above the canopy. The model was applied to a lodgepole pine (Pinus contorta) canopy over a diurnal cycle. Simulated vs measured radiometric average temperatures of the midcanopy layer corresponded with 2 C. Simulation results suggested that canopy geometry can significantly influence the effective radiant temperature recorded at varying sensor view angles.
Subpixel area-based evaluation for crosstalk suppression in quasi-three-dimensional displays.
Zhuang, Zhenfeng; Surman, Phil; Cheng, Qijia; Thibault, Simon; Zheng, Yuanjin; Sun, Xiao Wei
2017-07-01
A subpixel area-based evaluation method for an improved slanted lenticular film that minimizes the crosstalk in a quasi-three-dimensional (Q3D) display is proposed in this paper. To identify an optimal slant angle of the film, a subpixel area-based measurement is derived to evaluate the crosstalk among viewing regions of the intended subpixel and adjacent unintended subpixel by taking the real subpixel shape and black matrix into consideration. The subpixel mapping, which corresponds to the optimal slant angle of the film, can then be determined. Meanwhile, the viewing zone characteristics are analyzed to balance the light intensity in both right and left eye channels. A compact and portable Q3D system has been built and appropriate experiments have been applied. The results indicate that significant improvements in both crosstalk and resolution can be obtained with the proposed technique.
Topview stereo: combining vehicle-mounted wide-angle cameras to a distance sensor array
NASA Astrophysics Data System (ADS)
Houben, Sebastian
2015-03-01
The variety of vehicle-mounted sensors in order to fulfill a growing number of driver assistance tasks has become a substantial factor in automobile manufacturing cost. We present a stereo distance method exploiting the overlapping field of view of a multi-camera fisheye surround view system, as they are used for near-range vehicle surveillance tasks, e.g. in parking maneuvers. Hence, we aim at creating a new input signal from sensors that are already installed. Particular properties of wide-angle cameras (e.g. hanging resolution) demand an adaptation of the image processing pipeline to several problems that do not arise in classical stereo vision performed with cameras carefully designed for this purpose. We introduce the algorithms for rectification, correspondence analysis, and regularization of the disparity image, discuss reasons and avoidance of the shown caveats, and present first results on a prototype topview setup.
Evaluation of imaging quality for flat-panel detector based low dose C-arm CT system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seo, Chang-Woo; Cha, Bo Kyung; Jeon, Sungchae
The image quality associated with the extent of the angle of gantry rotation, the number of projection views, and the dose of X-ray radiation was investigated in flat-panel detector (FPD) based C-arm cone-beam computed tomography (CBCT) system for medical applications. A prototype CBCT system for the projection acquisition used the X-ray tube (A-132, Varian inc.) having rhenium-tungsten molybdenum target and flat panel a-Si X-ray detector (PaxScan 4030CB, Varian inc.) having a 397 x 298 mm active area with 388 μm pixel pitch and 1024 x 768 pixels in 2 by 2 binning mode. The performance comparison of X-ray imaging qualitymore » was carried out using the Feldkamp, Davis, and Kress (FDK) reconstruction algorithm between different conditions of projection acquisition. In this work, head-and-dental (75 kVp/20 mA) and chest (90 kVp/25 mA) phantoms were used to evaluate the image quality. The 361 (30 fps x 12 s) projection data during 360 deg. gantry rotation with 1 deg. interval for the 3D reconstruction were acquired. Parke weighting function were applied to handle redundant data and improve the reconstructed image quality in a mobile C-arm system with limited rotation angles. The reconstructed 3D images were investigated for comparison of qualitative image quality in terms of scan protocols (projection views, rotation angles and exposure dose). Furthermore, the performance evaluation in image quality will be investigated regarding X-ray dose and limited projection data for a FPD based mobile C-arm CBCT system. (authors)« less
Robust design study on the wide angle lens with free distortion for mobile lens
NASA Astrophysics Data System (ADS)
Kim, Taeyoung; Yong, Liu; Xu, Qing
2017-10-01
Recently new trend applying wide angle in mobile imaging lens is attracting. Specially, customer requirements for capturing wider scene result that a field of view of lens be wider than 100deg. Introduction of retro-focus type lens in mobile imaging lens is required. However, imaging lens in mobile phone always face to many constraints such as lower total length, low F/# and higher performance. The sensitivity for fabrication may become more severe because of wide angle FOV. In this paper, we investigate an optical lens design satisfy all requirements for mobile imaging lens. In order to accomplish Low cost and small depth of optical system, we used plastic materials for all element and the productivity is considered for realization. The lateral color is minimized less than 2 pixels and optical distortion is less than 5%. Also, we divided optical system into 2 part for robust design. The compensation between 2 groups can help us to increase yield in practice. The 2 group alignment for high yield may be a promising solution for wide angle lens.
Observing System Simulations for Small Satellite Formations Estimating Bidirectional Reflectance
NASA Technical Reports Server (NTRS)
Nag, Sreeja; Gatebe, Charles K.; de Weck, Olivier
2015-01-01
The bidirectional reflectance distribution function (BRDF) gives the reflectance of a target as a function of illumination geometry and viewing geometry, hence carries information about the anisotropy of the surface. BRDF is needed in remote sensing for the correction of view and illumination angle effects (for example in image standardization and mosaicing), for deriving albedo, for land cover classification, for cloud detection, for atmospheric correction, and other applications. However, current spaceborne instruments provide sparse angular sampling of BRDF and airborne instruments are limited in the spatial and temporal coverage. To fill the gaps in angular coverage within spatial, spectral and temporal requirements, we propose a new measurement technique: Use of small satellites in formation flight, each satellite with a VNIR (visible and near infrared) imaging spectrometer, to make multi-spectral, near-simultaneous measurements of every ground spot in the swath at multiple angles. This paper describes an observing system simulation experiment (OSSE) to evaluate the proposed concept and select the optimal formation architecture that minimizes BRDF uncertainties. The variables of the OSSE are identified; number of satellites, measurement spread in the view zenith and relative azimuth with respect to solar plane, solar zenith angle, BRDF models and wavelength of reflection. Analyzing the sensitivity of BRDF estimation errors to the variables allow simplification of the OSSE, to enable its use to rapidly evaluate formation architectures. A 6-satellite formation is shown to produce lower BRDF estimation errors, purely in terms of angular sampling as evaluated by the OSSE, than a single spacecraft with 9 forward-aft sensors. We demonstrate the ability to use OSSEs to design small satellite formations as complements to flagship mission data. The formations can fill angular sampling gaps and enable better BRDF products than currently possible.
Observing system simulations for small satellite formations estimating bidirectional reflectance
NASA Astrophysics Data System (ADS)
Nag, Sreeja; Gatebe, Charles K.; Weck, Olivier de
2015-12-01
The bidirectional reflectance distribution function (BRDF) gives the reflectance of a target as a function of illumination geometry and viewing geometry, hence carries information about the anisotropy of the surface. BRDF is needed in remote sensing for the correction of view and illumination angle effects (for example in image standardization and mosaicing), for deriving albedo, for land cover classification, for cloud detection, for atmospheric correction, and other applications. However, current spaceborne instruments provide sparse angular sampling of BRDF and airborne instruments are limited in the spatial and temporal coverage. To fill the gaps in angular coverage within spatial, spectral and temporal requirements, we propose a new measurement technique: use of small satellites in formation flight, each satellite with a VNIR (visible and near infrared) imaging spectrometer, to make multi-spectral, near-simultaneous measurements of every ground spot in the swath at multiple angles. This paper describes an observing system simulation experiment (OSSE) to evaluate the proposed concept and select the optimal formation architecture that minimizes BRDF uncertainties. The variables of the OSSE are identified; number of satellites, measurement spread in the view zenith and relative azimuth with respect to solar plane, solar zenith angle, BRDF models and wavelength of reflection. Analyzing the sensitivity of BRDF estimation errors to the variables allow simplification of the OSSE, to enable its use to rapidly evaluate formation architectures. A 6-satellite formation is shown to produce lower BRDF estimation errors, purely in terms of angular sampling as evaluated by the OSSE, than a single spacecraft with 9 forward-aft sensors. We demonstrate the ability to use OSSEs to design small satellite formations as complements to flagship mission data. The formations can fill angular sampling gaps and enable better BRDF products than currently possible.
Ogawa, Hiroyuki; Hasegawa, Seiichirou; Tsukada, Sachiyuki; Matsubara, Masaaki
2018-06-01
We developed an acetabular cup placement device, the AR-HIP system, using augmented reality (AR). The AR-HIP system allows the surgeon to view an acetabular cup image superimposed in the surgical field through a smartphone. The smartphone also shows the placement angle of the acetabular cup. This preliminary study was performed to assess the accuracy of the AR-HIP system for acetabular cup placement during total hip arthroplasty (THA). We prospectively measured the placement angles using both a goniometer and AR-HIP system in 56 hips of 54 patients undergoing primary THA. We randomly determined the order of intraoperative measurement using the 2 devices. At 3 months after THA, the placement angle of the acetabular cup was measured on computed tomography images. The primary outcome was the absolute value of the difference between intraoperative and postoperative computed tomography measurements. The measurement angle using AR-HIP was significantly more accurate in terms of radiographic anteversion than that using a goniometer (2.7° vs 6.8°, respectively; mean difference 4.1°; 95% confidence interval, 3.0-5.2; P < .0001). There was no statistically significant difference in terms of radiographic inclination (2.1° vs 2.6°; mean difference 0.5°; 95% confidence interval, -1.1 to 0.1; P = .13). In this pilot study, the AR-HIP system provided more accurate information regarding acetabular cup placement angle than the conventional method. Further studies are required to confirm the utility of the AR-HIP system as a navigation tool. Copyright © 2018 Elsevier Inc. All rights reserved.
Word Knowledge in a Theory of Reading Comprehension
ERIC Educational Resources Information Center
Perfetti, Charles; Stafura, Joseph
2014-01-01
We reintroduce a wide-angle view of reading comprehension, the Reading Systems Framework, which places word knowledge in the center of the picture, taking into account the progress made in comprehension research and theory. Within this framework, word-to-text integration processes can serve as a model for the study of local comprehension…
Equity and Segregation in the Spanish Education System
ERIC Educational Resources Information Center
Ferrer, Ferran; Ferrer, Gerard; Baldellou, Jose Luis Castel
2006-01-01
This article discusses educational inequalities within the territorial context of Spain, and more particularly in the autonomous community of Catalonia. The analysis, which takes a comparative international approach, looks at the question from two points of view. First, from the angle of students, an analysis is made of the impact produced by…
Height and Motion of the Chikurachki Eruption Plume
NASA Technical Reports Server (NTRS)
2003-01-01
The height and motion of the ash and gas plume from the April 22, 2003, eruption of the Chikurachki volcano is portrayed in these views from the Multi-angle Imaging SpectroRadiometer (MISR). Situated within the northern portion of the volcanically active Kuril Island group, the Chikurachki volcano is an active stratovolcano on Russia's Paramushir Island (just south of the Kamchatka Peninsula).In the upper panel of the still image pair, this scene is displayed as a natural-color view from MISR's vertical-viewing (nadir) camera. The white and brownish-grey plume streaks several hundred kilometers from the eastern edge of Paramushir Island toward the southeast. The darker areas of the plume typically indicate volcanic ash, while the white portions of the plume indicate entrained water droplets and ice. According to the Kamchatkan Volcanic Eruptions Response Team (KVERT), the temperature of the plume near the volcano on April 22 was -12o C.The lower panel shows heights derived from automated stereoscopic processing of MISR's multi-angle imagery, in which the plume is determined to reach heights of about 2.5 kilometers above sea level. Heights for clouds above and below the eruption plume were also retrieved, including the high-altitude cirrus clouds in the lower left (orange pixels). The distinctive patterns of these features provide sufficient spatial contrast for MISR's stereo height retrieval to perform automated feature matching between the images acquired at different view angles. Places where clouds or other factors precluded a height retrieval are shown in dark gray.The multi-angle 'fly-over' animation (below) allows the motion of the plume and of the surrounding clouds to be directly observed. The frames of the animation consist of data acquired by the 70-degree, 60-degree, 46-degree and 26-degree forward-viewing cameras in sequence, followed by the images from the nadir camera and each of the four backward-viewing cameras, ending with the view from the 70-degree backward camera.The Multi-angle Imaging SpectroRadiometer observes the daylit Earth continuously from pole to pole, and every 9 days views the entire globe between 82 degrees north and 82 degrees south latitude. These data products were generated from a portion of the imagery acquired during Terra orbit 17776. The panels cover an area of approximately 296 kilometers x 216 kilometers (still images) and 185 kilometers x 154 kilometers (animation), and utilize data from blocks 50 to 51 within World Reference System-2 path 100.MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology. [figure removed for brevity, see original siteImaging of trabecular meshwork using Bessel-Gauss light sheet with fluorescence
NASA Astrophysics Data System (ADS)
Jie Jeesmond Hong, Xun; Shinoj, V. K.; Murukeshan, V. M.; Baskaran, M.; Aung, Tin
2017-03-01
Ocular imaging technology that holds promise for both fundamental investigation and clinical detection of glaucoma is still a challenging research area. A direct view of the trabecular meshwork (TM) with high resolution is not generally possible because the iridocorneal angle region is obstructed by the sclera overlap. The best approach to observe the aqueous outflow system (AOS) is therefore to view from the opposite angle. In this research work, we developed two imaging systems for the high resolution ex vivo studies of the AOS inside porcine eye, based on a Gaussian illuminated and a digitally scanned Bessel-Gauss beam light sheet fluorescence configurations. The digitally scanned Bessel-Gauss beam is able to overcome the trade-off between the length and thickness of the Gaussian light sheet to give better imaging performance. It has adequate spatial resolution to resolve critical anatomical structures such as the TM, thereby enabling objective information about the AOS. This non-contact and non-invasive imaging methodology with excellent safety profile is expected to be well received by vision researchers and clinicians in the evaluation and management of glaucoma.
NASA Astrophysics Data System (ADS)
Diallo, Mamadou S.; Glinka, Charles J.; Goddard, William A.; Johnson, James H.
2005-10-01
Fulvic acids (FA) and humic acids (HA) constitute 30-50% of dissolved organic matter in natural aquatic systems. In aqueous solutions, a commonly accepted view is that FA and HA exist as soluble macroligands at low concentration and as supramolecular aggregates at higher concentration. The size, shape and structure of these aggregates are still the subject of ongoing debate in the environmental chemistry literature. In this article, we use small angle neutron scattering (SANS) to assess the effects of solute concentration, solution pH and background electrolyte (NaCl) concentration on the structures of Suwannee River FA (SRFA) aggregates in D2O. The qualitative features of the SANS curves and data analysis are not consistent with the view point that SRFA forms micelle-like aggregates as its concentration in aqueous solution increases. We find that SRFA forms fractal aggregates in D20 with size greater than 242 nm. The SRFA aggregates undergo a significant degree of restructuring in compactness as solution pH, solute concentration and NaCl concentration increase.
57. INTERIOR VIEW OF VAL BRIDGE STRUCTURE SHOWING LAUNCHING TUBE, ...
57. INTERIOR VIEW OF VAL BRIDGE STRUCTURE SHOWING LAUNCHING TUBE, STAIRS AND PORTION OF LAUNCHING DECK. NOTE SUPPORT CARRIAGE ASSEMBLY IN DISTANCE. Date unknown, circa March 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
2016-10-31
Saturn appears as a serene globe amid tranquil rings in this view from NASA's Cassini spacecraft. In reality, the planet's atmosphere is an ever-changing scene of high-speed winds and evolving weather patterns, punctuated by occasional large storms (see PIA14901). The rings, consist of countless icy particles, which are continually colliding. Such collisions play a key role in the rings' numerous waves and wakes, which are the manifestation of the subtle influence of Saturn's moons and, indeed, the planet itself. The long duration of the Cassini mission has allowed scientists to study how the atmosphere and rings of Saturn change over time, providing much-needed insights into this active planetary system. The view looks toward the sunlit side of the rings from about 41 degrees above the ring plane. The image was taken with the Cassini spacecraft wide-angle camera on July 16, 2016 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 752 nanometers. The view was acquired at a distance of approximately 1 million miles (2 million kilometers) from Saturn. Image scale is 68 miles (110 kilometers) per pixel. The view was obtained at a distance of approximately 752,000 miles (1.21 million kilometers) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 6 degrees. Image scale is 45 miles (72 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA20502
Inter-Comparison of MODIS and VIIRS Vegetation Indices Using One-Year Global Data
NASA Astrophysics Data System (ADS)
Miura, T.; Muratsuchi, J.; Obata, K.; Kato, A.; Vargas, M.; Huete, A. R.
2016-12-01
The Visible Infrared Imaging Radiometer Suite (VIIRS) sensor series of the Joint Polar Satellite System program is slated to continue the highly calibrated data stream initiated with the Earth Observing System Moderate Resolution Imaging Spectroradiometer (MODIS) sensors. A number of geophysical products are being/to be produced from VIIRS data, including the "Top-of-the-Atmosphere (TOA)" Normalized Difference Vegetation Index (NDVI), "Top-of-Canopy (TOC)" Enhanced Vegetation Index (EVI), and TOC NDVI. In this study, we cross-compared vegetation indices (VIs) from the first VIIRS sensor aboard the Suomi National Polar-orbiting Partnership satellite with the Aqua MODIS counterparts using one-year global data. This study was aimed at developing a thorough understanding of radiometric compatibility between the two VI datasets across globe, seasons, a range of viewing angle, and land cover types. VIIRS and MODIS VI data of January-December 2015 were obtained at monthly intervals when their orbital tracks coincided. These data were projected and spatially-aggregated into a .0036-degree grid while screening for cloud and aerosol contaminations using their respective quality flags. VIIRS-MODIS observation pairs with near-identical sun-target-view angles were extracted from each of these monthly image pairs for cross-comparison. The four VIs of TOA NDVI, TOC NDVI, TOC EVI, and TOC EVI2 (a two-band version of the EVI) were analyzed. Between MODIS and VIIRS, TOA NDVI, TOC NDVI, and TOC EVI2 had very small overall mean differences (MD) of .014, .013, and .013 VI units, respectively, whereas TOC EVI had a slightly larger overall MD of 0.023 EVI units attributed to the disparate blue bands of the two sensors. These systematic differences were consistent across the one-year period. With respect to sun-target-viewing geometry, MDs were also consistent across the view zenith angle range, but always lower for forward- than backward-viewing geometry. MDs showed large land cover dependencies for TOA NDVI and TOC NDVI, varying 10 folds from .002 for forests to .02 for sparsely-vegetated areas. They were consistent across land cover types for TOC EVI and TOC EVI2. Future studies should address the impact of sun-target-view geometry on corss-sensor VI comparisons.
Design of the computerized 3D endoscopic imaging system for delicate endoscopic surgery.
Song, Chul-Gyu; Kang, Jin U
2011-02-01
This paper describes a 3D endoscopic video system designed to improve visualization and enhance the ability of the surgeon to perform delicate endoscopic surgery. In a comparison of the polarized and conventional electric shutter-type stereo imaging systems, the former was found to be superior in terms of both accuracy and speed for suturing and for the loop pass test. Among the groups performing loop passing and suturing, there was no significant difference in the task performance between the 2D and 3D modes, however, suturing was performed 15% (p < 0.05) faster in 3D mode by both groups. The results of our experiments show that the proposed 3D endoscopic system has a sufficiently wide viewing angle and zone for multi-viewing.
NASA Astrophysics Data System (ADS)
Zhang, Qian; Chen, Jing; Zhang, Yongguang; Qiu, Feng; Fan, Weiliang; Ju, Weimin
2017-04-01
The gross primary production (GPP) of terrestrial ecosystems constitutes the largest global land carbon flux and exhibits significant spatial and temporal variations. Due to its wide spatial coverage, remote sensing technology is shown to be useful for improving the estimation of GPP in combination with light use efficiency (LUE) models. Accurate estimation of LUE is essential for calculating GPP using remote sensing data and LUE models at regional and global scales. A promising method used for estimating LUE is the photochemical reflectance index (PRI = (R531-R570)/(R531 + R570), where R531 and R570 are reflectance at wavelengths 531 and 570 nm) through remote sensing. However, it has been documented that there are certain issues with PRI at the canopy scale, which need to be considered systematically. For this purpose, an improved tower-based automatic canopy multi-angle hyperspectral observation system was established at the Qianyanzhou flux station in China since January of 2013. In each 15-minute observation cycle, PRI was observed at four view zenith angles fixed at solar zenith angle and (37°, 47°, 57°) or (42°, 52°, 62°) in the azimuth angle range from 45° to 325° (defined from geodetic north). To improve the ability of directional PRI observation to track canopy LUE, the canopy is treated as two-big leaves, i.e. sunlit and shaded leaves. On the basis of a geometrical optical model, the observed canopy reflectance for each view angle is separated to four components, i.e. sunlit and shaded leaves and sunlit and shaded backgrounds. To determine the fractions of these four components at each view angle, three models based on different theories are tested for simulating the fraction of sunlit leaves. Finally, a ratio of canopy reflectance to leaf reflectance is used to represent the fraction of sunlit leaves, and the fraction of shaded leaves is calculated with the four-scale geometrical optical model. Thus, sunlit and shaded PRI are estimated using the least squares regression with multi-angle observations. In both the half-hourly and daily time steps, the canopy-level two-leaf PRI (PRIt) can effectively enhance (>50% and >35%, respectively) the correlation between PRI and LUE derived from the tower flux measurements over the big-leaf PRI (PRIb) taken as the arithmetic average of the multi-angle measurements in a given time interval. PRIt is very effective in detecting the low-moderate drought stress on LUE at half-hourly time steps, while ineffective in detecting severe atmospheric water and heat stresses, which is probably due to alternative radiative energy sink, i.e. photorespiration. Overall, the two-leaf approach well overcomes some external effects (e.g. sun-target-view geometry) that interfere with PRI signals.
Active control of acoustic field-of-view in a biosonar system.
Yovel, Yossi; Falk, Ben; Moss, Cynthia F; Ulanovsky, Nachum
2011-09-01
Active-sensing systems abound in nature, but little is known about systematic strategies that are used by these systems to scan the environment. Here, we addressed this question by studying echolocating bats, animals that have the ability to point their biosonar beam to a confined region of space. We trained Egyptian fruit bats to land on a target, under conditions of varying levels of environmental complexity, and measured their echolocation and flight behavior. The bats modulated the intensity of their biosonar emissions, and the spatial region they sampled, in a task-dependant manner. We report here that Egyptian fruit bats selectively change the emission intensity and the angle between the beam axes of sequentially emitted clicks, according to the distance to the target, and depending on the level of environmental complexity. In so doing, they effectively adjusted the spatial sector sampled by a pair of clicks-the "field-of-view." We suggest that the exact point within the beam that is directed towards an object (e.g., the beam's peak, maximal slope, etc.) is influenced by three competing task demands: detection, localization, and angular scanning-where the third factor is modulated by field-of-view. Our results suggest that lingual echolocation (based on tongue clicks) is in fact much more sophisticated than previously believed. They also reveal a new parameter under active control in animal sonar-the angle between consecutive beams. Our findings suggest that acoustic scanning of space by mammals is highly flexible and modulated much more selectively than previously recognized.
NASA Astrophysics Data System (ADS)
Venolia, Dan S.; Williams, Lance
1990-08-01
A range of stereoscopic display technologies exist which are no more intrusive, to the user, than a pair of spectacles. Combining such a display system with sensors for the position and orientation of the user's point-of-view results in a greatly enhanced depiction of three-dimensional data. As the point of view changes, the stereo display channels are updated in real time. The face of a monitor or display screen becomes a window on a three-dimensional scene. Motion parallax naturally conveys the placement and relative depth of objects in the field of view. Most of the advantages of "head-mounted display" technology are achieved with a less cumbersome system. To derive the full benefits of stereo combined with motion parallax, both stereo channels must be updated in real time. This may limit the size and complexity of data bases which can be viewed on processors of modest resources, and restrict the use of additional three-dimensional cues, such as texture mapping, depth cueing, and hidden surface elimination. Effective use of "full 3D" may still be undertaken in a non-interactive mode. Integral composite holograms have often been advanced as a powerful 3D visualization tool. Such a hologram is typically produced from a film recording of an object on a turntable, or a computer animation of an object rotating about one axis. The individual frames of film are multiplexed, in a composite hologram, in such a way as to be indexed by viewing angle. The composite may be produced as a cylinder transparency, which provides a stereo view of the object as if enclosed within the cylinder, which can be viewed from any angle. No vertical parallax is usually provided (this would require increasing the dimensionality of the multiplexing scheme), but the three dimensional image is highly resolved and easy to view and interpret. Even a modest processor can duplicate the effect of such a precomputed display, provided sufficient memory and bus bandwidth. This paper describes the components of a stereo display system with user point-of-view tracking for interactive 3D, and a digital realization of integral composite display which we term virtual integral holography. The primary drawbacks of holographic display - film processing turnaround time, and the difficulties of displaying scenes in full color -are obviated, and motion parallax cues provide easy 3D interpretation even for users who cannot see in stereo.
Traffic Sign Recognition with Invariance to Lighting in Dual-Focal Active Camera System
NASA Astrophysics Data System (ADS)
Gu, Yanlei; Panahpour Tehrani, Mehrdad; Yendo, Tomohiro; Fujii, Toshiaki; Tanimoto, Masayuki
In this paper, we present an automatic vision-based traffic sign recognition system, which can detect and classify traffic signs at long distance under different lighting conditions. To realize this purpose, the traffic sign recognition is developed in an originally proposed dual-focal active camera system. In this system, a telephoto camera is equipped as an assistant of a wide angle camera. The telephoto camera can capture a high accuracy image for an object of interest in the view field of the wide angle camera. The image from the telephoto camera provides enough information for recognition when the accuracy of traffic sign is low from the wide angle camera. In the proposed system, the traffic sign detection and classification are processed separately for different images from the wide angle camera and telephoto camera. Besides, in order to detect traffic sign from complex background in different lighting conditions, we propose a type of color transformation which is invariant to light changing. This color transformation is conducted to highlight the pattern of traffic signs by reducing the complexity of background. Based on the color transformation, a multi-resolution detector with cascade mode is trained and used to locate traffic signs at low resolution in the image from the wide angle camera. After detection, the system actively captures a high accuracy image of each detected traffic sign by controlling the direction and exposure time of the telephoto camera based on the information from the wide angle camera. Moreover, in classification, a hierarchical classifier is constructed and used to recognize the detected traffic signs in the high accuracy image from the telephoto camera. Finally, based on the proposed system, a set of experiments in the domain of traffic sign recognition is presented. The experimental results demonstrate that the proposed system can effectively recognize traffic signs at low resolution in different lighting conditions.
NASA Technical Reports Server (NTRS)
McFarland, Shane M.
2008-01-01
Field of view has always been a design feature paramount to helmet design, and in particular space suit design, where the helmet must provide an adequate field of view for a large range of activities, environments, and body positions. For Project Constellation, a slightly different approach to helmet requirement maturation was utilized; one that was less a direct function of body position and suit pressure and more a function of the mission segment in which the field of view is required. Through taxonimization of various parameters that affect suited FOV, as well as consideration for possible nominal and contingency operations during that mission segment, a reduction process was able to condense the large number of possible outcomes to only six unique field of view angle requirements that still captured all necessary variables without sacrificing fidelity. The specific field of view angles were defined by considering mission segment activities, historical performance of other suits, comparison between similar requirements (pressure visor up versus down, etc.), estimated requirements from other teams for field of view (Orion, Altair, EVA), previous field of view tests, medical data for shirtsleeve field of view performance, and mapping of visual field data to generate 45degree off-axis field of view requirements. Full resolution of several specific field of view angle requirements warranted further work, which consisted of low and medium fidelity field of view testing in the rear entry ISuit and DO27 helmet prototype. This paper serves to document this reduction progress and followup testing employed to write the Constellation requirements for helmet field of view.
Biophysical and spectral modeling for crop identification and assessment
NASA Technical Reports Server (NTRS)
Goel, N. S. (Principal Investigator)
1984-01-01
The development of a technique for estimating all canopy parameters occurring in a canopy reflectance model from the measured canopy reflectance data is summarized. The Suits and the SAIL model for a uniform and homogeneous crop canopy were used to determine if the leaf area index and the leaf angle distribution could be estimated. Optimal solar/view angles for measuring CR were also investigated. The use of CR in many wavelengths or spectral bands and of linear and nonlinear transforms of CRs for various solar/view angles and various spectral bands is discussed as well as the inversion of rediance data inside the canopy, angle transforms for filtering out terrain slope effects, and modification of one dimensional models.
Morizane, Kazuki; Takahashi, Toshiaki; Konishi, Fumihiko; Yamamoto, Haruyasu
2011-12-01
A new radiographic method using the anterior and posterior femoral condyles as a landmark to determine the rotational alignment of the femoral component in TKA had been developed. The new radiograph presents an axial view of the distal femur. The patients were asked to lie in the supine position and flex the knee approximately 120° to 130°. Radiographs were applied at an inclination angle of 20° to 30°. The condylar twist angle (CTA), the external rotational angle between the posterior condylar (PC) line and the clinical transepicondylar axis (TEA), and the trochlear line angle (TLA), and the internal rotational angle between the anterior trochlear line and the clinical TEA were measured. Images were taken of 129 knees in 87 patients with osteoarthritis of the knee. The measurement values obtained using our method with those obtained using 3D reconstructed images from a 3-dimensional helical CT system (n = 35) were compared. The average CTA was 5.7° ± 2.8° and the average TLA was -5.6° ± 3.2°. The CTA was negatively correlated with the tibiofemoral angle (TFA). The average TLA was positively correlated with the TFA. The average difference between the TLA values obtained with this view and those obtained using the 3D-CT was 0.5° ± 1.6°. The relationship between the radiograph and 3D-CT in TLA was higher than that in CTA. This radiographic technique allows easy and simultaneous measurement of the CTA and TLA and may provide an alternative method for assessing the TEA of the femur during preoperative planning for TKA.
Toward a dose reduction strategy using model-based reconstruction with limited-angle tomosynthesis
NASA Astrophysics Data System (ADS)
Haneda, Eri; Tkaczyk, J. E.; Palma, Giovanni; Iordache, Rǎzvan; Zelakiewicz, Scott; Muller, Serge; De Man, Bruno
2014-03-01
Model-based iterative reconstruction (MBIR) is an emerging technique for several imaging modalities and appli- cations including medical CT, security CT, PET, and microscopy. Its success derives from an ability to preserve image resolution and perceived diagnostic quality under impressively reduced signal level. MBIR typically uses a cost optimization framework that models system geometry, photon statistics, and prior knowledge of the recon- structed volume. The challenge of tomosynthetic geometries is that the inverse problem becomes more ill-posed due to the limited angles, meaning the volumetric image solution is not uniquely determined by the incom- pletely sampled projection data. Furthermore, low signal level conditions introduce additional challenges due to noise. A fundamental strength of MBIR for limited-views and limited-angle is that it provides a framework for constraining the solution consistent with prior knowledge of expected image characteristics. In this study, we analyze through simulation the capability of MBIR with respect to prior modeling components for limited-views, limited-angle digital breast tomosynthesis (DBT) under low dose conditions. A comparison to ground truth phantoms shows that MBIR with regularization achieves a higher level of fidelity and lower level of blurring and streaking artifacts compared to other state of the art iterative reconstructions, especially for high contrast objects. The benefit of contrast preservation along with less artifacts may lead to detectability improvement of microcalcification for more accurate cancer diagnosis.
Ash from Kilauea Eruption Viewed by NASA's MISR
Atmospheric Science Data Center
2018-06-07
... title: Ash from Kilauea Eruption Viewed by NASA's MISR View Larger Image Ash ... Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite captured this view of the island as it passed overhead. ...
Use of geographic information management systems (GIMS) for nitrogen management
NASA Astrophysics Data System (ADS)
Diker, Kenan
1998-11-01
Geographic Information Management Systems (GIMS) was investigated in this study to develop an efficient nitrogen management scheme for corn. The study was conducted on two experimental corn sites. The first site consisted of six non-replicated plots where the canopy reflectance of corn at six nitrogen fertilizer levels was investigated. The reflectance measurements were conducted for nadir and 75sp° view angles. Data from these plots were used to develop relationships between reflectance data and soil and plant parameters. The second site had four corn plots fertilized by different methods such as spoon-fed, pre-plant and side-dress, which created nitrogen variability within the field. Soil and plant nitrogen as well as leaf area, biomass, percent cover measurements, and canopy reflectance data were collected at various growth stages from both sites during the 1995 and 1996 growing seasons. Relationships were developed between the Nitrogen Reflectance Index (NRI) developed by Bausch et al. (1994) and soil and plant variables. Spatial dependence of data was determined by geostatistical methods; variability was mapped in ArcView. Results of this study indicated that the NRI is a better estimator of plant nitrogen status than chlorophyll meter measurements. The NRI can successfully be used to estimate the spatial distribution of soil nitrogen estimates through the plant nitrogen status as well as plant parameters and the yield potential. GIS mapping of measured and estimated soil nitrogen agreed except in locations where hot spots were measured. The NRI value of 0.95 seemed to be the critical value for plant nitrogen status especially for the 75sp° view. The nadir view tended to underestimate plant and soil parameters, whereas, the 75sp° view slightly overestimated these parameters. If available, the 75sp° view data should be used before the tasseling stage for reflectance measurements to reduce the soil background effect. However, it is sensitive to windy conditions. After tasseling, the nadir view should be used because the 75sp° view is obstructed by tassels. Total soil nitrogen at the V6 growth stage was underestimated by the NRI for both view angles. Results also indicated that a nitrogen prescription could be estimated at various growth stages.
3. Elevation view of entire midsection using ultrawide angle lens. ...
3. Elevation view of entire midsection using ultrawide angle lens. Note opened south doors and closed north doors. The following photo WA-203-C-4 is similar except the camera position was moved right to include the slope of the south end. - Puget Sound Naval Shipyard, Munitions Storage Bunker, Naval Ammunitions Depot, South of Campbell Trail, Bremerton, Kitsap County, WA
The pigeon's distant visual acuity as a function of viewing angle.
Uhlrich, D J; Blough, P M; Blough, D S
1982-01-01
Distant visual acuity was determined for several viewing angles in two restrained White Carneaux pigeons. The behavioral technique was a classical conditioning procedure that paired presentation of sinusoidal gratings with shock. A conditioned heart rate acceleration during the grating presentation indicated resolution of the grating. The bird's acuity was fairly uniform across a large range of their lateral visual field; performance decreased slightly for posterior stimulus placement and sharply for frontal placements. The data suggest that foveal viewing is relatively less advantageous for acuity in pigeons than in humans. The data are also consistent with the current view that pigeons are myopic in frontal vision.
Krotkov, N A; Vasilkov, A P
2000-03-20
Use of a vertical polarizer has been suggested to reduce the effects of surface reflection in the above-water measurements of marine reflectance. We suggest using a similar technique for airborne or spaceborne sensors when atmospheric scattering adds its own polarization signature to the upwelling radiance. Our own theoretical sensitivity study supports the recommendation of Fougnie et al. [Appl. Opt. 38, 3844 (1999)] (40-50 degrees vertical angle and azimuth angle near 135 degrees, polarizer parallel to the viewing plane) for above-water measurements. However, the optimal viewing directions (and the optimal orientation of the polarizer) change with altitude above the sea surface, solar angle, and atmospheric vertical optical structure. A polarization efficiency function is introduced, which shows the maximal possible polarization discrimination of the background radiation for an arbitrary altitude above the sea surface, viewing direction, and solar angle. Our comment is meant to encourage broader application of airborne and spaceborne polarization sensors in remote sensing of water and sea surface properties.
A Wide Field of View Plasma Spectrometer
Skoug, Ruth M.; Funsten, Herbert O.; Moebius, Eberhard; ...
2016-07-01
Here we present a fundamentally new type of space plasma spectrometer, the wide field of view plasma spectrometer, whose field of view is >1.25π ster using fewer resources than traditional methods. The enabling component is analogous to a pinhole camera with an electrostatic energy-angle filter at the image plane. Particle energy-per-charge is selected with a tunable bias voltage applied to the filter plate relative to the pinhole aperture plate. For a given bias voltage, charged particles from different directions are focused by different angles to different locations. Particles with appropriate locations and angles can transit the filter plate and aremore » measured using a microchannel plate detector with a position-sensitive anode. Full energy and angle coverage are obtained using a single high-voltage power supply, resulting in considerable resource savings and allowing measurements at fast timescales. Lastly, we present laboratory prototype measurements and simulations demonstrating the instrument concept and discuss optimizations of the instrument design for application to space measurements.« less
Virtual interface environment workstations
NASA Technical Reports Server (NTRS)
Fisher, S. S.; Wenzel, E. M.; Coler, C.; Mcgreevy, M. W.
1988-01-01
A head-mounted, wide-angle, stereoscopic display system controlled by operator position, voice and gesture has been developed at NASA's Ames Research Center for use as a multipurpose interface environment. This Virtual Interface Environment Workstation (VIEW) system provides a multisensory, interactive display environment in which a user can virtually explore a 360-degree synthesized or remotely sensed environment and can viscerally interact with its components. Primary applications of the system are in telerobotics, management of large-scale integrated information systems, and human factors research. System configuration, research scenarios, and research directions are described.
Concentrating Solar Power Projects - Puerto Errado 1 Thermosolar Power
linear Fresnel reflector system. Status Date: September 7, 2011 Photo showing an aerial view at an angle ): Novatec Solar España S.L. (100%) Technology: Linear Fresnel reflector Turbine Capacity: Gross: 1.4 MW Technology: Linear Fresnel reflector Status: Operational Country: Spain City: Calasparra Region: Murcia Lat
Multiview robotic microscope reveals the in-plane kinematics of amphibian neurulation.
Veldhuis, Jim H; Brodland, G Wayne; Wiebe, Colin J; Bootsma, Gregory J
2005-06-01
A new robotic microscope system, called the Frogatron 3000, was developed to collect time-lapse images from arbitrary viewing angles over the surface of live embryos. Embryos are mounted at the center of a horizontal, fluid-filled, cylindrical glass chamber around which a camera with special optics traverses. To hold them at the center of the chamber and revolve them about a vertical axis, the embryos are placed on the end of a small vertical glass tube that is rotated under computer control. To demonstrate operation of the system, it was used to capture time-lapse images of developing axolotl (amphibian) embryos from 63 viewing angles during the process of neurulation and the in-plane kinematics of the epithelia visible at the center of each view was calculated. The motions of points on the surface of the embryo were determined by digital tracking of their natural surface texture, and a least-squares algorithm was developed to calculate the deformation-rate tensor from the motions of these surface points. Principal strain rates and directions were extracted from this tensor using decomposition and eigenvector techniques. The highest observed principal true strain rate was 28 +/- 5% per hour, along the midline of the neural plate during developmental stage 14, while the greatest contractile true strain rate was--35 +/- 5% per hour, normal to the embryo midline during stage 15.
The neural code for face orientation in the human fusiform face area.
Ramírez, Fernando M; Cichy, Radoslaw M; Allefeld, Carsten; Haynes, John-Dylan
2014-09-03
Humans recognize faces and objects with high speed and accuracy regardless of their orientation. Recent studies have proposed that orientation invariance in face recognition involves an intermediate representation where neural responses are similar for mirror-symmetric views. Here, we used fMRI, multivariate pattern analysis, and computational modeling to investigate the neural encoding of faces and vehicles at different rotational angles. Corroborating previous studies, we demonstrate a representation of face orientation in the fusiform face-selective area (FFA). We go beyond these studies by showing that this representation is category-selective and tolerant to retinal translation. Critically, by controlling for low-level confounds, we found the representation of orientation in FFA to be compatible with a linear angle code. Aspects of mirror-symmetric coding cannot be ruled out when FFA mean activity levels are considered as a dimension of coding. Finally, we used a parametric family of computational models, involving a biased sampling of view-tuned neuronal clusters, to compare different face angle encoding models. The best fitting model exhibited a predominance of neuronal clusters tuned to frontal views of faces. In sum, our findings suggest a category-selective and monotonic code of face orientation in the human FFA, in line with primate electrophysiology studies that observed mirror-symmetric tuning of neural responses at higher stages of the visual system, beyond the putative homolog of human FFA. Copyright © 2014 the authors 0270-6474/14/3412155-13$15.00/0.
NASA Technical Reports Server (NTRS)
Pina, J. F.; House, F. B.
1975-01-01
Radiometers on earth orbiting satellites measure the exchange of radiant energy between the earth-atmosphere (E-A) system and space at observation points in space external to the E-A system. Observations by wideangle, spherical and flat radiometers are analyzed and interpreted with regard to the general problem of the earth energy budget (EEB) and to the problem of determining the energy budget of regions smaller than the field of view (FOV) of these radiometers.
Thin plate spline feature point matching for organ surfaces in minimally invasive surgery imaging
NASA Astrophysics Data System (ADS)
Lin, Bingxiong; Sun, Yu; Qian, Xiaoning
2013-03-01
Robust feature point matching for images with large view angle changes in Minimally Invasive Surgery (MIS) is a challenging task due to low texture and specular reflections in these images. This paper presents a new approach that can improve feature matching performance by exploiting the inherent geometric property of the organ surfaces. Recently, intensity based template image tracking using a Thin Plate Spline (TPS) model has been extended for 3D surface tracking with stereo cameras. The intensity based tracking is also used here for 3D reconstruction of internal organ surfaces. To overcome the small displacement requirement of intensity based tracking, feature point correspondences are used for proper initialization of the nonlinear optimization in the intensity based method. Second, we generate simulated images from the reconstructed 3D surfaces under all potential view positions and orientations, and then extract feature points from these simulated images. The obtained feature points are then filtered and re-projected to the common reference image. The descriptors of the feature points under different view angles are stored to ensure that the proposed method can tolerate a large range of view angles. We evaluate the proposed method with silicon phantoms and in vivo images. The experimental results show that our method is much more robust with respect to the view angle changes than other state-of-the-art methods.
Li, Xiaofang; Deng, Linhong; Lu, Hu; He, Bin
2014-08-01
A measurement system based on the image processing technology and developed by LabVIEW was designed to quickly obtain the range of motion (ROM) of spine. NI-Vision module was used to pre-process the original images and calculate the angles of marked needles in order to get ROM data. Six human cadaveric thoracic spine segments T7-T10 were selected to carry out 6 kinds of loads, including left/right lateral bending, flexion, extension, cis/counterclockwise torsion. The system was used to measure the ROM of segment T8-T9 under the loads from 1 Nm to 5 Nm. The experimental results showed that the system is able to measure the ROM of the spine accurately and quickly, which provides a simple and reliable tool for spine biomechanics investigators.
Gooi, Patrick; Ahmed, Yusuf; Ahmed, Iqbal Ike K
2014-07-01
We describe the use of a microscope-mounted wide-angle point-of-view camera to record optimal hand positions in ocular surgery. The camera is mounted close to the objective lens beneath the surgeon's oculars and faces the same direction as the surgeon, providing a surgeon's view. A wide-angle lens enables viewing of both hands simultaneously and does not require repositioning the camera during the case. Proper hand positioning and instrument placement through microincisions are critical for effective and atraumatic handling of tissue within the eye. Our technique has potential in the assessment and training of optimal hand position for surgeons performing intraocular surgery. It is an innovative way to routinely record instrument and operating hand positions in ophthalmic surgery and has minimal requirements in terms of cost, personnel, and operating-room space. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2014 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Snowstorm Along the China-Mongolia-Russia Borders
NASA Technical Reports Server (NTRS)
2004-01-01
Heavy snowfall on March 12, 2004, across north China's Inner Mongolia Autonomous Region, Mongolia and Russia, caused train and highway traffic to stop for several days along the Russia-China border. This pair of images from the Multi-angle Imaging SpectroRadiometer (MISR) highlights the snow and surface properties across the region on March 13. The left-hand image is a multi-spectral false-color view made from the near-infrared, red, and green bands of MISR's vertical-viewing (nadir) camera. The right-hand image is a multi-angle false-color view made from the red band data of the 46-degree aftward camera, the nadir camera, and the 46-degree forward camera. About midway between the frozen expanse of China's Hulun Nur Lake (along the right-hand edge of the images) and Russia's Torey Lakes (above image center) is a dark linear feature that corresponds with the China-Mongolia border. In the upper portion of the images, many small plumes of black smoke rise from coal and wood fires and blow toward the southeast over the frozen lakes and snow-covered grasslands. Along the upper left-hand portion of the images, in Russia's Yablonovyy mountain range and the Onon River Valley, the terrain becomes more hilly and forested. In the nadir image, vegetation appears in shades of red, owing to its high near-infrared reflectivity. In the multi-angle composite, open-canopy forested areas are indicated by green hues. Since this is a multi-angle composite, the green color arises not from the color of the leaves but from the architecture of the surface cover. The green areas appear brighter at the nadir angle than at the oblique angles because more of the snow-covered surface in the gaps between the trees is visible. Color variations in the multi-angle composite also indicate angular reflectance properties for areas covered by snow and ice. The light blue color of the frozen lakes is due to the increased forward scattering of smooth ice, and light orange colors indicate rougher ice or snow, which scatters more light in the backward direction. The Multi-angle Imaging SpectroRadiometer observes the daylit Earth continuously and every 9 days views the entire Earth between 82 degrees north and 82 degrees south latitude. These data products were generated from a portion of the imagery acquired during Terra orbit 22525. The panels cover an area of about 355 kilometers x 380 kilometers, and utilize data from blocks 50 to 52 within World Reference System-2 path 126. MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.Changes in reflectance anisotropy of wheat crop during different phenophases
NASA Astrophysics Data System (ADS)
Lunagaria, Manoj M.; Patel, Haridas R.
2017-04-01
The canopy structure of wheat changes significantly with growth stages and leads to changes in reflectance anisotropy. Bidirectional reflectance distribution function characterises the reflectance anisotropy of the targets, which can be approximated. Spectrodirectional reflectance measurements on wheat crop were acquired using a field goniometer system. The bidirectional reflectance spectra were acquired at 54 view angles to cover the hemispheric span up to 60° view zenith. The observations were made during early growth stages till maturity of the crop. The anisotropy was not constant for all wavelengths and anisotropic factors clearly revealed spectral dependence, which was more pronounced in near principal plane. In near infrared, wheat canopy expressed less reflectance anisotropy because of higher multiple scattering. The broad hotspot signature was noticeable in reflectance of canopy whenever view and solar angles were close. Distinct changes in bidirectional reflectance distribution function were observed during booting to flowering stages as the canopy achieves more uniformity, height and head emergence. The function clearly reveals bowl shape during heading to early milking growth stages of the crop. Late growth stages show less prominent gap and shadow effects. Anisotropy index revealed that wheat exhibits changes in reflectance anisotropy with phenological development and with spectral bands.
Wide-angle lens for miniature capsule endoscope
NASA Astrophysics Data System (ADS)
Ou-Yang, Mang; Chen, Yung-Lin; Lee, Hsin-Hung; LU, Shih-chieh; Wu, Hsien-Ming
2006-02-01
In recent years, using the capsule endoscope to inspect the pathological change of digestive system and intestine had a great break-through on the medical engineering. However, there are some problems needs to overcome. One is that, the field of view was not wide enough, and the other is that the image quality was not enough well. The drawbacks made medical professionals to examine digestive diseases unclearly and ambiguously. In order to solve these problems, the paper designed a novel miniature lenses which has a wide angle of field of view and a good quality of imaging. The lenses employed in the capsule endoscope consisted of a piece of plastic aspherical lens and a piece of glass lens and compacted in the 9.8mm (W) *9.8mm (L) *10.7mm (H) size. Taking the white LED light source and the 10μm pixel size of 256*256 CMOS sensor under considerations, the field of view of the lenses could be achieved to 86 degrees, and the MTF to 37% at 50lp/mm of space frequency. The experimental data proves that the design is consistent with the finished prototype.
Practical system for generating digital mixed reality video holograms.
Song, Joongseok; Kim, Changseob; Park, Hanhoon; Park, Jong-Il
2016-07-10
We propose a practical system that can effectively mix the depth data of real and virtual objects by using a Z buffer and can quickly generate digital mixed reality video holograms by using multiple graphic processing units (GPUs). In an experiment, we verify that real objects and virtual objects can be merged naturally in free viewing angles, and the occlusion problem is well handled. Furthermore, we demonstrate that the proposed system can generate mixed reality video holograms at 7.6 frames per second. Finally, the system performance is objectively verified by users' subjective evaluations.
NPP VIIRS on-orbit calibration and characterization using the moon
NASA Astrophysics Data System (ADS)
Sun, J.; Xiong, X.; Butler, J.
2012-09-01
The Visible Infrared Imager Radiometer Suite (VIIRS) is one of five instruments on-board the Suomi National Polarorbiting Partnership (NPP) satellite that launched from Vandenberg Air Force Base, Calif., on Oct. 28, 2011. VIIRS has been scheduled to view the Moon approximately monthly with a spacecraft roll maneuver after its NADIR door open on November 21, 2012. To reduce the uncertainty of the radiometric calibration due to the view geometry, the lunar phase angles of the scheduled lunar observations were confined in the range from -56° to -55° in the first three scheduled lunar observations and then changed to the range from -51.5° to -50.5°, where the negative sign for the phase angles indicates that the VIIRS views a waxing moon. Unlike the MODIS lunar observations, most scheduled VIIRS lunar views occur on the day side of the Earth. For the safety of the instrument, the roll angles of the scheduled VIIRS lunar observations are required to be within [-14°, 0°] and the aforementioned change of the phase angle range was aimed to further minimize the roll angle required for each lunar observation while keeping the number of months in which the moon can be viewed by the VIIRS instrument each year unchanged. The lunar observations can be used to identify if there is crosstalk in VIIRS bands and to track on-orbit changes in VIIRS Reflective Solar Bands (RSB) detector gains. In this paper, we report our results using the lunar observations to examine the on-orbit crosstalk effects among NPP VIIRS bands, to track the VIIRS RSB gain changes in first few months on-orbit, and to compare the gain changes derived from lunar and SD/SDSM calibration.
NPP VIIRS On-Orbit Calibration and Characterization Using the Moon
NASA Technical Reports Server (NTRS)
Sun, J.; Xiong, X.; Butler, J.
2012-01-01
The Visible Infrared Imager Radiometer Suite (VIIRS) is one of five instruments on-board the Suomi National Polar orbiting Partnership (NPP) satellite that launched from Vandenberg Air Force Base, Calif., on Oct. 28, 2011. VIIRS has been scheduled to view the Moon approximately monthly with a spacecraft roll maneuver after its NADIR door open on November 21, 2011. To reduce the uncertainty of the radiometric calibration due to the view geometry, the lunar phase angles of the scheduled lunar observations were confined in the range from -56 deg to -55 deg in the first three scheduled lunar observations and then changed to the range from -51.5 deg to -50.5 deg, where the negative sign for the phase angles indicates that the VIIRS views a waxing moon. Unlike the MODIS lunar observations, most scheduled VIIRS lunar views occur on the day side of the Earth. For the safety of the instrument, the roll angles of the scheduled VIIRS lunar observations are required to be within [-14 deg, 0 deg] and the aforementioned change of the phase angle range was aimed to further minimize the roll angle required for each lunar observation while keeping the number of months in which the moon can be viewed by the VIIRS instrument each year unchanged. The lunar observations can be used to identify if there is crosstalk in VIIRS bands and to track on-orbit changes in VIIRS Reflective Solar Bands (RSB) detector gains. In this paper, we report our results using the lunar observations to examine the on-orbit crosstalk effects among NPP VIIRS bands, to track the VIIRS RSB gain changes in first few months on-orbit, and to compare the gain changes derived from lunar and SD/SDSM calibration.
NASA Technical Reports Server (NTRS)
Valdez, P. F.; Donohoe, G. W.
1997-01-01
Statistical classification of remotely sensed images attempts to discriminate between surface cover types on the basis of the spectral response recorded by a sensor. It is well known that surfaces reflect incident radiation as a function of wavelength producing a spectral signature specific to the material under investigation. Multispectral and hyperspectral sensors sample the spectral response over tens and even hundreds of wavelength bands to capture the variation of spectral response with wavelength. Classification algorithms then exploit these differences in spectral response to distinguish between materials of interest. Sensors of this type, however, collect detailed spectral information from one direction (usually nadir); consequently, do not consider the directional nature of reflectance potentially detectable at different sensor view angles. Improvements in sensor technology have resulted in remote sensing platforms capable of detecting reflected energy across wavelengths (spectral signatures) and from multiple view angles (angular signatures) in the fore and aft directions. Sensors of this type include: the moderate resolution imaging spectroradiometer (MODIS), the multiangle imaging spectroradiometer (MISR), and the airborne solid-state array spectroradiometer (ASAS). A goal of this paper, then, is to explore the utility of Bidirectional Reflectance Distribution Function (BRDF) models in the selection of optimal view angles for the classification of remotely sensed images by employing a strategy of searching for the maximum difference between surface BRDFs. After a brief discussion of directional reflect ante in Section 2, attention is directed to the Beard-Maxwell BRDF model and its use in predicting the bidirectional reflectance of a surface. The selection of optimal viewing angles is addressed in Section 3, followed by conclusions and future work in Section 4.
Volume-holographic memory for laser threat discrimination
NASA Astrophysics Data System (ADS)
Delong, Mark L.; Duncan, Bradley D.; Parker, Jack H., Jr.
1996-10-01
Using conventional volume-holographic angle multiplexing in an Fe:LiNbO3 crystal, we have developed a compact laser threat discriminator, intended for aircraft integration, that optically detects laser spatial coherence and angle of arrival while simultaneously rejecting incoherent background sources, such as the Sun. The device is intended for a specific type of psychophysical laser attack against U.S. Air Force pilots, namely, third-world-country exploitation of inexpensive and powerful cw Ar-ion or doubled Nd:YAG lasers in the visible spectrum to blind or disorient U.S. pilots. The component does not solve the general tactical laser weapon situation, which includes identifying precision-guided munitions, range finders, and lidar systems that use pulsed infrared lasers. These are fundamentally different threats requiring different detector solutions. The device incorporates a sequence of highly redundant, simple black-and-white warning patterns that are keyed to be reconstructed as the incident laser threat, playing the role of an uncooperative probe beam, changes angle with respect to the crystal. The device tracks both azimuth and elevation, using a nonconventional hologram viewing system. Recording and playback conditions are simplified because nonzero cross talk is a desirable feature of this discriminator, inasmuch as our application requires a nonzero probability of detection for arbitrary directions of arrival within the sensor's field of view. The device can exploit phase-matched grating trade-off with probe-beam wavelength, accommodating wavelength-tunable threats, while still maintaining high direction-of-arrival tracking accuracy. .
NASA Technical Reports Server (NTRS)
1992-01-01
This view of the north polar region of the Moon was obtained by Galileo's camera during the spacecraft's flyby of the Earth-Moon system on December 7 and 8, 1992. The north pole is to the lower right of the image. The view in the upper left is toward the horizon across the volcanic lava plains of Mare Imbrium. The prominent crater with the central peak is Pythagoras, an impact crater some 130 kilometers (80 miles) in diameter. The image was taken at a distance of 121,000 kilometers (75,000 miles) from the Moon through the violet filter of Galileo's imaging system. According to team scientists, the viewing geometry provided by the spacecraft's pass over the north pole and the low sun-angle illumination provide a unique opportunity to assess the geologic relationships among the smooth plains, cratered terrain and impact ejecta deposits in this region of the Moon. JPL manages the Galileo Project for NASA's Office of Space Science and Applications.
Measuring contact angle and meniscus shape with a reflected laser beam.
Eibach, T F; Fell, D; Nguyen, H; Butt, H J; Auernhammer, G K
2014-01-01
Side-view imaging of the contact angle between an extended planar solid surface and a liquid is problematic. Even when aligning the view perfectly parallel to the contact line, focusing one point of the contact line is not possible. We describe a new measurement technique for determining contact angles with the reflection of a widened laser sheet on a moving contact line. We verified this new technique measuring the contact angle on a cylinder, rotating partially immersed in a liquid. A laser sheet is inclined under an angle φ to the unperturbed liquid surface and is reflected off the meniscus. Collected on a screen, the reflection image contains information to determine the contact angle. When dividing the laser sheet into an array of laser rays by placing a mesh into the beam path, the shape of the meniscus can be reconstructed from the reflection image. We verified the method by measuring the receding contact angle versus speed for aqueous cetyltrimethyl ammonium bromide solutions on a smooth hydrophobized as well as on a rough polystyrene surface.
Measuring contact angle and meniscus shape with a reflected laser beam
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eibach, T. F.; Nguyen, H.; Butt, H. J.
2014-01-15
Side-view imaging of the contact angle between an extended planar solid surface and a liquid is problematic. Even when aligning the view perfectly parallel to the contact line, focusing one point of the contact line is not possible. We describe a new measurement technique for determining contact angles with the reflection of a widened laser sheet on a moving contact line. We verified this new technique measuring the contact angle on a cylinder, rotating partially immersed in a liquid. A laser sheet is inclined under an angle φ to the unperturbed liquid surface and is reflected off the meniscus. Collectedmore » on a screen, the reflection image contains information to determine the contact angle. When dividing the laser sheet into an array of laser rays by placing a mesh into the beam path, the shape of the meniscus can be reconstructed from the reflection image. We verified the method by measuring the receding contact angle versus speed for aqueous cetyltrimethyl ammonium bromide solutions on a smooth hydrophobized as well as on a rough polystyrene surface.« less
An Automated, High-Throughput System for GISAXS and GIWAXS Measurements of Thin Films
NASA Astrophysics Data System (ADS)
Schaible, Eric; Jimenez, Jessica; Church, Matthew; Lim, Eunhee; Stewart, Polite; Hexemer, Alexander
Grazing incidence small-angle X-ray scattering (GISAXS) and grazing incidence wide-angle X-ray scattering (GIWAXS) are important techniques for characterizing thin films. In order to meet rapidly increasing demand, the SAXSWAXS beamline at the Advanced Light Source (beamline 7.3.3) has implemented a fully automated, high-throughput system to conduct SAXS, GISAXS and GIWAXS measurements. An automated robot arm transfers samples from a holding tray to a measurement stage. Intelligent software aligns each sample in turn, and measures each according to user-defined specifications. Users mail in trays of samples on individually barcoded pucks, and can download and view their data remotely. Data will be pipelined to the NERSC supercomputing facility, and will be available to users via a web portal that facilitates highly parallelized analysis.
Fabrication of multi-focal microlens array on curved surface for wide-angle camera module
NASA Astrophysics Data System (ADS)
Pan, Jun-Gu; Su, Guo-Dung J.
2017-08-01
In this paper, we present a wide-angle and compact camera module that consists of microlens array with different focal lengths on curved surface. The design integrates the principle of an insect's compound eye and the human eye. It contains a curved hexagonal microlens array and a spherical lens. Compared with normal mobile phone cameras which usually need no less than four lenses, but our proposed system only uses one lens. Furthermore, the thickness of our proposed system is only 2.08 mm and diagonal full field of view is about 100 degrees. In order to make the critical microlens array, we used the inkjet printing to control the surface shape of each microlens for achieving different focal lengths and use replication method to form curved hexagonal microlens array.
NASA Technical Reports Server (NTRS)
Gatebe, C. K.; King, M. D.; Tsay, S.-C.; Ji, Q.
2000-01-01
Remote sensing of aerosol over land, from MODIS will be based on dark targets using mid-IR channels 2.1 and 3.9 micron. This approach was developed by Kaufman et al (1997), who suggested that dark surface reflectance in the red (0.66 micron -- rho(sub 0.66)) channel is half of that at 2.2 micron (rho(sub 2.2)), and the reflectance in the blue (0.49 micron - rho(sub 0.49)) channel is a quarter of that at 2.2 micron. Using this relationship, the surface reflectance in the visible channels can be predicted within Delta.rho(sub 0.49) approximately Delat.rho(sub 0.66) approximately 0.006 from rho(sub 2.2) for rho(sub 2.2) <= 0.10. This was half the error obtained using the 3.75 micron and corresponds to an error in aerosol optical thickness of Delat.tau approximately 0.06. These results, though applicable to several biomes (e.g. forests, and brighter lower canopies), have only been tested at one view angle - the nadir (theta = 0 deg). Considering the importance of the results in remote sensing of aerosols over land surfaces from space, we are validating the relationships for off-nadir view angles using Cloud Absorption Radiometer (CAR) data. The CAR data are available for channels between 0.3 and 2.3 micron and for different surface types and conditions: forest, tundra, ocean, sea-ice, swamp, grassland and over areas covered with smoke. In this study we analyzed data collected during the Smoke, Clouds, and Radiation - Brazil (SCAR-B) experiment to validate Kaufman et al.'s (1997) results for non-nadir view angles. We will show the correlation between rho(sub 0.472), rho(sub 0.675), and rho(sub 2.2) for view angles between nadir (0 deg) and 55 deg off-nadir, and for different viewing directions in the backscatter and forward scatter directions.
Self-contained eye-safe laser radar using an erbium-doped fiber laser
NASA Astrophysics Data System (ADS)
Driscoll, Thomas A.; Radecki, Dan J.; Tindal, Nan E.; Corriveau, John P.; Denman, Richard
2003-07-01
An Eye-safe Laser Radar has been developed under White Sands Missile Range sponsorship. The SEAL system, the Self-contained Eyesafe Autonomous Laser system, is designed to measure target position within a 0.5 meter box. Targets are augmented with Scotchlite for ranging out to 6 km and augmented with a retroreflector for targets out to 20 km. The data latency is less than 1.5 ms, and the position update rate is 1 kHz. The system is air-cooled, contained in a single 200-lb, 6-cubic-foot box, and uses less than 600 watts of prime power. The angle-angle-range data will be used to measure target dynamics and to control a tracking mount. The optical system is built around a diode-pumped, erbium-doped fiber laser rated at 1.5 watts average power at 10 kHz repetition rate with 25 nsec pulse duration. An 8 inch-diameter, F/2.84 telescope is relayed to a quadrant detector at F/0.85 giving a 5 mrad field of view. Two detectors have been evaluated, a Germanium PIN diode and an Intevac TE-IPD. The receiver electronics uses a DSP network of 6 SHARC processors to implement ranging and angle error algorithms along with an Optical AGC, including beam divergence/FOV control loops.Laboratory measurements of the laser characteristics, and system range and angle accuracies will be compared to simulations. Field measurements against actual targets will be presented.
On Local Ionization Equilibrium and Disk Winds in QSOs
NASA Astrophysics Data System (ADS)
Pereyra, Nicolas A.
2014-11-01
We present theoretical C IV λλ1548,1550 absorption line profiles for QSOs calculated assuming the accretion disk wind (ADW) scenario. The results suggest that the multiple absorption troughs seen in many QSOs may be due to the discontinuities in the ion balance of the wind (caused by X-rays), rather than discontinuities in the density/velocity structure. The profiles are calculated from a 2.5-dimensional time-dependent hydrodynamic simulation of a line-driven disk wind for a typical QSO black hole mass, a typical QSO luminosity, and for a standard Shakura-Sunyaev disk. We include the effects of ionizing X-rays originating from within the inner disk radius by assuming that the wind is shielded from the X-rays from a certain viewing angle up to 90° ("edge on"). In the shielded region, we assume constant ionization equilibrium, and thus constant line-force parameters. In the non-shielded region, we assume that both the line-force and the C IV populations are nonexistent. The model can account for P-Cygni absorption troughs (produced at edge on viewing angles), multiple absorption troughs (produced at viewing angles close to the angle that separates the shielded region and the non-shielded region), and for detached absorption troughs (produced at an angle in between the first two absorption line types); that is, the model can account for the general types of broad absorption lines seen in QSOs as a viewing angle effect. The steady nature of ADWs, in turn, may account for the steady nature of the absorption structure observed in multiple-trough broad absorption line QSOs. The model parameters are M bh = 109 M ⊙ and L disk = 1047 erg s-1.
Active Control of Acoustic Field-of-View in a Biosonar System
Yovel, Yossi; Falk, Ben; Moss, Cynthia F.; Ulanovsky, Nachum
2011-01-01
Active-sensing systems abound in nature, but little is known about systematic strategies that are used by these systems to scan the environment. Here, we addressed this question by studying echolocating bats, animals that have the ability to point their biosonar beam to a confined region of space. We trained Egyptian fruit bats to land on a target, under conditions of varying levels of environmental complexity, and measured their echolocation and flight behavior. The bats modulated the intensity of their biosonar emissions, and the spatial region they sampled, in a task-dependant manner. We report here that Egyptian fruit bats selectively change the emission intensity and the angle between the beam axes of sequentially emitted clicks, according to the distance to the target, and depending on the level of environmental complexity. In so doing, they effectively adjusted the spatial sector sampled by a pair of clicks—the “field-of-view.” We suggest that the exact point within the beam that is directed towards an object (e.g., the beam's peak, maximal slope, etc.) is influenced by three competing task demands: detection, localization, and angular scanning—where the third factor is modulated by field-of-view. Our results suggest that lingual echolocation (based on tongue clicks) is in fact much more sophisticated than previously believed. They also reveal a new parameter under active control in animal sonar—the angle between consecutive beams. Our findings suggest that acoustic scanning of space by mammals is highly flexible and modulated much more selectively than previously recognized. PMID:21931535
Multi-view line-scan inspection system using planar mirrors
NASA Astrophysics Data System (ADS)
Holländer, Bransilav; Štolc, Svorad; Huber-Mörk, Reinhold
2013-04-01
We demonstrate the design, setup, and results for a line-scan stereo image acquisition system using a single area- scan sensor, single lens and two planar mirrors attached to the acquisition device. The acquired object is moving relatively to the acquisition device and is observed under three different angles at the same time. Depending on the specific configuration it is possible to observe the object under a straight view (i.e., looking along the optical axis) and two skewed views. The relative motion between an object and the acquisition device automatically fulfills the epipolar constraint in stereo vision. The choice of lines to be extracted from the CMOS sensor depends on various factors such as the number, position and size of the mirrors, the optical and sensor configuration, or other application-specific parameters like desired depth resolution. The acquisition setup presented in this paper is suitable for the inspection of a printed matter, small parts or security features such as optical variable devices and holograms. The image processing pipeline applied to the extracted sensor lines is explained in detail. The effective depth resolution achieved by the presented system, assembled from only off-the-shelf components, is approximately equal to the spatial resolution and can be smoothly controlled by changing positions and angles of the mirrors. Actual performance of the device is demonstrated on a 3D-printed ground-truth object as well as two real-world examples: (i) the EUR-100 banknote - a high-quality printed matter and (ii) the hologram at the EUR-50 banknote { an optical variable device.
Fluctuations of Lake Eyre, South Australia
NASA Technical Reports Server (NTRS)
2002-01-01
Lake Eyre is a large salt lake situated between two deserts in one of Australia's driest regions. However, this low-lying lake attracts run-off from one of the largest inland drainage systems in the world. The drainage basin is very responsive to rainfall variations, and changes dramatically with Australia's inter-annual weather fluctuations. When Lake Eyre fills,as it did in 1989, it is temporarily Australia's largest lake, and becomes dense with birds, frogs and colorful plant life. The Lake responds to extended dry periods (often associated with El Nino events) by drying completely.These four images from the Multi-angle Imaging SpectroRadiometer contrast the lake area at the start of the austral summers of 2000 and 2002. The top two panels portray the region as it appeared on December 9, 2000. Heavy rains in the first part of 2000 caused both the north and south sections of the lake to fill partially and the northern part of the lake still contained significant standing water by the time these data were acquired. The bottom panels were captured on November 29, 2002. Rainfall during 2002 was significantly below average ( http://www.bom.gov.au/ ), although showers occurring in the week before the image was acquired helped alleviate this condition slightly.The left-hand panels portray the area as it appeared to MISR's vertical-viewing (nadir) camera, and are false-color views comprised of data from the near-infrared, green and blue channels. Here, wet and/or moist surfaces appear blue-green, since water selectively absorbs longer wavelengths such as near-infrared. The right-hand panels are multi-angle composites created with red band data from MISR's 60-degree forward, nadir and 60-degree backward-viewing cameras, displayed as red, green and blue, respectively. In these multi-angle composites, color variations serve as a proxy for changes in angular reflectance, and indicate textural properties of the surface related to roughness and/or moisture content.Data from the two dates were processed identically to preserve relative variations in brightness between them. Wet surfaces or areas with standing water appear green due to the effect of sunglint at the nadir camera view angle. Dry, salt encrusted parts of the lake appear bright white or gray. Purple areas have enhanced forward scattering, possibly as a result of surface moistness. Some variations exhibited by the multi-angle composites are not discernible in the nadir multi-spectral images and vice versa, suggesting that the combination of angular and spectral information is a more powerful diagnostic of surface conditions than either technique by itself.The Multi-angle Imaging SpectroRadiometer observes the daylit Earth continuously and every 9 days views the entire globe between 82 degrees north and 82 degrees south latitude. These data products were generated from a portion of the imagery acquired during Terra orbits 5194 and 15679. The panels cover an area of 146 kilometers x 122 kilometers, and utilize data from blocks 113 to 114 within World Reference System-2 path 100.MISR was built and is managed by NASA's Jet Propulsion Laboratory,Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.2002-01-08
new PAL with a total viewing angle of around 80° and suitable for foveal vision, it turned out that the optical design program ZEMAX EE we intended to...use was not capable for optimization. The reason was that ZEMAX -EE and all present optical design programs are based on see-through-window (STW
Coherent lidar wind measurements from the Space Station base using 1.5 m all-reflective optics
NASA Technical Reports Server (NTRS)
Bilbro, J. W.; Beranek, R. G.
1987-01-01
This paper discusses the space-based measurement of atmospheric winds from the point of view of the requirements of the optical system of a coherent CO2 lidar. A brief description of the measurement technique is given and a discussion of previous study results provided. The telescope requirements for a Space Station based lidar are arrived at through discussions of the desired system sensitivity and the need for lag angle compensation.
Samborsky, James K.
1993-01-01
A device for the purpose of monitoring light transmissions in optical fibers comprises a fiber optic tap that optically diverts a fraction of a transmitted optical signal without disrupting the integrity of the signal. The diverted signal is carried, preferably by the fiber optic tap, to a lens or lens system that disperses the light over a solid angle that facilitates viewing. The dispersed light indicates whether or not the monitored optical fiber or system of optical fibers is currently transmitting optical information.
There is no bidirectional hot-spot in Sentinel-2 data
NASA Astrophysics Data System (ADS)
Li, Z.; Roy, D. P.; Zhang, H.
2017-12-01
The Sentinel-2 multi-spectral instrument (MSI) acquires reflective wavelength observations with directional effects due to surface reflectance anisotropy, often described by the bidirectional reflectance distribution function (BRDF). Recently, we quantified Sentinel-2A (S2A) BRDF effects for 20° × 10° of southern Africa sensed in January and in April 2016 and found maximum BRDF effects for the January data and at the western scan edge, i.e., in the back-scatter direction (Roy et al. 2017). The hot-spot is the term used to describe the increased directional reflectance that occurs over most surfaces when the solar and viewing directions coincide, and has been observed in wide-field of view data such as MODIS. Recently, we observed that Landsat data will not have a hot-spot because the global annual minimum solar zenith angle is more than twice the maximum view zenith angle (Zhang et al. 2016). This presentation examines if there is a S2A hot-spot which may be possible as it has a wider field of view (20.6°) and higher orbit (786 km) than Landsat. We examined a global year of S2A metadata extracted using the Committee on Earth Observation Satellite Visualization Environment (COVE) tool, computed the solar zenith angles in the acquisition corners, and ranked the acquisitions by the solar zenith angle in the back-scatter direction. The available image data for the 10 acquisitions with the smallest solar zenith angle over the year were ordered from the ESA and their geometries examined in detail. The acquisition closest to the hot-spot had a maximum scattering angle of 173.61° on its western edge (view zenith angle 11.91°, solar zenith angle 17.97°) and was acquired over 60.80°W 24.37°N on June 2nd 2016. Given that hot-spots are only apparent when the scattering angle is close to 180° we conclude from this global annual analysis that there is no hot-spot in Sentinel-2 data. Roy, D.P, Li, J., Zhang, H.K., Yan, L., Huang, H., Li, Z., 2017, Examination of Sentinel-2A multi-spectral instrument (MSI) reflectance anisotropy and the suitability of a general method to normalize MSI reflectance to nadir BRDF adjusted reflectance, RSE. 199, 25-38. Zhang, H. K., Roy, D.P., Kovalskyy, V., 2016, Optimal solar geometry definition for global long term Landsat time series bi-directional reflectance normalization, IEEE TGRS. 54(3), 1410-1418.
Brain activation in parietal area during manipulation with a surgical robot simulator.
Miura, Satoshi; Kobayashi, Yo; Kawamura, Kazuya; Nakashima, Yasutaka; Fujie, Masakatsu G
2015-06-01
we present an evaluation method to qualify the embodiment caused by the physical difference between master-slave surgical robots by measuring the activation of the intraparietal sulcus in the user's brain activity during surgical robot manipulation. We show the change of embodiment based on the change of the optical axis-to-target view angle in the surgical simulator to change the manipulator's appearance in the monitor in terms of hand-eye coordination. The objective is to explore the change of brain activation according to the change of the optical axis-to-target view angle. In the experiments, we used a functional near-infrared spectroscopic topography (f-NIRS) brain imaging device to measure the brain activity of the seven subjects while they moved the hand controller to insert a curved needle into a target using the manipulator in a surgical simulator. The experiment was carried out several times with a variety of optical axis-to-target view angles. Some participants showed a significant peak (P value = 0.037, F-number = 2.841) when the optical axis-to-target view angle was 75°. The positional relationship between the manipulators and endoscope at 75° would be the closest to the human physical relationship between the hands and eyes.
NASA Astrophysics Data System (ADS)
Kim, Jung-Bum; Lee, Jeong-Hwan; Moon, Chang-Ki; Kim, Jang-Joo
2014-02-01
We report a highly efficient phosphorescent green inverted top emitting organic light emitting diode with excellent color stability by using the 1,4,5,8,9,11-hexaazatriphenylene-hexacarbonitrile/indium zinc oxide top electrode and bis(2-phenylpyridine)iridium(III) acetylacetonate as the emitter in an exciplex forming co-host system. The device shows a high external quantum efficiency of 23.4% at 1000 cd/m2 corresponding to a current efficiency of 110 cd/A, low efficiency roll-off with 21% at 10 000 cd/m2 and low turn on voltage of 2.4 V. Especially, the device showed very small color change with the variation of Δx = 0.02, Δy = 0.02 in the CIE 1931 coordinates as the viewing angle changes from 0° to 60°. The performance of the device is superior to that of the metal/metal cavity structured device.
[Study on the characteristics of radiance calibration using nonuniformity extended source].
Wang, Jian-Wei; Huang, Min; Xiangli, Bin; Tu, Xiao-Long
2013-07-01
Integrating sphere and diffuser are always used as extended source, and they have different effects on radiance calibration of imaging spectrometer with parameter difference. In the present paper, a mathematical model based on the theory of radiative transfer and calibration principle is founded to calculate the irradiance and calibration coefficients on CCD, taking relatively poor uniformity lights-board calibration system for example. The effects of the nonuniformity on the calibration was analyzed, which makes up the correlation of calibration coefficient matrix under ideal and unideal situation. The results show that the nonuniformity makes the viewing angle and the position of the point of intersection of the optical axis and the diffuse reflection plate have relatively large effects on calibration, while the observing distance's effect is small; under different viewing angles, a deviation value can be found that makes the calibration results closest to the desired results. So, the calibration error can be reduced by choosing appropriate deviation value.
Feasibility study of low-dose intra-operative cone-beam CT for image-guided surgery
NASA Astrophysics Data System (ADS)
Han, Xiao; Shi, Shuanghe; Bian, Junguo; Helm, Patrick; Sidky, Emil Y.; Pan, Xiaochuan
2011-03-01
Cone-beam computed tomography (CBCT) has been increasingly used during surgical procedures for providing accurate three-dimensional anatomical information for intra-operative navigation and verification. High-quality CBCT images are in general obtained through reconstruction from projection data acquired at hundreds of view angles, which is associated with a non-negligible amount of radiation exposure to the patient. In this work, we have applied a novel image-reconstruction algorithm, the adaptive-steepest-descent-POCS (ASD-POCS) algorithm, to reconstruct CBCT images from projection data at a significantly reduced number of view angles. Preliminary results from experimental studies involving both simulated data and real data show that images of comparable quality to those presently available in clinical image-guidance systems can be obtained by use of the ASD-POCS algorithm from a fraction of the projection data that are currently used. The result implies potential value of the proposed reconstruction technique for low-dose intra-operative CBCT imaging applications.
Phase transition in the parametric natural visibility graph.
Snarskii, A A; Bezsudnov, I V
2016-10-01
We investigate time series by mapping them to the complex networks using a parametric natural visibility graph (PNVG) algorithm that generates graphs depending on arbitrary continuous parameter-the angle of view. We study the behavior of the relative number of clusters in PNVG near the critical value of the angle of view. Artificial and experimental time series of different nature are used for numerical PNVG investigations to find critical exponents above and below the critical point as well as the exponent in the finite size scaling regime. Altogether, they allow us to find the critical exponent of the correlation length for PNVG. The set of calculated critical exponents satisfies the basic Widom relation. The PNVG is found to demonstrate scaling behavior. Our results reveal the similarity between the behavior of the relative number of clusters in PNVG and the order parameter in the second-order phase transitions theory. We show that the PNVG is another example of a system (in addition to magnetic, percolation, superconductivity, etc.) with observed second-order phase transition.
NASA Technical Reports Server (NTRS)
Deepak, A.; Box, M. A.
1978-01-01
The paper presents a parametric study of the forwardscattering corrections for experimentally measured optical extinction coefficients in polydisperse particulate media, since some forward scattered light invariably enters, along with the direct beam, into the finite aperture of the detector. Forwardscattering corrections are computed by two methods: (1) using the exact Mie theory, and (2) the approximate Rayleigh diffraction formula for spherical particles. A parametric study of the dependence of the corrections on mode radii, real and imaginary parts of the complex refractive index, and half-angle of the detector's view cone has been carried out for three different size distribution functions of the modified gamma type. In addition, a study has been carried out to investigate the range of these parameters in which the approximate formulation is valid. The agreement is especially good for small-view cone angles and large particles, which improves significantly for slightly absorbing aerosol particles. Also discussed is the dependence of these corrections on the experimental design of the transmissometer systems.
NASA Astrophysics Data System (ADS)
Pravdivtsev, Andrey V.
2012-06-01
The article presents the approach to the design wide-angle optical systems with special illumination and instantaneous field of view (IFOV) requirements. The unevenness of illumination reduces the dynamic range of the system, which negatively influence on the system ability to perform their task. The result illumination on the detector depends among other factors from the IFOV changes. It is also necessary to consider IFOV in the synthesis of data processing algorithms, as it directly affects to the potential "signal/background" ratio for the case of statistically homogeneous backgrounds. A numerical-analytical approach that simplifies the design of wideangle optical systems with special illumination and IFOV requirements is presented. The solution can be used for optical systems which field of view greater than 180 degrees. Illumination calculation in optical CAD is based on computationally expensive tracing of large number of rays. The author proposes to use analytical expression for some characteristics which illumination depends on. The rest characteristic are determined numerically in calculation with less computationally expensive operands, the calculation performs not every optimization step. The results of analytical calculation inserts in the merit function of optical CAD optimizer. As a result we reduce the optimizer load, since using less computationally expensive operands. It allows reducing time and resources required to develop a system with the desired characteristics. The proposed approach simplifies the creation and understanding of the requirements for the quality of the optical system, reduces the time and resources required to develop an optical system, and allows creating more efficient EOS.
MSDS sky reference and preamplifier study
NASA Technical Reports Server (NTRS)
Larsen, L.; Stewart, S.; Lambeck, P.
1974-01-01
The major goals in re-designing the Multispectral Scanner and Data System (MSDS) sky reference are: (1) to remove the sun-elevation angle and aircraft-attitude angle dependence from the solar-sky illumination measurement, and (2) to obtain data on the optical state of the atmosphere. The present sky reference is dependent on solar elevation and provides essentially no information on important atmospheric parameters. Two sky reference designs were tested. One system is built around a hyperbolic mirror and the reflection approach. A second approach to a sky reference utilizes a fish-eye lens to obtain a 180 deg field of view. A detailed re-design of the present sky reference around the fish-eye approach, even with its limitations, is recommended for the MSDS system. A preamplifier study was undertaken to find ways of improving the noise-equivalent reflectance by reducing the noise level for silicon detector channels on the MSDS.
5. VIEW OF FRONT (WEST AND SOUTH SIDES) TO NORTHEAST. ...
5. VIEW OF FRONT (WEST AND SOUTH SIDES) TO NORTHEAST. VIEW TO NORTHEAST. NOTE THAT LARGE TREES PREVENT MORE COMPLETE VIEW FROM BETTER ANGLE. FOR MORE COMPLETE VIEW, SEE PHOTOGRAPHIC COPY OF 1916 PHOTO, NO. ID-17-C-35. - Boise Project, Boise Project Office, 214 Broadway, Boise, Ada County, ID
Virtual displays for 360-degree video
NASA Astrophysics Data System (ADS)
Gilbert, Stephen; Boonsuk, Wutthigrai; Kelly, Jonathan W.
2012-03-01
In this paper we describe a novel approach for comparing users' spatial cognition when using different depictions of 360- degree video on a traditional 2D display. By using virtual cameras within a game engine and texture mapping of these camera feeds to an arbitrary shape, we were able to offer users a 360-degree interface composed of four 90-degree views, two 180-degree views, or one 360-degree view of the same interactive environment. An example experiment is described using these interfaces. This technique for creating alternative displays of wide-angle video facilitates the exploration of how compressed or fish-eye distortions affect spatial perception of the environment and can benefit the creation of interfaces for surveillance and remote system teleoperation.
Kuchin, I; Starov, V
2015-05-19
A theory of contact angle hysteresis of liquid droplets on smooth, homogeneous solid substrates is developed in terms of the shape of the disjoining/conjoining pressure isotherm and quasi-equilibrium phenomena. It is shown that all contact angles, θ, in the range θr < θ < θa, which are different from the unique equilibrium contact angle θ ≠ θe, correspond to the state of slow "microscopic" advancing or receding motion of the liquid if θe < θ < θa or θr < θ < θe, respectively. This "microscopic" motion almost abruptly becomes fast "macroscopic" advancing or receding motion after the contact angle reaches the critical values θa or θr, correspondingly. The values of the static receding, θr, and static advancing, θa, contact angles in cylindrical capillaries were calculated earlier, based on the shape of disjoining/conjoining pressure isotherm. It is shown now that (i) both advancing and receding contact angles of a droplet on a on smooth, homogeneous solid substrate can be calculated based on shape of disjoining/conjoining pressure isotherm, and (ii) both advancing and receding contact angles depend on the drop volume and are not unique characteristics of the liquid-solid system. The latter is different from advancing/receding contact angles in thin capillaries. It is shown also that the receding contact angle is much closer to the equilibrium contact angle than the advancing contact angle. The latter conclusion is unexpected and is in a contradiction with the commonly accepted view that the advancing contact angle can be taken as the first approximation for the equilibrium contact angle. The dependency of hysteresis contact angles on the drop volume has a direct experimental confirmation.
Emission Patterns of Solar Type III Radio Bursts: Stereoscopic Observations
NASA Technical Reports Server (NTRS)
Thejappa, G.; MacDowall, R.; Bergamo, M.
2012-01-01
Simultaneous observations of solar type III radio bursts obtained by the STEREO A, B, and WIND spacecraft at low frequencies from different vantage points in the ecliptic plane are used to determine their directivity. The heliolongitudes of the sources of these bursts, estimated at different frequencies by assuming that they are located on the Parker spiral magnetic field lines emerging from the associated active regions into the spherically symmetric solar atmosphere, and the heliolongitudes of the spacecraft are used to estimate the viewing angle, which is the angle between the direction of the magnetic field at the source and the line connecting the source to the spacecraft. The normalized peak intensities at each spacecraft Rj = Ij /[Sigma]Ij (the subscript j corresponds to the spacecraft STEREO A, B, and WIND), which are defined as the directivity factors are determined using the time profiles of the type III bursts. It is shown that the distribution of the viewing angles divides the type III bursts into: (1) bursts emitting into a very narrow cone centered around the tangent to the magnetic field with angular width of approximately 2 deg and (2) bursts emitting into a wider cone with angular width spanning from [approx] -100 deg to approximately 100 deg. The plots of the directivity factors versus the viewing angles of the sources from all three spacecraft indicate that the type III emissions are very intense along the tangent to the spiral magnetic field lines at the source, and steadily fall as the viewing angles increase to higher values. The comparison of these emission patterns with the computed distributions of the ray trajectories indicate that the intense bursts visible in a narrow range of angles around the magnetic field directions probably are emitted in the fundamental mode, whereas the relatively weaker bursts visible to a wide range of angles are probably emitted in the harmonic mode.
59. VIEW FROM THE NORTHEAST IN THE NORTHEAST QUADRANT. GENERAL ...
59. VIEW FROM THE NORTHEAST IN THE NORTHEAST QUADRANT. GENERAL VIEW OF THE RIGHT FLANK WALL. RIGHT SHOULDER ANGLE IS INCLUDED ON THE RIGHT SIDE OF THE PHOTOGRAPH. - Fort Sumter, Charleston, Charleston County, SC
Atmospheric Science Data Center
2013-04-16
article title: Unique Views of a Shattered Ice Shelf View Larger Image ... views of the breakup of the northern section of the Larsen B ice shelf are shown in this image pair from the Multi-angle Imaging ...
Rotationally Invariant Image Representation for Viewing Direction Classification in Cryo-EM
Zhao, Zhizhen; Singer, Amit
2014-01-01
We introduce a new rotationally invariant viewing angle classification method for identifying, among a large number of cryo-EM projection images, similar views without prior knowledge of the molecule. Our rotationally invariant features are based on the bispectrum. Each image is denoised and compressed using steerable principal component analysis (PCA) such that rotating an image is equivalent to phase shifting the expansion coefficients. Thus we are able to extend the theory of bispectrum of 1D periodic signals to 2D images. The randomized PCA algorithm is then used to efficiently reduce the dimensionality of the bispectrum coefficients, enabling fast computation of the similarity between any pair of images. The nearest neighbors provide an initial classification of similar viewing angles. In this way, rotational alignment is only performed for images with their nearest neighbors. The initial nearest neighbor classification and alignment are further improved by a new classification method called vector diffusion maps. Our pipeline for viewing angle classification and alignment is experimentally shown to be faster and more accurate than reference-free alignment with rotationally invariant K-means clustering, MSA/MRA 2D classification, and their modern approximations. PMID:24631969
2015-08-20
This view from NASA Cassini spacecraft looks toward Saturn icy moon Dione, with giant Saturn and its rings in the background, just prior to the mission final close approach to the moon on August 17, 2015. At lower right is the large, multi-ringed impact basin named Evander, which is about 220 miles (350 kilometers) wide. The canyons of Padua Chasma, features that form part of Dione's bright, wispy terrain, reach into the darkness at left. Imaging scientists combined nine visible light (clear spectral filter) images to create this mosaic view: eight from the narrow-angle camera and one from the wide-angle camera, which fills in an area at lower left. The scene is an orthographic projection centered on terrain at 0.2 degrees north latitude, 179 degrees west longitude on Dione. An orthographic view is most like the view seen by a distant observer looking through a telescope. North on Dione is up. The view was acquired at distances ranging from approximately 106,000 miles (170,000 kilometers) to 39,000 miles (63,000 kilometers) from Dione and at a sun-Dione-spacecraft, or phase, angle of 35 degrees. Image scale is about 1,500 feet (450 meters) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA19650
NASA Astrophysics Data System (ADS)
Wang, Z.; Roman, M. O.; Pahlevan, N.; Stachura, M.; McCorkel, J.; Bland, G.; Schaaf, C.
2016-12-01
Albedo is a key climate forcing variable that governs the absorption of incoming solar radiation and its ultimate transfer to the atmosphere. Albedo contributes significant uncertainties in the simulation of climate changes; and as such, it is defined by the Global Climate Observing System (GCOS) as a terrestrial essential climate variable (ECV) required by global and regional climate and biogeochemical models. NASA's Goddard Space Flight Center's Multi AngLe Imaging Bidirectional Reflectance Distribution Function small-UAS (MALIBU) is part of a series of pathfinder missions to develop enhanced multi-angular remote sensing techniques using small Unmanned Aircraft Systems (sUAS). The MALIBU instrument package includes two multispectral imagers oriented at two different viewing geometries (i.e., port and starboard sides) capture vegetation optical properties and structural characteristics. This is achieved by analyzing the surface reflectance anisotropy signal (i.e., BRDF shape) obtained from the combination of surface reflectance from different view-illumination angles and spectral channels. Satellite measures of surface albedo from MODIS, VIIRS, and Landsat have been evaluated by comparison with spatially representative albedometer data from sparsely distributed flux towers at fixed heights. However, the mismatch between the footprint of ground measurements and the satellite footprint challenges efforts at validation, especially for heterogeneous landscapes. The BRDF (Bidirectional Reflectance Distribution Function) models of surface anisotropy have only been evaluated with airborne BRDF data over a very few locations. The MALIBU platform that acquires extremely high resolution sub-meter measures of surface anisotropy and surface albedo, can thus serve as an important source of reference data to enable global land product validation efforts, and resolve the errors and uncertainties in the various existing products generated by NASA and its national and international partners.
NASA Astrophysics Data System (ADS)
Chaban, R.; Pace, D. C.; Marcy, G. R.; Taussig, D.
2016-10-01
Energetic ion losses must be minimized in burning plasmas to maintain fusion power, and existing tokamaks provide access to energetic ion parameter regimes that are relevant to burning machines. A new Fast Ion Loss Detector (FILD) probe on the DIII-D tokamak has been optimized to resolve beam ion losses across a range of 30 - 90 keV in energy and 40° to 80° in pitch angle, thereby providing valuable measurements during many different experiments. The FILD is a magnetic spectrometer; once inserted into the tokamak, the magnetic field allows energetic ions to pass through a collimating aperture and strike a scintillator plate that is imaged by a wide view camera and narrow view photomultiplier tubes (PMTs). The design involves calculating scintillator strike patterns while varying probe geometry. Calculated scintillator patterns are then used to design an optical system that allows adjustment of the focus regions for the 1 MS/s resolved PMTs. A synthetic diagnostic will be used to determine the energy and pitch angle resolution that can be attained in DIII-D experiments. Work supported in part by US DOE under the Science Undergraduate Laboratory Internship (SULI) program and under DE-FC02-04ER54698.
Asymmetry in the Outburst of SN 1987A Detected Using Light Echo Spectroscopy
NASA Astrophysics Data System (ADS)
Sinnott, B.; Welch, D. L.; Rest, A.; Sutherland, P. G.; Bergmann, M.
2013-04-01
We report direct evidence for asymmetry in the early phases of SN 1987A via optical spectroscopy of five fields of its light echo system. The light echoes allow the first few hundred days of the explosion to be reobserved, with different position angles providing different viewing angles to the supernova. Light echo spectroscopy therefore allows a direct spectroscopic comparison of light originating from different regions of the photosphere during the early phases of SN 1987A. Gemini multi-object spectroscopy of the light echo fields shows fine structure in the Hα line as a smooth function of position angle on the near-circular light echo rings. Hα profiles originating from the northern hemisphere of SN 1987A show an excess in redshifted emission and a blue knee, while southern hemisphere profiles show an excess of blueshifted Hα emission and a red knee. This fine structure is reminiscent of the "Bochum event" originally observed for SN 1987A, but in an exaggerated form. Maximum deviation from symmetry in the Hα line is observed at position angles 16° and 186°, consistent with the major axis of the expanding elongated ejecta. The asymmetry signature observed in the Hα line smoothly diminishes as a function of viewing angle away from the poles of the elongated ejecta. We propose an asymmetric two-sided distribution of 56Ni most dominant in the southern far quadrant of SN 1987A as the most probable explanation of the observed light echo spectra. This is evidence that the asymmetry of high-velocity 56Ni in the first few hundred days after explosion is correlated to the geometry of the ejecta some 25 years later.
A Macintosh-Based Scientific Images Video Analysis System
NASA Technical Reports Server (NTRS)
Groleau, Nicolas; Friedland, Peter (Technical Monitor)
1994-01-01
A set of experiments was designed at MIT's Man-Vehicle Laboratory in order to evaluate the effects of zero gravity on the human orientation system. During many of these experiments, the movements of the eyes are recorded on high quality video cassettes. The images must be analyzed off-line to calculate the position of the eyes at every moment in time. To this aim, I have implemented a simple inexpensive computerized system which measures the angle of rotation of the eye from digitized video images. The system is implemented on a desktop Macintosh computer, processes one play-back frame per second and exhibits adequate levels of accuracy and precision. The system uses LabVIEW, a digital output board, and a video input board to control a VCR, digitize video images, analyze them, and provide a user friendly interface for the various phases of the process. The system uses the Concept Vi LabVIEW library (Graftek's Image, Meudon la Foret, France) for image grabbing and displaying as well as translation to and from LabVIEW arrays. Graftek's software layer drives an Image Grabber board from Neotech (Eastleigh, United Kingdom). A Colour Adapter box from Neotech provides adequate video signal synchronization. The system also requires a LabVIEW driven digital output board (MacADIOS II from GW Instruments, Cambridge, MA) controlling a slightly modified VCR remote control used mainly to advance the video tape frame by frame.
Perceived orientation, spatial layout and the geometry of pictures
NASA Technical Reports Server (NTRS)
Goldstein, E. Bruce
1989-01-01
The purpose is to discuss the role of geometry in determining the perception of spatial layout and perceived orientation in pictures viewed at an angle. This discussion derives from Cutting's (1988) suggestion, based on his analysis of some of the author's data (Goldstein, 1987), that the changes in perceived orientation that occur when pictures are viewed at an angle can be explained in terms of geometrically produced changes in the picture's virtual space.
Variation in spectral response of soybeans with respect to illumination, view, and canopy geometry
NASA Technical Reports Server (NTRS)
Ranson, K. J.; Biehl, L. L.; Bauer, M. E.
1984-01-01
Comparisons of the spectral response for incomplete (well-defined row structure) and complete (overlapping row structure) canopies of soybeans indicated a greater dependence on Sun and view geometry for the incomplete canopies. Red and near-IR reflectance for the incomplete canopy decreased as solar zenith angle increased for a nadir view angle until the soil between the plant rows was completely shaded. Thereafter for increasing solar zenith angle, the red reflectance leveled off and the near-IR reflectance increased. A 'hot spot' effect was evident for the red and near-IR reflectance factors. The 'hot spot' effect was more pronounced for the red band based on relative reflectance value changes. The ratios of off-nadir to nadir acquired data reveal that off-nadir red band reflectance factors more closely approximated straightdown measurements for time periods away from solar noon. Normalized difference generally approximated straightdown measurements during the middle portion of the day.
Effect of structured visual environments on apparent eye level.
Stoper, A E; Cohen, M M
1989-11-01
Each of 12 subjects set a binocularly viewed target to apparent eye level; the target was projected on the rear wall of an open box, the floor of which was horizontal or pitched up and down at angles of 7.5 degrees and 15 degrees. Settings of the target were systematically biased by 60% of the pitch angle when the interior of the box was illuminated, but by only 5% when the interior of the box was darkened. Within-subjects variability of the settings was less under illuminated viewing conditions than in the dark, but was independent of box pitch angle. In a second experiment, 11 subjects were tested with an illuminated pitched box, yielding biases of 53% and 49% for binocular and monocular viewing conditions, respectively. The results are discussed in terms of individual and interactive effects of optical, gravitational, and extraretinal eye-position information in determining judgements of eye level.
Image quality improvement in MDCT cardiac imaging via SMART-RECON method
NASA Astrophysics Data System (ADS)
Li, Yinsheng; Cao, Ximiao; Xing, Zhanfeng; Sun, Xuguang; Hsieh, Jiang; Chen, Guang-Hong
2017-03-01
Coronary CT angiography (CCTA) is a challenging imaging task currently limited by the achievable temporal resolution of modern Multi-Detector CT (MDCT) scanners. In this paper, the recently proposed SMARTRECON method has been applied in MDCT-based CCTA imaging to improve the image quality without any prior knowledge of cardiac motion. After the prospective ECG-gated data acquisition from a short-scan angular span, the acquired data were sorted into several sub-sectors of view angles; each corresponds to a 1/4th of the short-scan angular range. Information of the cardiac motion was thus encoded into the data in each view angle sub-sector. The SMART-RECON algorithm was then applied to jointly reconstruct several image volumes, each of which is temporally consistent with the data acquired in the corresponding view angle sub-sector. Extensive numerical simulations were performed to validate the proposed technique and investigate the performance dependence.
Research on visible and near infrared spectral-polarimetric properties of soil polluted by crude oil
NASA Astrophysics Data System (ADS)
Shen, Hui-yan; Zhou, Pu-cheng; Pan, Bang-long
2017-10-01
Hydrocarbon contaminated soil can impose detrimental effects on forest health and quality of agricultural products. To manage such consequences, oil leak indicators should be detected quickly by monitoring systems. Remote sensing is one of the most suitable techniques for monitoring systems, especially for areas which are uninhabitable and difficulty to access. The most available physical quantities in optical remote sensing domain are the intensity and spectral information obtained by visible or infrared sensors. However, besides the intensity and wavelength, polarization is another primary physical quantity associated with an optical field. During the course of reflecting light-wave, the surface of soil polluted by crude oil will cause polarimetric properties which are related to the nature of itself. Thus, detection of the spectralpolarimetric properties for soil polluted by crude oil has become a new remote sensing monitoring method. In this paper, the multi-angle spectral-polarimetric instrument was used to obtain multi-angle visible and near infrared spectralpolarimetric characteristic data of soil polluted by crude oil. And then, the change rule between polarimetric properties with different affecting factors, such as viewing zenith angle, incidence zenith angle of the light source, relative azimuth angle, waveband of the detector as well as different grain size of soil were discussed, so as to provide a scientific basis for the research on polarization remote sensing for soil polluted by crude oil.
NASA Astrophysics Data System (ADS)
Makino, T.; Okamoto, H.; Sato, K.; Tanaka, K.; Nishizawa, T.; Sugimoto, N.; Matsui, I.; Jin, Y.; Uchiyama, A.; Kudo, R.
2014-12-01
We have developed a new type of ground-based lidar, Multi-Field of view-Multiple-Scattering-Polarization Lidar (MFMSPL), to analyze multiple scattering contribution due to low-level clouds. One issue of the ground based lidar is the limitation of optical thickness of about 3 due to the strong attenuation in the lidar signals so that only the cloud bottom part can be observed. In order to overcome the problem, we have proposed the MFMSPL that has been designed to observe similar degree of multiple scattering contribution expected from space-borne lidar CALIOP on CALIPSO satellite. The system consists of eight detectors; four telescopes for parallel channels and four for perpendicular channels. The four pairs of telescope have been mounted with four different off-beam angles, ranging from -5 to 35mrad, where the angle is defined as the one between the direction of laser beam and the direction of telescope. Consequently, similar large foot print (100m) as CALIOP can be achieved in the MFMSPL observations when the altitude of clouds is located at about 1km. The use of multi-field of views enables to measure depolarization ratio from optically thick clouds. The outer receivers attached with larger angles generally detect backscattered signals from clouds located at upper altitudes due to the enhanced multiple scattering compared with the inner receiver that detects signals only from cloud bottom portions. Therefore the information of cloud microphysics from optically thicker regions is expected by the MFMSPL observations compared with the conventional lidar with small FOV. The MFMSPL have been continuously operated in Tsukuba, Japan since June 2014.Initial analyses have indicated expected performances from the theoretical estimation by backward Monte-Carlo simulations. The depolarization ratio from deeper part of the clouds detected by the receiver with large off-beam angle showed much larger values than those from the one with small angle. The calibration procedures and summary of initial observations will be presented. The observed data obtained by the MFMSPL will be used to develop and evaluate the retrieval algorithms for cloud microphysics applied to the CALIOP data.
Airborne Laser Polar Nephelometer
NASA Technical Reports Server (NTRS)
Grams, Gerald W.
1973-01-01
A polar nephelometer has been developed at NCAR to measure the angular variation of the intensity of light scattered by air molecules and particles. The system has been designed for airborne measurements using outside air ducted through a 5-cm diameter airflow tube; the sample volume is that which is common to the intersection of a collimated source beam and the detector field of view within the airflow tube. The source is a linearly polarized helium-neon laser beam. The optical system defines a collimated field-of-view (0.5deg half-angle) through a series of diaphragms located behind a I72-mm focal length objective lens. A photomultiplier tube is located immediately behind an aperture in the focal plane of the objective lens. The laser beam is mechanically chopped (on-off) at a rate of 5 Hz; a two-channel pulse counter, synchronized to the laser output, measures the photomultiplier pulse rate with the light beam both on and off. The difference in these measured pulse rates is directly proportional to the intensity of the scattered light from the volume common to the intersection of the laser beam and the detector field-of-view. Measurements can be made at scattering angles from 15deg to 165deg with reference to the direction of propagation of the light beam. Intermediate angles are obtained by selecting the angular increments desired between these extreme angles (any multiple of 0.1deg can be selected for the angular increment; 5deg is used in normal operation). Pulses provided by digital circuits control a stepping motor which sequentially rotates the detector by pre-selected angular increments. The synchronous photon-counting system automatically begins measurement of the scattered-light intensity immediately after the rotation to a new angle has been completed. The instrument has been flown on the NASA Convair 990 airborne laboratory to obtain data on the complex index of refraction of atmospheric aerosols. A particle impaction device is operated simultaneously to collect particles from the same airflow tube used to make the scattered-light measurements. A size distribution function is obtained by analysis of the particles collected by the impaction device. Calculated values of the angular variation of the scattered-light intensity are obtained by applying Mie scattering theory to the observed size distribution function and assuming different values of the complex index of refraction of the particles. The calculated values are then compared with data on the actual variation of the scattered-light intensity obtained with the polar nephelometer. The most probable value of the complex refractive index is that which provides the best fit between the experimental light scattering data and the parameters calculated from the observed size distribution function.
Novel compact panomorph lens based vision system for monitoring around a vehicle
NASA Astrophysics Data System (ADS)
Thibault, Simon
2008-04-01
Automotive applications are one of the largest vision-sensor market segments and one of the fastest growing ones. The trend to use increasingly more sensors in cars is driven both by legislation and consumer demands for higher safety and better driving experiences. Awareness of what directly surrounds a vehicle affects safe driving and manoeuvring of a vehicle. Consequently, panoramic 360° Field of View imaging can contributes most to the perception of the world around the driver than any other sensors. However, to obtain a complete vision around the car, several sensor systems are necessary. To solve this issue, a customized imaging system based on a panomorph lens will provide the maximum information for the drivers with a reduced number of sensors. A panomorph lens is a hemispheric wide angle anamorphic lens with enhanced resolution in predefined zone of interest. Because panomorph lenses are optimized to a custom angle-to-pixel relationship, vision systems provide ideal image coverage that reduces and optimizes the processing. We present various scenarios which may benefit from the use of a custom panoramic sensor. We also discuss the technical requirements of such vision system. Finally we demonstrate how the panomorph based visual sensor is probably one of the most promising ways to fuse many sensors in one. For example, a single panoramic sensor on the front of a vehicle could provide all necessary information for assistance in crash avoidance, lane tracking, early warning, park aids, road sign detection, and various video monitoring views.
Design of the compact high-resolution imaging spectrometer (CHRIS), and future developments
NASA Astrophysics Data System (ADS)
Cutter, Mike; Lobb, Dan
2017-11-01
The CHRIS instrument was launched on ESA's PROBA platform in October 2001, and is providing hyperspectral images of selected ground areas at 17m ground sampling distance, in the spectral range 415nm to 1050nm. Platform agility allows image sets to be taken at multiple view angles in each overpass. The design of the instrument is briefly outlined, including design of optics, structures, detection and in-flight calibration system. Lessons learnt from construction and operation of the experimental system, and possible design directions for future hyperspectral systems, are discussed.
Soybean canopy reflectance as a function of view and illumination geometry
NASA Technical Reports Server (NTRS)
Ranson, K. J.; Vanderbilt, V. C.; Biehl, L. L.; Robinson, B. F.; Bauer, M. E.
1981-01-01
Reflectances were calculated from measurements at four wavelength bands through eight view azimuth and seven view zenith directions, for various solar zenith and azimuth angles over portions of three days, in an experimental characterization of a soybean field by means of its reflectances and physical and agronomic attributes. Results indicate that the distribution of reflectance from a soybean field is a function of the solar illumination and viewing geometry, wavelength, and row direction, as well as the state of canopy development. Shadows between rows were found to affect visible wavelength band reflectance to a greater extent than near-IR reflectance. A model describing reflectance variation as a function of projected solar and viewing angles is proposed, which approximates the visible wavelength band reflectance variations of a canopy with a well-defined row structure.
Extended Duration Orbiter (EDO) Improved Waste Collection System (IWCS)
1992-09-25
S92-46726 (November 1992) --- A high angle view of the Improved Waste Collection System (IWCS) scheduled to fly aboard NASA's Space Shuttle Endeavour for the STS-54 mission. Among the advantages the new IWCS is hoped to have over the current WCS are greater dependability, better hygiene, virtually unlimited capacity and more efficient preparation between Shuttle missions. Unlike the previous WCS, the improved version will not have to be removed from the spacecraft to be readied for the next flight.
LDEF grappled by remote manipulator system (RMS) during STS-32 retrieval
1990-01-20
This view taken through overhead window W7 on Columbia's, Orbiter Vehicle (OV) 102's, aft flight deck shows the Long Duration Exposure Facility (LDEF) in the grasp of the remote manipulator system (RMS) during STS-32 retrieval activities. Other cameras at eye level were documenting the bus-sized spacecraft at various angles as the RMS manipulated LDEF for a lengthy photo survey. The glaring celestial body in the upper left is the sun with the Earth's surface visible below.
Photogrammetric mobile satellite service prediction
NASA Technical Reports Server (NTRS)
Akturan, Riza; Vogel, Wolfhard J.
1994-01-01
Photographic images of the sky were taken with a camera through a fisheye lens with a 180 deg field-of-view. The images of rural, suburban, and urban scenes were analyzed on a computer to derive quantitative information about the elevation angles at which the sky becomes visible. Such knowledge is needed by designers of mobile and personal satellite communications systems and is desired by customers of these systems. The 90th percentile elevation angle of the skyline was found to be 10 deg, 17 deg, and 51 deg in the three environments. At 8 deg, 75 percent, 75 percent, and 35 percent of the sky was visible, respectively. The elevation autocorrelation fell to zero with a 72 deg lag in the rural and urban environment and a 40 deg lag in the suburb. Mean estimation errors are below 4 deg.
64. VIEW FROM THE NORTHEAST IN THE NORTHEAST QUADRANT. DETAIL ...
64. VIEW FROM THE NORTHEAST IN THE NORTHEAST QUADRANT. DETAIL VIEW OF THE RIGHT FACE. A PORTION OF THE RIGHT SHOULDER ANGLE IS INCLUDED ON THE LEFT-SIDE OF THE IMAGE, WITH SCALE. - Fort Sumter, Charleston, Charleston County, SC
Vertical gaze angle: absolute height-in-scene information for the programming of prehension.
Gardner, P L; Mon-Williams, M
2001-02-01
One possible source of information regarding the distance of a fixated target is provided by the height of the object within the visual scene. It is accepted that this cue can provide ordinal information, but generally it has been assumed that the nervous system cannot extract "absolute" information from height-in-scene. In order to use height-in-scene, the nervous system would need to be sensitive to ocular position with respect to the head and to head orientation with respect to the shoulders (i.e. vertical gaze angle or VGA). We used a perturbation technique to establish whether the nervous system uses vertical gaze angle as a distance cue. Vertical gaze angle was perturbed using ophthalmic prisms with the base oriented either up or down. In experiment 1, participants were required to carry out an open-loop pointing task whilst wearing: (1) no prisms; (2) a base-up prism; or (3) a base-down prism. In experiment 2, the participants reached to grasp an object under closed-loop viewing conditions whilst wearing: (1) no prisms; (2) a base-up prism; or (3) a base-down prism. Experiment 1 and 2 provided clear evidence that the human nervous system uses vertical gaze angle as a distance cue. It was found that the weighting attached to VGA decreased with increasing target distance. The weighting attached to VGA was also affected by the discrepancy between the height of the target, as specified by all other distance cues, and the height indicated by the initial estimate of the position of the supporting surface. We conclude by considering the use of height-in-scene information in the perception of surface slant and highlight some of the complexities that must be involved in the computation of environmental layout.
NASA Astrophysics Data System (ADS)
Maeda, M.; Yamamoto, K.; Mizokawa, T.; Saini, N. L.; Arita, M.; Namatame, H.; Taniguchi, M.; Tan, G.; Zhao, L. D.; Kanatzidis, M. G.
2018-03-01
We have studied the electronic structure of SnSe and Na-doped SnSe by means of angle-resolved photoemission spectroscopy. The valence-band top reaches the Fermi level by the Na doping, indicating that Na-doped SnSe can be viewed as a degenerate semiconductor. However, in the Na-doped system, the chemical potential shift with temperature is unexpectedly large and is apparently inconsistent with the degenerate semiconductor picture. The large chemical potential shift and anomalous spectral shape are key ingredients for an understanding of the novel metallic state with the large thermoelectric performance in Na-doped SnSe.
Lesion Quantification in Dual-Modality Mammotomography
NASA Astrophysics Data System (ADS)
Li, Heng; Zheng, Yibin; More, Mitali J.; Goodale, Patricia J.; Williams, Mark B.
2007-02-01
This paper describes a novel x-ray/SPECT dual modality breast imaging system that provides 3D structural and functional information. While only a limited number of views on one side of the breast can be acquired due to mechanical and time constraints, we developed a technique to compensate for the limited angle artifact in reconstruction images and accurately estimate both the lesion size and radioactivity concentration. Various angular sampling strategies were evaluated using both simulated and experimental data. It was demonstrated that quantification of lesion size to an accuracy of 10% and quantification of radioactivity to an accuracy of 20% are feasible from limited-angle data acquired with clinically practical dosage and acquisition time
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maeda, M.; Yamamoto, K.; Mizokawa, T.
In this work, we have studied the electronic structure of SnSe and Na-doped SnSe by means of angle-resolved photoemission spectroscopy. The valence-band top reaches the Fermi level by the Na doping, indicating that Na-doped SnSe can be viewed as a degenerate semiconductor. However, in the Na-doped system, the chemical potential shift with temperature is unexpectedly large and is apparently inconsistent with the degenerate semiconductor picture. Lastly, the large chemical potential shift and anomalous spectral shape are key ingredients for an understanding of the novel metallic state with the large thermoelectric performance in Na-doped SnSe.
Maeda, M.; Yamamoto, K.; Mizokawa, T.; ...
2018-03-23
In this work, we have studied the electronic structure of SnSe and Na-doped SnSe by means of angle-resolved photoemission spectroscopy. The valence-band top reaches the Fermi level by the Na doping, indicating that Na-doped SnSe can be viewed as a degenerate semiconductor. However, in the Na-doped system, the chemical potential shift with temperature is unexpectedly large and is apparently inconsistent with the degenerate semiconductor picture. Lastly, the large chemical potential shift and anomalous spectral shape are key ingredients for an understanding of the novel metallic state with the large thermoelectric performance in Na-doped SnSe.
Wang, Wei; Chen, Xiyuan
2018-02-23
In view of the fact the accuracy of the third-degree Cubature Kalman Filter (CKF) used for initial alignment under large misalignment angle conditions is insufficient, an improved fifth-degree CKF algorithm is proposed in this paper. In order to make full use of the innovation on filtering, the innovation covariance matrix is calculated recursively by an innovative sequence with an exponent fading factor. Then a new adaptive error covariance matrix scaling algorithm is proposed. The Singular Value Decomposition (SVD) method is used for improving the numerical stability of the fifth-degree CKF in this paper. In order to avoid the overshoot caused by excessive scaling of error covariance matrix during the convergence stage, the scaling scheme is terminated when the gradient of azimuth reaches the maximum. The experimental results show that the improved algorithm has better alignment accuracy with large misalignment angles than the traditional algorithm.
Photographic measurement of head and cervical posture when viewing mobile phone: a pilot study.
Guan, Xiaofei; Fan, Guoxin; Wu, Xinbo; Zeng, Ying; Su, Hang; Gu, Guangfei; Zhou, Qi; Gu, Xin; Zhang, Hailong; He, Shisheng
2015-12-01
With the dramatic growth of mobile phone usage, concerns have been raised with regard to the adverse health effects of mobile phone on spinal posture. The aim of this study was to determine the head and cervical postures by photogrammetry when viewing the mobile phone screen, compared with those in neutral standing posture. A total of 186 subjects (81 females and 105 males) aged from 17 to 31 years old participated in this study. Subjects were instructed to stand neutrally and using mobile phone as in daily life. Using a photographic method, the sagittal head and cervical postures were assessed by head tilt angle, neck tilt angle, forward head shift and gaze angle. The photographic method showed a high intra-rater and inter-rater reliability in measuring the sagittal posture of cervical spine and gaze angle (ICCs ranged from 0.80 to 0.99). When looking at mobile phone, the head tilt angle significantly increased (from 74.55° to 95.22°, p = 0.000) and the neck angle decreased (from 54.68° to 38.77°, p = 0.000). The forward head posture was also confirmed by the significantly increased head shift (from 10.90 to 13.85 cm, p = 0.000). The posture assumed in mobile phone use was significantly correlated with neutral posture (p < 0.05). Males displayed a more forward head posture than females (p < 0.05). The head tilt angle was positively correlated with the gaze angle (r = 0.616, p = 0.000), while the neck tilt angle was negatively correlated with the gaze angle (r = -0.628, p = 0.000). Photogrammetry is a reliable, quantitative method to evaluate the head and cervical posture during mobile phone use. Compared to neutral standing, subjects display a more forward head posture when viewing the mobile phone screen, which is correlated with neutral posture, gaze angle and gender. Future studies will be needed to investigate a dose-response relationship between mobile phone use and assumed posture.
Wide-angle Optical Telescope for the EUSO Experiments
NASA Technical Reports Server (NTRS)
Hillman, L. W.; Takahaski, Y.; Zuccaro, A.; Lamb, D.; Pitalo, K.; Lopado, A.; Keys, A.
2003-01-01
Future spacebased air shower experiments, including the planned Extreme Universe Space Observatory (EUSO) mission, require a wide-angle telescope in the near-UV wavelengths 330 - 400 nm. Widest possible target aperture of earth's atmosphere, such as greater than 10(exp 5) square kilometers sr, can be viewed within the field-of-view of 30 degrees from space. EUSO's optical design is required to be compact, being constrained by the allocated mass and diameter for use in space. Two doublesided Fresnel lenses with 2.5-m diameter are chosen for the baseline design. It satisfies the imaging resolution of 0.1 degree over the 30-degree field of view.
NASA Astrophysics Data System (ADS)
Roosjen, Peter P. J.; Brede, Benjamin; Suomalainen, Juha M.; Bartholomeus, Harm M.; Kooistra, Lammert; Clevers, Jan G. P. W.
2018-04-01
In addition to single-angle reflectance data, multi-angular observations can be used as an additional information source for the retrieval of properties of an observed target surface. In this paper, we studied the potential of multi-angular reflectance data for the improvement of leaf area index (LAI) and leaf chlorophyll content (LCC) estimation by numerical inversion of the PROSAIL model. The potential for improvement of LAI and LCC was evaluated for both measured data and simulated data. The measured data was collected on 19 July 2016 by a frame-camera mounted on an unmanned aerial vehicle (UAV) over a potato field, where eight experimental plots of 30 × 30 m were designed with different fertilization levels. Dozens of viewing angles, covering the hemisphere up to around 30° from nadir, were obtained by a large forward and sideways overlap of collected images. Simultaneously to the UAV flight, in situ measurements of LAI and LCC were performed. Inversion of the PROSAIL model was done based on nadir data and based on multi-angular data collected by the UAV. Inversion based on the multi-angular data performed slightly better than inversion based on nadir data, indicated by the decrease in RMSE from 0.70 to 0.65 m2/m2 for the estimation of LAI, and from 17.35 to 17.29 μg/cm2 for the estimation of LCC, when nadir data were used and when multi-angular data were used, respectively. In addition to inversions based on measured data, we simulated several datasets at different multi-angular configurations and compared the accuracy of the inversions of these datasets with the inversion based on data simulated at nadir position. In general, the results based on simulated (synthetic) data indicated that when more viewing angles, more well distributed viewing angles, and viewing angles up to larger zenith angles were available for inversion, the most accurate estimations were obtained. Interestingly, when using spectra simulated at multi-angular sampling configurations as were captured by the UAV platform (view zenith angles up to 30°), already a huge improvement could be obtained when compared to solely using spectra simulated at nadir position. The results of this study show that the estimation of LAI and LCC by numerical inversion of the PROSAIL model can be improved when multi-angular observations are introduced. However, for the potato crop, PROSAIL inversion for measured data only showed moderate accuracy and slight improvements.
Pixel-level tunable liquid crystal lenses for auto-stereoscopic display
NASA Astrophysics Data System (ADS)
Li, Kun; Robertson, Brian; Pivnenko, Mike; Chu, Daping; Zhou, Jiong; Yao, Jun
2014-02-01
Mobile video and gaming are now widely used, and delivery of a glass-free 3D experience is of both research and development interest. The key drawbacks of a conventional 3D display based on a static lenticular lenslet array and parallax barriers are low resolution, limited viewing angle and reduced brightness, mainly because of the need of multiple-pixels for each object point. This study describes the concept and performance of pixel-level cylindrical liquid crystal (LC) lenses, which are designed to steer light to the left and right eye sequentially to form stereo parallax. The width of the LC lenses can be as small as 20-30 μm, so that the associated auto-stereoscopic display will have the same resolution as the 2D display panel in use. Such a thin sheet of tunable LC lens array can be applied directly on existing mobile displays, and can deliver 3D viewing experience while maintaining 2D viewing capability. Transparent electrodes were laser patterned to achieve the single pixel lens resolution, and a high birefringent LC material was used to realise a large diffraction angle for a wide field of view. Simulation was carried out to model the intensity profile at the viewing plane and optimise the lens array based on the measured LC phase profile. The measured viewing angle and intensity profile were compared with the simulation results.
Site selection and directional models of deserts used for ERBE validation targets
NASA Technical Reports Server (NTRS)
Staylor, W. F.
1986-01-01
Broadband shortwave and longwave radiance measurements obtained from the Nimbus 7 Earth Radiation Budget scanner were used to develop reflectance and emittance models for the Sahara, Gibson, and Saudi Deserts. These deserts will serve as in-flight validation targets for the Earth Radiation Budget Experiment being flown on the Earth Radiation Budget Satellite and two National Oceanic and Atmospheric Administration polar satellites. The directional reflectance model derived for the deserts was a function of the sum and product of the cosines of the solar and viewing zenith angles, and thus reciprocity existed between these zenith angles. The emittance model was related by a power law of the cosine of the viewing zenith angle.
2015-08-20
NASA Cassini spacecraft captured this parting view showing the rough and icy crescent of Saturn moon Dione following the spacecraft last close flyby of the moon on Aug. 17, 2015. Cassini obtained a similar crescent view in 2005 (see PIA07745). The earlier view has an image scale about four times higher, but does not show the moon's full crescent as this view does. Five visible light (clear spectral filter), narrow-angle camera images were combined to create this mosaic view. The scene is an orthographic projection centered on terrain at 0.4 degrees north latitude, 30.6 degrees west longitude on Dione. An orthographic view is most like the view seen by a distant observer looking through a telescope. The view was acquired at distances ranging from approximately 37,000 miles (59,000 kilometers) to 47,000 miles (75,000 kilometers) from Dione and at a sun-Dione-spacecraft, or phase, angle of 145 degrees. Image scale is about 1,300 feet (400 meters) per pixel. North on Dione is up and rotated 34 degrees to the right. http://photojournal.jpl.nasa.gov/catalog/PIA19649
Development of a 3-D visible limiter imaging system for the HSX stellarator
NASA Astrophysics Data System (ADS)
Buelo, C.; Stephey, L.; Anderson, F. S. B.; Eisert, D.; Anderson, D. T.
2017-12-01
A visible camera diagnostic has been developed to study the Helically Symmetric eXperiment (HSX) limiter plasma interaction. A straight line view from the camera location to the limiter was not possible due to the complex 3D stellarator geometry of HSX, so it was necessary to insert a mirror/lens system into the plasma edge. A custom support structure for this optical system tailored to the HSX geometry was designed and installed. This system holds the optics tube assembly at the required angle for the desired view to both minimize system stress and facilitate robust and repeatable camera positioning. The camera system has been absolutely calibrated and using Hα and C-III filters can provide hydrogen and carbon photon fluxes, which through an S/XB coefficient can be converted into particle fluxes. The resulting measurements have been used to obtain the characteristic penetration length of hydrogen and C-III species. The hydrogen λiz value shows reasonable agreement with the value predicted by a 1D penetration length calculation.
Atmospheric Science Data Center
2013-04-16
... Gujarat), and in areas close to the earthquake epicenter. Research uses the unique capabilities of the Multi-angle Imaging ... Indo-Pakistani border, which were not easily accessible to survey teams on the ground. Changes in reflection at different view angles ...
The near infrared imaging system for the real-time protection of the JET ITER-like wall
NASA Astrophysics Data System (ADS)
Huber, A.; Kinna, D.; Huber, V.; Arnoux, G.; Balboa, I.; Balorin, C.; Carman, P.; Carvalho, P.; Collins, S.; Conway, N.; McCullen, P.; Jachmich, S.; Jouve, M.; Linsmeier, Ch; Lomanowski, B.; Lomas, P. J.; Lowry, C. G.; Maggi, C. F.; Matthews, G. F.; May-Smith, T.; Meigs, A.; Mertens, Ph; Nunes, I.; Price, M.; Puglia, P.; Riccardo, V.; Rimini, F. G.; Sergienko, G.; Tsalas, M.; Zastrow, K.-D.; contributors, JET
2017-12-01
This paper describes the design, implementation and operation of the near infrared (NIR) imaging diagnostic system of the JET ITER-like wall (JET-ILW) plasma experiment and its integration into the existing JET protection architecture. The imaging system comprises four wide-angle views, four tangential divertor views, and two top views of the divertor covering 66% of the first wall and up to 43% of the divertor. The operation temperature ranges which must be observed by the NIR protection cameras are, for the materials used on JET: Be 700 °C-1400 °C W coating 700 °C-1370 °C W bulk 700 °C-1400 °C. The Real-Time Protection system operates routinely since 2011 and successfully demonstrated its capability to avoid the overheating of the main chamber beryllium wall as well as of the divertor W and W-coated carbon fibre composite (CFC) tiles. During this period, less than 0.5% of the terminated discharges were aborted by a malfunction of the system. About 2%-3% of the discharges were terminated due to the detection of actual hot spots.
Stokes-Doppler coherence imaging for ITER boundary tomography.
Howard, J; Kocan, M; Lisgo, S; Reichle, R
2016-11-01
An optical coherence imaging system is presently being designed for impurity transport studies and other applications on ITER. The wide variation in magnetic field strength and pitch angle (assumed known) across the field of view generates additional Zeeman-polarization-weighting information that can improve the reliability of tomographic reconstructions. Because background reflected light will be somewhat depolarized analysis of only the polarized fraction may be enough to provide a level of background suppression. We present the principles behind these ideas and some simulations that demonstrate how the approach might work on ITER. The views and opinions expressed herein do not necessarily reflect those of the ITER Organization.
Characterization of crosstalk in stereoscopic display devices.
Zafar, Fahad; Badano, Aldo
2014-12-01
Many different types of stereoscopic display devices are used for commercial and research applications. Stereoscopic displays offer the potential to improve performance in detection tasks for medical imaging diagnostic systems. Due to the variety of stereoscopic display technologies, it remains unclear how these compare with each other for detection and estimation tasks. Different stereo devices have different performance trade-offs due to their display characteristics. Among them, crosstalk is known to affect observer perception of 3D content and might affect detection performance. We measured and report the detailed luminance output and crosstalk characteristics for three different types of stereoscopic display devices. We recorded the effect of other issues on recorded luminance profiles such as viewing angle, use of different eye wear, and screen location. Our results show that the crosstalk signature for viewing 3D content can vary considerably when using different types of 3D glasses for active stereo displays. We also show that significant differences are present in crosstalk signatures when varying the viewing angle from 0 degrees to 20 degrees for a stereo mirror 3D display device. Our detailed characterization can help emulate the effect of crosstalk in conducting computational observer image quality assessment evaluations that minimize costly and time-consuming human reader studies.
Projection moire for remote contour analysis
NASA Technical Reports Server (NTRS)
Doty, J. L.
1983-01-01
Remote projection and viewing of moire contours are examined analytically for a system employing separate projection and viewing optics, with specific attention paid to the practical limitations imposed by the optical systems. It is found that planar contours are possible only when the optics are telecentric (exit pupil at infinity) but that the requirement for spatial separability of the contour fringes from extraneous fringes is independent of the specific optics and is a function only of the angle separating the two optic axes. In the nontelecentric case, the contour separation near the object is unchanged from that of the telecentric case, although the contours are distorted into low-eccentricity (near-circular) ellipses. Furthermore, the minimum contour spacing is directly related to the depth of focus through the resolution of the optics.
2013-12-23
The globe of Saturn, seen here in natural color, is reminiscent of a holiday ornament in this wide-angle view from NASA's Cassini spacecraft. The characteristic hexagonal shape of Saturn's northern jet stream, somewhat yellow here, is visible. At the pole lies a Saturnian version of a high-speed hurricane, eye and all. This view is centered on terrain at 75 degrees north latitude, 120 degrees west longitude. Images taken using red, green and blue spectral filters were combined to create this natural-color view. The images were taken with the Cassini spacecraft wide-angle camera on July 22, 2013. This view was acquired at a distance of approximately 611,000 miles (984,000 kilometers) from Saturn. Image scale is 51 miles (82 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA17175
Smartphone-Guided Needle Angle Selection During CT-Guided Procedures.
Xu, Sheng; Krishnasamy, Venkatesh; Levy, Elliot; Li, Ming; Tse, Zion Tsz Ho; Wood, Bradford John
2018-01-01
In CT-guided intervention, translation from a planned needle insertion angle to the actual insertion angle is estimated only with the physician's visuospatial abilities. An iPhone app was developed to reduce reliance on operator ability to estimate and reproduce angles. The iPhone app overlays the planned angle on the smartphone's camera display in real-time based on the smartphone's orientation. The needle's angle is selected by visually comparing the actual needle with the guideline in the display. If the smartphone's screen is perpendicular to the planned path, the smartphone shows the Bull's-Eye View mode, in which the angle is selected after the needle's hub overlaps the tip in the camera. In phantom studies, we evaluated the accuracies of the hardware, the Guideline mode, and the Bull's-Eye View mode and showed the app's clinical efficacy. A proof-of-concept clinical case was also performed. The hardware accuracy was 0.37° ± 0.27° (mean ± SD). The mean error and navigation time were 1.0° ± 0.9° and 8.7 ± 2.3 seconds for a senior radiologist with 25 years' experience and 1.5° ± 1.3° and 8.0 ± 1.6 seconds for a junior radiologist with 4 years' experience. The accuracy of the Bull's-Eye View mode was 2.9° ± 1.1°. Combined CT and smart-phone guidance was significantly more accurate than CT-only guidance for the first needle pass (p = 0.046), which led to a smaller final targeting error (mean distance from needle tip to target, 2.5 vs 7.9 mm). Mobile devices can be useful for guiding needle-based interventions. The hardware is low cost and widely available. The method is accurate, effective, and easy to implement.
Denize, Erin Stewart; McDonald, Fraser; Sherriff, Martyn
2014-01-01
Objective To evaluate the relative importance of bilabial prominence in relation to other facial profile parameters in a normal population. Methods Profile stimulus images of 38 individuals (28 female and 10 male; ages 19-25 years) were shown to an unrelated group of first-year students (n = 42; ages 18-24 years). The images were individually viewed on a 17-inch monitor. The observers received standardized instructions before viewing. A six-question questionnaire was completed using a Likert-type scale. The responses were analyzed by ordered logistic regression to identify associations between profile characteristics and observer preferences. The Bayesian Information Criterion was used to select variables that explained observer preferences most accurately. Results Nasal, bilabial, and chin prominences; the nasofrontal angle; and lip curls had the greatest effect on overall profile attractiveness perceptions. The lip-chin-throat angle and upper lip curl had the greatest effect on forehead prominence perceptions. The bilabial prominence, nasolabial angle (particularly the lower component), and mentolabial angle had the greatest effect on nasal prominence perceptions. The bilabial prominence, nasolabial angle, chin prominence, and submental length had the greatest effect on lip prominence perceptions. The bilabial prominence, nasolabial angle, mentolabial angle, and submental length had the greatest effect on chin prominence perceptions. Conclusions More prominent lips, within normal limits, may be considered more attractive in the profile view. Profile parameters have a greater influence on their neighboring aesthetic units but indirectly influence related profile parameters, endorsing the importance of achieving an aesthetic balance between relative prominences of all aesthetic units of the facial profile. PMID:25133133
Triangulation-based 3D surveying borescope
NASA Astrophysics Data System (ADS)
Pulwer, S.; Steglich, P.; Villringer, C.; Bauer, J.; Burger, M.; Franz, M.; Grieshober, K.; Wirth, F.; Blondeau, J.; Rautenberg, J.; Mouti, S.; Schrader, S.
2016-04-01
In this work, a measurement concept based on triangulation was developed for borescopic 3D-surveying of surface defects. The integration of such measurement system into a borescope environment requires excellent space utilization. The triangulation angle, the projected pattern, the numerical apertures of the optical system, and the viewing angle were calculated using partial coherence imaging and geometric optical raytracing methods. Additionally, optical aberrations and defocus were considered by the integration of Zernike polynomial coefficients. The measurement system is able to measure objects with a size of 50 μm in all dimensions with an accuracy of +/- 5 μm. To manage the issue of a low depth of field while using an optical high resolution system, a wavelength dependent aperture was integrated. Thereby, we are able to control depth of field and resolution of the optical system and can use the borescope in measurement mode with high resolution and low depth of field or in inspection mode with low resolution and higher depth of field. First measurements of a demonstrator system are in good agreement with our simulations.
The solid angle hidden in polyhedron gravitation formulations
NASA Astrophysics Data System (ADS)
Werner, Robert A.
2017-03-01
Formulas of a homogeneous polyhedron's gravitational potential typically include two arctangent terms for every edge of every face and a special term to eliminate a possible facial singularity. However, the arctangent and singularity terms are equivalent to the face's solid angle viewed from the field point. A face's solid angle can be evaluated with a single arctangent, saving computation.
1986-08-01
CHARACTERISTICS OF CRU.CIFORM MISSILES INCLUDING EFFECTS OF ROLL ANGLE AND CONTROL DEFLECTION N by Daniel J. Lesieutre Michael R. Mendenhall Susana M. Nazario...ANGLE AND CONTROL DEFLECTION Daniel J. Lesieutre Michael R. Mendenhal. Susana M. Nazario Nielsen Engineering & Research, Inc.00 Mountain View, CA 94043...Lo PREDICTION OF THE AERODYNAMIC CHARACTERISTICS OF CRU.CIFORM MISSILES - INCLUDING EFFECTS OF ROLL ANGLE AND CONTROL DEFLECTION by Daniel J
View Angle Effects on MODIS Snow Mapping in Forests
NASA Technical Reports Server (NTRS)
Xin, Qinchuan; Woodcock, Curtis E.; Liu, Jicheng; Tan, Bin; Melloh, Rae A.; Davis, Robert E.
2012-01-01
Binary snow maps and fractional snow cover data are provided routinely from MODIS (Moderate Resolution Imaging Spectroradiometer). This paper investigates how the wide observation angles of MODIS influence the current snow mapping algorithm in forested areas. Theoretical modeling results indicate that large view zenith angles (VZA) can lead to underestimation of fractional snow cover (FSC) by reducing the amount of the ground surface that is viewable through forest canopies, and by increasing uncertainties during the gridding of MODIS data. At the end of the MODIS scan line, the total modeled error can be as much as 50% for FSC. Empirical analysis of MODIS/Terra snow products in four forest sites shows high fluctuation in FSC estimates on consecutive days. In addition, the normalized difference snow index (NDSI) values, which are the primary input to the MODIS snow mapping algorithms, decrease as VZA increases at the site level. At the pixel level, NDSI values have higher variances, and are correlated with the normalized difference vegetation index (NDVI) in snow covered forests. These findings are consistent with our modeled results, and imply that consideration of view angle effects could improve MODIS snow monitoring in forested areas.
Astronomy in Denver: Polarization of bow shock nebulae around massive stars
NASA Astrophysics Data System (ADS)
Shrestha, Manisha; Hoffman, Jennifer L.; Ignace, Richard; Neilson, Hilding; Richard Ignace
2018-06-01
Stellar wind bow shocks are structures created when stellar winds with supersonic relative velocities interact with the local interstellar medium (ISM). They can be studied to understand the properties of stars as well as the ISM. Since bow shocks are asymmetric, light becomes polarized by scattering in the regions of enhanced density they create. We use a Monte Carlo radiative transfer code calle SLIP to simulate the polarization signatures produced by both resolved and unresolved bow shocks with analytically derived shapes and density structures. When electron scattering is the polarizing mechanism, we find that optical depth plays an important role in the polarization signatures. While results for low optical depths reproduce theoretical predictions, higher optical depths produce higher polarization and position angle rotations at specific viewing angles. This is due to the geometrical properties of the bow shock along with multiple scattering effects. For dust scattering, we find that the polarization signature is strongly affected by wavelength, dust size, dust composition, and viewing angle. Depending on the viewing angle, the polarization magnitude may increase or decrease as a function of wavelength. We will present results from these simulations and preliminary comparisons with observational data.
Guidance Of A Mobile Robot Using An Omnidirectional Vision Navigation System
NASA Astrophysics Data System (ADS)
Oh, Sung J.; Hall, Ernest L.
1987-01-01
Navigation and visual guidance are key topics in the design of a mobile robot. Omnidirectional vision using a very wide angle or fisheye lens provides a hemispherical view at a single instant that permits target location without mechanical scanning. The inherent image distortion with this view and the numerical errors accumulated from vision components can be corrected to provide accurate position determination for navigation and path control. The purpose of this paper is to present the experimental results and analyses of the imaging characteristics of the omnivision system including the design of robot-oriented experiments and the calibration of raw results. Errors less than one picture element on each axis were observed by testing the accuracy and repeatability of the experimental setup and the alignment between the robot and the sensor. Similar results were obtained for four different locations using corrected results of the linearity test between zenith angle and image location. Angular error of less than one degree and radial error of less than one Y picture element were observed at moderate relative speed. The significance of this work is that the experimental information and the test of coordinated operation of the equipment provide a greater understanding of the dynamic omnivision system characteristics, as well as insight into the evaluation and improvement of the prototype sensor for a mobile robot. Also, the calibration of the sensor is important, since the results provide a cornerstone for future developments. This sensor system is currently being developed for a robot lawn mower.
Video Mosaicking for Inspection of Gas Pipelines
NASA Technical Reports Server (NTRS)
Magruder, Darby; Chien, Chiun-Hong
2005-01-01
A vision system that includes a specially designed video camera and an image-data-processing computer is under development as a prototype of robotic systems for visual inspection of the interior surfaces of pipes and especially of gas pipelines. The system is capable of providing both forward views and mosaicked radial views that can be displayed in real time or after inspection. To avoid the complexities associated with moving parts and to provide simultaneous forward and radial views, the video camera is equipped with a wide-angle (>165 ) fish-eye lens aimed along the axis of a pipe to be inspected. Nine white-light-emitting diodes (LEDs) placed just outside the field of view of the lens (see Figure 1) provide ample diffuse illumination for a high-contrast image of the interior pipe wall. The video camera contains a 2/3-in. (1.7-cm) charge-coupled-device (CCD) photodetector array and functions according to the National Television Standards Committee (NTSC) standard. The video output of the camera is sent to an off-the-shelf video capture board (frame grabber) by use of a peripheral component interconnect (PCI) interface in the computer, which is of the 400-MHz, Pentium II (or equivalent) class. Prior video-mosaicking techniques are applicable to narrow-field-of-view (low-distortion) images of evenly illuminated, relatively flat surfaces viewed along approximately perpendicular lines by cameras that do not rotate and that move approximately parallel to the viewed surfaces. One such technique for real-time creation of mosaic images of the ocean floor involves the use of visual correspondences based on area correlation, during both the acquisition of separate images of adjacent areas and the consolidation (equivalently, integration) of the separate images into a mosaic image, in order to insure that there are no gaps in the mosaic image. The data-processing technique used for mosaicking in the present system also involves area correlation, but with several notable differences: Because the wide-angle lens introduces considerable distortion, the image data must be processed to effectively unwarp the images (see Figure 2). The computer executes special software that includes an unwarping algorithm that takes explicit account of the cylindrical pipe geometry. To reduce the processing time needed for unwarping, parameters of the geometric mapping between the circular view of a fisheye lens and pipe wall are determined in advance from calibration images and compiled into an electronic lookup table. The software incorporates the assumption that the optical axis of the camera is parallel (rather than perpendicular) to the direction of motion of the camera. The software also compensates for the decrease in illumination with distance from the ring of LEDs.
Analysis of the flight dynamics of the Solar Maximum Mission (SMM) off-sun scientific pointing
NASA Technical Reports Server (NTRS)
Pitone, D. S.; Klein, J. R.
1989-01-01
Algorithms are presented which were created and implemented by the Goddard Space Flight Center's (GSFC's) Solar Maximum Mission (SMM) attitude operations team to support large-angle spacecraft pointing at scientific objectives. The mission objective of the post-repair SMM satellite was to study solar phenomena. However, because the scientific instruments, such as the Coronagraph/Polarimeter (CP) and the Hard X ray Burst Spectrometer (HXRBS), were able to view objects other than the Sun, attitude operations support for attitude pointing at large angles from the nominal solar-pointing attitudes was required. Subsequently, attitude support for SMM was provided for scientific objectives such as Comet Halley, Supernova 1987A, Cygnus X-1, and the Crab Nebula. In addition, the analysis was extended to include the reverse problem, computing the right ascension and declination of a body given the off-Sun angles. This analysis led to the computation of the orbits of seven new solar comets seen in the field-of-view (FOV) of the CP. The activities necessary to meet these large-angle attitude-pointing sequences, such as slew sequence planning, viewing-period prediction, and tracking-bias computation are described. Analysis is presented for the computation of maneuvers and pointing parameters relative to the SMM-unique, Sun-centered reference frame. Finally, science data and independent attitude solutions are used to evaluate the large-angle pointing performance.
Analysis of the flight dynamics of the Solar Maximum Mission (SMM) off-sun scientific pointing
NASA Technical Reports Server (NTRS)
Pitone, D. S.; Klein, J. R.; Twambly, B. J.
1990-01-01
Algorithms are presented which were created and implemented by the Goddard Space Flight Center's (GSFC's) Solar Maximum Mission (SMM) attitude operations team to support large-angle spacecraft pointing at scientific objectives. The mission objective of the post-repair SMM satellite was to study solar phenomena. However, because the scientific instruments, such as the Coronagraph/Polarimeter (CP) and the Hard X-ray Burst Spectrometer (HXRBS), were able to view objects other than the Sun, attitude operations support for attitude pointing at large angles from the nominal solar-pointing attitudes was required. Subsequently, attitude support for SMM was provided for scientific objectives such as Comet Halley, Supernova 1987A, Cygnus X-1, and the Crab Nebula. In addition, the analysis was extended to include the reverse problem, computing the right ascension and declination of a body given the off-Sun angles. This analysis led to the computation of the orbits of seven new solar comets seen in the field-of-view (FOV) of the CP. The activities necessary to meet these large-angle attitude-pointing sequences, such as slew sequence planning, viewing-period prediction, and tracking-bias computation are described. Analysis is presented for the computation of maneuvers and pointing parameters relative to the SMM-unique, Sun-centered reference frame. Finally, science data and independent attitude solutions are used to evaluate the larg-angle pointing performance.
Multi-Angle View of the Canary Islands
NASA Technical Reports Server (NTRS)
2000-01-01
A multi-angle view of the Canary Islands in a dust storm, 29 February 2000. At left is a true-color image taken by the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite. This image was captured by the MISR camera looking at a 70.5-degree angle to the surface, ahead of the spacecraft. The middle image was taken by the MISR downward-looking (nadir) camera, and the right image is from the aftward 70.5-degree camera. The images are reproduced using the same radiometric scale, so variations in brightness, color, and contrast represent true variations in surface and atmospheric reflectance with angle. Windblown dust from the Sahara Desert is apparent in all three images, and is much brighter in the oblique views. This illustrates how MISR's oblique imaging capability makes the instrument a sensitive detector of dust and other particles in the atmosphere. Data for all channels are presented in a Space Oblique Mercator map projection to facilitate their co-registration. The images are about 400 km (250 miles)wide, with a spatial resolution of about 1.1 kilometers (1,200 yards). North is toward the top. MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.
WindCam and MSPI: two cloud and aerosol instrument concepts derived from Terra/MISR heritage
NASA Astrophysics Data System (ADS)
Diner, David J.; Mischna, Michael; Chipman, Russell A.; Davis, Ab; Cairns, Brian; Davies, Roger; Kahn, Ralph A.; Muller, Jan-Peter; Torres, Omar
2008-08-01
The Multi-angle Imaging SpectroRadiometer (MISR) has been acquiring global cloud and aerosol data from polar orbit since February 2000. MISR acquires moderately high-resolution imagery at nine view angles from nadir to 70.5°, in four visible/near-infrared spectral bands. Stereoscopic parallax, time lapse among the nine views, and the variation of radiance with angle and wavelength enable retrieval of geometric cloud and aerosol plume heights, height-resolved cloud-tracked winds, and aerosol optical depth and particle property information. Two instrument concepts based upon MISR heritage are in development. The Cloud Motion Vector Camera, or WindCam, is a simplified version comprised of a lightweight, compact, wide-angle camera to acquire multiangle stereo imagery at a single visible wavelength. A constellation of three WindCam instruments in polar Earth orbit would obtain height-resolved cloud-motion winds with daily global coverage, making it a low-cost complement to a spaceborne lidar wind measurement system. The Multiangle SpectroPolarimetric Imager (MSPI) is aimed at aerosol and cloud microphysical properties, and is a candidate for the National Research Council Decadal Survey's Aerosol-Cloud-Ecosystem (ACE) mission. MSPI combines the capabilities of MISR with those of other aerosol sensors, extending the spectral coverage to the ultraviolet and shortwave infrared and incorporating high-accuracy polarimetric imaging. Based on requirements for the nonimaging Aerosol Polarimeter Sensor on NASA's Glory mission, a degree of linear polarization uncertainty of 0.5% is specified within a subset of the MSPI bands. We are developing a polarization imaging approach using photoelastic modulators (PEMs) to accomplish this objective.
Atmospheric Science Data Center
2014-05-15
... View Larger Image Multi-angle views of the Appalachian Mountains, March 6, 2000 . ... Center Atmospheric Science Data Center in Hampton, VA. Photo credit: NASA/GSFC/LaRC/JPL, MISR Science Team Other formats ...
Eyjafjallajökull Ash Plume Particle Properties
2010-04-21
As NASA Terra satellite flew over Iceland erupting Eyjafjallajökull volcano, its Multi-angle Imaging SpectroRadiometer instrument acquired 36 near-simultaneous images of the ash plume, covering nine view angles in each of four wavelengths.
Three paths toward the quantum angle operator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gazeau, Jean Pierre, E-mail: gazeau@apc.univ-paris7.fr; Szafraniec, Franciszek Hugon, E-mail: franciszek.szafraniec@uj.edu.pl
2016-12-15
We examine mathematical questions around angle (or phase) operator associated with a number operator through a short list of basic requirements. We implement three methods of construction of quantum angle. The first one is based on operator theory and parallels the definition of angle for the upper half-circle through its cosine and completed by a sign inversion. The two other methods are integral quantization generalizing in a certain sense the Berezin–Klauder approaches. One method pertains to Weyl–Heisenberg integral quantization of the plane viewed as the phase space of the motion on the line. It depends on a family of “weight”more » functions on the plane. The third method rests upon coherent state quantization of the cylinder viewed as the phase space of the motion on the circle. The construction of these coherent states depends on a family of probability distributions on the line.« less
Reproducing the hierarchy of disorder for Morpho-inspired, broad-angle color reflection
NASA Astrophysics Data System (ADS)
Song, Bokwang; Johansen, Villads Egede; Sigmund, Ole; Shin, Jung H.
2017-04-01
The scales of Morpho butterflies are covered with intricate, hierarchical ridge structures that produce a bright, blue reflection that remains stable across wide viewing angles. This effect has been researched extensively, and much understanding has been achieved using modeling that has focused on the positional disorder among the identical, multilayered ridges as the critical factor for producing angular independent color. Realizing such positional disorder of identical nanostructures is difficult, which in turn has limited experimental verification of different physical mechanisms that have been proposed. In this paper, we suggest an alternative model of inter-structural disorder that can achieve the same broad-angle color reflection, and is applicable to wafer-scale fabrication using conventional thin film technologies. Fabrication of a thin film that produces pure, stable blue across a viewing angle of more than 120 ° is demonstrated, together with a robust, conformal color coating.
Shuttle imaging radar views the Earth from Challenger: The SIR-B experiment
NASA Technical Reports Server (NTRS)
Ford, J. P.; Cimino, J. B.; Holt, B.; Ruzek, M. R.
1986-01-01
In October 1984, SIR-B obtained digital image data of about 6.5 million km2 of the Earth's surface. The coverage is mostly of selected experimental test sites located between latitudes 60 deg north and 60 deg south. Programmed adjustments made to the look angle of the steerable radar antenna and to the flight attitude of the shuttle during the mission permitted collection of multiple-incidence-angle coverage or extended mapping coverage as required for the experiments. The SIR-B images included here are representative of the coverage obtained for scientific studies in geology, cartography, hydrology, vegetation cover, and oceanography. The relations between radar backscatter and incidence angle for discriminating various types of surfaces, and the use of multiple-incidence-angle SIR-B images for stereo measurement and viewing, are illustrated with examples. Interpretation of the images is facilitated by corresponding images or photographs obtained by different sensors or by sketch maps or diagrams.
Estimation of canopy carotenoid content of winter wheat using multi-angle hyperspectral data
NASA Astrophysics Data System (ADS)
Kong, Weiping; Huang, Wenjiang; Liu, Jiangui; Chen, Pengfei; Qin, Qiming; Ye, Huichun; Peng, Dailiang; Dong, Yingying; Mortimer, A. Hugh
2017-11-01
Precise estimation of carotenoid (Car) content in crops, using remote sensing data, could be helpful for agricultural resources management. Conventional methods for Car content estimation were mostly based on reflectance data acquired from nadir direction. However, reflectance acquired at this direction is highly influenced by canopy structure and soil background reflectance. Off-nadir observation is less impacted, and multi-angle viewing data are proven to contain additional information rarely exploited for crop Car content estimation. The objective of this study was to explore the potential of multi-angle observation data for winter wheat canopy Car content estimation. Canopy spectral reflectance was measured from nadir as well as from a series of off-nadir directions during different growing stages of winter wheat, with concurrent canopy Car content measurements. Correlation analyses were performed between Car content and the original and continuum removed spectral reflectance. Spectral features and previously published indices were derived from data obtained at different viewing angles and were tested for Car content estimation. Results showed that spectral features and indices obtained from backscattering directions between 20° and 40° view zenith angle had a stronger correlation with Car content than that from the nadir direction, and the strongest correlation was observed from about 30° backscattering direction. Spectral absorption depth at 500 nm derived from spectral data obtained from 30° backscattering direction was found to reduce the difference induced by plant cultivars greatly. It was the most suitable for winter wheat canopy Car estimation, with a coefficient of determination 0.79 and a root mean square error of 19.03 mg/m2. This work indicates the importance of taking viewing geometry effect into account when using spectral features/indices and provides new insight in the application of multi-angle remote sensing for the estimation of crop physiology.
X-Ray Computed Tomography Monitors Damage in Composites
NASA Technical Reports Server (NTRS)
Baaklini, George Y.
1997-01-01
The NASA Lewis Research Center recently codeveloped a state-of-the-art x-ray CT facility (designated SMS SMARTSCAN model 100-112 CITA by Scientific Measurement Systems, Inc., Austin, Texas). This multipurpose, modularized, digital x-ray facility includes an imaging system for digital radiography, CT, and computed laminography. The system consists of a 160-kV microfocus x-ray source, a solid-state charge-coupled device (CCD) area detector, a five-axis object-positioning subassembly, and a Sun SPARCstation-based computer system that controls data acquisition and image processing. The x-ray source provides a beam spot size down to 3 microns. The area detector system consists of a 50- by 50- by 3-mm-thick terbium-doped glass fiber-optic scintillation screen, a right-angle mirror, and a scientific-grade, digital CCD camera with a resolution of 1000 by 1018 pixels and 10-bit digitization at ambient cooling. The digital output is recorded with a high-speed, 16-bit frame grabber that allows data to be binned. The detector can be configured to provide a small field-of-view, approximately 45 by 45 mm in cross section, or a larger field-of-view, approximately 60 by 60 mm in cross section. Whenever the highest spatial resolution is desired, the small field-of-view is used, and for larger samples with some reduction in spatial resolution, the larger field-of-view is used.
Minimum viewing angle for visually guided ground speed control in bumblebees.
Baird, Emily; Kornfeldt, Torill; Dacke, Marie
2010-05-01
To control flight, flying insects extract information from the pattern of visual motion generated during flight, known as optic flow. To regulate their ground speed, insects such as honeybees and Drosophila hold the rate of optic flow in the axial direction (front-to-back) constant. A consequence of this strategy is that its performance varies with the minimum viewing angle (the deviation from the frontal direction of the longitudinal axis of the insect) at which changes in axial optic flow are detected. The greater this angle, the later changes in the rate of optic flow, caused by changes in the density of the environment, will be detected. The aim of the present study is to examine the mechanisms of ground speed control in bumblebees and to identify the extent of the visual range over which optic flow for ground speed control is measured. Bumblebees were trained to fly through an experimental tunnel consisting of parallel vertical walls. Flights were recorded when (1) the distance between the tunnel walls was either 15 or 30 cm, (2) the visual texture on the tunnel walls provided either strong or weak optic flow cues and (3) the distance between the walls changed abruptly halfway along the tunnel's length. The results reveal that bumblebees regulate ground speed using optic flow cues and that changes in the rate of optic flow are detected at a minimum viewing angle of 23-30 deg., with a visual field that extends to approximately 155 deg. By measuring optic flow over a visual field that has a low minimum viewing angle, bumblebees are able to detect and respond to changes in the proximity of the environment well before they are encountered.
EOSDIS Terra Data Sampler #1: Western US Wildfires 2000. 1.1
NASA Technical Reports Server (NTRS)
Perkins, Dorothy C. (Technical Monitor)
2000-01-01
This CD-ROM contains sample data in HDF-EOS format from the instruments on board the Earth Observing System (EOS) Terra satellite: (1) Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER); (2) Clouds and the Earth's Radiant Energy System (CERES); (3) Multi-angle Imaging Spectroradiometer (MISR); and (4) Moderate Resolution Imaging Spectroradiometer (MODIS). Data from the Measurements of Pollution in the Troposphere (MOPITT) instrument were not available for distribution (as of October 17, 2000). The remotely sensed, coincident data for the Western US wildfires were acquired August 30, 2000. This CD-ROM provides information about the Terra mission, instruments, data, and viewing tools. It also provides the Collage tool for viewing data, and links to Web sites containing other digital data processing software. Full granules of the data on this CD-ROM and other EOS Data and Information System (EOSDIS) data products are available from the NASA Distributed Active Archive Centers (DAACs).
Fundamental and practical limits of planar tracking solar concentrators.
Grede, Alex J; Price, Jared S; Giebink, Noel C
2016-12-26
Planar microtracking provides an alternate paradigm for solar concentration that offers the possibility of realizing high-efficiency embedded concentrating photovoltaic systems in the form factor of standard photovoltaic panels. Here, we investigate the thermodynamic limit of planar tracking optical concentrators and establish that they can, in principal, achieve the sine limit of their orientationally-tracked counterparts provided that the receiver translates a minimum distance set by the field of view half-angle. We develop a phase space methodology to optimize practical planar tracking concentrators and apply it to the design of a two surface, catadioptric system that operates with > 90% optical efficiency over a 140° field of view at geometric gains exceeding 1000×. These results provide a reference point for subsequent developments in the field and indicate that planar microtracking can achieve the high optical concentration ratio required in commercial concentrating photovoltaic systems.
Hand biometric recognition based on fused hand geometry and vascular patterns.
Park, GiTae; Kim, Soowon
2013-02-28
A hand biometric authentication method based on measurements of the user's hand geometry and vascular pattern is proposed. To acquire the hand geometry, the thickness of the side view of the hand, the K-curvature with a hand-shaped chain code, the lengths and angles of the finger valleys, and the lengths and profiles of the fingers were used, and for the vascular pattern, the direction-based vascular-pattern extraction method was used, and thus, a new multimodal biometric approach is proposed. The proposed multimodal biometric system uses only one image to extract the feature points. This system can be configured for low-cost devices. Our multimodal biometric-approach hand-geometry (the side view of the hand and the back of hand) and vascular-pattern recognition method performs at the score level. The results of our study showed that the equal error rate of the proposed system was 0.06%.
Hand Biometric Recognition Based on Fused Hand Geometry and Vascular Patterns
Park, GiTae; Kim, Soowon
2013-01-01
A hand biometric authentication method based on measurements of the user's hand geometry and vascular pattern is proposed. To acquire the hand geometry, the thickness of the side view of the hand, the K-curvature with a hand-shaped chain code, the lengths and angles of the finger valleys, and the lengths and profiles of the fingers were used, and for the vascular pattern, the direction-based vascular-pattern extraction method was used, and thus, a new multimodal biometric approach is proposed. The proposed multimodal biometric system uses only one image to extract the feature points. This system can be configured for low-cost devices. Our multimodal biometric-approach hand-geometry (the side view of the hand and the back of hand) and vascular-pattern recognition method performs at the score level. The results of our study showed that the equal error rate of the proposed system was 0.06%. PMID:23449119
Digital Astronaut Photography: A Discovery Dataset for Archaeology
NASA Technical Reports Server (NTRS)
Stefanov, William L.
2010-01-01
Astronaut photography acquired from the International Space Station (ISS) using commercial off-the-shelf cameras offers a freely-accessible source for high to very high resolution (4-20 m/pixel) visible-wavelength digital data of Earth. Since ISS Expedition 1 in 2000, over 373,000 images of the Earth-Moon system (including land surface, ocean, atmospheric, and lunar images) have been added to the Gateway to Astronaut Photography of Earth online database (http://eol.jsc.nasa.gov ). Handheld astronaut photographs vary in look angle, time of acquisition, solar illumination, and spatial resolution. These attributes of digital astronaut photography result from a unique combination of ISS orbital dynamics, mission operations, camera systems, and the individual skills of the astronaut. The variable nature of astronaut photography makes the dataset uniquely useful for archaeological applications in comparison with more traditional nadir-viewing multispectral datasets acquired from unmanned orbital platforms. For example, surface features such as trenches, walls, ruins, urban patterns, and vegetation clearing and regrowth patterns may be accentuated by low sun angles and oblique viewing conditions (Fig. 1). High spatial resolution digital astronaut photographs can also be used with sophisticated land cover classification and spatial analysis approaches like Object Based Image Analysis, increasing the potential for use in archaeological characterization of landscapes and specific sites.
Laser interferometric high-precision angle monitor for JASMINE
NASA Astrophysics Data System (ADS)
Niwa, Yoshito; Arai, Koji; Sakagami, Masaaki; Gouda, Naoteru; Kobayashi, Yukiyasu; Yamada, Yoshiyuki; Yano, Taihei
2006-06-01
The JASMINE instrument uses a beam combiner to observe two different fields of view separated by 99.5 degrees simultaneously. This angle is so-called basic angle. The basic angle of JASMINE should be stabilized and fluctuations of the basic angle should be monitored with the accuracy of 10 microarcsec in root-mean-square over the satellite revolution period of 5 hours. For this purpose, a high-precision interferometric laser metrogy system is employed. One of the available techniques for measuring the fluctuations of the basic angle is a method known as the wave front sensing using a Fabry-Perot type laser interferometer. This technique is to detect fluctuations of the basic angle as displacement of optical axis in the Fabry-Perot cavity. One of the advantages of the technique is that the sensor is made to be sensitive only to the relative fluctuations of the basic angle which the JASMINE wants to know and to be insensitive to the common one; in order to make the optical axis displacement caused by relative motion enhanced the Fabry-Perot cavity is formed by two mirrors which have long radius of curvature. To verify the principle of this idea, the experiment was performed using a 0.1m-length Fabry-Perot cavity with the mirror curvature of 20m. The mirrors of the cavity were artificially actuated in either relative way or common way and the resultant outputs from the sensor were compared.
High-speed large angle mammography tomosynthesis system
NASA Astrophysics Data System (ADS)
Eberhard, Jeffrey W.; Staudinger, Paul; Smolenski, Joe; Ding, Jason; Schmitz, Andrea; McCoy, Julie; Rumsey, Michael; Al-Khalidy, Abdulrahman; Ross, William; Landberg, Cynthia E.; Claus, Bernhard E. H.; Carson, Paul; Goodsitt, Mitchell; Chan, Heang-Ping; Roubidoux, Marilyn; Thomas, Jerry A.; Osland, Jacqueline
2006-03-01
A new mammography tomosynthesis prototype system that acquires 21 projection images over a 60 degree angular range in approximately 8 seconds has been developed and characterized. Fast imaging sequences are facilitated by a high power tube and generator for faster delivery of the x-ray exposure and a high speed detector read-out. An enhanced a-Si/CsI flat panel digital detector provides greater DQE at low exposure, enabling tomo image sequence acquisitions at total patient dose levels between 150% and 200% of the dose of a standard mammographic view. For clinical scenarios where a single MLO tomographic acquisition per breast may replace the standard CC and MLO views, total tomosynthesis breast dose is comparable to or below the dose in standard mammography. The system supports co-registered acquisition of x-ray tomosynthesis and 3-D ultrasound data sets by incorporating an ultrasound transducer scanning system that flips into position above the compression paddle for the ultrasound exam. Initial images acquired with the system are presented.
Design of large zoom for visible and infrared optical system in hemisphere space
NASA Astrophysics Data System (ADS)
Xing, Yang-guang; Li, Lin; Zhang, Juan
2018-01-01
In the field of space optical, the application of advanced optical instruments for related target detection and identification has become an advanced technology in modern optics. In order to complete the task of search in wide field of view and detailed investigation in small field of view, it is inevitable to use the structure of the zoom system to achieve a better observation for important targets. The innovation of this paper lies in using the zoom optical system in space detection, which achieve firstly military needs of searched target in the large field of view and recognized target in the small field of view. At the same time, this paper also completes firstly the design of variable focus optical detection system in the range of hemisphere space, the zoom optical system is working in the range of visible and infrared wavelengths, the perspective angle reaches 360 ° and the zoom ratio of the visible system is up to 15. The visible system has a zoom range of 60-900 mm, a detection band of 0.48-0.70μm, and a F-number of 2.0 to 5.0. The infrared system has a zoom range of 150 900mm, a detection band of 8-12μm, and a F-number of 1.2 to 3.0. The MTF of the visible zoom system is above 0.4 at spatial frequency of 45 lp / mm, and the infrared zoom system is above 0.4 at spatial frequency of 11 lp / mm. The design results show that the system has a good image quality.
SPACE FOR AUDIO-VISUAL LARGE GROUP INSTRUCTION.
ERIC Educational Resources Information Center
GAUSEWITZ, CARL H.
WITH AN INCREASING INTEREST IN AND UTILIZATION OF AUDIO-VISUAL MEDIA IN EDUCATION FACILITIES, IT IS IMPORTANT THAT STANDARDS ARE ESTABLISHED FOR ESTIMATING THE SPACE REQUIRED FOR VIEWING THESE VARIOUS MEDIA. THIS MONOGRAPH SUGGESTS SUCH STANDARDS FOR VIEWING AREAS, VIEWING ANGLES, SEATING PATTERNS, SCREEN CHARACTERISTICS AND EQUIPMENT PERFORMANCES…
NASA Astrophysics Data System (ADS)
Wolf, Kevin; Ehrlich, André; Hüneke, Tilman; Pfeilsticker, Klaus; Werner, Frank; Wirth, Martin; Wendisch, Manfred
2017-03-01
Spectral radiance measurements collected in nadir and sideward viewing directions by two airborne passive solar remote sensing instruments, the Spectral Modular Airborne Radiation measurement sysTem (SMART) and the Differential Optical Absorption Spectrometer (mini-DOAS), are used to compare the remote sensing results of cirrus optical thickness τ. The comparison is based on a sensitivity study using radiative transfer simulations (RTS) and on data obtained during three airborne field campaigns: the North Atlantic Rainfall VALidation (NARVAL) mission, the Mid-Latitude Cirrus Experiment (ML-CIRRUS) and the Aerosol, Cloud, Precipitation, and Radiation Interactions and Dynamics of Convective Cloud Systems (ACRIDICON) campaign. Radiative transfer simulations are used to quantify the sensitivity of measured upward radiance I with respect to τ, ice crystal effective radius reff, viewing angle of the sensor θV, spectral surface albedo α, and ice crystal shape. From the calculations it is concluded that sideward viewing measurements are generally better suited than radiance data from the nadir direction to retrieve τ of optically thin cirrus, especially at wavelengths larger than λ = 900 nm. Using sideward instead of nadir-directed spectral radiance measurements significantly improves the sensitivity and accuracy in retrieving τ, in particular for optically thin cirrus of τ ≤ 2. The comparison of retrievals of τ based on nadir and sideward viewing radiance measurements from SMART, mini-DOAS and independent estimates of τ from an additional active remote sensing instrument, the Water Vapor Lidar Experiment in Space (WALES), shows general agreement within the range of measurement uncertainties. For the selected example a mean τ of 0.54 ± 0.2 is derived from SMART, and 0.49 ± 0.2 by mini-DOAS nadir channels, while WALES obtained a mean value of τ = 0.32 ± 0.02 at 532 nm wavelength, respectively. The mean of τ derived from the sideward viewing mini-DOAS channels is 0.26 ± 0.2. For the few simultaneous measurements, the mini-DOAS sideward channel measurements systematically underestimate (-17.6 %) the nadir observations from SMART and mini-DOAS. The agreement between mini-DOAS sideward viewing channels and WALES is better, showing the advantage of using sideward viewing measurements for cloud remote sensing for τ ≤ 1. Therefore, we suggest sideward viewing measurements for retrievals of τ of thin cirrus because of the significantly enhanced capability of sideward viewing compared to nadir measurements.
Multi-layer Clouds Over the South Indian Ocean
NASA Technical Reports Server (NTRS)
2003-01-01
The complex structure and beauty of polar clouds are highlighted by these images acquired by the Multi-angle Imaging SpectroRadiometer (MISR) on April 23, 2003. These clouds occur at multiple altitudes and exhibit a noticeable cyclonic circulation over the Southern Indian Ocean, to the north of Enderbyland, East Antarctica.The image at left was created by overlying a natural-color view from MISR's downward-pointing (nadir) camera with a color-coded stereo height field. MISR retrieves heights by a pattern recognition algorithm that utilizes multiple view angles to derive cloud height and motion. The opacity of the height field was then reduced until the field appears as a translucent wash over the natural-color image. The resulting purple, cyan and green hues of this aesthetic display indicate low, medium or high altitudes, respectively, with heights ranging from less than 2 kilometers (purple) to about 8 kilometers (green). In the lower right corner, the edge of the Antarctic coastline and some sea ice can be seen through some thin, high cirrus clouds.The right-hand panel is a natural-color image from MISR's 70-degree backward viewing camera. This camera looks backwards along the path of Terra's flight, and in the southern hemisphere the Sun is in front of this camera. This perspective causes the cloud-tops to be brightly outlined by the sun behind them, and enhances the shadows cast by clouds with significant vertical structure. An oblique observation angle also enhances the reflection of light by atmospheric particles, and accentuates the appearance of polar clouds. The dark ocean and sea ice that were apparent through the cirrus clouds at the bottom right corner of the nadir image are overwhelmed by the brightness of these clouds at the oblique view.The Multi-angle Imaging SpectroRadiometer observes the daylit Earth continuously from pole to pole, and every 9 days views the entire globe between 82 degrees north and 82 degrees south latitude. These data products were generated from a portion of the imagery acquired during Terra orbit 17794. The panels cover an area of 335 kilometers x 605 kilometers, and utilize data from blocks 142 to 145 within World Reference System-2 path 155.MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.NASA Astrophysics Data System (ADS)
Humphries, T.; Winn, J.; Faridani, A.
2017-08-01
Recent work in CT image reconstruction has seen increasing interest in the use of total variation (TV) and related penalties to regularize problems involving reconstruction from undersampled or incomplete data. Superiorization is a recently proposed heuristic which provides an automatic procedure to ‘superiorize’ an iterative image reconstruction algorithm with respect to a chosen objective function, such as TV. Under certain conditions, the superiorized algorithm is guaranteed to find a solution that is as satisfactory as any found by the original algorithm with respect to satisfying the constraints of the problem; this solution is also expected to be superior with respect to the chosen objective. Most work on superiorization has used reconstruction algorithms which assume a linear measurement model, which in the case of CT corresponds to data generated from a monoenergetic x-ray beam. Many CT systems generate x-rays from a polyenergetic spectrum, however, in which the measured data represent an integral of object attenuation over all energies in the spectrum. This inconsistency with the linear model produces the well-known beam hardening artifacts, which impair analysis of CT images. In this work we superiorize an iterative algorithm for reconstruction from polyenergetic data, using both TV and an anisotropic TV (ATV) penalty. We apply the superiorized algorithm in numerical phantom experiments modeling both sparse-view and limited-angle scenarios. In our experiments, the superiorized algorithm successfully finds solutions which are as constraints-compatible as those found by the original algorithm, with significantly reduced TV and ATV values. The superiorized algorithm thus produces images with greatly reduced sparse-view and limited angle artifacts, which are also largely free of the beam hardening artifacts that would be present if a superiorized version of a monoenergetic algorithm were used.
The moon illusion: a different view through the legs.
Coren, S
1992-12-01
The fact that the overestimation of the horizon moon is reduced when individuals bend over and view it through their legs has been used as support for theories of the moon illusion based upon angle of regard and vestibular inputs. Inversion of the visual scene, however, can also reduce the salience of depth cue, so illusion reduction might be consistent with size constancy explanations. A sample of 70 subjects viewed normal and inverted pictorial arrays. The moon illusion was reduced in the inverted arrays, suggesting that the "through the legs" reduction of the moon illusion may reflect the alteration in perceived depth associated with scene inversion rather than angle of regard or vestibular effects.
Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Horie, Yu; Han, Seunghoon; Faraon, Andrei
2016-01-01
Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° × 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision. PMID:27892454
NASA Astrophysics Data System (ADS)
Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Horie, Yu; Han, Seunghoon; Faraon, Andrei
2016-11-01
Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° × 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision.
Adiabatic Berry phase in an atom-molecule conversion system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fu Libin; Center for Applied Physics and Technology, Peking University, Beijing 100084; Liu Jie, E-mail: liu_jie@iapcm.ac.c
2010-11-15
We investigate the Berry phase of adiabatic quantum evolution in the atom-molecule conversion system that is governed by a nonlinear Schroedinger equation. We find that the Berry phase consists of two parts: the usual Berry connection term and a novel term from the nonlinearity brought forth by the atom-molecule coupling. The total geometric phase can be still viewed as the flux of the magnetic field of a monopole through the surface enclosed by a closed path in parameter space. The charge of the monopole, however, is found to be one third of the elementary charge of the usual quantized monopole.more » We also derive the classical Hannay angle of a geometric nature associated with the adiabatic evolution. It exactly equals minus Berry phase, indicating a novel connection between Berry phase and Hannay angle in contrast to the usual derivative form.« less
Atmospheric Science Data Center
2013-04-17
article title: Coccoliths in the Celtic Sea View Larger Image As ... This image is a natural-color view of the Celtic Sea and English Channel regions, and was acquired by the Multi-angle Imaging ...
Active Planning, Sensing and Recognition Using a Resource-Constrained Discriminant POMDP
2014-06-28
classes of military vehicles, with sample images shown in Fig. 1. The vehicles were captured from various angles. 4785 images with depression angles 17...and 30◦ are used for training, and 4351 images with depression angles 15◦ and 45◦ are used for testing. The azimuth angles are quantized into 12...selection by collecting the engine sounds for the 8 vehicle classes from the Youtube . The sounds are attenuated differently in 6 view directions
Geometry of the Large Magellanic Cloud Using Multi- wavelength Photometry of Classical Cepheids
NASA Astrophysics Data System (ADS)
Deb, Sukanta; Ngeow, Chow-Choong; Kanbur, Shashi M.; Singh, Harinder P.; Wysocki, Daniel; Kumar, Subhash
2018-05-01
We determine the geometrical and viewing angle parameters of the Large Magellanic Cloud (LMC) using the Leavitt law based on a sample of more than 3500 common classical Cepheids (FU and FO) in optical (V, I), near-infrared (JHKs) and mid-infrared ([3.6] μm and [4.5] μm) photometric bands. Statistical reddening and distance modulus free from the effect of reddening to each of the individual Cepheids are obtained using the simultaneous multi-band fit to the apparent distance moduli from the analysis of the resulting Leavitt laws in these seven photometric bands. A reddening map of the LMC obtained from the analysis shows good agreement with the other maps available in the literature. Extinction free distance measurements along with the information of the equatorial coordinates (α, δ) for individual stars are used to obtain the corresponding Cartesian coordinates with respect to the plane of the sky. By fitting a plane solution of the form z = f(x, y) to the observed three dimensional distribution, the following viewing angle parameters of the LMC are obtained: inclination angle i = 25°.110 ± 0°.365, position angle of line of nodes θlon = 154°.702 ± 1°.378. On the other hand, modelling the observed three dimensional distribution of the Cepheids as a triaxial ellipsoid, the following values of the geometrical axes ratios of the LMC are obtained: 1.000 ± 0.003: 1.151 ± 0.003: 1.890 ± 0.014 with the viewing angle parameters: inclination angle of i = 11°.920 ± 0°.315 with respect to the longest axis from the line of sight and position angle of line of nodes θlon = 128°.871 ± 0°.569. The position angles are measured eastwards from north.
2017-09-15
As it glanced around the Saturn system one final time, NASA's Cassini spacecraft captured this view of the planet's giant moon Titan. Interest in mysterious Titan was a major motivating factor to return to Saturn with Cassini-Huygens following the Voyager mission flybys of the early 1980s. Cassini and its Huygens probe, supplied by European Space Agency, revealed the moon to be every bit as fascinating as scientists had hoped. These views were obtained by Cassini's narrow-angle camera on Sept. 13, 2017. They are among the last images Cassini sent back to Earth. This natural color view, made from images taken using red, green and blue spectral filters, shows Titan much as Voyager saw it -- a mostly featureless golden orb, swathed in a dense atmospheric haze. An enhanced-color view (Figure 1) adds to this color a separate view taken using a spectral filter (centered at 938 nanometers) that can partially see through the haze. The views were acquired at a distance of 481,000 miles (774,000 kilometers) from Titan. The image scale is about 3 miles (5 kilometers) per pixel. https://photojournal.jpl.nasa.gov/catalog/PIA21890
Earth orbiting Sisyphus system study
NASA Technical Reports Server (NTRS)
Jurkevich, I.; Krause, K. W.; Neste, S. L.; Soberman, R. K.
1971-01-01
The feasibility of employing an optical meteoroid detecting system, known as Sisyphus, to measure the near-earth particulates from an earth orbiting vehicle, is considered. A Sisyphus system can discriminate between natural and man-made particles since the system measures orbital characteristics of particles. A Sisyphus system constructed for the Pioneer F/G missions to Jupiter is used as the baseline, and is described. The amount of observing time which can be obtained by a Sisyphus instrument launched into various orbits is determined. Observation time is lost when, (1) the Sun is in or near the field of view, (2) the lighted Earth is in or near the field of view, (3) the instrument is eclipsed by the Earth, and (4) the phase angle measured at the particle between the forward scattering direction and the instrument is less than a certain critical value. The selection of the launch system and the instrument platform with a dedicated, attitude controlled payload package is discussed. Examples of such systems are SATS and SOLRAD 10(C) vehicles, and other possibilities are AVCO Corp. S4 system, the OWL system, and the Delta Payload Experiment Package.
Kanamori, Yoshiaki; Ozaki, Toshikazu; Hane, Kazuhiro
2014-10-20
We fabricated reflection color filters of the three primary colors with wide viewing angles using silicon two-dimensional subwavelength gratings on the same quartz substrate. The grating periods were 400, 340, and 300 nm for red, green, and blue filters, respectively. All of the color filters had the same grating thickness of 100 nm, which enabled simple fabrication of a color filter array. Reflected colors from the red, green, and blue filters under s-polarized white-light irradiation appeared in the respective colors at incident angles from 0 to 50°. By rigorous coupled-wave analysis, the dimensions of each color filter were designed, and the calculated reflectivity was compared with the measured reflectivity.
Axial Tomography from Digitized Real Time Radiography
DOE R&D Accomplishments Database
Zolnay, A. S.; McDonald, W. M.; Doupont, P. A.; McKinney, R. L.; Lee, M. M.
1985-01-18
Axial tomography from digitized real time radiographs provides a useful tool for industrial radiography and tomography. The components of this system are: x-ray source, image intensifier, video camera, video line extractor and digitizer, data storage and reconstruction computers. With this system it is possible to view a two dimensional x-ray image in real time at each angle of rotation and select the tomography plane of interest by choosing which video line to digitize. The digitization of a video line requires less than a second making data acquisition relatively short. Further improvements on this system are planned and initial results are reported.
Design of collection optics and polychromators for a JT-60SA Thomson scattering system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tojo, H.; Hatae, T.; Sakuma, T.
2010-10-15
This paper presents designs of collection optics for a JT-60SA Thomson scattering system. By using tangential (to the toroidal direction) YAG laser injection, three collection optics without strong chromatic aberration generated by the wide viewing angle and small design volume were found to measure almost all the radial space. For edge plasma measurements, the authors optimized the channel number and wavelength ranges of band-pass filters in a polychromator to reduce the relative error in T{sub e} by considering all spatial channels and a double-pass laser system with different geometric parameters.
Suenaga, Hideyuki; Hoang Tran, Huy; Liao, Hongen; Masamune, Ken; Dohi, Takeyoshi; Hoshi, Kazuto; Mori, Yoshiyuki; Takato, Tsuyoshi
2013-01-01
To evaluate the feasibility and accuracy of a three-dimensional augmented reality system incorporating integral videography for imaging oral and maxillofacial regions, based on preoperative computed tomography data. Three-dimensional surface models of the jawbones, based on the computed tomography data, were used to create the integral videography images of a subject's maxillofacial area. The three-dimensional augmented reality system (integral videography display, computed tomography, a position tracker and a computer) was used to generate a three-dimensional overlay that was projected on the surgical site via a half-silvered mirror. Thereafter, a feasibility study was performed on a volunteer. The accuracy of this system was verified on a solid model while simulating bone resection. Positional registration was attained by identifying and tracking the patient/surgical instrument's position. Thus, integral videography images of jawbones, teeth and the surgical tool were superimposed in the correct position. Stereoscopic images viewed from various angles were accurately displayed. Change in the viewing angle did not negatively affect the surgeon's ability to simultaneously observe the three-dimensional images and the patient, without special glasses. The difference in three-dimensional position of each measuring point on the solid model and augmented reality navigation was almost negligible (<1 mm); this indicates that the system was highly accurate. This augmented reality system was highly accurate and effective for surgical navigation and for overlaying a three-dimensional computed tomography image on a patient's surgical area, enabling the surgeon to understand the positional relationship between the preoperative image and the actual surgical site, with the naked eye. PMID:23703710
A three dimensional point cloud registration method based on rotation matrix eigenvalue
NASA Astrophysics Data System (ADS)
Wang, Chao; Zhou, Xiang; Fei, Zixuan; Gao, Xiaofei; Jin, Rui
2017-09-01
We usually need to measure an object at multiple angles in the traditional optical three-dimensional measurement method, due to the reasons for the block, and then use point cloud registration methods to obtain a complete threedimensional shape of the object. The point cloud registration based on a turntable is essential to calculate the coordinate transformation matrix between the camera coordinate system and the turntable coordinate system. We usually calculate the transformation matrix by fitting the rotation center and the rotation axis normal of the turntable in the traditional method, which is limited by measuring the field of view. The range of exact feature points used for fitting the rotation center and the rotation axis normal is approximately distributed within an arc less than 120 degrees, resulting in a low fit accuracy. In this paper, we proposes a better method, based on the invariant eigenvalue principle of rotation matrix in the turntable coordinate system and the coordinate transformation matrix of the corresponding coordinate points. First of all, we control the rotation angle of the calibration plate with the turntable to calibrate the coordinate transformation matrix of the corresponding coordinate points by using the least squares method. And then we use the feature decomposition to calculate the coordinate transformation matrix of the camera coordinate system and the turntable coordinate system. Compared with the traditional previous method, it has a higher accuracy, better robustness and it is not affected by the camera field of view. In this method, the coincidence error of the corresponding points on the calibration plate after registration is less than 0.1mm.
In-Flight performance of MESSENGER's Mercury dual imaging system
Hawkins, S.E.; Murchie, S.L.; Becker, K.J.; Selby, C.M.; Turner, F.S.; Noble, M.W.; Chabot, N.L.; Choo, T.H.; Darlington, E.H.; Denevi, B.W.; Domingue, D.L.; Ernst, C.M.; Holsclaw, G.M.; Laslo, N.R.; Mcclintock, W.E.; Prockter, L.M.; Robinson, M.S.; Solomon, S.C.; Sterner, R.E.
2009-01-01
The Mercury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) spacecraft, launched in August 2004 and planned for insertion into orbit around Mercury in 2011, has already completed two flybys of the innermost planet. The Mercury Dual Imaging System (MDIS) acquired nearly 2500 images from the first two flybys and viewed portions of Mercury's surface not viewed by Mariner 10 in 1974-1975. Mercury's proximity to the Sun and its slow rotation present challenges to the thermal design for a camera on an orbital mission around Mercury. In addition, strict limitations on spacecraft pointing and the highly elliptical orbit create challenges in attaining coverage at desired geometries and relatively uniform spatial resolution. The instrument designed to meet these challenges consists of dual imagers, a monochrome narrow-angle camera (NAC) with a 1.5?? field of view (FOV) and a multispectral wide-angle camera (WAC) with a 10.5?? FOV, co-aligned on a pivoting platform. The focal-plane electronics of each camera are identical and use a 1024??1024 charge-coupled device detector. The cameras are passively cooled but use diode heat pipes and phase-change-material thermal reservoirs to maintain the thermal configuration during the hot portions of the orbit. Here we present an overview of the instrument design and how the design meets its technical challenges. We also review results from the first two flybys, discuss the quality of MDIS data from the initial periods of data acquisition and how that compares with requirements, and summarize how in-flight tests are being used to improve the quality of the instrument calibration. ?? 2009 SPIE.
Wide-angle flat field telescope
NASA Technical Reports Server (NTRS)
Hallam, K. L.; Howell, B. J.; Wilson, M. E.
1986-01-01
Described is an unobscured three mirror wide angle telescopic imaging system comprised of an input baffle which provides a 20 deg (Y axis) x 30 deg (X axis) field of view, a primary mirror having a convex spherical surface, a secondary mirror having a concave ellipsoidal reflecting surface, a tertiary mirror having a concave spherical reflecting surface. The mirrors comprise mirror elements which are offset segments of parent mirrors whose axes and vertices commonly lie on the system's optical axis. An iris diaphragm forming an aperture stop is located between the secondary and tertiary mirror with its center also being coincident with the optical axis and being further located at the beam waist of input light beams reflected from the primary and secondary mirror surfaces. At the system focus following the tertiary mirror is located a flat detector which may be, for example, a TV imaging tube or a photographic film. When desirable, a spectral transmission filter is placed in front of the detector in close proximity thereto.
Burwell, R G; Aujla, R K; Freeman, B J C; Dangerfield, P H; Cole, A A; Kirby, A S; Polak, F J; Pratt, R K; Moulton, A
2008-01-01
The deformity of the ribcage in thoracic adolescent idiopathic scoliosis (AIS) is viewed by most as being secondary to the spinal deformity, though a few consider it primary or involved in curve aggravation. Those who consider it primary ascribe pathogenetic significance to rib-vertebra angle asymmetry. In thoracic AIS, supra-apical rib-vertebra angle differences (RVADs) are reported to be associated with the severity of the Cobb angle. In this paper we attempt to evaluate rib and spinal pathomechanisms in thoracic and thnoracolumbar AIS using spinal radiographs and real-time ultrasound. On the radiographs by costo-vertebral angle asymmetries (rib-vertebral angle differences RVADs, and rib-spinal angle differences RSADs), apical vertebral rotation (AV) and apical vertebral translation (AVT) were measured; and by ultrasound, spine-rib rotation differences (SRRDs) were estimated. RVADs are largest at two and three vertebral levels above the apex where they correlate significantly and positively with Cobb angle and AVT but not AVR. In right thoracic AIS, the cause(s) of the RVA asymmetries is unknown: it may result from trunk muscle imbalance, or from ribs adjusting passively within the constraint of the fourth column of the spine to increasing spinal curvature from whatever cause. Several possible mechanisms may drive axial vertebral rotation including, biplanar spinal asymmetry, relative anterior spinal overgrowth, dorsal shear forces in the presence of normal vertebral axial rotation, asymmetry of rib linear growth, trunk muscle imbalance causing rib-vertebra angle asymmetry weakening the spinal rotation-defending system of bipedal gait, and CNS mechanisms.
Ozmeric, A; Yucens, M; Gultaç, E; Açar, H I; Aydogan, N H; Gül, D; Alemdaroglu, K B
2015-05-01
We hypothesised that the anterior and posterior walls of the body of the first sacral vertebra could be visualised with two different angles of inlet view, owing to the conical shape of the sacrum. Six dry male cadavers with complete pelvic rings and eight dry sacrums with K-wires were used to study the effect of canting (angling the C-arm) the fluoroscope towards the head in 5° increments from 10° to 55°. Fluoroscopic images were taken in each position. Anterior and posterior angles of inclination were measured between the upper sacrum and the vertical line on the lateral view. Three authors separately selected the clearest image for overlapping anterior cortices and the upper sacral canal in the cadaveric models. The dry bone and K-wire models were scored by the authors, being sure to check whether the K-wire was in or out. In the dry bone models the mean score of the relevant inlet position of the anterior or posterior inclination was 8.875 (standard deviation (sd) 0.35), compared with the inlet position of the opposite inclination of -5.75 (sd 4.59). We found that two different inlet views should be used separately to evaluate the borders of the body of the sacrum using anterior and posterior inclination angles of the sacrum, during placement of iliosacral screws. ©2015 The British Editorial Society of Bone & Joint Surgery.
2010-05-26
NASA Cassini spacecraft looks toward the limb of Saturn and, on the right of this image, views part of the rings through the planet atmosphere. Saturn atmosphere can distort the view of the rings from some angles.
Atmospheric Science Data Center
2014-05-15
article title: Los Alamos, New Mexico View Larger JPEG image ... kb) Multi-angle views of the Fire in Los Alamos, New Mexico, May 9, 2000. These true-color images covering north-central New Mexico ...
Challenging Popular Media's Control by Teaching Critical Viewing.
ERIC Educational Resources Information Center
Couch, Richard A.
The purpose of this paper is to express the importance of visual/media literacy and the teaching of critical television viewing. An awareness of the properties and characteristics of television--including camera angles and placement, editing, and emotionally involving subject matter--aids viewers in the critical viewing process. The knowledge of…
Optimization design of periscope type 3X zoom lens design for a five megapixel cellphone camera
NASA Astrophysics Data System (ADS)
Sun, Wen-Shing; Tien, Chuen-Lin; Pan, Jui-Wen; Chao, Yu-Hao; Chu, Pu-Yi
2016-11-01
This paper presents a periscope type 3X zoom lenses design for a five megapixel cellphone camera. The configuration of optical system uses the right angle prism in front of the zoom lenses to change the optical path rotated by a 90° angle resulting in the zoom lenses length of 6 mm. The zoom lenses can be embedded in mobile phone with a thickness of 6 mm. The zoom lenses have three groups with six elements. The half field of view is varied from 30° to 10.89°, the effective focal length is adjusted from 3.142 mm to 9.426 mm, and the F-number is changed from 2.8 to 5.13.
2012-01-01
DM system with a detector field -of-view (FOV) of 24 30 cm and a source-to- image distance of 70 cm measured at the midpoint of the chest wall. In...DNCs in frequency space have an opening angle spanning approximately -7.5° to +7.5° for measurements made near the midpoint of the chest wall. At...conference abstract). 18Ren B, Ruth C, Stein J, Smith A, Shaw I, Jing Z. Design and performance of the prototype full field breast tomosynthesis system
Summer Harvest in Saratov, Russia
NASA Technical Reports Server (NTRS)
2002-01-01
Russia's Saratov Oblast (province) is located in the southeastern portion of the East-European plain, in the Lower Volga River Valley. Southern Russia produces roughly 40 percent of the country's total agricultural output, and Saratov Oblast is the largest producer of grain in the Volga region. Vegetation changes in the province's agricultural lands between spring and summer are apparent in these images acquired on May 31 and July 18, 2002 (upper and lower image panels, respectively) by the Multi-angle Imaging SpectroRadiometer (MISR).The left-hand panels are natural color views acquired by MISR's vertical-viewing (nadir) camera. Less vegetation and more earth tones (indicative of bare soils) are apparent in the summer image (lower left). Farmers in the region utilize staggered sowing to help stabilize yields, and a number of different stages of crop maturity can be observed. The main crop is spring wheat, cultivated under non-irrigated conditions. A short growing season and relatively low and variable rainfall are the major limitations to production. Saratov city is apparent as the light gray pixels on the left (west) bank of the Volga River. Riparian vegetation along the Volga exhibits dark green hues, with some new growth appearing in summer.The right-hand panels are multi-angle composites created with red band data from MISR's 60-degree backward, nadir and 60-degree forward-viewing cameras displayed as red, green and blue respectively. In these images, color variations serve as a proxy for changes in angular reflectance, and the spring and summer views were processed identically to preserve relative variations in brightness between the two dates. Urban areas and vegetation along the Volga banks look similar in the two seasonal multi-angle composites. The agricultural areas, on the other hand, look strikingly different. This can be attributed to differences in brightness and texture between bare soil and vegetated land. The chestnut-colored soils in this region are brighter in MISR's red band than the vegetation. Because plants have vertical structure, the oblique cameras observe a greater proportion of vegetation relative to the nadir camera, which sees more soil. In spring, therefore, the scene is brightest in the vertical view and thus appears with an overall greenish hue. In summer, the soil characteristics play a greater role in governing the appearance of the scene, and the angular reflectance is now brighter at the oblique view angles (displayed as red and blue), thus imparting a pink color to much of the farmland and a purple color to areas along the banks of several narrow rivers. The unusual appearance of the clouds is due to geometric parallax which splits the imagery into spatially separated components as a consequence of their elevation above the surface.The Multi-angle Imaging SpectroRadiometer observes the daylit Earth continuously from pole to pole, and views almost the entire globe every 9 days. These images are a portion of the data acquired during Terra orbits 13033 and 13732, and cover an area of about 173 kilometers x 171 kilometers. They utilize data from blocks 49 to 50 within World Reference System-2 path 170.MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.Development of Human Posture Simulation Method for Assessing Posture Angles and Spinal Loads
Lu, Ming-Lun; Waters, Thomas; Werren, Dwight
2015-01-01
Video-based posture analysis employing a biomechanical model is gaining a growing popularity for ergonomic assessments. A human posture simulation method of estimating multiple body postural angles and spinal loads from a video record was developed to expedite ergonomic assessments. The method was evaluated by a repeated measures study design with three trunk flexion levels, two lift asymmetry levels, three viewing angles and three trial repetitions as experimental factors. The study comprised two phases evaluating the accuracy of simulating self and other people’s lifting posture via a proxy of a computer-generated humanoid. The mean values of the accuracy of simulating self and humanoid postures were 12° and 15°, respectively. The repeatability of the method for the same lifting condition was excellent (~2°). The least simulation error was associated with side viewing angle. The estimated back compressive force and moment, calculated by a three dimensional biomechanical model, exhibited a range of 5% underestimation. The posture simulation method enables researchers to simultaneously quantify body posture angles and spinal loading variables with accuracy and precision comparable to on-screen posture matching methods. PMID:26361435
NASA Astrophysics Data System (ADS)
Yi, Bo; Shen, Huifang
2018-01-01
Non-iridescent structural colors and lotus effect universally existing in the nature provide a great inspiration for artificially developing angle-independent and high hydrophobic structurally colored films. To this end, a facile strategy is put forward for achieving superhydrophobic structurally colored films with wide viewing angles and high visibility based on bumpy melanin-like polydopamine-coated polystyrene particles. Here, hierarchical and amorphous structures are assembled in a self-driven manner due to particles' protrusive surfaces. The superhydrophobicity of the structurally colored films, with water contact angle up to 151°, is realized by combining the hierarchical surface roughness with a dip-coating process of polydimethylsiloxane-hexane solution, while angle-independence revealed in the films is ascribed to amorphous arrays. In addition, benefited from an essential light-absorbing property and high refractive index of polydopamine, the visibility of as-prepared colored films is fundamentally enhanced. Moreover, the mechanical robustness of the films is considerably boosted by inletting 3-aminopropyltriethoxysilane. This fabrication strategy might provide an opportunity for promoting the open-air application of structurally colored coatings.
Impact of basic angle variations on the parallax zero point for a scanning astrometric satellite
NASA Astrophysics Data System (ADS)
Butkevich, Alexey G.; Klioner, Sergei A.; Lindegren, Lennart; Hobbs, David; van Leeuwen, Floor
2017-07-01
Context. Determination of absolute parallaxes by means of a scanning astrometric satellite such as Hipparcos or Gaia relies on the short-term stability of the so-called basic angle between the two viewing directions. Uncalibrated variations of the basic angle may produce systematic errors in the computed parallaxes. Aims: We examine the coupling between a global parallax shift and specific variations of the basic angle, namely those related to the satellite attitude with respect to the Sun. Methods: The changes in observables produced by small perturbations of the basic angle, attitude, and parallaxes were calculated analytically. We then looked for a combination of perturbations that had no net effect on the observables. Results: In the approximation of infinitely small fields of view, it is shown that certain perturbations of the basic angle are observationally indistinguishable from a global shift of the parallaxes. If these kinds of perturbations exist, they cannot be calibrated from the astrometric observations but will produce a global parallax bias. Numerical simulations of the astrometric solution, using both direct and iterative methods, confirm this theoretical result. For a given amplitude of the basic angle perturbation, the parallax bias is smaller for a larger basic angle and a larger solar aspect angle. In both these respects Gaia has a more favourable geometry than Hipparcos. In the case of Gaia, internal metrology is used to monitor basic angle variations. Additionally, Gaia has the advantage of detecting numerous quasars, which can be used to verify the parallax zero point.
Zhang, Yawei; Yin, Fang-Fang; Zhang, You; Ren, Lei
2017-05-07
The purpose of this study is to develop an adaptive prior knowledge guided image estimation technique to reduce the scan angle needed in the limited-angle intrafraction verification (LIVE) system for 4D-CBCT reconstruction. The LIVE system has been previously developed to reconstruct 4D volumetric images on-the-fly during arc treatment for intrafraction target verification and dose calculation. In this study, we developed an adaptive constrained free-form deformation reconstruction technique in LIVE to further reduce the scanning angle needed to reconstruct the 4D-CBCT images for faster intrafraction verification. This technique uses free form deformation with energy minimization to deform prior images to estimate 4D-CBCT based on kV-MV projections acquired in extremely limited angle (orthogonal 3°) during the treatment. Note that the prior images are adaptively updated using the latest CBCT images reconstructed by LIVE during treatment to utilize the continuity of the respiratory motion. The 4D digital extended-cardiac-torso (XCAT) phantom and a CIRS 008A dynamic thoracic phantom were used to evaluate the effectiveness of this technique. The reconstruction accuracy of the technique was evaluated by calculating both the center-of-mass-shift (COMS) and 3D volume-percentage-difference (VPD) of the tumor in reconstructed images and the true on-board images. The performance of the technique was also assessed with varied breathing signals against scanning angle, lesion size, lesion location, projection sampling interval, and scanning direction. In the XCAT study, using orthogonal-view of 3° kV and portal MV projections, this technique achieved an average tumor COMS/VPD of 0.4 ± 0.1 mm/5.5 ± 2.2%, 0.6 ± 0.3 mm/7.2 ± 2.8%, 0.5 ± 0.2 mm/7.1 ± 2.6%, 0.6 ± 0.2 mm/8.3 ± 2.4%, for baseline drift, amplitude variation, phase shift, and patient breathing signal variation, respectively. In the CIRS phantom study, this technique achieved an average tumor COMS/VPD of 0.7 ± 0.1 mm/7.5 ± 1.3% for a 3 cm lesion and 0.6 ± 0.2 mm/11.4 ± 1.5% for a 2 cm lesion in the baseline drift case. The average tumor COMS/VPD were 0.5 ± 0.2 mm/10.8 ± 1.4%, 0.4 ± 0.3 mm/7.3 ± 2.9%, 0.4 ± 0.2 mm/7.4 ± 2.5%, 0.4 ± 0.2 mm/7.3 ± 2.8% for the four real patient breathing signals, respectively. Results demonstrated that the adaptive prior knowledge guided image estimation technique with LIVE system is robust against scanning angle, lesion size, location and scanning direction. It can estimate on-board images accurately with as little as 6 projections in orthogonal-view 3° angle. In conclusion, adaptive prior knowledge guided image reconstruction technique accurately estimates 4D-CBCT images using extremely-limited angle and projections. This technique greatly improves the efficiency and accuracy of LIVE system for ultrafast 4D intrafraction verification of lung SBRT treatments.
Telescope aperture optimization for spacebased coherent wind lidar
NASA Astrophysics Data System (ADS)
Ge, Xian-ying; Zhu, Jun; Cao, Qipeng; Zhang, Yinchao; Yin, Huan; Dong, Xiaojing; Wang, Chao; Zhang, Yongchao; Zhang, Ning
2015-08-01
Many studies have indicated that the optimum measurement approach for winds from space is a pulsed coherent wind lidar, which is an active remote sensing tool with the characteristics that high spatial and temporal resolutions, real-time detection, high mobility, facilitated control and so on. Because of the significant eye safety, efficiency, size, and lifetime advantage, 2μm wavelength solid-state laser lidar systems have attracted much attention in spacebased wind lidar plans. In this paper, the theory of coherent detection is presented and a 2μm wavelength solid-state laser lidar system is introduced, then the ideal aperture is calculated from signal-to-noise(SNR) view at orbit 400km. However, considering real application, even if the lidar hardware is perfectly aligned, the directional jitter of laser beam, the attitude change of the lidar in the long round trip time of the light from the atmosphere and other factors can bring misalignment angle. So the influence of misalignment angle is considered and calculated, and the optimum telescope diameter(0.45m) is obtained as the misalignment angle is 4 μrad. By the analysis of the optimum aperture required for spacebased coherent wind lidar system, we try to present the design guidance for the telescope.
3D bubble reconstruction using multiple cameras and space carving method
NASA Astrophysics Data System (ADS)
Fu, Yucheng; Liu, Yang
2018-07-01
An accurate measurement of bubble shape and size has a significant value in understanding the behavior of bubbles that exist in many engineering applications. Past studies usually use one or two cameras to estimate bubble volume, surface area, among other parameters. The 3D bubble shape and rotation angle are generally not available in these studies. To overcome this challenge and obtain more detailed information of individual bubbles, a 3D imaging system consisting of four high-speed cameras is developed in this paper, and the space carving method is used to reconstruct the 3D bubble shape based on the recorded high-speed images from different view angles. The proposed method can reconstruct the bubble surface with minimal assumptions. A benchmarking test is performed in a 3 cm × 1 cm rectangular channel with stagnant water. The results show that the newly proposed method can measure the bubble volume with an error of less than 2% compared with the syringe reading. The conventional two-camera system has an error around 10%. The one-camera system has an error greater than 25%. The visualization of a 3D bubble rising demonstrates the wall influence on bubble rotation angle and aspect ratio. This also explains the large error that exists in the single camera measurement.
Instrument Display Visual Angles for Conventional Aircraft and the MQ-9 Ground Control Station
NASA Technical Reports Server (NTRS)
Bendrick, Gregg A.; Kamine, Tovy Haber
2008-01-01
Aircraft instrument panels should be designed such that primary displays are in optimal viewing location to minimize pilot perception and response time. Human Factors engineers define three zones (i.e. "cones") of visual location: 1) "Easy Eye Movement" (foveal vision); 2) "Maximum Eye Movement" (peripheral vision with saccades), and 3) "Head Movement" (head movement required). Instrument display visual angles were measured to determine how well conventional aircraft (T-34, T-38, F- 15B, F-16XL, F/A-18A, U-2D, ER-2, King Air, G-III, B-52H, DC-10, B747-SCA) and the MQ-9 ground control station (GCS) complied with these standards, and how they compared with each other. Methods: Selected instrument parameters included: attitude, pitch, bank, power, airspeed, altitude, vertical speed, heading, turn rate, slip/skid, AOA, flight path, latitude, longitude, course, bearing, range and time. Vertical and horizontal visual angles for each component were measured from the pilot s eye position in each system. Results: The vertical visual angles of displays in conventional aircraft lay within the cone of "Easy Eye Movement" for all but three of the parameters measured, and almost all of the horizontal visual angles fell within this range. All conventional vertical and horizontal visual angles lay within the cone of "Maximum Eye Movement". However, most instrument vertical visual angles of the MQ-9 GCS lay outside the cone of "Easy Eye Movement", though all were within the cone of "Maximum Eye Movement". All the horizontal visual angles for the MQ-9 GCS were within the cone of "Easy Eye Movement". Discussion: Most instrument displays in conventional aircraft lay within the cone of "Easy Eye Movement", though mission-critical instruments sometimes displaced less important instruments outside this area. Many of the MQ-9 GCS systems lay outside this area. Specific training for MQ-9 pilots may be needed to avoid increased response time and potential error during flight.
NASA Astrophysics Data System (ADS)
Ball, C. P.; Marks, A. A.; Green, P.; Mac Arthur, A.; Fox, N.; King, M. D.
2013-12-01
Surface albedo is the hemispherical and wavelength integrated reflectance over the visible, near infrared and shortwave infrared regions of the solar spectrum. The albedo of Arctic snow can be in excess of 0.8 and it is a critical component in the global radiation budget because it determines the proportion of solar radiation absorbed, and reflected, over a large part of the Earth's surface. We present here our first results of the angularly resolved surface reflectance of Arctic snow at high solar zenith angles (~80°) suitable for the validation of satellite remote sensing products. The hemispherical directional reflectance factor (HDRF) of Arctic snow covered tundra was measured using the GonioRAdiometric Spectrometer System (GRASS) during a three-week field campaign in Ny-Ålesund, Svalbard, in March/April 2013. The measurements provide one of few existing HDRF datasets at high solar zenith angles for wind-blown Arctic snow covered tundra (conditions typical of the Arctic region), and the first ground-based measure of HDRF at Ny-Ålesund. The HDRF was recorded under clear sky conditions with 10° intervals in view zenith, and 30° intervals in view azimuth, for several typical sites over a wavelength range of 400-1500 nm at 1 nm resolution. Satellite sensors such as MODIS, AVHRR and VIIRS offer a method to monitor the surface albedo with high spatial and temporal resolution. However, snow reflectance is anisotropic and is dependent on view and illumination angle and the wavelength of the incident light. Spaceborne sensors subtend a discrete angle to the target surface and measure radiance over a limited number of narrow spectral bands. Therefore, the derivation of the surface albedo requires accurate knowledge of the surfaces bidirectional reflectance as a function of wavelength. The ultimate accuracy to which satellite sensors are able to measure snow surface properties such as albedo is dependant on the accuracy of the BRDF model, which can only be assessed if hyperspectral ground-based data are available to validate the current modelling approaches. The results presented here extend the work of previous studies by recording the HDRF of Arctic snow covered tundra at high solar zenith angles over several sites. Demonstrating the strong forward scattering nature of snow reflectance at high solar zenith angles, but also showing clear wavelength dependence in the shape of the HDRF, and an increasing anisotropy with wavelength.
7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION ...
7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION EQUIPMENT AND STORAGE CABINET. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
3. VAL CONTROL STATION, VIEW OF CONTROL PANELS SHOWING MAIN ...
3. VAL CONTROL STATION, VIEW OF CONTROL PANELS SHOWING MAIN PRESSURE GAUGES, LOOKING NORTH. - Variable Angle Launcher Complex, Control Station, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Measuring the Radius of the Earth from a Mountain Top Overlooking the Ocean
ERIC Educational Resources Information Center
Gangadharan, Dhevan
2009-01-01
A clear view of the ocean may be used to measure the radius of the Earth. To an observer looking out at the ocean, the horizon will always form some angle [theta] with the local horizontal plane. As the observer's elevation "h" increases, so does the angle [theta]. From measurements of the elevation "h" and the angle [theta],…
Baffling system for the Wide Angle Camera (WAC) of ROSETTA mission
NASA Astrophysics Data System (ADS)
Brunello, Pierfrancesco; Peron, Fabio; Barbieri, Cesare; Fornasier, Sonia
2000-10-01
After the experience of GIOTTO fly-by to comet Halley in 1986, the European Space Agency planned to improve the scientific knowledge of these astronomical objects by means of an even more ambitious rendezvous mission with another comet (P/Wirtanen). This mission, named ROSETTA, will go on from 2003 to 2013, ending after the comet perihelion phase and including also the fly-by with two asteroids of the main belt (140 Siwa and 4979 Otawara). Scientific priority of the mission is the in situ investigation of the cometary nucleus, with the aim of better understanding the formation and the composition of planetesimals and their evolution over the last 4.5 billions of years. In this context, the Authors were involved in the design of the baffling for the Wide Angle Camera (WAC) of the imaging system (OSIRIS) carried on board of the spacecraft. Scientific requirements for the WAC are : a large field of view (FOV) of 12 degree(s) x 12 degree(s) with a resolution of 100 (mu) rad per pixel, UV response, and a contrast ratio of 10-4 in order to detect gaseous and dusty features close to the nucleus of the comet. TO achieve these performances, a fairly novel class of optical solutions employing off-axis sections of concentric mirrors was explored. Regarding baffling, the peculiar demand was the rejection of stray-light generated by the optics for sources within the FOV, since the optical entrance aperture is located at the level of the secondary mirror (instead of the primary as usual). This paper describes the baffle design and analyzes its performances, calculated by numerical simulation with ray tracing methods, at different angles of incidence of the light, for sources both outside and inside the field of view.
NASA Technical Reports Server (NTRS)
Hanson, Donald B.
2001-01-01
The problem of broadband noise generated by turbulence impinging on a downstream blade row is examined from a theoretical viewpoint. Equations are derived for sound power spectra in terms of 3 dimensional wavenumber spectra of the turbulence. Particular attention is given to issues of turbulence inhomogeneity associated with the near field of the rotor and variations through boundary layers. Lean and sweep of the rotor or stator cascade are also handled rigorously with a full derivation of the relevant geometry and definitions of lean and sweep angles. Use of the general theory is illustrated by 2 simple theoretical spectra for homogeneous turbulence. Limited comparisons are made with data from model fans designed by Pratt & Whitney, Allison, and Boeing. Parametric studies for stator noise are presented showing trends with Mach number, vane count, turbulence scale and intensity, lean, and sweep. Two conventions are presented to define lean and sweep. In the "cascade system" lean is a rotation out of its plane and sweep is a rotation of the airfoil in its plane. In the "duct system" lean is the leading edge angle viewing the fan from the front (along the fan axis) and sweep is the angle viewing the fan from the side (,perpendicular to the axis). It is shown that the governing parameter is sweep in the plane of the airfoil (which reduces the chordwise component of Mach number). Lean (out of the plane of the airfoil) has little effect. Rotor noise predictions are compared with duct turbulence/rotor interaction noise data from Boeing and variations, including blade tip sweep and turbulence axial and transverse scales are explored.
NASA Technical Reports Server (NTRS)
Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Gregg, Watson W.
1992-01-01
Due to range safety considerations, the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) ocean color instrument may be required to be launched into a near-noon descending node, as opposed to the ascending node used by the predecessor sensor, the Coastal Zone Color Scanner (CZCS). The relative importance of ascending versus descending near-noon orbits was assessed here to determine if descending node will meet the scientific requirements of SeaWiFS. Analyses focused on ground coverage, local times of coverage, solar and viewing geometries (zenith and azimuth angles), and sun glint. Differences were found in the areas covered by individual orbits, but were not important when taken over a 16 day repeat time. Local time of coverage was also different: for ascending node orbits the Northern Hemisphere was observed in the morning and the Southern Hemisphere in the afternoon, while for descending node orbits the Northern Hemisphere was observed in the afternoon and the Southern in the morning. There were substantial differences in solar azimuth and spacecraft azimuth angles both at equinox and at the Northern Hemisphere summer solstice. Negligible differences in solar and spacecraft zenith angles, relative azimuth angles, and sun glint were obtained at the equinox. However, large differences were found in solar zenith angles, relative azimuths, and sun glint for the solstice. These differences appeared to compensate across the scan, however, an increase in sun glint in descending node over that in ascending node on the western part of the scan was compensated by a decrease on the eastern part of the scan. Thus, no advantage or disadvantage could be conferred upon either ascending node or descending node for noon orbits. Analyses were also performed for ascending and descending node orbits that deviated from a noon equator crossing time. For ascending node, afternoon orbits produced the lowest mean solar zenith angles in the Northern Hemisphere, and morning orbits produced the lowest angles for the Southern Hemisphere. For descending node, morning orbits produced the lowest mean solar zenith angles for the Northern Hemisphere; afternoon orbits produced the lowest angles for the Southern Hemisphere.
Pupil geometry and pupil re-imaging in telescope arrays
NASA Technical Reports Server (NTRS)
Traub, Wesley A.
1990-01-01
This paper considers the issues of lateral and longitudinal pupil geometry in ground-based telescope arrays, such as IOTA. In particular, it is considered whether or not pupil re-imaging is required before beam combination. By considering the paths of rays through the system, an expression is derived for the optical path errors in the combined wavefront as a function of array dimensions, telescope magnification factor, viewing angle, and field-of-view. By examining this expression for the two cases of pupil-plane and image-plane combination, operational limits can be found for any array. As a particular example, it is shown that for IOTA no pupil re-imaging optics will be needed.
3D medical thermography device
NASA Astrophysics Data System (ADS)
Moghadam, Peyman
2015-05-01
In this paper, a novel handheld 3D medical thermography system is introduced. The proposed system consists of a thermal-infrared camera, a color camera and a depth camera rigidly attached in close proximity and mounted on an ergonomic handle. As a practitioner holding the device smoothly moves it around the human body parts, the proposed system generates and builds up a precise 3D thermogram model by incorporating information from each new measurement in real-time. The data is acquired in motion, thus it provides multiple points of view. When processed, these multiple points of view are adaptively combined by taking into account the reliability of each individual measurement which can vary due to a variety of factors such as angle of incidence, distance between the device and the subject and environmental sensor data or other factors influencing a confidence of the thermal-infrared data when captured. Finally, several case studies are presented to support the usability and performance of the proposed system.
High angle view of Apollo 14 space vehicle on way to Pad A
1970-11-09
S70-54127 (9 Nov. 1970) --- A high-angle view at Launch Complex 39, Kennedy Space Center (KSC), showing the Apollo 14 (Spacecraft 110/Lunar Module 8/Saturn 509) space vehicle on the way from the Vehicle Assembly Building (VAB) to Pad A. The Saturn V stack and its mobile launch tower sit atop a huge crawler-transporter. The Apollo 14 crewmen will be astronauts Alan B. Shepard Jr., commander; Stuart A. Roosa, command module pilot; and Edgar D. Mitchell, lunar module pilot.
High angle view of Apollo 14 space vehicle on way to Pad A
1970-11-09
S70-54119 (9 Nov. 1970) --- A high-angle view at Launch Complex 39, Kennedy Space Center (KSC), showing the Apollo 14 (Spacecraft 110/Lunar Module 8/Saturn 509) space vehicle on the way from the Vehicle Assembly Building (VAB) to Pad A. The Saturn V stack and its mobile launch tower sit atop a huge crawler-transporter. The Apollo 14 crewmen will be astronauts Alan B. Shepard Jr., commander; Stuart A. Roosa, command module pilot; and Edgar D. Mitchell, lunar module pilot.
Leaf bidirectional reflectance and transmittance in corn and soybean
NASA Technical Reports Server (NTRS)
Walter-Shea, E. A.; Norman, J. M.; Blad, B. L.
1989-01-01
Bidirectional optical properties of leaves must be adequately characterized to develop comprehensive and reliably predictive canopy radiative-transfer models. Directional reflectance and transmittance factors of individual corn and soybean leaves were measured at source incidence angles (SIAs) 20, 45, and 70 deg and numerous view angles in the visible and NIR. Bidirectional reflectance distributions changed with increasing SIA, with forward scattering most pronounced at 70 deg. Directional-hemispherical reflectance generally increased and transmittance decreased with increased SIA. Directional-hemispherical reflectance factors were higher and transmittances were lower than the nadir-viewed reflectance component.
2017-11-21
After more than 13 years at Saturn, and with its fate sealed, NASA's Cassini spacecraft bid farewell to the Saturnian system by firing the shutters of its wide-angle camera and capturing this last, full mosaic of Saturn and its rings two days before the spacecraft's dramatic plunge into the planet's atmosphere. During the observation, a total of 80 wide-angle images were acquired in just over two hours. This view is constructed from 42 of those wide-angle shots, taken using the red, green and blue spectral filters, combined and mosaicked together to create a natural-color view. Six of Saturn's moons -- Enceladus, Epimetheus, Janus, Mimas, Pandora and Prometheus -- make a faint appearance in this image. (Numerous stars are also visible in the background.) A second version of the mosaic is provided in which the planet and its rings have been brightened, with the fainter regions brightened by a greater amount. (The moons and stars have also been brightened by a factor of 15 in this version.) The ice-covered moon Enceladus -- home to a global subsurface ocean that erupts into space -- can be seen at the 1 o'clock position. Directly below Enceladus, just outside the F ring (the thin, farthest ring from the planet seen in this image) lies the small moon Epimetheus. Following the F ring clock-wise from Epimetheus, the next moon seen is Janus. At about the 4:30 position and outward from the F ring is Mimas. Inward of Mimas and still at about the 4:30 position is the F-ring-disrupting moon, Pandora. Moving around to the 10 o'clock position, just inside of the F ring, is the moon Prometheus. This view looks toward the sunlit side of the rings from about 15 degrees above the ring plane. Cassini was approximately 698,000 miles (1.1 million kilometers) from Saturn, on its final approach to the planet, when the images in this mosaic were taken. Image scale on Saturn is about 42 miles (67 kilometers) per pixel. The image scale on the moons varies from 37 to 50 miles (59 to 80 kilometers) pixel. The phase angle (the Sun-planet-spacecraft angle) is 138 degrees. The Cassini spacecraft ended its mission on Sept. 15, 2017. https://photojournal.jpl.nasa.gov/catalog/PIA17218
A precise laboratory goniometer system to collect spectral BRDF data of materials
NASA Astrophysics Data System (ADS)
Jiao, Guangping; Jiao, Ziti; Wang, Jie; Zhang, Hu; Dong, Yadong
2014-11-01
This paper presents a precise laboratory goniometer system to quickly collect bidirectional reflectance distribution factor(BRDF)of typical materials such soil, canopy and artificial materials in the laboratory. The system consists of the goniometer, SVC HR1024 spectroradiometer, and xenon long-arc lamp as light source. the innovation of cantilever slab can reduce the shadow of the goniometer in the principle plane. The geometric precision of the footprint centre is better than +/-4cm in most azimuth directions, and the angle-controlling accuracy is better than 0.5°. The light source keeps good stability, with 0.8% irradiance decrease in 3 hours. But the large areal heterogeneity of the light source increase the data processing difficulty to capture the accurate BRDF. First measurements are taken from soil in a resolution of 15° and 30° in zenith and azimuth direction respectively, with the +/-50° biggest view angle. More observations are taken in the hot-spot direction. The system takes about 40 minutes to complete all measurements. A spectralon panel is measured at the beginning and end of the whole period. A simple interactive interface on the computer can automatically control all operations of the goniometer and data-processing. The laboratory experiment of soil layer and grass lawn shows that the goniometer can capture the the multi-angle variation of BRDF.
2. OBLIQUE VIEW OF WEST FRONT. The frames on an ...
2. OBLIQUE VIEW OF WEST FRONT. The frames on an angle originally held mirrors for viewing the tests from inside the building. Vertical frame originally held bullet glass. - Edwards Air Force Base, South Base Sled Track, Firing Control Blockhouse, South of Sled Track at east end, Lancaster, Los Angeles County, CA
Reflection and emission models for deserts derived from Nimbus-7 ERB scanner measurements
NASA Technical Reports Server (NTRS)
Staylor, W. F.; Suttles, J. T.
1986-01-01
Broadband shortwave and longwave radiance measurements obtained from the Nimbus-7 Earth Radiation Budget scanner were used to develop reflectance and emittance models for the Sahara-Arabian, Gibson, and Saudi Deserts. The models were established by fitting the satellite measurements to analytic functions. For the shortwave, the model function is based on an approximate solution to the radiative transfer equation. The bidirectional-reflectance function was obtained from a single-scattering approximation with a Rayleigh-like phase function. The directional-reflectance model followed from integration of the bidirectional model and is a function of the sum and product of cosine solar and viewing zenith angles, thus satisfying reciprocity between these angles. The emittance model was based on a simple power-law of cosine viewing zenith angle.
8. VAL CAMERA CAR, CLOSEUP VIEW OF 'FLARE' OR TRAJECTORY ...
8. VAL CAMERA CAR, CLOSE-UP VIEW OF 'FLARE' OR TRAJECTORY CAMERA ON SLIDING MOUNT. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
18. DETAIL VIEW OF DEVICE ON OUTSIDE OF COFFEE HUSKER ...
18. DETAIL VIEW OF DEVICE ON OUTSIDE OF COFFEE HUSKER THAT ADJUSTED ANGLE OF HUSKER VAT WALLS - Hacienda Cafetalera Santa Clara, Coffee Mill, KM 19, PR Route 372, Hacienda La Juanita, Yauco Municipio, PR
2. VAL CONTROL STATION, VIEW OF INTERIOR SHOWING EXTERIOR DOOR, ...
2. VAL CONTROL STATION, VIEW OF INTERIOR SHOWING EXTERIOR DOOR, WINDOWS AND CONTROL PANELS, LOOKING SOUTHEAST. - Variable Angle Launcher Complex, Control Station, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Ground-based full-sky imaging polarimeter based on liquid crystal variable retarders.
Zhang, Ying; Zhao, Huijie; Song, Ping; Shi, Shaoguang; Xu, Wujian; Liang, Xiao
2014-04-07
A ground-based full-sky imaging polarimeter based on liquid crystal variable retarders (LCVRs) is proposed in this paper. Our proposed method can be used to realize the rapid detection of the skylight polarization information with hemisphere field-of-view for the visual band. The characteristics of the incidence angle of light on the LCVR are investigated, based on the electrically controlled birefringence. Then, the imaging polarimeter with hemisphere field-of-view is designed. Furthermore, the polarization calibration method with the field-of-view multiplexing and piecewise linear fitting is proposed, based on the rotation symmetry of the polarimeter. The polarization calibration of the polarimeter is implemented with the hemisphere field-of-view. This imaging polarimeter is investigated by the experiment of detecting the skylight image. The consistency between the obtained experimental distribution of polarization angle with that due to Rayleigh scattering model is 90%, which confirms the effectivity of our proposed imaging polarimeter.
NASA Astrophysics Data System (ADS)
Penning de Vries, Marloes; Beirle, Steffen; Sihler, Holger; Wagner, Thomas
2017-04-01
The UV Aerosol Index (UVAI) is a simple measure of aerosols from satellite that is particularly sensitive to elevated layers of absorbing particles. It has been determined from a range of instruments including TOMS, GOME-2, and OMI, for almost four decades and will be continued in the upcoming Sentinel missions S5-precursor, S4, and S5. Despite its apparent simplicity, the interpretation of UVAI is not straightforward, as it depends on aerosol abundance, absorption, and altitude in a non-linear way. In addition, UVAI depends on the geometry of the measurement (viewing angle, solar zenith and relative azimuth angles), particularly if viewing angles exceed 45 degrees, as is the case for OMI and TROPOMI (on S5-precursor). The dependence on scattering angle complicates the interpretation and further processing (e.g., averaging) of UVAI. In certain favorable cases, however, independent information on aerosol altitude and absorption may become available. We present a detailed study of the scatter angle dependence using SCIATRAN radiative transfer calculations. The model results were compared to observations of an extensive Siberian smoke plume, of which parts reached 10-12 km altitude. Due to its large extent and the high latitude, OMI observed the complete plume in five consecutive orbits under a wide range of scattering angles. This allowed us to deduce aerosol characteristics (absorption and layer height) that were compared with collocated CALIOP lidar measurements.
A multi-directional backlight for a wide-angle, glasses-free three-dimensional display.
Fattal, David; Peng, Zhen; Tran, Tho; Vo, Sonny; Fiorentino, Marco; Brug, Jim; Beausoleil, Raymond G
2013-03-21
Multiview three-dimensional (3D) displays can project the correct perspectives of a 3D image in many spatial directions simultaneously. They provide a 3D stereoscopic experience to many viewers at the same time with full motion parallax and do not require special glasses or eye tracking. None of the leading multiview 3D solutions is particularly well suited to mobile devices (watches, mobile phones or tablets), which require the combination of a thin, portable form factor, a high spatial resolution and a wide full-parallax view zone (for short viewing distance from potentially steep angles). Here we introduce a multi-directional diffractive backlight technology that permits the rendering of high-resolution, full-parallax 3D images in a very wide view zone (up to 180 degrees in principle) at an observation distance of up to a metre. The key to our design is a guided-wave illumination technique based on light-emitting diodes that produces wide-angle multiview images in colour from a thin planar transparent lightguide. Pixels associated with different views or colours are spatially multiplexed and can be independently addressed and modulated at video rate using an external shutter plane. To illustrate the capabilities of this technology, we use simple ink masks or a high-resolution commercial liquid-crystal display unit to demonstrate passive and active (30 frames per second) modulation of a 64-view backlight, producing 3D images with a spatial resolution of 88 pixels per inch and full-motion parallax in an unprecedented view zone of 90 degrees. We also present several transparent hand-held prototypes showing animated sequences of up to six different 200-view images at a resolution of 127 pixels per inch.
GLRS-R 2-colour retroreflector target design and predicted performance
NASA Astrophysics Data System (ADS)
Lund, Glenn
The retroreflector ground target design for the GLRS-R spaceborne dual wavelength laser ranging system is described. The passive design flows down from the requirements of high station autonomy, high global field of view, little or no multiple pulse returns, and adequate optical cross section for most ranging geometries. The solution makes use of five hollow cube corner retroreflectors of which one points to the zenith and the remaining four are inclined from the vertical at uniform azimuthal spacings. The need for large retroreflectors is expected to generate narrow diffraction lobes. A good compromise solution is found by spoiling just one of the retroereflector dihedral angles from 90 deg, thus generating two symmetrically oriented diffraction lobes in the return beam. The required spoil angles are found to have little dependance on ground target latitude. Various link budget analyses are presented. They show the influence of such factors as point ahead optimization, turbulence, ranging angle, atmospheric visibility, and ground target thermal deformations.
Wang, Wei; Chen, Xiyuan
2018-01-01
In view of the fact the accuracy of the third-degree Cubature Kalman Filter (CKF) used for initial alignment under large misalignment angle conditions is insufficient, an improved fifth-degree CKF algorithm is proposed in this paper. In order to make full use of the innovation on filtering, the innovation covariance matrix is calculated recursively by an innovative sequence with an exponent fading factor. Then a new adaptive error covariance matrix scaling algorithm is proposed. The Singular Value Decomposition (SVD) method is used for improving the numerical stability of the fifth-degree CKF in this paper. In order to avoid the overshoot caused by excessive scaling of error covariance matrix during the convergence stage, the scaling scheme is terminated when the gradient of azimuth reaches the maximum. The experimental results show that the improved algorithm has better alignment accuracy with large misalignment angles than the traditional algorithm. PMID:29473912
Flow Behavior in Side-View Plane of Pitching Delta Wing
NASA Astrophysics Data System (ADS)
Pektas, Mehmet Can; Tasci, Mehmet Oguz; Karasu, Ilyas; Sahin, Besir; Akilli, Huseyin
2018-06-01
In the present investigation, a delta wing which has 70° sweep angle, Λ was oscillated on its midcord according to the equation of α(t)=αm+α0sin(ωet). This study focused on understanding the effect of pitching and characterizing the interaction of vortex breakdown with oscillating leading edges under different yaw angles, β over a slender delta wing. The value of mean angle of attack, αm was taken as 25°. The yaw angle, β was varied with an interval of 4° over the range of 0°≤β≤ 16°. The delta wing was sinusoidally pitched within the range of period of time 5s≤Te≤60s and reduced frequency was set as K=0.16, 0.25, 0.49, 1.96 and lastly amplitude of pitching motion was arranged as α0=±5°.Formations and locations of vortex breakdown were investigated by using the dye visualization technique in side view plane.
Multi-Angle Snowflake Camera Instrument Handbook
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stuefer, Martin; Bailey, J.
2016-07-01
The Multi-Angle Snowflake Camera (MASC) takes 9- to 37-micron resolution stereographic photographs of free-falling hydrometers from three angles, while simultaneously measuring their fall speed. Information about hydrometeor size, shape orientation, and aspect ratio is derived from MASC photographs. The instrument consists of three commercial cameras separated by angles of 36º. Each camera field of view is aligned to have a common single focus point about 10 cm distant from the cameras. Two near-infrared emitter pairs are aligned with the camera’s field of view within a 10-angular ring and detect hydrometeor passage, with the lower emitters configured to trigger the MASCmore » cameras. The sensitive IR motion sensors are designed to filter out slow variations in ambient light. Fall speed is derived from successive triggers along the fall path. The camera exposure times are extremely short, in the range of 1/25,000th of a second, enabling the MASC to capture snowflake sizes ranging from 30 micrometers to 3 cm.« less
A method and data for video monitor sizing. [human CRT viewing requirements
NASA Technical Reports Server (NTRS)
Kirkpatrick, M., III; Shields, N. L., Jr.; Malone, T. B.; Guerin, E. G.
1976-01-01
The paper outlines an approach consisting of using analytical methods and empirical data to determine monitor size constraints based on the human operator's CRT viewing requirements in a context where panel space and volume considerations for the Space Shuttle aft cabin constrain the size of the monitor to be used. Two cases are examined: remote scene imaging and alphanumeric character display. The central parameter used to constrain monitor size is the ratio M/L where M is the monitor dimension and L the viewing distance. The study is restricted largely to 525 line video systems having an SNR of 32 db and bandwidth of 4.5 MHz. Degradation in these parameters would require changes in the empirically determined visual angle constants presented. The data and methods described are considered to apply to cases where operators are required to view via TV target objects which are well differentiated from the background and where the background is relatively sparse. It is also necessary to identify the critical target dimensions and cues.
Pair Production and Gamma-Ray Emission in the Outer Magnetospheres of Rapidly Spinning Young Pulsars
NASA Technical Reports Server (NTRS)
Ruderman, Malvin; Chen, Kaiyou
1997-01-01
Electron-positron pair production and acceleration in the outer magnetosphere may be crucial for a young rapidly spinning canonical pulsar to be a strong Gamma-ray emitter. Collision between curvature radiated GeV photons and soft X-ray photons seems to be the only efficient pair production mechanism. For Crib-like pulsars, the magnetic field near the light cylinder is so strong, such that the synchrotron radiation of secondary pairs will be in the needed X-ray range. However, for majority of the known Gamma-ray pulsars, surface emitted X-rays seem to work as the matches and fuels for a gamma-ray generation fireball in the outer magnetosphere. The needed X-rays could come from thermal emission of a cooling neutron star or could be the heat generated by bombardment of the polar cap by energetic particles generated in the outer magnetosphere. With detection of more Gamma-ray pulsars, it is becoming evident that the neutron star's intrisic geometry (the inclination angle between the rotation and magnetic axes) and observational geometry (the viewing angle with respect to the rotation axis) are crucial to the understanding of varieties of observational properties exhibited by these pulsars. Inclination angles for many known high energy Gamma-ray pulsars appear to be large and the distribution seems to be consistent with random orientation. However, all of them except Geminga are pre-selected from known radio pulsars. The viewing angles are thus limited to be around the respective inclination angles for beamed radio emission, which may induce strong selection effect. The viewing angles as well as the inclination angles of PSR 1509-58 and PSB 0656+14 may be small such that most of the high energy Gamma-rays produced in the outer accelerators may not reach the observer's direction. The observed Gamma-rays below 5 MeV from this pulsar may be synchrotron radiation of secondary electron-positron pairs produced outside the accelerating regions.
Limited-angle tomography for analyzer-based phase-contrast X-ray imaging
Majidi, Keivan; Wernick, Miles N; Li, Jun; Muehleman, Carol; Brankov, Jovan G
2014-01-01
Multiple-Image Radiography (MIR) is an analyzer-based phase-contrast X-ray imaging method (ABI), which is emerging as a potential alternative to conventional radiography. MIR simultaneously generates three planar parametric images containing information about scattering, refraction and attenuation properties of the object. The MIR planar images are linear tomographic projections of the corresponding object properties, which allows reconstruction of volumetric images using computed tomography (CT) methods. However, when acquiring a full range of linear projections around the tissue of interest is not feasible or the scanning time is limited, limited-angle tomography techniques can be used to reconstruct these volumetric images near the central plane, which is the plane that contains the pivot point of the tomographic movement. In this work, we use computer simulations to explore the applicability of limited-angle tomography to MIR. We also investigate the accuracy of reconstructions as a function of number of tomographic angles for a fixed total radiation exposure. We use this function to find an optimal range of angles over which data should be acquired for limited-angle tomography MIR (LAT-MIR). Next, we apply the LAT-MIR technique to experimentally acquired MIR projections obtained in a cadaveric human thumb study. We compare the reconstructed slices near the central plane to the same slices reconstructed by CT-MIR using the full angular view around the object. Finally, we perform a task-based evaluation of LAT-MIR performance for different numbers of angular views, and use template matching to detect cartilage in the refraction image near the central plane. We use the signal-to-noise ratio of this test as the detectability metric to investigate an optimum range of tomographic angles for detecting soft tissues in LAT-MIR. Both results show that there is an optimum range of angular view for data acquisition where LAT-MIR yields the best performance, comparable to CT-MIR only if one considers volumetric images near the central plane and not the whole volume. PMID:24898008
Limited-angle tomography for analyzer-based phase-contrast x-ray imaging
NASA Astrophysics Data System (ADS)
Majidi, Keivan; Wernick, Miles N.; Li, Jun; Muehleman, Carol; Brankov, Jovan G.
2014-07-01
Multiple-image radiography (MIR) is an analyzer-based phase-contrast x-ray imaging method, which is emerging as a potential alternative to conventional radiography. MIR simultaneously generates three planar parametric images containing information about scattering, refraction and attenuation properties of the object. The MIR planar images are linear tomographic projections of the corresponding object properties, which allows reconstruction of volumetric images using computed tomography (CT) methods. However, when acquiring a full range of linear projections around the tissue of interest is not feasible or the scanning time is limited, limited-angle tomography techniques can be used to reconstruct these volumetric images near the central plane, which is the plane that contains the pivot point of the tomographic movement. In this work, we use computer simulations to explore the applicability of limited-angle tomography to MIR. We also investigate the accuracy of reconstructions as a function of number of tomographic angles for a fixed total radiation exposure. We use this function to find an optimal range of angles over which data should be acquired for limited-angle tomography MIR (LAT-MIR). Next, we apply the LAT-MIR technique to experimentally acquired MIR projections obtained in a cadaveric human thumb study. We compare the reconstructed slices near the central plane to the same slices reconstructed by CT-MIR using the full angular view around the object. Finally, we perform a task-based evaluation of LAT-MIR performance for different numbers of angular views, and use template matching to detect cartilage in the refraction image near the central plane. We use the signal-to-noise ratio of this test as the detectability metric to investigate an optimum range of tomographic angles for detecting soft tissues in LAT-MIR. Both results show that there is an optimum range of angular view for data acquisition where LAT-MIR yields the best performance, comparable to CT-MIR only if one considers volumetric images near the central plane and not the whole volume.
Spectral bidirectional reflectance of Antarctic snow: Measurements and parameterization
NASA Astrophysics Data System (ADS)
Hudson, Stephen R.; Warren, Stephen G.; Brandt, Richard E.; Grenfell, Thomas C.; Six, Delphine
2006-09-01
The bidirectional reflectance distribution function (BRDF) of snow was measured from a 32-m tower at Dome C, at latitude 75°S on the East Antarctic Plateau. These measurements were made at 96 solar zenith angles between 51° and 87° and cover wavelengths 350-2400 nm, with 3- to 30-nm resolution, over the full range of viewing geometry. The BRDF at 900 nm had previously been measured at the South Pole; the Dome C measurement at that wavelength is similar. At both locations the natural roughness of the snow surface causes the anisotropy of the BRDF to be less than that of flat snow. The inherent BRDF of the snow is nearly constant in the high-albedo part of the spectrum (350-900 nm), but the angular distribution of reflected radiance becomes more isotropic at the shorter wavelengths because of atmospheric Rayleigh scattering. Parameterizations were developed for the anisotropic reflectance factor using a small number of empirical orthogonal functions. Because the reflectance is more anisotropic at wavelengths at which ice is more absorptive, albedo rather than wavelength is used as a predictor in the near infrared. The parameterizations cover nearly all viewing angles and are applicable to the high parts of the Antarctic Plateau that have small surface roughness and, at viewing zenith angles less than 55°, elsewhere on the plateau, where larger surface roughness affects the BRDF at larger viewing angles. The root-mean-squared error of the parameterized reflectances is between 2% and 4% at wavelengths less than 1400 nm and between 5% and 8% at longer wavelengths.
Smoke from Fires in Southern Mexico
NASA Technical Reports Server (NTRS)
2002-01-01
On May 2, 2002, numerous fires in southern Mexico sent smoke drifting northward over the Gulf of Mexico. These views from the Multi-angle Imaging SpectroRadiometer illustrate the smoke extent over parts of the Gulf and the southern Mexican states of Tabasco, Campeche and Chiapas. At the same time, dozens of other fires were also burning in the Yucatan Peninsula and across Central America. A similar situation occurred in May and June of 1998, when Central American fires resulted in air quality warnings for several U.S. States.The image on the left is a natural color view acquired by MISR's vertical-viewing (nadir) camera. Smoke is visible, but sunglint in some ocean areas makes detection difficult. The middle image, on the other hand, is a natural color view acquired by MISR's 70-degree backward-viewing camera; its oblique view angle simultaneously suppresses sunglint and enhances the smoke. A map of aerosol optical depth, a measurement of the abundance of atmospheric particulates, is provided on the right. This quantity is retrieved using an automated computer algorithm that takes advantage of MISR's multi-angle capability. Areas where no retrieval occurred are shown in black.The images each represent an area of about 380 kilometers x 1550 kilometers and were captured during Terra orbit 12616.MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.The ASTRO-H SXT Performance to the Large Off-Set Angles
NASA Technical Reports Server (NTRS)
Sato, Toshiki; Iizuka, Ryo; Mori, Hideyuki; Hayashi, Takayuki; Maeda, Yoshitomo; Ishida, Manabu; Kikuchi, Naomichi; Kurashima, Sho; Nakaniwa, Nozomi; Okajima, Takashi;
2016-01-01
The X-ray astronomy satellite ASTRO-H, which is the 6th Japanese X-ray astronomy satellite and is renamed Hitomi after launch, is designed to observe celestial X-ray objects in a wide energy band from a few hundred eV to 600 keV. The Soft X-ray Telescopes (SXTs) onboard ASTRO-H play a role of collecting and imaging X-rays up to approximately 12 keV. Although the field of view of the SXT is approximately 15' (FWHM), due to the thin-foil-nested Wolter-I type optics adopted in the SXTs, X-rays out of the field of view can reach the focal plane without experiencing a normal double reflection. This component is referred to as 'stray light'. Owing to investigation of the stray light so far, 'secondary reflection' is now identified as the main component of the stray light, which is composed of X-rays reflected only by secondary reflectors. In order to cut the secondary reflections, a 'pre-collimator' is equipped on top of the SXTs. However, we cannot cut all the stray lights with the pre-collimator in some off-axis angle domain. In this study, we measure the brightness of the stray light of the SXTs at some representative off-axis angles by using the ISAS X-ray beam line. ASTRO-H is equipped with two modules of the SXT; one is for the Soft X-ray Spectrometer (SXS), an X-ray calorimeter, and the other is for the Soft X-ray Imager (SXI), an X-ray CCD camera. These SXT modules are called SXT-S and SXT-I, respectively. Of the two detector systems, the SXI has a large field of view, a square with 38' on a side. To cope with this, we have made a mosaic mapping of the stray light at a representative off-axis angle of 30' in the X-ray beam line at the Institute of Space and Astronautical Science. The effective area of the brightest secondary reflection is found of order approximately 0.1% of the on-axis effective area at the energy of 1.49 keV. The other components are not so bright (less than 5 x 10(exp -4) times smaller than the on-axis effective area). On the other hand, we have found that the effective area of the stray light in the SXS field of view (approximately 3' x 3') at large off-axis angles (greater than 15') are approximately 1(exp -4) times smaller than the on-axis effective area (approximately 590 sq cm at 1.49 keV).
NASA Astrophysics Data System (ADS)
Cloutis, Edward A.; Pietrasz, Valerie B.; Kiddell, Cain; Izawa, Matthew R. M.; Vernazza, Pierre; Burbine, Thomas H.; DeMeo, Francesca; Tait, Kimberly T.; Bell, James F.; Mann, Paul; Applin, Daniel M.; Reddy, Vishnu
2018-05-01
Carbonaceous chondrites (CCs) are important materials for understanding the early evolution of the solar system and delivery of volatiles and organic material to the early Earth. Presumed CC-like asteroids are also the targets of two current sample return missions: OSIRIS-REx to asteroid Bennu and Hayabusa-2 to asteroid Ryugu, and the Dawn orbital mission at asteroid Ceres. To improve our ability to identify and characterize CM2 CC-type parent bodies, we have examined how factors such as particle size, particle packing, and viewing geometry affect reflectance spectra of the Murchison CM2 CC. The derived relationships have implications for disc-resolved examinations of dark asteroids and sampleability. It has been found that reflectance spectra of slabs are more blue-sloped (reflectance decreasing toward longer wavelengths as measured by the 1.8/0.6 μm reflectance ratio), and generally darker, than powdered sample spectra. Decreasing the maximum grain size of a powdered sample results in progressively brighter and more red-sloped spectra. Decreasing the average grain size of a powdered sample results in a decrease in diagnostic absorption band depths, and redder and brighter spectra. Decreasing porosity of powders and variations in surface texture result in spectral changes that may be different as a function of viewing geometry. Increasing thickness of loose dust on a denser powdered substrate leads to a decrease in absorption band depths. Changes in viewing geometry lead to different changes in spectral metrics depending on whether the spectra are acquired in backscatter or forward-scatter geometries. In backscattered geometry, increasing phase angle leads to an initial increase and then decrease in spectral slope, and a general decrease in visible region reflectance and absorption band depths, and frequent decreases in absorption band minima positions. In forward scattering geometry, increasing phase angle leads to small non-systematic changes in spectral slope, and general decreases in visible region reflectance, and absorption band depths. The highest albedos and larger band depths are generally seen in the lowest phase angle backscattering geometry spectra. The reddest spectra are generally seen in the lowest phase angle backscatter geometry spectra. For the same phase angle, spectra acquired in forward scatter geometry are generally redder and darker and have shallower absorption bands than those acquired in backscatter geometry. Overall, backscatter geometry-acquired spectra are flatter, brighter, and have deeper 0.7 μm region absorption band depths than forward scatter geometry-acquired spectra. It was also found that the 0.7, 0.9, and 1.1 μm absorption bands in Murchison spectra, which are attributable to various Fe electronic processes, are ubiquitous and can be used to recognize CM2 chondrites regardless of the physical properties of the meteorite and viewing geometry.
Park, Ju Yong; Hwang, Se Won; Hwang, Kun
2013-11-01
The aim of this study was to compare the painting portraits of beautiful women, femme fatales, and artists' mothers using anthropometry.Portraits of each theme were selected in modern novels, essays and picture books, and categorized portraits. A total of 52 samples were collected, including 20 beautiful women, 20 femme fatales, and 12 artists' mothers. In 5 persons, 17 anthropometric ratios including the alae-alae/zygion-zygion ratio were compared in a 15-degree oblique view and in anteroposterior view photographs, and they were proved to not differ significantly. To distinguish oblique portraits less than 15 degrees, we measured the exocanthion-stomion-exocanthion (ESE) angle in photographs of 5 volunteers. The mean ± SD of the ESE angle was 64.52 ± 4.87 in the 15-degree angle view and 57.68 ± 54.09 in the 30-degree angle view. Thereafter, if the ESE angle was greater than 65 degrees, we considered the portrait to have less than a 15-degree angle and included it in the samples.The ratio did not differ significantly in 11 anthropometric proportions. However, the remaining 5 proportions were statistically significant. Beautiful women had wider noses (85% of the endocanthion-endocanthion width) than those of the femme fatale group (77%). Lips in the beautiful woman group are nicer and thicker (36% of lip's width) compared with the artists' mother group (27%). Femme fatales were relatively similar to beautiful women such as those women with nice and thick lips. However, the femme fatale group had an attractive midface ratio (36% of the total face height) that has been mentioned in the older literature, and the noses of the femme fatale group were narrower and sharper (77% of the endocanthion-endocanthion width) than those of the beautiful women (85%). The artists' mother group has a relatively narrower upper face (29% of the total face height) and thinner lips (27% of the lip width) compared with the other 2 groups (36%).Proportions from works of art are more ideal and attractive than clinically measured proportions. The ideal ratios measured from historical portraits might be useful in planning facial surgeries.
3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...
3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH THE VAL TO THE RIGHT, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Atmospheric Science Data Center
2014-05-15
... the Multi-angle Imaging SpectroRadiometer (MISR). On the left, a natural-color view acquired by MISR's vertical-viewing (nadir) camera ... Gunnison River at the city of Grand Junction. The striking "L" shaped feature in the lower image center is a sandstone monocline known as ...
A cylindrical specimen holder for electron cryo-tomography
Palmer, Colin M.; Löwe, Jan
2014-01-01
The use of slab-like flat specimens for electron cryo-tomography restricts the range of viewing angles that can be used. This leads to the “missing wedge” problem, which causes artefacts and anisotropic resolution in reconstructed tomograms. Cylindrical specimens provide a way to eliminate the problem, since they allow imaging from a full range of viewing angles around the tilt axis. Such specimens have been used before for tomography of radiation-insensitive samples at room temperature, but never for frozen-hydrated specimens. Here, we demonstrate the use of thin-walled carbon tubes as specimen holders, allowing the preparation of cylindrical frozen-hydrated samples of ribosomes, liposomes and whole bacterial cells. Images acquired from these cylinders have equal quality at all viewing angles, and the accessible tilt range is restricted only by the physical limits of the microscope. Tomographic reconstructions of these specimens demonstrate that the effects of the missing wedge are substantially reduced, and could be completely eliminated if a full tilt range was used. The overall quality of these tomograms is still lower than that obtained by existing methods, but improvements are likely in future. PMID:24275523
Fougnie, B; Frouin, R; Lecomte, P; Deschamps, P Y
1999-06-20
Reflected skylight in above-water measurements of diffuse marine reflectance can be reduced substantially by viewing the surface through an analyzer transmitting the vertically polarized component of incident radiance. For maximum reduction of effects, radiometric measurements should be made at a viewing zenith angle of approximately 45 degrees (near the Brewster angle) and a relative azimuth angle between solar and viewing directions greater than 90 degrees (backscattering), preferably 135 degrees. In this case the residual reflected skylight in the polarized signal exhibits minimum sensitivity to the sea state and can be corrected to within a few 10(-4) in reflectance units. For most oceanic waters the resulting relative error on the diffuse marine reflectance in the blue and green is less than 1%. Since the water body polarizes incident skylight, the measured polarized reflectance differs from the total reflectance. The difference, however, is small for the considered geometry. Measurements made at the Scripps Institution of Oceanography pier in La Jolla, Calif., with a specifically designed scanning polarization radiometer, confirm the theoretical findings and demonstrate the usefulness of polarization radiometry for measuring diffuse marine reflectance.
He, Xing; Li, Hua; Shao, Yan; Shi, Bing
2015-01-01
The purpose of this study is to ascertain objective nasal measurements from the basal view that are predictive of nasal esthetics in individuals with secondary cleft nasal deformity. Thirty-three patients who had undergone unilateral cleft lip repair were retrospectively reviewed in this study. The degree of nasal deformity was subjectively ranked by seven surgeons using standardized basal-view measurements. Nine physical objective parameters including angles and ratios were measured. Correlations and regressions between these objective and subjective measurements were then analyzed. There was high concordance in subjective measurements by different surgeons (Kendall's harmonious coefficient = W = .825, P = .006). The strongest predictive factors for nasal aesthetics were the ratio of length of nasal alar (r = .370, P = .034) and the degree of deviation of the columnar axis (r = .451, P = .008). The columellar angle had a more powerful effect in rating nasal esthetics. There was reliable concordance in subjective ranking of nasal esthetics by surgeons. Measurement of the columnar angle may serve as an independent, objective predictor of esthetics of the nose.
View of the launch of STS 51-A shuttle Discovery
NASA Technical Reports Server (NTRS)
1984-01-01
View across the water of the launch of STS 51-A shuttle Discovery. The orbiter is just clearing the launch pad (90032); closer view of the Shuttle Discovery just clearing the launch pad. Photo was taken from across the river, with trees and shrubs forming the bottom edge of the view (90033); Low angle view of the rapidly climbing Discovery, still attached to its two solid rocket boosters and an external fuel tank (90034).
A laser technique for characterizing the geometry of plant canopies
NASA Technical Reports Server (NTRS)
Vanderbilt, V. C.; Silva, L. F.; Bauer, M. E.
1977-01-01
The interception of solar power by the canopy is investigated as a function of solar zenith angle (time), component of the canopy, and depth into the canopy. The projected foliage area, cumulative leaf area, and view factors within the canopy are examined as a function of the same parameters. Two systems are proposed that are capable of describing the geometrical aspects of a vegetative canopy and of operation in an automatic mode. Either system would provide sufficient data to yield a numerical map of the foliage area in the canopy. Both systems would involve the collection of large data sets in a short time period using minimal manpower.
Development of 40-in hybrid hologram screen for auto-stereoscopic video display
NASA Astrophysics Data System (ADS)
Song, Hyun Ho; Nakashima, Y.; Momonoi, Y.; Honda, Toshio
2004-06-01
Usually in auto stereoscopic display, there are two problems. The first problem is that large image display is difficult, and the second problem is that the view zone (which means the zone in which both eyes are put for stereoscopic or 3-D image observation) is very narrow. We have been developing an auto stereoscopic large video display system (over 100 inches diagonal) which a few people can view simultaneously1,2. Usually in displays that are over 100 inches diagonal, an optical video projection system is used. As one of auto stereoscopic display systems the hologram screen has been proposed3,4,5,6. However, if the hologram screen becomes too large, the view zone (corresponding to the reconstructed diffused object) causes color dispersion and color aberration7. We also proposed the additional Fresnel lens attached to the hologram screen. We call the screen a "hybrid hologram screen", (HHS in short). We made the HHS 866mm(H)×433mm(V) (about 40 inch diagonal)8,9,10,11. By using the lens in the reconstruction step, the angle between object light and reference light can be small, compared to without the lens. So, the spread of the view zone by the color dispersion and color aberration becomes small. And also, the virtual image which is reconstructed from the hologram screen can be transformed to a real image (view zone). So, it is not necessary to use a large lens or concave mirror while making a large hologram screen.
Position Estimation for Switched Reluctance Motor Based on the Single Threshold Angle
NASA Astrophysics Data System (ADS)
Zhang, Lei; Li, Pang; Yu, Yue
2017-05-01
This paper presents a position estimate model of switched reluctance motor based on the single threshold angle. In view of the relationship of between the inductance and rotor position, the position is estimated by comparing the real-time dynamic flux linkage with the threshold angle position flux linkage (7.5° threshold angle, 12/8SRM). The sensorless model is built by Maltab/Simulink, the simulation are implemented under the steady state and transient state different condition, and verified its validity and feasibility of the method..
10. Elevation view of south side of FrankJensen Summer Home. ...
10. Elevation view of south side of Frank-Jensen Summer Home. Note that the steep angle of view gives an illusion of a flat roof. For a more accurate depiction of the roof line, see photos WA-207-4 and WA-207-8. - Frank-Jensen Summer Home, 17423 North Lake Shore Drive, Telma, Chelan County, WA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Groth, M; Ellis, R; Brooks, N
A video camera system is described that measures the spatial distribution of visible line emission emitted from the main scrape-off layer (SOL) of plasmas in the DIII-D tokamak. A wide-angle lens installed on an equatorial port and an in-vessel mirror which intercepts part of the lens view provide simultaneous tangential views of the SOL on the low-field and high-field sides of the plasma's equatorial plane. Tomographic reconstruction techniques are used to calculate the 2-D poloidal profiles from the raw data, and 1-D poloidal profiles simulating chordal views of other optical diagnostics from the 2-D profiles. The 2-D profiles can bemore » compared with SOL plasma simulations; the 1-D profiles with measurements from spectroscopic diagnostics. Sample results are presented which elucidate carbon transport in plasmas with toroidally uniform injection of methane and argon transport in disruption mitigation experiments with massive gas jet injection.« less
NASA Astrophysics Data System (ADS)
Poudyal, R.; Singh, M. K.; Gatebe, C. K.; Gautam, R.; Varnai, T.
2015-12-01
Using airborne Cloud Absorption Radiometer (CAR) reflectance measurements of smoke, an empirical relationship between reflectances measured at different sun-satellite geometry is established, in this study. It is observed that reflectance of smoke aerosol at any viewing zenith angle can be computed using a linear combination of reflectance at two viewing zenith angles. One of them should be less than 30° and other must be greater than 60°. We found that the parameters of the linear combination computation follow a third order polynomial function of the viewing geometry. Similar relationships were also established for different relative azimuth angles. Reflectance at any azimuth angle can be written as a linear combination of measurements at two different azimuth angles. One must be in the forward scattering direction and the other in backward scattering, with both close to the principal plane. These relationships allowed us to create an Angular Distribution Model (ADM) for smoke, which can estimate reflectances in any direction based on measurements taken in four view directions. The model was tested by calculating the ADM parameters using CAR data from the SCAR-B campaign, and applying these parameters to different smoke cases at three spectral channels (340nm, 380nm and 470nm). We also tested our modelled smoke ADM formulas with Absorbing Aerosol Index (AAI) directly computed from the CAR data, based on 340nm and 380nm, which is probably the first study to analyze the complete multi-angular distribution of AAI for smoke aerosols. The RMSE (and mean error) of predicted reflectance for SCAR-B and ARCTAS smoke ADMs were found to be 0.002 (1.5%) and 0.047 (6%), respectively. The accuracy of the ADM formulation is also tested through radiative transfer simulations for a wide variety of situations (varying smoke loading, underlying surface types, etc.).
2015-05-04
Saturn's surface is painted with swirls and shadows. Each swirl here is a weather system, reminding us of how dynamic Saturn's atmosphere is. Images taken in the near-infrared (like this one) permit us to peer through Saturn's methane haze layer to the clouds below. Scientists track the clouds and weather systems in the hopes of better understanding Saturn's complex atmosphere - and thus Earth's as well. This view looks toward the sunlit side of the rings from about 17 degrees above the ringplane. The image was taken with the Cassini spacecraft wide-angle camera on Feb. 8, 2015 using a spectral filter which preferentially admits wavelengths of near-infrared light centered at 752 nanometers. The view was obtained at a distance of approximately 794,000 miles (1.3 million kilometers) from Saturn. Image scale is 47 miles (76 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/pia18311
View of ASTRO-2 payload in cargo bay of STS-67 Endeavour
1995-03-17
STS067-713-072 (2-18 March 1995) --- This 70mm cargo bay scene, backdropped against a desert area of Namibia, typifies the view that daily greeted the Astro-2 crew members during their almost 17-days aboard the Space Shuttle Endeavour. Positioned on the Spacelab pallet amidst other hardware, the Astro-2 payload is in its operational mode. Visible here are the Instrument Pointing System (IPS), Hopkins Ultraviolet Telescope (HUT), Star Tracker (ST), Ultraviolet Imaging Telescope (UIT), Wisconsin Ultraviolet Photo-Polarimeter Experiment (WUPPE), and Integrated Radiator System (IRS). At this angle, the Optical Sensor Package (OPS) is not seen. The Igloo, which supports the package of experiments, is in center foreground. Two Get-Away Special (GAS) canisters are in lower left foreground. The Extended Duration Orbiter (EDO) pallet, located aft of the cargo bay, is obscured by the Astro-2 payload. The Endeavour was 190 nautical miles above Earth.
2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...
2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING WEST TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA
Prediction of Viking lander camera image quality
NASA Technical Reports Server (NTRS)
Huck, F. O.; Burcher, E. E.; Jobson, D. J.; Wall, S. D.
1976-01-01
Formulations are presented that permit prediction of image quality as a function of camera performance, surface radiance properties, and lighting and viewing geometry. Predictions made for a wide range of surface radiance properties reveal that image quality depends strongly on proper camera dynamic range command and on favorable lighting and viewing geometry. Proper camera dynamic range commands depend mostly on the surface albedo that will be encountered. Favorable lighting and viewing geometries depend mostly on lander orientation with respect to the diurnal sun path over the landing site, and tend to be independent of surface albedo and illumination scattering function. Side lighting with low sun elevation angles (10 to 30 deg) is generally favorable for imaging spatial details and slopes, whereas high sun elevation angles are favorable for measuring spectral reflectances.
2017-09-12
NASA's Cassini spacecraft gazed toward the northern hemisphere of Saturn to spy subtle, multi-hued bands in the clouds there. This view looks toward the terminator -- the dividing line between night and day -- at lower left. The sun shines at low angles along this boundary, in places highlighting vertical structure in the clouds. Some vertical relief is apparent in this view, with higher clouds casting shadows over those at lower altitude. Images taken with the Cassini spacecraft narrow-angle camera using red, green and blue spectral filters were combined to create this natural-color view. The images were acquired on Aug. 31, 2017, at a distance of approximately 700,000 miles (1.1 million kilometers) from Saturn. Image scale is about 4 miles (6 kilometers) per pixel. https://photojournal.jpl.nasa.gov/catalog/PIA21888
Rotary acceleration of a subject inhibits choice reaction time to motion in peripheral vision
NASA Technical Reports Server (NTRS)
Borkenhagen, J. M.
1974-01-01
Twelve pilots were tested in a rotation device with visual simulation, alone and in combination with rotary stimulation, in experiments with variable levels of acceleration and variable viewing angles, in a study of the effect of S's rotary acceleration on the choice reaction time for an accelerating target in peripheral vision. The pilots responded to the direction of the visual motion by moving a hand controller to the right or left. Visual-plus-rotary stimulation required a longer choice reaction time, which was inversely related to the level of acceleration and directly proportional to the viewing angle.
Effects of soil and canopy characteristics on microwave backscattering of vegetation
NASA Technical Reports Server (NTRS)
Daughtry, C. S. T.; Ranson, K. J.
1991-01-01
A frequency modulated continuous wave C-band (4.8 GHz) scatterometer was mounted on an aerial lift truck and backscatter coefficients of corn were acquired as functions of polarizations, view angles, and row directions. As phytomass and green leaf area index increased, the backscatter also increased. Near anthesis when the canopies were fully developed, the major scattering elements were located in the upper 1 m of the 2.8 m tall canopy and little backscatter was measured below that level. C-band backscatter data could provide information to monitor vegetation at large view zenith angles.
16. SOUTH TO VIEW OF CIRCA 1900 MICHIGAN MACHINERY MFG. ...
16. SOUTH TO VIEW OF CIRCA 1900 MICHIGAN MACHINERY MFG. CO. PUNCH PRESS WITH WOOD-BURNING HEATING STOVE LOCATED IN THE CENTER OF THE FACTORY BUILDING. BESIDE THE HEATING STOVE, POINTING TOWARD THE PUNCH PRESS, IS A JIG USED TO POSITION ANGLE STEEL COMPONENTS OF STEEL WINDMILL TOWER LEGS FOR PUNCHING BOLT HOLES. THE SUPPORT FOR THE BRICK FLUE OF THE HEATING STOVE IS CONSTRUCTED FROM SALVAGED GALVANIZED ANGLE STEEL OF THE TYPE USED IN FABRICATING WINDMILL TOWERS MANUFACTURED IN THE FACTORY. - Kregel Windmill Company Factory, 1416 Central Avenue, Nebraska City, Otoe County, NE
Impact Angle and Time Control Guidance Under Field-of-View Constraints and Maneuver Limits
NASA Astrophysics Data System (ADS)
Shim, Sang-Wook; Hong, Seong-Min; Moon, Gun-Hee; Tahk, Min-Jea
2018-04-01
This paper proposes a guidance law which considers the constraints of seeker field-of-view (FOV) as well as the requirements on impact angle and time. The proposed guidance law is designed for a constant speed missile against a stationary target. The guidance law consists of two terms of acceleration commands. The first one is to achieve zero-miss distance and the desired impact angle, while the second is to meet the desired impact time. To consider the limits of FOV and lateral maneuver capability, a varying-gain approach is applied on the second term. Reduction of realizable impact times due to these limits is then analyzed by finding the longest course among the feasible ones. The performance of the proposed guidance law is demonstrated by numerical simulation for various engagement conditions.
2006-06-01
angle Imaging SpectroRadiometer MODIS Moderate Resolution Imaging Spectroradiometer NGA National Geospatial Intelligence Agency POI Principles of...and µ , the cosine of the viewing zenith angle and the effect of the variation of each of these variables on total optical depth. Extraterrestrial ...Eq. (34). Additionally, solar zenith angle also plays a role in the third term on the RHS of Eq. (34) by modifying extraterrestrial spectral solar
A Prototype Instrument for Adaptive SPECT Imaging
Freed, Melanie; Kupinski, Matthew A.; Furenlid, Lars R.; Barrett, Harrison H.
2015-01-01
We have designed and constructed a small-animal adaptive SPECT imaging system as a prototype for quantifying the potential benefit of adaptive SPECT imaging over the traditional fixed geometry approach. The optical design of the system is based on filling the detector with the object for each viewing angle, maximizing the sensitivity, and optimizing the resolution in the projection images. Additional feedback rules for determining the optimal geometry of the system can be easily added to the existing control software. Preliminary data have been taken of a phantom with a small, hot, offset lesion in a flat background in both adaptive and fixed geometry modes. Comparison of the predicted system behavior with the actual system behavior is presented along with recommendations for system improvements. PMID:26346820
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Y; Yin, F; Ren, L
2014-06-15
Purpose: To develop a quasi-cine CBCT reconstruction technique that uses extremely-small angle (∼3°) projections to generate real-time high-quality lung CBCT images. Method: 4D-CBCT is obtained at the beginning and used as prior images. This study uses extremely-small angle (∼3°) on-board projections acquired at a single respiratory phase to reconstruct the CBCT image at this phase. An adaptive constrained free-form deformation (ACFD) method is developed to deform the prior 4D-CBCT volume at the same phase to reconstruct the new CBCT. Quasi-cine CBCT images are obtained by continuously reconstructing CBCT images at subsequent phases every 3° angle (∼0.5s). Note that the priormore » 4D-CBCT images are dynamically updated using the latest CBCT images. The 4D digital extended-cardiac-torso (XCAT) phantom was used to evaluate the efficacy of ACFD. A lung patient was simulated with a tumor baseline shift of 2mm along superior-inferior (SI) direction after every respiratory cycle for 5 cycles. Limited-angle projections were simulated for each cycle. The 4D-CBCT reconstructed by these projections were compared with the ground-truth generated in XCAT.Volume-percentage-difference (VPD) and center-of-mass-shift (COMS) were calculated between the reconstructed and the ground-truth tumors to evaluate their geometric differences.The ACFD was also compared to a principal-component-analysis based motion-modeling (MM) method. Results: Using orthogonal-view 3° projections, the VPD/COMS values for tumor baseline shifts of 2mm, 4mm, 6mm, 8mm, 10mm were 11.0%/0.3mm, 25.3%/2.7mm, 22.4%/2.9mm, 49.5%/5.4mm, 77.2%/8.1mm for the MM method, and 2.9%/0.7mm, 3.9%/0.8mm, 6.2%/1mm, 7.9%/1.2mm, 10.1%/1.1mm for the ACFD method. Using orthogonal-view 0° projections (1 projection only), the ACFD method yielded VPD/COMS results of 5.0%/0.9mm, 10.5%/1.2mm, 15.1%/1.4mm, 20.9%/1.6mm and 24.8%/1.6mm. Using single-view instead of orthogonal-view projections yielded less accurate results for ACFD. Conclusion: The ACFD method accurately reconstructs snapshot CBCT images using orthogonal-view 3° projections. It has a great potential to provide real-time quasi-cine CBCT images for verification in lung radiation therapy. The research is supported by grant from Varian Medical Systems.« less
Ren, Huazhong; Yan, Guangjian; Liu, Rongyuan; Li, Zhao-Liang; Qin, Qiming; Nerry, Françoise; Liu, Qiang
2015-03-27
Multi-angular observation of land surface thermal radiation is considered to be a promising method of performing the angular normalization of land surface temperature (LST) retrieved from remote sensing data. This paper focuses on an investigation of the minimum requirements of viewing angles to perform such normalizations on LST. The normally kernel-driven bi-directional reflectance distribution function (BRDF) is first extended to the thermal infrared (TIR) domain as TIR-BRDF model, and its uncertainty is shown to be less than 0.3 K when used to fit the hemispheric directional thermal radiation. A local optimum three-angle combination is found and verified using the TIR-BRDF model based on two patterns: the single-point pattern and the linear-array pattern. The TIR-BRDF is applied to an airborne multi-angular dataset to retrieve LST at nadir (Te-nadir) from different viewing directions, and the results show that this model can obtain reliable Te-nadir from 3 to 4 directional observations with large angle intervals, thus corresponding to large temperature angular variations. The Te-nadir is generally larger than temperature of the slant direction, with a difference of approximately 0.5~2.0 K for vegetated pixels and up to several Kelvins for non-vegetated pixels. The findings of this paper will facilitate the future development of multi-angular thermal infrared sensors.
Ren, Huazhong; Yan, Guangjian; Liu, Rongyuan; Li, Zhao-Liang; Qin, Qiming; Nerry, Françoise; Liu, Qiang
2015-01-01
Multi-angular observation of land surface thermal radiation is considered to be a promising method of performing the angular normalization of land surface temperature (LST) retrieved from remote sensing data. This paper focuses on an investigation of the minimum requirements of viewing angles to perform such normalizations on LST. The normally kernel-driven bi-directional reflectance distribution function (BRDF) is first extended to the thermal infrared (TIR) domain as TIR-BRDF model, and its uncertainty is shown to be less than 0.3 K when used to fit the hemispheric directional thermal radiation. A local optimum three-angle combination is found and verified using the TIR-BRDF model based on two patterns: the single-point pattern and the linear-array pattern. The TIR-BRDF is applied to an airborne multi-angular dataset to retrieve LST at nadir (Te-nadir) from different viewing directions, and the results show that this model can obtain reliable Te-nadir from 3 to 4 directional observations with large angle intervals, thus corresponding to large temperature angular variations. The Te-nadir is generally larger than temperature of the slant direction, with a difference of approximately 0.5~2.0 K for vegetated pixels and up to several Kelvins for non-vegetated pixels. The findings of this paper will facilitate the future development of multi-angular thermal infrared sensors. PMID:25825975
Head posture measurements among work vehicle drivers and implications for work and workplace design.
Eklund, J; Odenrick, P; Zettergren, S; Johansson, H
1994-04-01
An increased risk of musculoskeletal disorders, e.g. from the neck region, has been found among professional drivers of work vehicles. The purpose of this study was to identify causes of postural load and implications for vehicle design and work tasks. A second purpose was to develop the methods for measurement and analysis of head postures. Field measurements of head postures for drivers of fork lift trucks, forestry machines, and cranes were carried out. The equipment used was an electric goniometer measurement system, containing a mechanical transmission between the head and the upper trunk. Methods for data presentation and quantification were developed. The results showed that rotatable and movable driver cabins improved head postures and viewing angles substantially. Narrow window frame structures and large, optimally-placed windows were also advantageous. The steering wheel, controls, and a high backrest restricted shoulder rotation, which increased head rotation in unfavourable viewing angles. Improved workspace layouts and work organization factors such as job enlargement decreased the influence of strenuous postures. The results also showed that head postures should be analysed in two or three dimensions simultaneously, otherwise the postures taken will be underestimated in relation to the maximal voluntary movement.
2016-09-12
Saturn's shadow stretched beyond the edge of its rings for many years after Cassini first arrived at Saturn, casting an ever-lengthening shadow that reached its maximum extent at the planet's 2009 equinox. This image captured the moment in 2015 when the shrinking shadow just barely reached across the entire main ring system. The shadow will continue to shrink until the planet's northern summer solstice, at which point it will once again start lengthening across the rings, reaching across them in 2019. Like Earth, Saturn is tilted on its axis. And, just as on Earth, as the sun climbs higher in the sky, shadows get shorter. The projection of the planet's shadow onto the rings shrinks and grows over the course of its 29-year-long orbit, as the angle of the sun changes with respect to Saturn's equator. This view looks toward the sunlit side of the rings from about 11 degrees above the ring plane. The image was taken in visible light with the Cassini spacecraft wide-angle camera on Jan. 16, 2015. The view was obtained at a distance of approximately 1.6 million miles (2.5 million kilometers) from Saturn. Image scale is about 90 miles (150 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA20498
Volga Delta and the Caspian Sea
NASA Technical Reports Server (NTRS)
2002-01-01
Russia's Volga River is the largest river system in Europe, draining over 1.3 million square kilometers of catchment area into the Caspian Sea. The brackish Caspian is Earth's largest landlocked water body, and its isolation from the world's oceans has enabled the preservation of several unique animal and plant species. The Volga provides most of the Caspian's fresh water and nutrients, and also discharges large amounts of sediment and industrial waste into the relatively shallow northern part of the sea. These images of the region were captured by the Multi-angle Imaging SpectroRadiometer on October 5, 2001, during Terra orbit 9567. Each image represents an area of approximately 275 kilometers x 376 kilometers.The left-hand image is from MISR's nadir (vertical-viewing) camera, and shows how light is reflected at red, green, and blue wavelengths. The right-hand image is a false color composite of red-band imagery from MISR's 60-degree backward, nadir, and 60-degree forward-viewing cameras, displayed as red, green, and blue, respectively. Here, color variations indicate how light is reflected at different angles of view. Water appears blue in the right-hand image, for example, because sun glitter makes smooth, wet surfaces look brighter at the forward camera's view angle. The rougher-textured vegetated wetlands near the coast exhibit preferential backscattering, and consequently appear reddish. A small cloud near the center of the delta separates into red, green, and blue components due to geometric parallax associated with its elevation above the surface.Other notable features within the images include several linear features located near the Volga Delta shoreline. These long, thin lines are artificially maintained shipping channels, dredged to depths of at least 2 meters. The crescent-shaped Kulaly Island, also known as Seal Island, is visible near the right-hand edge of the images.MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.Jung, Kyunghwa; Choi, Hyunseok; Hong, Hanpyo; Adikrishna, Arnold; Jeon, In-Ho; Hong, Jaesung
2017-02-01
A hands-free region-of-interest (ROI) selection interface is proposed for solo surgery using a wide-angle endoscope. A wide-angle endoscope provides images with a larger field of view than a conventional endoscope. With an appropriate selection interface for a ROI, surgeons can also obtain a detailed local view as if they moved a conventional endoscope in a specific position and direction. To manipulate the endoscope without releasing the surgical instrument in hand, a mini-camera is attached to the instrument, and the images taken by the attached camera are analyzed. When a surgeon moves the instrument, the instrument orientation is calculated by an image processing. Surgeons can select the ROI with this instrument movement after switching from 'task mode' to 'selection mode.' The accelerated KAZE algorithm is used to track the features of the camera images once the instrument is moved. Both the wide-angle and detailed local views are displayed simultaneously, and a surgeon can move the local view area by moving the mini-camera attached to the surgical instrument. Local view selection for a solo surgery was performed without releasing the instrument. The accuracy of camera pose estimation was not significantly different between camera resolutions, but it was significantly different between background camera images with different numbers of features (P < 0.01). The success rate of ROI selection diminished as the number of separated regions increased. However, separated regions up to 12 with a region size of 160 × 160 pixels were selected with no failure. Surgical tasks on a phantom model and a cadaver were attempted to verify the feasibility in a clinical environment. Hands-free endoscope manipulation without releasing the instruments in hand was achieved. The proposed method requires only a small, low-cost camera and an image processing. The technique enables surgeons to perform solo surgeries without a camera assistant.
Optical Polarization of Light from a Sorghum Canopy Measured Under Both a Clear and an Overcast Sky
NASA Technical Reports Server (NTRS)
Vanderbilt, Vern; Daughtry, Craig; Biehl, Larry; Dahlgren, Robert
2014-01-01
Introduction: We tested the hypothesis that the optical polarization of the light reflected by a sorghum canopy is due to a Fresnel-type redirection, by sorghum leaf surfaces, of light from an unpolarized light source, the sun or overcast sky, toward the measuring sensor. If it can be shown that the source of the polarization of the light scattered by the sorghum canopy is a first surface, Fresnel-type reflection, then removing this surface reflected light from measurements of canopy reflectance presumably would allow better insight into the biochemical processes such as photosynthesis and metabolism that occur in the interiors of sorghum canopy leaves. Methods: We constructed a tower 5.9m tall in the center of a homogenous sorghum field. We equipped two Barnes MMR radiometers with polarization analyzers on the number 1, 3 and 7 Landsat TM wavelength bands. Positioning the radiometers atop the tower, we collected radiance data in 44 view directions on two days, one day with an overcast sky and the other, clear and sunlit. From the radiance data we calculated the linear polarization of the reflected light for each radiometer wavelength channel and view direction. Results and Discussion: Our experimental results support our hypothesis, showing that the amplitude of the linearly polarized portion of the light reflected by the sorghum canopy varied dramatically with view azimuth direction under a point source, the sun, but the amplitude varied little with view azimuth direction under the hemispherical source, the overcast sky. Under the clear sky, the angle of polarization depended upon the angle of incidence of the sunlight on the leaf, while under the overcast sky the angle of polarization depended upon the zenith view angle. These results support a polarized radiation transport model of the canopy that is based upon a first surface, Fresnel reflection from leaves in the sorghum canopy.
126. AERIAL FORWARD VIEW OF ENCLOSED HURRICANE BOW WITH FLIGHT ...
126. AERIAL FORWARD VIEW OF ENCLOSED HURRICANE BOW WITH FLIGHT DECK GUN MOUNTS REMOVED AND ANGLED FLIGHT DECK. 1 OCTOBER 1956. (NATIONAL ARCHIVES NO. 80-G-1001445) - U.S.S. HORNET, Puget Sound Naval Shipyard, Sinclair Inlet, Bremerton, Kitsap County, WA
10. View northwest Typical panel detail (south chord) of variable ...
10. View northwest Typical panel detail (south chord) of variable section girder showing riveted connections, angle stiffeners for girder web, and nuts securing wind bracing rods. - Walpole-Westminster Bridge, Spanning Connecticut River between Walpole, NH & Westminster, VT, Walpole, Cheshire County, NH
Choi, Seung Yong; Lee, Youlim; Kim, Mirinae; Park, Young Hoon
2018-04-01
To investigate the outcomes of scleral buckling surgery performed under a slit-lamp illumination system (Visulux) with a contact wide-angle viewing lens (Mini Quad) in patients with rhegmatogenous retinal detachment (RRD) and to compare these outcomes with those of surgery performed under an indirect ophthalmoscope. By retrospective review of electronic medical records, patients with RRD who had undergone scleral buckling surgery were identified. Scleral buckling surgeries were performed with two illumination instruments, a slit-lamp (SL group) and an indirect ophthalmoscope (IO group). Subretinal fluid drainage, cryopexy, and intravitreal gas injection were performed optionally. At 6 months after surgery, anatomical and functional outcomes were evaluated and compared between the two groups. Operation time was also compared between the two groups. Of the 45 total patients (45 eyes), 28 were included in the SL group, and 17 were included in the IO group. In the SL and IO groups, the primary anatomical success rate was 89.3% and 88.2%, respectively (p = 0.92). The logarithm of the minimal angle of resolution change, which reflects improvement in best-corrected visual acuity after surgery, was -0.19 ± 0.38 in the SL group and -0.21 ± 0.63 in the IO group; this difference was not statistically significant (p = 0.91). The mean operation time was significantly shorter in the SL group (78.9 ± 11.8 minutes) than in the IO group (100.0 ± 13.9 minutes, p < 0.001), especially for patients who underwent additional procedures such as subretinal fluid drainage and cryopexy (81.4 ± 12.9 and 103.5 ± 12.3 minutes, respectively, p < 0.001). Scleral buckling surgery performed under a slit-lamp illumination system yielded a similar anatomical success rate and similar functional improvement in RRD compared with surgery performed under an indirect ophthalmoscope. The slit-lamp system could save time, especially in bullous RRD, which requires additional subretinal fluid drainage. © 2018 The Korean Ophthalmological Society.
NASA Technical Reports Server (NTRS)
Graff, Paige Valderrama; Baker, Marshalyn (Editor); Graff, Trevor (Editor); Lindgren, Charlie (Editor); Mailhot, Michele (Editor); McCollum, Tim (Editor); Runco, Susan (Editor); Stefanov, William (Editor); Willis, Kim (Editor)
2010-01-01
Scientists from the Image Science and Analysis Laboratory (ISAL) at NASA's Johnson Space Center (JSC) work with astronauts onboard the International Space Station (ISS) who take images of Earth. Astronaut photographs, sometimes referred to as Crew Earth Observations, are taken using hand-held digital cameras onboard the ISS. These digital images allow scientists to study our Earth from the unique perspective of space. Astronauts have taken images of Earth since the 1960s. There is a database of over 900,000 astronaut photographs available at http://eol.jsc.nasa.gov . Images are requested by ISAL scientists at JSC and astronauts in space personally frame and acquire them from the Destiny Laboratory or other windows in the ISS. By having astronauts take images, they can specifically frame them according to a given request and need. For example, they can choose to use different lenses to vary the amount of area (field of view) an image will cover. Images can be taken at different times of the day which allows different lighting conditions to bring out or highlight certain features. The viewing angle at which an image is acquired can also be varied to show the same area from different perspectives. Pointing the camera straight down gives you a nadir shot. Pointing the camera at an angle to get a view across an area would be considered an oblique shot. Being able to change these variables makes astronaut photographs a unique and useful data set. Astronaut photographs are taken from the ISS from altitudes of 300 - 400 km (185 to 250 miles). One of the current cameras being used, the Nikon D3X digital camera, can take images using a 50, 100, 250, 400 or 800mm lens. These different lenses allow for a wider or narrower field of view. The higher the focal length (800mm for example) the narrower the field of view (less area will be covered). Higher focal lengths also show greater detail of the area on the surface being imaged. Scientists from the Image Science and Analysis Laboratory (ISAL) at NASA s Johnson Space Center (JSC) work with astronauts onboard the International Space Station (ISS) who take images of Earth. Astronaut photographs, sometimes referred to as Crew Earth Observations, are taken using hand-held digital cameras onboard the ISS. These digital images allow scientists to study our Earth from the unique perspective of space. Astronauts have taken images of Earth since the 1960s. There is a database of over 900,000 astronaut photographs available at http://eol.jsc.nasa.gov . Images are requested by ISAL scientists at JSC and astronauts in space personally frame and acquire them from the Destiny Laboratory or other windows in the ISS. By having astronauts take images, they can specifically frame them according to a given request and need. For example, they can choose to use different lenses to vary the amount of area (field of view) an image will cover. Images can be taken at different times of the day which allows different lighting conditions to bring out or highlight certain features. The viewing angle at which an image is acquired can also be varied to show the same area from different perspectives. Pointing the camera straight down gives you a nadir shot. Pointing the camera at an angle to get a view across an area would be considered an oblique shot. Being able to change these variables makes astronaut photographs a unique and useful data set. Astronaut photographs are taken from the ISS from altitudes of 300 - 400 km (approx.185 to 250 miles). One of the current cameras being used, the Nikon D3X digital camera, can take images using a 50, 100, 250, 400 or 800mm lens. These different lenses allow for a wider or narrower field of view. The higher the focal length (800mm for example) the narrower the field of view (less area will be covered). Higher focal lengths also show greater detail of the area on the surface being imaged. There are four major systems or spheres of Earth. They are: Atmosphere, Biosphere, Hydrosphe, and Litho/Geosphere.
SAR (Synthetic Aperture Radar). Earth observing system. Volume 2F: Instrument panel report
NASA Technical Reports Server (NTRS)
1987-01-01
The scientific and engineering requirements for the Earth Observing System (EOS) imaging radar are provided. The radar is based on Shuttle Imaging Radar-C (SIR-C), and would include three frequencies: 1.25 GHz, 5.3 GHz, and 9.6 GHz; selectable polarizations for both transmit and receive channels; and selectable incidence angles from 15 to 55 deg. There would be three main viewing modes: a local high-resolution mode with typically 25 m resolution and 50 km swath width; a regional mapping mode with 100 m resolution and up to 200 km swath width; and a global mapping mode with typically 500 m resolution and up to 700 km swath width. The last mode allows global coverage in three days. The EOS SAR will be the first orbital imaging radar to provide multifrequency, multipolarization, multiple incidence angle observations of the entire Earth. Combined with Canadian and Japanese satellites, continuous radar observation capability will be possible. Major applications in the areas of glaciology, hydrology, vegetation science, oceanography, geology, and data and information systems are described.
CCD Camera Lens Interface for Real-Time Theodolite Alignment
NASA Technical Reports Server (NTRS)
Wake, Shane; Scott, V. Stanley, III
2012-01-01
Theodolites are a common instrument in the testing, alignment, and building of various systems ranging from a single optical component to an entire instrument. They provide a precise way to measure horizontal and vertical angles. They can be used to align multiple objects in a desired way at specific angles. They can also be used to reference a specific location or orientation of an object that has moved. Some systems may require a small margin of error in position of components. A theodolite can assist with accurately measuring and/or minimizing that error. The technology is an adapter for a CCD camera with lens to attach to a Leica Wild T3000 Theodolite eyepiece that enables viewing on a connected monitor, and thus can be utilized with multiple theodolites simultaneously. This technology removes a substantial part of human error by relying on the CCD camera and monitors. It also allows image recording of the alignment, and therefore provides a quantitative means to measure such error.
Experimental teaching and training system based on volume holographic storage
NASA Astrophysics Data System (ADS)
Jiang, Zhuqing; Wang, Zhe; Sun, Chan; Cui, Yutong; Wan, Yuhong; Zou, Rufei
2017-08-01
The experiment of volume holographic storage for teaching and training the practical ability of senior students in Applied Physics is introduced. The students can learn to use advanced optoelectronic devices and the automatic control means via this experiment, and further understand the theoretical knowledge of optical information processing and photonics disciplines that have been studied in some courses. In the experiment, multiplexing holographic recording and readout is based on Bragg selectivity of volume holographic grating, in which Bragg diffraction angle is dependent on grating-recording angel. By using different interference angle between reference and object beams, the holograms can be recorded into photorefractive crystal, and then the object images can be read out from these holograms via angular addressing by using the original reference beam. In this system, the experimental data acquisition and the control of the optoelectronic devices, such as the shutter on-off, image loaded in SLM and image acquisition of a CCD sensor, are automatically realized by using LabVIEW programming.
Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; ...
2016-11-28
Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° ×more » 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision.« less
2016-11-21
Surface features are visible on Saturn's moon Prometheus in this view from NASA's Cassini spacecraft. Most of Cassini's images of Prometheus are too distant to resolve individual craters, making views like this a rare treat. Saturn's narrow F ring, which makes a diagonal line beginning at top center, appears bright and bold in some Cassini views, but not here. Since the sun is nearly behind Cassini in this image, most of the light hitting the F ring is being scattered away from the camera, making it appear dim. Light-scattering behavior like this is typical of rings comprised of small particles, such as the F ring. This view looks toward the unilluminated side of the rings from about 14 degrees below the ring plane. The image was taken in visible light with the Cassini spacecraft narrow-angle camera on Sept. 24, 2016. The view was acquired at a distance of approximately 226,000 miles (364,000 kilometers) from Prometheus and at a sun-Prometheus-spacecraft, or phase, angle of 51 degrees. Image scale is 1.2 miles (2 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA20508
NASA Astrophysics Data System (ADS)
Markiet, Vincent; Perheentupa, Viljami; Mõttus, Matti; Hernández-Clemente, Rocío
2016-04-01
Imaging spectroscopy is a remote sensing technology which records continuous spectral data at a very high (better than 10 nm) resolution. Such spectral images can be used to monitor, for example, the photosynthetic activity of vegetation. Photosynthetic activity is dependent on varying light conditions and varies within the canopy. To measure this variation we need very high spatial resolution data with resolution better than the dominating canopy element size (e.g., tree crown in a forest canopy). This is useful, e.g., for detecting photosynthetic downregulation and thus plant stress. Canopy illumination conditions are often quantified using the shadow fraction: the fraction of visible foliage which is not sunlit. Shadow fraction is known to depend on view angle (e.g., hot spot images have very low shadow fraction). Hence, multiple observation angles potentially increase the range of shadow fraction in the imagery in high spatial resolution imaging spectroscopy data. To investigate the potential of multi-angle imaging spectroscopy in investigating canopy processes which vary with shadow fraction, we obtained a unique multiangular airborne imaging spectroscopy data for the Hyytiälä forest research station located in Finland (61° 50'N, 24° 17'E) in July 2015. The main tree species are Norway spruce (Picea abies L. karst), Scots pine (Pinus sylvestris L.) and birch (Betula pubescens Ehrh., Betula pendula Roth). We used an airborne hyperspectral sensor AISA Eagle II (Specim - Spectral Imaging Ltd., Finland) mounted on a tilting platform. The tilting platform allowed us to measure at nadir and approximately 35 degrees off-nadir. The hyperspectral sensor has a 37.5 degrees field of view (FOV), 0.6m pixel size, 128 spectral bands with an average spectral bandwidth of 4.6nm and is sensitive in the 400-1000 nm spectral region. The airborne data was radiometrically, atmospherically and geometrically processed using the Parge and Atcor software (Re Se applications Schläpfer, Switzerland). However, even after meticulous geolocation, the canopy elements (needles) seen from the three view angles were different: at each overpass, different parts of the same crowns were observed. To overcome this, we used a 200m x 200m test site covered with pure pine stands. We assumed that for sunlit, shaded and understory spectral signatures are independent of viewing direction to the accuracy of a constant BRDF factor. Thus, we compared the spectral signatures for sunlit and shaded canopy and understory obtained for each view direction. We selected visually six hundred of the brightest and darkest canopy pixels. Next, we performed a minimum noise fraction (MNF) transformation, created a pixel purity index (PPI) and used Envi's n-D scatterplot to determine pure spectral signatures for the two classes. The pure endmembers for different view angles were compared to determine the BRDF factor and to analyze its spectral invariance. We demonstrate the compatibility of multi-angle data with high spatial resolution data. In principle, both carry similar information on structured (non-flat) targets thus as a vegetation canopy. Nevertheless, multiple view angles helped us to extend the range of shadow fraction in the images. Also, correct separation of shaded crown and shaded understory pixels remains a challenge.
NASA Astrophysics Data System (ADS)
Matar, C.; Auriol, F.; Nicolas, J. M.; Parol, F.; Riedi, J.; Djellali, M. S.; Cornet, C.; Waquet, F.; Catalfamo, M.; Delegove, C.; Loisil, R.
2017-12-01
OSIRIS instrument largely inherits from the POLDER concept developed and operated between 1991 (first airborne prototype) and 2013 (end of the POLDER-3/PARASOL space-borne mission). It consists in two optical systems, one covering the visible to near infrared range (440, 490, 670, 763, 765, 870, 910 and 940 nm) and a second one for the shortwave infrared (940, 1020, 1240, 1360, 1620 and 2200 nm). Each optical system is composed of a wide field-of-view optics (114° and 105° respectively) associated to two rotating wheels with interferential filters (spectral) and analyzers filters (polarization) respectively, and a 2D array of detectors. For each channel, radiance is measured once without analyzer, followed by sequential measurements with the three analyzers shifted by an angle of 60° to reconstruct the total and polarized radiances. The complete acquisition sequence for all spectral channels last a couple of seconds according to the chosen measurement protocol. Thanks to the large field of view of the optics, any target is seen under several viewing angles during the aircraft motion. In a first step we will present the new ground characterization of the instrument based on laboratory measurements (linearity, flat-field, absolute calibration, induced polarization, polarizers efficiency and position), the radiometric model and the Radiometric Inverted Model (RIM) used to develop the Level 1 processing chain that is used to produce level 1 products (normalized radiances, polarized or not, with viewing geometries) from the instrument generated level 0 files (Digital Counts) and attitude information from inertial system. The stray light issues will be specifically discussed. In a second step we will present in-flight radiometric and geometric methods applied to OSIRIS data in order to control and validate ground-based calibrated products: molecular scattering method and sun-glint cross-band method for radiometric calibration, glories, rainbows and sun-glint targets for geometric calibration control. Results from the CharMEX (June-July 2013) and Caliosiris (October 2014) OSIRIS campaigns will be presented. Finally, we will present the available products developed and produced by LOA/University of Lille/CNRS, as compared to the scheduled level 1B and 1C 3MI products.
NASA Astrophysics Data System (ADS)
Sybilski, P.; Pawłaszek, R. K.; Sybilska, A.; Konacki, M.; Hełminiak, K. G.; Kozłowski, S. K.; Ratajczak, M.
2018-07-01
We have obtained high-resolution spectra of four eclipsing binary systems (FM Leo, NN Del, V963 Cen and AI Phe) with the view to gaining an insight into the relative orientations of their stellar spin axes and orbital axes. The so-called Rossiter-McLaughlin (RM) effect, i.e. the fact that the broadening and the amount of blue or redshift in the spectra during an eclipse depends on the tilt of the spin axis of the background star, has the potential of reconciling observations and theoretical models if such a tilt is found. We analyse the RM effect by disentangling the spectra, removing the front component and measuring the remaining, distorted lines with a broadening function (BF) obtained from single-value decomposition (SVD), weighting by the intensity centre of the BF in the eclipse. All but one of our objects show no significant misalignment, suggesting that aligned systems are dominant. We provide stellar as well as orbital parameters for our systems. With five measured spin-orbit angles, we increase significantly (from 9 to 14) the number of stars for which it has been measured. The spin-orbit angle β calculated for AI Phe's secondary component shows a misalignment of 87±17°. NN Del, with a large separation of components and a long dynamical time-scale for circularization and synchronization, is an example of a close to primordial spin-orbit angle measurement.
1975-03-01
Veazey , "An Integrated Error Description of Active and Passive Balloon Tracking Systems," ECOM-5500, June 1973. 18. Doll, Barry, "The Potential Use...Effect of Viewing Angle on the Ground Resolution of Satellite-Borne Sensors," ECOM-5502, July 1973. 20. Miller, Walter B., and Donald R. Veazey ...60. Miller, Walter B., and Donald R. Veazey , "On Increasing Vertical Efficiency of a Passive Balloon Tracking Device by Optimal Choice of
P6 Truss, port side of the Integrated Equipment Assembly (IEA)
2000-12-03
STS097-374-015 (5 December 2000) --- This high angle view shows astronaut Carlos I. Noriega, STS-97 mission specialist, traversing over Endeavour's cargo bay during the flight's first space walk on Dec. 5, 2000. Astronaut Joseph R. Tanner, mission specialist, was near the top of the P6 truss structure when he exposed the 35mm frame. The Canadian-built Remote Manipulator System (RMS) arm, instrumental in the current operations, can be seen at bottom right.
The Geolocation model for lunar-based Earth observation
NASA Astrophysics Data System (ADS)
Ding, Yixing; Liu, Guang; Ren, Yuanzhen; Ye, Hanlin; Guo, Huadong; Lv, Mingyang
2016-07-01
In recent years, people are more and more aware of that the earth need to treated as an entirety, and consequently to be observed in a holistic, systematic and multi-scale view. However, the interaction mechanism between the Earth's inner layers and outer layers is still unclear. Therefore, we propose to observe the Earth's inner layers and outer layers instantaneously on the Moon which may be helpful to the studies in climatology, meteorology, seismology, etc. At present, the Moon has been proved to be an irreplaceable platform for Earth's outer layers observation. Meanwhile, some discussions have been made in lunar-based observation of the Earth's inner layers, but the geolocation model of lunar-based observation has not been specified yet. In this paper, we present a geolocation model based on transformation matrix. The model includes six coordinate systems: The telescope coordinate system, the lunar local coordinate system, the lunar-reference coordinate system, the selenocentric inertial coordinate system, the geocentric inertial coordinate system and the geo-reference coordinate system. The parameters, lncluding the position of the Sun, the Earth, the Moon, the libration and the attitude of the Earth, can be acquired from the Ephemeris. By giving an elevation angle and an azimuth angle of the lunar-based telescope, this model links the image pixel to the ground point uniquely.
Ash from Kilauea Eruption Viewed by NASA's MISR
2018-05-09
On May 3, 2018, a new eruption began at a fissure of the Kilauea volcano on the Island of Hawaii. Kilauea is the most active volcano in the world, having erupted almost continuously since 1983. Advancing lava and dangerous sulfur dioxide gas have forced thousands of residents in the neighborhood of Leilani Estates to evacuate. A number of homes have been destroyed, and no one can say how soon the eruption will abate and evacuees can return home. On May 6, 2018, at approximately 11 a.m. local time, the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite captured this view of the island as it passed overhead. Much of the island was shrouded by clouds, including the fissure on its eastern point. However, an eruption plume is visible streaming southwest over the ocean. The MISR instrument is unique in that it has nine cameras that view Earth at different angles: one pointing downward, four at various angles in the forward direction, and four in the backward direction. This image shows the view from one of MISR's forward-pointing cameras (60 degrees), which shows the plume more distinctly than the near-vertical views. The information from the images acquired at different view angles is used to calculate the height of the plume, results of which are superimposed on the right-hand image. The top of the plume near the fissure is at approximately 6,500 feet (2,000 meters) altitude, and the height of the plume decreases as it travels south and west. These relatively low altitudes mean that the ash and sulfur dioxide remained near the ground, which can cause health issues for people on the island downwind of the eruption. The "Ocean View" air quality monitor operated by the Clean Air Branch of the State of Hawaii Department of Health recorded a concentration of 18 μg/m3 of airborne particles less than 2.5 micrometers in diameter at 11 a.m. local time. This amount corresponds to an air quality rating of "moderate" and supports the MISR results indicating that ash was most likely present at ground level on this side of the island. These data were acquired during Terra orbit 97780. An annotated version is available at https://photojournal.jpl.nasa.gov/catalog/PIA22451
A Description of a Family of Heron Quadrilaterals
ERIC Educational Resources Information Center
Sastry, K. R. S.
2005-01-01
Mathematical historians place Heron in the first century. Right-angled triangles with integer sides and area had been determined before Heron, but he discovered such a "non" right-angled triangle, viz 13, 14, 15; 84. In view of this, triangles with integer sides and area are named "Heron triangles." The Indian mathematician Brahmagupta, born in…
NASA Technical Reports Server (NTRS)
2003-01-01
Dark smoke from oil fires extend for about 60 kilometers south of Iraq's capital city of Baghdad in these images acquired by the Multi-angle Imaging SpectroRadiometer (MISR) on April 2, 2003. The thick, almost black smoke is apparent near image center and contains chemical and particulate components hazardous to human health and the environment.The top panel is from MISR's vertical-viewing (nadir) camera. Vegetated areas appear red here because this display is constructed using near-infrared, red and blue band data, displayed as red, green and blue, respectively, to produce a false-color image. The bottom panel is a combination of two camera views of the same area and is a 3-D stereo anaglyph in which red band nadir camera data are displayed as red, and red band data from the 60-degree backward-viewing camera are displayed as green and blue. Both panels are oriented with north to the left in order to facilitate stereo viewing. Viewing the 3-D anaglyph with red/blue glasses (with the red filter placed over the left eye and the blue filter over the right) makes it possible to see the rising smoke against the surface terrain. This technique helps to distinguish features in the atmosphere from those on the surface. In addition to the smoke, several high, thin cirrus clouds (barely visible in the nadir view) are readily observed using the stereo image.The Multi-angle Imaging SpectroRadiometer observes the daylit Earth continuously and every 9 days views the entire globe between 82 degrees north and 82 degrees south latitude. These data products were generated from a portion of the imagery acquired during Terra orbit 17489. The panels cover an area of about 187 kilometers x 123 kilometers, and use data from blocks 63 to 65 within World Reference System-2 path 168.MISR was built and is managed by NASA's Jet Propulsion Laboratory,Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.2017-08-11
NASA's Cassini spacecraft looks toward the night side of Saturn's moon Titan in a view that highlights the extended, hazy nature of the moon's atmosphere. During its long mission at Saturn, Cassini has frequently observed Titan at viewing angles like this, where the atmosphere is backlit by the Sun, in order to make visible the structure of the hazes. Titan's high-altitude haze layer appears blue here, whereas the main atmospheric haze is orange. The difference in color could be due to particle sizes in the haze. The blue haze likely consists of smaller particles than the orange haze. Images taken using red, green and blue spectral filters were combined to create this natural-color view. The image was taken with the Cassini spacecraft narrow-angle camera on May 29, 2017. The view was acquired at a distance of approximately 1.2 million miles (2 million kilometers) from Titan. Image scale is 5 miles (9 kilometers) per pixel. https://photojournal.jpl.nasa.gov/catalog/PIA21625
Wei, Yingying; An, Qinglong; Cai, Xiaojiang; Chen, Ming; Ming, Weiwei
2015-10-02
The purpose of this article is to investigate the influences of carbon fibers on the fracture mechanism of carbon fibers both in macroscopic view and microscopic view by using single-point flying cutting method. Cutting tools with three different materials were used in this research, namely, PCD (polycrystalline diamond) tool, CVD (chemical vapor deposition) diamond thin film coated carbide tool and uncoated carbide tool. The influence of fiber orientation on the cutting force and fracture topography were analyzed and conclusions were drawn that cutting forces are not affected by cutting speeds but significantly influenced by the fiber orientation. Cutting forces presented smaller values in the fiber orientation of 0/180° and 15/165° but the highest one in 30/150°. The fracture mechanism of carbon fibers was studied in different cutting conditions such as 0° orientation angle, 90° orientation angle, orientation angles along fiber direction, and orientation angles inverse to the fiber direction. In addition, a prediction model on the cutting defects of carbon fiber reinforced plastic was established based on acoustic emission (AE) signals.
Wei, Yingying; An, Qinglong; Cai, Xiaojiang; Chen, Ming; Ming, Weiwei
2015-01-01
The purpose of this article is to investigate the influences of carbon fibers on the fracture mechanism of carbon fibers both in macroscopic view and microscopic view by using single-point flying cutting method. Cutting tools with three different materials were used in this research, namely, PCD (polycrystalline diamond) tool, CVD (chemical vapor deposition) diamond thin film coated carbide tool and uncoated carbide tool. The influence of fiber orientation on the cutting force and fracture topography were analyzed and conclusions were drawn that cutting forces are not affected by cutting speeds but significantly influenced by the fiber orientation. Cutting forces presented smaller values in the fiber orientation of 0/180° and 15/165° but the highest one in 30/150°. The fracture mechanism of carbon fibers was studied in different cutting conditions such as 0° orientation angle, 90° orientation angle, orientation angles along fiber direction, and orientation angles inverse to the fiber direction. In addition, a prediction model on the cutting defects of carbon fiber reinforced plastic was established based on acoustic emission (AE) signals. PMID:28793597
Detection of pointing errors with CMOS-based camera in intersatellite optical communications
NASA Astrophysics Data System (ADS)
Yu, Si-yuan; Ma, Jing; Tan, Li-ying
2005-01-01
For very high data rates, intersatellite optical communications hold a potential performance edge over microwave communications. Acquisition and Tracking problem is critical because of the narrow transmit beam. A single array detector in some systems performs both spatial acquisition and tracking functions to detect pointing errors, so both wide field of view and high update rate is required. The past systems tend to employ CCD-based camera with complex readout arrangements, but the additional complexity reduces the applicability of the array based tracking concept. With the development of CMOS array, CMOS-based cameras can employ the single array detector concept. The area of interest feature of the CMOS-based camera allows a PAT system to specify portion of the array. The maximum allowed frame rate increases as the size of the area of interest decreases under certain conditions. A commercially available CMOS camera with 105 fps @ 640×480 is employed in our PAT simulation system, in which only part pixels are used in fact. Beams angle varying in the field of view can be detected after getting across a Cassegrain telescope and an optical focus system. Spot pixel values (8 bits per pixel) reading out from CMOS are transmitted to a DSP subsystem via IEEE 1394 bus, and pointing errors can be computed by the centroid equation. It was shown in test that: (1) 500 fps @ 100×100 is available in acquisition when the field of view is 1mrad; (2)3k fps @ 10×10 is available in tracking when the field of view is 0.1mrad.
Spinning angle optical calibration apparatus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beer, S.K.; Pratt, H.R.
1991-02-26
This patent describes an optical calibration apparatus provided for calibrating and reproducing spinning angles in cross-polarization, nuclear magnetic resonance spectroscopy. An illuminated magnifying apparatus enables optical setting an accurate reproducing of spinning magic angles in cross-polarization, nuclear magnetic resonance spectroscopy experiments. A reference mark scribed on an edge of a spinning angle test sample holder is illuminated by a light source and viewed through a magnifying scope. When the magic angle of a sample material used as a standard is attained by varying the angular position of the sample holder, the coordinate position of the reference mark relative to amore » graduation or graduations on a reticle in the magnifying scope is noted.« less
A wide-angle camera module for disposable endoscopy
NASA Astrophysics Data System (ADS)
Shim, Dongha; Yeon, Jesun; Yi, Jason; Park, Jongwon; Park, Soo Nam; Lee, Nanhee
2016-08-01
A wide-angle miniaturized camera module for disposable endoscope is demonstrated in this paper. A lens module with 150° angle of view (AOV) is designed and manufactured. All plastic injection-molded lenses and a commercial CMOS image sensor are employed to reduce the manufacturing cost. The image sensor and LED illumination unit are assembled with a lens module. The camera module does not include a camera processor to further reduce its size and cost. The size of the camera module is 5.5 × 5.5 × 22.3 mm3. The diagonal field of view (FOV) of the camera module is measured to be 110°. A prototype of a disposable endoscope is implemented to perform a pre-clinical animal testing. The esophagus of an adult beagle dog is observed. These results demonstrate the feasibility of a cost-effective and high-performance camera module for disposable endoscopy.
Maneuver Algorithm for Bearings-Only Target Tracking with Acceleration and Field of View Constraints
NASA Astrophysics Data System (ADS)
Roh, Heekun; Shim, Sang-Wook; Tahk, Min-Jea
2018-05-01
This paper proposes a maneuver algorithm for the agent performing target tracking with bearing angle information only. The goal of the agent is to estimate the target position and velocity based only on the bearing angle data. The methods of bearings-only target state estimation are outlined. The nature of bearings-only target tracking problem is then addressed. Based on the insight from above-mentioned properties, the maneuver algorithm for the agent is suggested. The proposed algorithm is composed of a nonlinear, hysteresis guidance law and the estimation accuracy assessment criteria based on the theory of Cramer-Rao bound. The proposed guidance law generates lateral acceleration command based on current field of view angle. The accuracy criteria supply the expected estimation variance, which acts as a terminal criterion for the proposed algorithm. The aforementioned algorithm is verified with a two-dimensional simulation.
Apparatus and method for high dose rate brachytherapy radiation treatment
Macey, Daniel J.; Majewski, Stanislaw; Weisenberger, Andrew G.; Smith, Mark Frederick; Kross, Brian James
2005-01-25
A method and apparatus for the in vivo location and tracking of a radioactive seed source during and after brachytherapy treatment. The method comprises obtaining multiple views of the seed source in a living organism using: 1) a single PSPMT detector that is exposed through a multiplicity of pinholes thereby obtaining a plurality of images from a single angle; 2) a single PSPMT detector that may obtain an image through a single pinhole or a plurality of pinholes from a plurality of angles through movement of the detector; or 3) a plurality of PSPMT detectors that obtain a plurality of views from different angles simultaneously or virtually simultaneously. The plurality of images obtained from these various techniques, through angular displacement of the various acquired images, provide the information required to generate the three dimensional images needed to define the location of the radioactive seed source within the body of the living organism.
Pwyll Impact Crater: Perspective View of Topographic Model
NASA Technical Reports Server (NTRS)
1998-01-01
This computer-generated perspective view of the Pwyll impact crater on Jupiter's moon Europa was created using images taken by NASA's Galileo spacecraft camera when the spacecraft flew past that moon on Feb. 20 and Dec. 16, 1997 during its 6th and 12th orbits of Jupiter. Images of the crater taken from different angles on the different orbits have been combined to generate a model of the topography of Pwyll and its surroundings. This simulated view is from the southwest at a 45 degree angle, with the vertical exaggerated four times the natural size. The colors represent different elevation levels with blue being the lowest and red the highest. Pwyll, about 26 kilometers (16 miles) across, is unusual among craters in the solar system, because its floor is at about the same elevation as the surrounding terrain. Moreover, its central peak, standing approximately 600 meters (almost 2,000 feet) above the floor, is much higher than its rim. This may indicate that the crater was modified shortly after its formation by the flow of underlying warm ice.
The Jet Propulsion Laboratory, Pasadena, CA manages the Galileo mission for NASA's Office of Space Science, Washington, DC. JPL is an operating division of California Institute of Technology (Caltech).This image and other images and data received from Galileo are posted on the World Wide Web, on the Galileo mission home page at URL http://www.jpl.nasa.gov/ galileo.Construction of a three-dimensional interactive model of the skull base and cranial nerves.
Kakizawa, Yukinari; Hongo, Kazuhiro; Rhoton, Albert L
2007-05-01
The goal was to develop an interactive three-dimensional (3-D) computerized anatomic model of the skull base for teaching microneurosurgical anatomy and for operative planning. The 3-D model was constructed using commercially available software (Maya 6.0 Unlimited; Alias Systems Corp., Delaware, MD), a personal computer, four cranial specimens, and six dry bones. Photographs from at least two angles of the superior and lateral views were imported to the 3-D software. Many photographs were needed to produce the model in anatomically complex areas. Careful dissection was needed to expose important structures in the two views. Landmarks, including foramen, bone, and dura mater, were used as reference points. The 3-D model of the skull base and related structures was constructed using more than 300,000 remodeled polygons. The model can be viewed from any angle. It can be rotated 360 degrees in any plane using any structure as the focal point of rotation. The model can be reduced or enlarged using the zoom function. Variable transparencies could be assigned to any structures so that the structures at any level can be seen. Anatomic labels can be attached to the structures in the 3-D model for educational purposes. This computer-generated 3-D model can be observed and studied repeatedly without the time limitations and stresses imposed by surgery. This model may offer the potential to create interactive surgical exercises useful in evaluating multiple surgical routes to specific target areas in the skull base.
Nuclear medicine imaging system
Bennett, Gerald W.; Brill, A. Bertrand; Bizais, Yves J.; Rowe, R. Wanda; Zubal, I. George
1986-01-07
A nuclear medicine imaging system having two large field of view scintillation cameras mounted on a rotatable gantry and being movable diametrically toward or away from each other is disclosed. In addition, each camera may be rotated about an axis perpendicular to the diameter of the gantry. The movement of the cameras allows the system to be used for a variety of studies, including positron annihilation, and conventional single photon emission, as well as static orthogonal dual multi-pinhole tomography. In orthogonal dual multi-pinhole tomography, each camera is fitted with a seven pinhole collimator to provide seven views from slightly different perspectives. By using two cameras at an angle to each other, improved sensitivity and depth resolution is achieved. The computer system and interface acquires and stores a broad range of information in list mode, including patient physiological data, energy data over the full range detected by the cameras, and the camera position. The list mode acquisition permits the study of attenuation as a result of Compton scatter, as well as studies involving the isolation and correlation of energy with a range of physiological conditions.
Nuclear medicine imaging system
Bennett, Gerald W.; Brill, A. Bertrand; Bizais, Yves J. C.; Rowe, R. Wanda; Zubal, I. George
1986-01-01
A nuclear medicine imaging system having two large field of view scintillation cameras mounted on a rotatable gantry and being movable diametrically toward or away from each other is disclosed. In addition, each camera may be rotated about an axis perpendicular to the diameter of the gantry. The movement of the cameras allows the system to be used for a variety of studies, including positron annihilation, and conventional single photon emission, as well as static orthogonal dual multi-pinhole tomography. In orthogonal dual multi-pinhole tomography, each camera is fitted with a seven pinhole collimator to provide seven views from slightly different perspectives. By using two cameras at an angle to each other, improved sensitivity and depth resolution is achieved. The computer system and interface acquires and stores a broad range of information in list mode, including patient physiological data, energy data over the full range detected by the cameras, and the camera position. The list mode acquisition permits the study of attenuation as a result of Compton scatter, as well as studies involving the isolation and correlation of energy with a range of physiological conditions.