Sample records for nadir vertical-viewing camera

  1. Off-Nadir Hyperspectral Sensing for Estimation of Vertical Profile of Leaf Chlorophyll Content within Wheat Canopies

    PubMed Central

    Huang, Wenjiang; Zhou, Xianfeng; Ye, Huichun; Dong, Yingying

    2017-01-01

    Monitoring the vertical profile of leaf chlorophyll (Chl) content within winter wheat canopies is of significant importance for revealing the real nutritional status of the crop. Information on the vertical profile of Chl content is not accessible to nadir-viewing remote or proximal sensing. Off-nadir or multi-angle sensing would provide effective means to detect leaf Chl content in different vertical layers. However, adequate information on the selection of sensitive spectral bands and spectral index formulas for vertical leaf Chl content estimation is not yet available. In this study, all possible two-band and three-band combinations over spectral bands in normalized difference vegetation index (NDVI)-, simple ratio (SR)- and chlorophyll index (CI)-like types of indices at different viewing angles were calculated and assessed for their capability of estimating leaf Chl for three vertical layers of wheat canopies. The vertical profiles of Chl showed top-down declining trends and the patterns of band combinations sensitive to leaf Chl content varied among different vertical layers. Results indicated that the combinations of green band (520 nm) with NIR bands were efficient in estimating upper leaf Chl content, whereas the red edge (695 nm) paired with NIR bands were dominant in quantifying leaf Chl in the lower layers. Correlations between published spectral indices and all NDVI-, SR- and CI-like types of indices and vertical distribution of Chl content showed that reflectance measured from 50°, 30° and 20° backscattering viewing angles were the most promising to obtain information on leaf Chl in the upper-, middle-, and bottom-layer, respectively. Three types of optimized spectral indices improved the accuracy for vertical leaf Chl content estimation. The optimized three-band CI-like index performed the best in the estimation of vertical distribution of leaf Chl content, with R2 of 0.84–0.69, and RMSE of 5.37–5.56 µg/cm2 from the top to the bottom layers

  2. Off-Nadir Hyperspectral Sensing for Estimation of Vertical Profile of Leaf Chlorophyll Content within Wheat Canopies.

    PubMed

    Kong, Weiping; Huang, Wenjiang; Casa, Raffaele; Zhou, Xianfeng; Ye, Huichun; Dong, Yingying

    2017-11-23

    Monitoring the vertical profile of leaf chlorophyll (Chl) content within winter wheat canopies is of significant importance for revealing the real nutritional status of the crop. Information on the vertical profile of Chl content is not accessible to nadir-viewing remote or proximal sensing. Off-nadir or multi-angle sensing would provide effective means to detect leaf Chl content in different vertical layers. However, adequate information on the selection of sensitive spectral bands and spectral index formulas for vertical leaf Chl content estimation is not yet available. In this study, all possible two-band and three-band combinations over spectral bands in normalized difference vegetation index (NDVI)-, simple ratio (SR)- and chlorophyll index (CI)-like types of indices at different viewing angles were calculated and assessed for their capability of estimating leaf Chl for three vertical layers of wheat canopies. The vertical profiles of Chl showed top-down declining trends and the patterns of band combinations sensitive to leaf Chl content varied among different vertical layers. Results indicated that the combinations of green band (520 nm) with NIR bands were efficient in estimating upper leaf Chl content, whereas the red edge (695 nm) paired with NIR bands were dominant in quantifying leaf Chl in the lower layers. Correlations between published spectral indices and all NDVI-, SR- and CI-like types of indices and vertical distribution of Chl content showed that reflectance measured from 50°, 30° and 20° backscattering viewing angles were the most promising to obtain information on leaf Chl in the upper-, middle-, and bottom-layer, respectively. Three types of optimized spectral indices improved the accuracy for vertical leaf Chl content estimation. The optimized three-band CI-like index performed the best in the estimation of vertical distribution of leaf Chl content, with R² of 0.84-0.69, and RMSE of 5.37-5.56 µg/cm² from the top to the bottom layers

  3. Relative and Absolute Calibration of a Multihead Camera System with Oblique and Nadir Looking Cameras for a Uas

    NASA Astrophysics Data System (ADS)

    Niemeyer, F.; Schima, R.; Grenzdörffer, G.

    2013-08-01

    Numerous unmanned aerial systems (UAS) are currently flooding the market. For the most diverse applications UAVs are special designed and used. Micro and mini UAS (maximum take-off weight up to 5 kg) are of particular interest, because legal restrictions are still manageable but also the payload capacities are sufficient for many imaging sensors. Currently a camera system with four oblique and one nadir looking cameras is under development at the Chair for Geodesy and Geoinformatics. The so-called "Four Vision" camera system was successfully built and tested in the air. A MD4-1000 UAS from microdrones is used as a carrier system. Light weight industrial cameras are used and controlled by a central computer. For further photogrammetric image processing, each individual camera, as well as all the cameras together have to be calibrated. This paper focuses on the determination of the relative orientation between the cameras with the „Australis" software and will give an overview of the results and experiences of test flights.

  4. On the vertical resolution for near-nadir looking spaceborne rain radar

    NASA Astrophysics Data System (ADS)

    Kozu, Toshiaki

    A definition of radar resolution for an arbitrary direction is proposed and used to calculate the vertical resolution for a near-nadir looking spaceborne rain radar. Based on the calculation result, a scanning strategy is proposed which efficiently distributes the measurement time to each angle bin and thus increases the number of independent samples compared with a simple linear scanning.

  5. Pettit runs a drill while looking through a camera mounted on the Nadir window in the U.S. Lab

    NASA Image and Video Library

    2003-04-05

    ISS006-E-44305 (5 April 2003) --- Astronaut Donald R. Pettit, Expedition Six NASA ISS science officer, runs a drill while looking through a camera mounted on the nadir window in the Destiny laboratory on the International Space Station (ISS). The device is called a “barn door tracker”. The drill turns the screw, which moves the camera and its spotting scope.

  6. Large off-nadir scan angle of airborne LiDAR can severely affect the estimates of forest structure metrics

    NASA Astrophysics Data System (ADS)

    Liu, Jing; Skidmore, Andrew K.; Jones, Simon; Wang, Tiejun; Heurich, Marco; Zhu, Xi; Shi, Yifang

    2018-02-01

    Gap fraction (Pgap) and vertical gap fraction profile (vertical Pgap profile) are important forest structural metrics. Accurate estimation of Pgap and vertical Pgap profile is therefore critical for many ecological applications, including leaf area index (LAI) mapping, LAI profile estimation and wildlife habitat modelling. Although many studies estimated Pgap and vertical Pgap profile from airborne LiDAR data, the scan angle was often overlooked and a nadir view assumed. However, the scan angle can be off-nadir and highly variable in the same flight strip or across different flight strips. In this research, the impact of off-nadir scan angle on Pgap and vertical Pgap profile was evaluated, for several forest types. Airborne LiDAR data from nadir (0°∼7°), small off-nadir (7°∼23°), and large off-nadir (23°∼38°) directions were used to calculate both Pgap and vertical Pgap profile. Digital hemispherical photographs (DHP) acquired during fieldwork were used as references for validation. Our results show that angular Pgap from airborne LiDAR correlates well with angular Pgap from DHP (R2 = 0.74, 0.87, and 0.67 for nadir, small off-nadir and large off-nadir direction). But underestimation of Pgap from LiDAR amplifies at large off-nadir scan angle. By comparing Pgap and vertical Pgap profiles retrieved from different directions, it is shown that scan angle impact on Pgap and vertical Pgap profile differs amongst different forest types. The difference is likely to be caused by different leaf angle distribution and canopy architecture in these forest types. Statistical results demonstrate that the scan angle impact is more severe for plots with discontinuous or sparse canopies. These include coniferous plots, and deciduous or mixed plots with between-crown gaps. In these discontinuous plots, Pgap and vertical Pgap profiles are maximum when observed from nadir direction, and then rapidly decrease with increasing scan angle. The results of this research have many

  7. Accuracy Potential and Applications of MIDAS Aerial Oblique Camera System

    NASA Astrophysics Data System (ADS)

    Madani, M.

    2012-07-01

    Airborne oblique cameras such as Fairchild T-3A were initially used for military reconnaissance in 30s. A modern professional digital oblique camera such as MIDAS (Multi-camera Integrated Digital Acquisition System) is used to generate lifelike three dimensional to the users for visualizations, GIS applications, architectural modeling, city modeling, games, simulators, etc. Oblique imagery provide the best vantage for accessing and reviewing changes to the local government tax base, property valuation assessment, buying & selling of residential/commercial for better decisions in a more timely manner. Oblique imagery is also used for infrastructure monitoring making sure safe operations of transportation, utilities, and facilities. Sanborn Mapping Company acquired one MIDAS from TrackAir in 2011. This system consists of four tilted (45 degrees) cameras and one vertical camera connected to a dedicated data acquisition computer system. The 5 digital cameras are based on the Canon EOS 1DS Mark3 with Zeiss lenses. The CCD size is 5,616 by 3,744 (21 MPixels) with the pixel size of 6.4 microns. Multiple flights using different camera configurations (nadir/oblique (28 mm/50 mm) and (50 mm/50 mm)) were flown over downtown Colorado Springs, Colorado. Boresight fights for 28 mm nadir camera were flown at 600 m and 1,200 m and for 50 mm nadir camera at 750 m and 1500 m. Cameras were calibrated by using a 3D cage and multiple convergent images utilizing Australis model. In this paper, the MIDAS system is described, a number of real data sets collected during the aforementioned flights are presented together with their associated flight configurations, data processing workflow, system calibration and quality control workflows are highlighted and the achievable accuracy is presented in some detail. This study revealed that the expected accuracy of about 1 to 1.5 GSD (Ground Sample Distance) for planimetry and about 2 to 2.5 GSD for vertical can be achieved. Remaining systematic

  8. A Summer View of Russia's Lena Delta and Olenek

    NASA Technical Reports Server (NTRS)

    2004-01-01

    These views of the Russian Arctic were acquired by NASA's Multi-angle Imaging SpectroRadiometer (MISR) instrument on July 11, 2004, when the brief arctic summer had transformed the frozen tundra and the thousands of lakes, channels, and rivers of the Lena Delta into a fertile wetland, and when the usual blanket of thick snow had melted from the vast plains and taiga forests. This set of three images cover an area in the northern part of the Eastern Siberian Sakha Republic. The Olenek River wends northeast from the bottom of the images to the upper left, and the top portions of the images are dominated by the delta into which the mighty Lena River empties when it reaches the Laptev Sea. At left is a natural color image from MISR's nadir (vertical-viewing) camera, in which the rivers appear murky due to the presence of sediment, and photosynthetically-active vegetation appears green. The center image is also from MISR's nadir camera, but is a false color view in which the predominant red color is due to the brightness of vegetation at near-infrared wavelengths. The most photosynthetically active parts of this area are the Lena Delta, in the lower half of the image, and throughout the great stretch of land that curves across the Olenek River and extends northeast beyond the relatively barren ranges of the Volyoi mountains (the pale tan-colored area to the right of image center).

    The right-hand image is a multi-angle false-color view made from the red band data of the 60o backward, nadir, and 60o forward cameras, displayed as red, green and blue, respectively. Water appears blue in this image because sun glitter makes smooth, wet surfaces look brighter at the forward camera's view angle. Much of the landscape and many low clouds appear purple since these surfaces are both forward and backward scattering, and clouds that are further from the surface appear in a different spot for each view angle, creating a rainbow-like appearance. However, the vegetated

  9. Bird's-Eye View of Opportunity at 'Erebus' (Vertical)

    NASA Technical Reports Server (NTRS)

    2006-01-01

    This view combines frames taken by the panoramic camera on NASA's Mars Exploration Rover Opportunity on the rover's 652nd through 663rd Martian days, or sols (Nov. 23 to Dec. 5, 2005), at the edge of 'Erebus Crater.' The mosaic is presented as a vertical projection. This type of projection provides a true-to-scale overhead view of the rover deck and nearby surrounding terrain. The view here shows outcrop rocks, sand dunes, and other features out to a distance of about 25 meters (82 feet) from the rover. Opportunity examined targets on the outcrop called 'Rimrock' in front of the rover, testing the mobility and operation of Opportunity's robotic arm. The view shows examples of the dunes and ripples that Opportunity has been crossing as the rover drives on the Meridiani plains.

    This view is a false-color composite of images taken through the camera's 750-nanometer, 530-nanometer and 430-nanometer filters. This kind of false-color scheme emphasizes differences in composition among the different kinds of materials that the rover is exploring.

  10. MISR Views Northern Australia

    NASA Technical Reports Server (NTRS)

    2000-01-01

    MISR images of tropical northern Australia acquired on June 1, 2000 (Terra orbit 2413) during the long dry season. Left: color composite of vertical (nadir) camera blue, green, and red band data. Right: multi-angle composite of red band data only from the cameras viewing 60 degrees aft, 60 degrees forward, and nadir. Color and contrast have been enhanced to accentuate subtle details. In the left image, color variations indicate how different parts of the scene reflect light differently at blue, green, and red wavelengths; in the right image color variations show how these same scene elements reflect light differently at different angles of view. Water appears in blue shades in the right image, for example, because glitter makes the water look brighter at the aft camera's view angle. The prominent inland water body is Lake Argyle, the largest human-made lake in Australia, which supplies water for the Ord River Irrigation Area and the town of Kununurra (pop. 6500) just to the north. At the top is the southern edge of Joseph Bonaparte Gulf; the major inlet at the left is Cambridge Gulf, the location of the town of Wyndham (pop. 850), the port for this region. This area is sparsely populated, and is known for its remote, spectacular mountains and gorges. Visible along much of the coastline are intertidal mudflats of mangroves and low shrubs; to the south the terrain is covered by open woodland merging into open grassland in the lower half of the pictures.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  11. Characteristics of Deep Tropical and Subtropical Convection from Nadir-Viewing High-Altitude Airborne Doppler Radar

    NASA Technical Reports Server (NTRS)

    Heymsfield, Gerald M.; Tian, Lin; Heymsfield, Andrew J.; Li, Lihua; Guimond, Stephen

    2010-01-01

    This paper presents observations of deep convection characteristics in the tropics and subtropics that have been classified into four categories: tropical cyclone, oceanic, land, and sea breeze. Vertical velocities in the convection were derived from Doppler radar measurements collected during several NASA field experiments from the nadir-viewing high-altitude ER-2 Doppler radar (EDOP). Emphasis is placed on the vertical structure of the convection from the surface to cloud top (sometimes reaching 18-km altitude). This unique look at convection is not possible from other approaches such as ground-based or lower-altitude airborne scanning radars. The vertical motions from the radar measurements are derived using new relationships between radar reflectivity and hydrometeor fall speed. Various convective properties, such as the peak updraft and downdraft velocities and their corresponding altitude, heights of reflectivity levels, and widths of reflectivity cores, are estimated. The most significant findings are the following: 1) strong updrafts that mostly exceed 15 m/s, with a few exceeding 30 m/s, are found in all the deep convection cases, whether over land or ocean; 2) peak updrafts were almost always above the 10-km level and, in the case of tropical cyclones, were closer to the 12-km level; and 3) land-based and sea-breeze convection had higher reflectivities and wider convective cores than oceanic and tropical cyclone convection. In addition, the high-resolution EDOP data were used to examine the connection between reflectivity and vertical velocity, for which only weak linear relationships were found. The results are discussed in terms of dynamical and microphysical implications for numerical models and future remote sensors.

  12. Intermediate view synthesis algorithm using mesh clustering for rectangular multiview camera system

    NASA Astrophysics Data System (ADS)

    Choi, Byeongho; Kim, Taewan; Oh, Kwan-Jung; Ho, Yo-Sung; Choi, Jong-Soo

    2010-02-01

    A multiview video-based three-dimensional (3-D) video system offers a realistic impression and a free view navigation to the user. The efficient compression and intermediate view synthesis are key technologies since 3-D video systems deal multiple views. We propose an intermediate view synthesis using a rectangular multiview camera system that is suitable to realize 3-D video systems. The rectangular multiview camera system not only can offer free view navigation both horizontally and vertically but also can employ three reference views such as left, right, and bottom for intermediate view synthesis. The proposed view synthesis method first represents the each reference view to meshes and then finds the best disparity for each mesh element by using the stereo matching between reference views. Before stereo matching, we separate the virtual image to be synthesized into several regions to enhance the accuracy of disparities. The mesh is classified into foreground and background groups by disparity values and then affine transformed. By experiments, we confirm that the proposed method synthesizes a high-quality image and is suitable for 3-D video systems.

  13. Sensitivity of MODIS 2.1 micron Channel for Off-Nadir View Angles for Use in Remote Sensing of Aerosol

    NASA Technical Reports Server (NTRS)

    Gatebe, C. K.; King, M. D.; Tsay, S.-C.; Ji, Q.

    2000-01-01

    Remote sensing of aerosol over land, from MODIS will be based on dark targets using mid-IR channels 2.1 and 3.9 micron. This approach was developed by Kaufman et al (1997), who suggested that dark surface reflectance in the red (0.66 micron -- rho(sub 0.66)) channel is half of that at 2.2 micron (rho(sub 2.2)), and the reflectance in the blue (0.49 micron - rho(sub 0.49)) channel is a quarter of that at 2.2 micron. Using this relationship, the surface reflectance in the visible channels can be predicted within Delta.rho(sub 0.49) approximately Delat.rho(sub 0.66) approximately 0.006 from rho(sub 2.2) for rho(sub 2.2) <= 0.10. This was half the error obtained using the 3.75 micron and corresponds to an error in aerosol optical thickness of Delat.tau approximately 0.06. These results, though applicable to several biomes (e.g. forests, and brighter lower canopies), have only been tested at one view angle - the nadir (theta = 0 deg). Considering the importance of the results in remote sensing of aerosols over land surfaces from space, we are validating the relationships for off-nadir view angles using Cloud Absorption Radiometer (CAR) data. The CAR data are available for channels between 0.3 and 2.3 micron and for different surface types and conditions: forest, tundra, ocean, sea-ice, swamp, grassland and over areas covered with smoke. In this study we analyzed data collected during the Smoke, Clouds, and Radiation - Brazil (SCAR-B) experiment to validate Kaufman et al.'s (1997) results for non-nadir view angles. We will show the correlation between rho(sub 0.472), rho(sub 0.675), and rho(sub 2.2) for view angles between nadir (0 deg) and 55 deg off-nadir, and for different viewing directions in the backscatter and forward scatter directions.

  14. 2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING WEST TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  15. 1. VARIABLEANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VARIABLE-ANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING NORTH TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  16. Computerized data reduction techniques for nadir viewing remote sensors

    NASA Technical Reports Server (NTRS)

    Tiwari, S. N.; Gormsen, Barbara B.

    1985-01-01

    Computer resources have been developed for the analysis and reduction of MAPS experimental data from the OSTA-1 payload. The MAPS Research Project is concerned with the measurement of the global distribution of mid-tropospheric carbon monoxide. The measurement technique for the MAPS instrument is based on non-dispersive gas filter radiometer operating in the nadir viewing mode. The MAPS experiment has two passive remote sensing instruments, the prototype instrument which is used to measure tropospheric air pollution from aircraft platforms and the third generation (OSTA) instrument which is used to measure carbon monoxide in the mid and upper troposphere from space platforms. Extensive effort was also expended in support of the MAPS/OSTA-3 shuttle flight. Specific capabilities and resources developed are discussed.

  17. 7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION EQUIPMENT AND STORAGE CABINET. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  18. Sensitivity of MODIS 2.1-(micrometers) Channel for Off-Nadir View Angles for Use in Remote Sensing of Aerosol

    NASA Technical Reports Server (NTRS)

    Gatebe, C. K.; King, M. D.; Tsay, S.-C.; Ji, Q.; Arnold, T.

    2000-01-01

    In this sensitivity study, we examined the ratio technique, the official method for remote sensing of aerosols over land from Moderate Resolution Imaging Spectroradiometer (MODIS) DATA, for view angles from nadir to 65 deg. off-nadir using Cloud Absorption Radiometer (CAR) data collected during the Smoke, Clouds, and Radiation-Brazil (SCAR-B) experiment conducted in 1995. For the data analyzed and for the view angles tested, results seem to suggest that the reflectance (rho)0.47 and (rho)0.67 are predictable from (rho)2.1 using: (rho)0.47 = (rho)2.1/6, which is a slight modification and (rho)0.67 = (rho)2.1/2. These results hold for target viewed from backscattered direction, but not for the forward direction.

  19. 3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH THE VAL TO THE RIGHT, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  20. Changing the Production Pipeline - Use of Oblique Aerial Cameras for Mapping Purposes

    NASA Astrophysics Data System (ADS)

    Moe, K.; Toschi, I.; Poli, D.; Lago, F.; Schreiner, C.; Legat, K.; Remondino, F.

    2016-06-01

    This paper discusses the potential of current photogrammetric multi-head oblique cameras, such as UltraCam Osprey, to improve the efficiency of standard photogrammetric methods for surveying applications like inventory surveys and topographic mapping for public administrations or private customers. In 2015, Terra Messflug (TM), a subsidiary of Vermessung AVT ZT GmbH (Imst, Austria), has flown a number of urban areas in Austria, Czech Republic and Hungary with an UltraCam Osprey Prime multi-head camera system from Vexcel Imaging. In collaboration with FBK Trento (Italy), the data acquired at Imst (a small town in Tyrol, Austria) were analysed and processed to extract precise 3D topographic information. The Imst block comprises 780 images and covers an area of approx. 4.5 km by 1.5 km. Ground truth data is provided in the form of 6 GCPs and several check points surveyed with RTK GNSS. Besides, 3D building data obtained by photogrammetric stereo plotting from a 5 cm nadir flight and a LiDAR point cloud with 10 to 20 measurements per m² are available as reference data or for comparison. The photogrammetric workflow, from flight planning to Dense Image Matching (DIM) and 3D building extraction, is described together with the achieved accuracy. For each step, the differences and innovation with respect to standard photogrammetric procedures based on nadir images are shown, including high overlaps, improved vertical accuracy, and visibility of areas masked in the standard vertical views. Finally the advantages of using oblique images for inventory surveys are demonstrated.

  1. ENGINEERING TEST REACTOR (ETR) BUILDING, TRA642. CONTEXTUAL VIEW, CAMERA FACING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ENGINEERING TEST REACTOR (ETR) BUILDING, TRA-642. CONTEXTUAL VIEW, CAMERA FACING EAST. VERTICAL METAL SIDING. ROOF IS SLIGHTLY ELEVATED AT CENTER LINE FOR DRAINAGE. WEST SIDE OF ETR COMPRESSOR BUILDING, TRA-643, PROJECTS TOWARD LEFT AT FAR END OF ETR BUILDING. INL NEGATIVE NO. HD46-37-1. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  2. Use of Vertical Aerial Images for Semi-Oblique Mapping

    NASA Astrophysics Data System (ADS)

    Poli, D.; Moe, K.; Legat, K.; Toschi, I.; Lago, F.; Remondino, F.

    2017-05-01

    The paper proposes a methodology for the use of the oblique sections of images from large-format photogrammetric cameras, by exploiting the effect of the central perspective geometry in the lateral parts of the nadir images ("semi-oblique" images). The point of origin of the investigation was the execution of a photogrammetric flight over Norcia (Italy), which was seriously damaged after the earthquake of 30/10/2016. Contrary to the original plan of oblique acquisitions, the flight was executed on 15/11/2017 using an UltraCam Eagle camera with focal length 80 mm, and combining two flight plans, rotated by 90º ("crisscross" flight). The images (GSD 5 cm) were used to extract a 2.5D DSM cloud, sampled to a XY-grid size of 2 GSD, a 3D point clouds with a mean spatial resolution of 1 GSD and a 3D mesh model at a resolution of 10 cm of the historic centre of Norcia for a quantitative assessment of the damages. From the acquired nadir images the "semi-oblique" images (forward, backward, left and right views) could be extracted and processed in a modified version of GEOBLY software for measurements and restitution purposes. The potential of such semi-oblique image acquisitions from nadir-view cameras is hereafter shown and commented.

  3. NASA MISR Views Kruger National Park

    NASA Image and Video Library

    2010-10-06

    This nadir camera view was captured by NASA Terra spacecraft around Kruger National Park in NE South Africa. The bright white feature is the Palabora Copper Mine, and the water body near upper right is Lake Massingir in Mozambique.

  4. New Zealand

    Atmospheric Science Data Center

    2013-04-16

    ... image is a natural color view from the instrument's vertical-viewing (nadir) camera. It is presented at a resolution of 550 meters per ... is Queenstown, the principal resort town of the island. The remote and spectacular Fiordland National Park, which occupies the far ...

  5. TES/Aura L2 Atmospheric Temperatures Nadir V6 (TL2ATMTN)

    Atmospheric Science Data Center

    2018-01-18

    TES/Aura L2 Atmospheric Temperatures Nadir (TL2ATMTN) News:  TES News ... Level:  L2 Platform:  TES/Aura L2 Atmospheric Temperatures Spatial Coverage:  5.3 x 8.5 km nadir ... Contact User Services Parameters:  Atmospheric Temperature Temperature Precision Vertical Resolution ...

  6. Colorado

    Atmospheric Science Data Center

    2014-05-15

    ... the Multi-angle Imaging SpectroRadiometer (MISR). On the left, a natural-color view acquired by MISR's vertical-viewing (nadir) camera ... Gunnison River at the city of Grand Junction. The striking "L" shaped feature in the lower image center is a sandstone monocline known as ...

  7. TES/Aura L2 Atmospheric Temperatures Nadir V6 (TL2TNS)

    Atmospheric Science Data Center

    2018-01-22

    TES/Aura L2 Atmospheric Temperatures Nadir (TL2TNS) News:  TES News ... Level:  L2 Platform:  TES/Aura L2 Atmospheric Temperatures Spatial Coverage:  5.3 x 8.5 km nadir ... Contact ASDC User Services Parameters:  Atmospheric Temperature Temperature Precision Vertical Resolution ...

  8. Vertical view of Apollo 16 landing site located Descartes area lunar nearside

    NASA Technical Reports Server (NTRS)

    1971-01-01

    A vertical view of the Apollo 16 landing site located in the Descartes area lunar nearside. The overlay indicates the location of the proposed touchdown point for the Apollo 16 Lunar Module. Descartes is located west of the Sea of Nectar and southwest of the Sea of Tranquility. This photograph was taken with a 500mm lens camera from lunar orbit by the Apollo 14 crew.

  9. The imaging system design of three-line LMCCD mapping camera

    NASA Astrophysics Data System (ADS)

    Zhou, Huai-de; Liu, Jin-Guo; Wu, Xing-Xing; Lv, Shi-Liang; Zhao, Ying; Yu, Da

    2011-08-01

    In this paper, the authors introduced the theory about LMCCD (line-matrix CCD) mapping camera firstly. On top of the introduction were consists of the imaging system of LMCCD mapping camera. Secondly, some pivotal designs which were Introduced about the imaging system, such as the design of focal plane module, the video signal's procession, the controller's design of the imaging system, synchronous photography about forward and nadir and backward camera and the nadir camera of line-matrix CCD. At last, the test results of LMCCD mapping camera imaging system were introduced. The results as following: the precision of synchronous photography about forward and nadir and backward camera is better than 4 ns and the nadir camera of line-matrix CCD is better than 4 ns too; the photography interval of line-matrix CCD of the nadir camera can satisfy the butter requirements of LMCCD focal plane module; the SNR tested in laboratory is better than 95 under typical working condition(the solar incidence degree is 30, the reflectivity of the earth's surface is 0.3) of each CCD image; the temperature of the focal plane module is controlled under 30° in a working period of 15 minutes. All of these results can satisfy the requirements about the synchronous photography, the temperature control of focal plane module and SNR, Which give the guarantee of precision for satellite photogrammetry.

  10. Ultraviolet Viewing with a Television Camera.

    ERIC Educational Resources Information Center

    Eisner, Thomas; And Others

    1988-01-01

    Reports on a portable video color camera that is fully suited for seeing ultraviolet images and offers some expanded viewing possibilities. Discusses the basic technique, specialized viewing, and the instructional value of this system of viewing reflectance patterns of flowers and insects that are invisible to the unaided eye. (CW)

  11. Two Perspectives on Forest Fire

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Multi-angle Imaging Spectroradiometer (MISR) images of smoke plumes from wildfires in western Montana acquired on August 14, 2000. A portion of Flathead Lake is visible at the top, and the Bitterroot Range traverses the images. The left view is from MISR's vertical-viewing (nadir) camera. The right view is from the camera that looks forward at a steep angle (60 degrees). The smoke location and extent are far more visible when seen at this highly oblique angle. However, vegetation is much darker in the forward view. A brown burn scar is located nearly in the exact center of the nadir image, while in the high-angle view it is shrouded in smoke. Also visible in the center and upper right of the images, and more obvious in the clearer nadir view, are checkerboard patterns on the surface associated with land ownership boundaries and logging. Compare these images with the high resolution infrared imagery captured nearby by Landsat 7 half an hour earlier. Images by NASA/GSFC/JPL, MISR Science Team.

  12. A direct-view customer-oriented digital holographic camera

    NASA Astrophysics Data System (ADS)

    Besaga, Vira R.; Gerhardt, Nils C.; Maksimyak, Peter P.; Hofmann, Martin R.

    2018-01-01

    In this paper, we propose a direct-view digital holographic camera system consisting mostly of customer-oriented components. The camera system is based on standard photographic units such as camera sensor and objective and is adapted to operate under off-axis external white-light illumination. The common-path geometry of the holographic module of the system ensures direct-view operation. The system can operate in both self-reference and self-interference modes. As a proof of system operability, we present reconstructed amplitude and phase information of a test sample.

  13. Russia: Saratov

    Atmospheric Science Data Center

    2013-04-17

    ... to differences in brightness and texture between bare soil and vegetated land. The chestnut-colored soils in this region are brighter ... of vegetation relative to the nadir camera, which sees more soil. In spring, therefore, the scene is brightest in the vertical view and ...

  14. Namibia and Central Angola

    Atmospheric Science Data Center

    2013-04-15

    ... The images on the left are natural color (red, green, blue) images from MISR's vertical-viewing (nadir) camera. The images on the ... one of MISR's derived surface products. The radiance (light intensity) in each pixel of the so-called "top-of-atmosphere" images on ...

  15. 32. DETAIL VIEW OF CAMERA PIT SOUTH OF LAUNCH PAD ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    32. DETAIL VIEW OF CAMERA PIT SOUTH OF LAUNCH PAD WITH CAMERA AIMED AT LAUNCH DECK; VIEW TO NORTHEAST. - Cape Canaveral Air Station, Launch Complex 17, Facility 28402, East end of Lighthouse Road, Cape Canaveral, Brevard County, FL

  16. 8. VAL CAMERA CAR, CLOSEUP VIEW OF 'FLARE' OR TRAJECTORY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. VAL CAMERA CAR, CLOSE-UP VIEW OF 'FLARE' OR TRAJECTORY CAMERA ON SLIDING MOUNT. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  17. 4. VIEW OF VERTICAL BORING MACHINE. (Bullard) Vertical turning lathe ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. VIEW OF VERTICAL BORING MACHINE. (Bullard) Vertical turning lathe (VTL). Machining the fixture for GE Turboshroud. G.S. O'Brien, operator. - Juniata Shops, Machine Shop No. 1, East of Fourth Avenue at Third Street, Altoona, Blair County, PA

  18. Vertical view of Arab Republic of Egypt from ASTP mission

    NASA Image and Video Library

    1975-07-19

    AST-09-555 (19 July 1975) --- A vertical view of a portion of the Arab Republic of Egypt, as photographed from the Apollo spacecraft in Earth orbit during the joint U.S.-USSR Apollo-Soyuz Test Project mission. The Nile Delta is in the most northerly corner of the picture. The City of Cairo on the Nile River is in the center of the photograph. The Gulf of Suez is in the most easterly corner of the picture. El Faiyum is south-southwest of Cairo. This picture was taken at an altitude of 223 kilometers (138 statute miles), with a 70mm Hasselblad camera using medium-speed Ektachrome QX-807 type film.

  19. The Panoramic Camera (Pancam) Investigation on the NASA 2003 Mars Exploration Rover Mission

    NASA Technical Reports Server (NTRS)

    Bell, J. F., III; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.; Schwochert, M.; Dingizian, A.; Brown, D.; Morris, R. V.; Arneson, H. M.; Johnson, M. J.

    2003-01-01

    The Panoramic Camera System (Pancam) is part of the Athena science payload to be launched to Mars in 2003 on NASA's twin Mars Exploration Rover (MER) missions. The Pancam imaging system on each rover consists of two major components: a pair of digital CCD cameras, and the Pancam Mast Assembly (PMA), which provides the azimuth and elevation actuation for the cameras as well as a 1.5 meter high vantage point from which to image. Pancam is a multispectral, stereoscopic, panoramic imaging system, with a field of regard provided by the PMA that extends across 360 of azimuth and from zenith to nadir, providing a complete view of the scene around the rover.

  20. Web Camera Use of Mothers and Fathers When Viewing Their Hospitalized Neonate.

    PubMed

    Rhoads, Sarah J; Green, Angela; Gauss, C Heath; Mitchell, Anita; Pate, Barbara

    2015-12-01

    Mothers and fathers of neonates hospitalized in a neonatal intensive care unit (NICU) differ in their experiences related to NICU visitation. To describe the frequency and length of maternal and paternal viewing of their hospitalized neonates via a Web camera. A total of 219 mothers and 101 fathers used the Web camera that allows 24/7 NICU viewing from September 1, 2010, to December 31, 2012, which included 40 mother and father dyads. We conducted a review of the Web camera's Web site log-on records in this nonexperimental, descriptive study. Mothers and fathers had a significant difference in the mean number of log-ons to the Web camera system (P = .0293). Fathers virtually visited the NICU less often than mothers, but there was not a statistical difference between mothers and fathers in terms of the mean total number of minutes viewing the neonate (P = .0834) or in the maximum number of minutes of viewing in 1 session (P = .6924). Patterns of visitations over time were not measured. Web camera technology could be a potential intervention to aid fathers in visiting their neonates. Both parents should be offered virtual visits using the Web camera and oriented regarding how to use the Web camera. These findings are important to consider when installing Web cameras in a NICU. Future research should continue to explore Web camera use in NICUs.

  1. Electronic Still Camera view of Aft end of Wide Field/Planetary Camera in HST

    NASA Image and Video Library

    1993-12-06

    S61-E-015 (6 Dec 1993) --- A close-up view of the aft part of the new Wide Field/Planetary Camera (WFPC-II) installed on the Hubble Space Telescope (HST). WFPC-II was photographed with the Electronic Still Camera (ESC) from inside Endeavour's cabin as astronauts F. Story Musgrave and Jeffrey A. Hoffman moved it from its stowage position onto the giant telescope. Electronic still photography is a relatively new technology which provides the means for a handheld camera to electronically capture and digitize an image with resolution approaching film quality. The electronic still camera has flown as an experiment on several other shuttle missions.

  2. Touch And Go Camera System (TAGCAMS) for the OSIRIS-REx Asteroid Sample Return Mission

    NASA Astrophysics Data System (ADS)

    Bos, B. J.; Ravine, M. A.; Caplinger, M.; Schaffner, J. A.; Ladewig, J. V.; Olds, R. D.; Norman, C. D.; Huish, D.; Hughes, M.; Anderson, S. K.; Lorenz, D. A.; May, A.; Jackman, C. D.; Nelson, D.; Moreau, M.; Kubitschek, D.; Getzandanner, K.; Gordon, K. E.; Eberhardt, A.; Lauretta, D. S.

    2018-02-01

    NASA's OSIRIS-REx asteroid sample return mission spacecraft includes the Touch And Go Camera System (TAGCAMS) three camera-head instrument. The purpose of TAGCAMS is to provide imagery during the mission to facilitate navigation to the target asteroid, confirm acquisition of the asteroid sample, and document asteroid sample stowage. The cameras were designed and constructed by Malin Space Science Systems (MSSS) based on requirements developed by Lockheed Martin and NASA. All three of the cameras are mounted to the spacecraft nadir deck and provide images in the visible part of the spectrum, 400-700 nm. Two of the TAGCAMS cameras, NavCam 1 and NavCam 2, serve as fully redundant navigation cameras to support optical navigation and natural feature tracking. Their boresights are aligned in the nadir direction with small angular offsets for operational convenience. The third TAGCAMS camera, StowCam, provides imagery to assist with and confirm proper stowage of the asteroid sample. Its boresight is pointed at the OSIRIS-REx sample return capsule located on the spacecraft deck. All three cameras have at their heart a 2592 × 1944 pixel complementary metal oxide semiconductor (CMOS) detector array that provides up to 12-bit pixel depth. All cameras also share the same lens design and a camera field of view of roughly 44° × 32° with a pixel scale of 0.28 mrad/pixel. The StowCam lens is focused to image features on the spacecraft deck, while both NavCam lens focus positions are optimized for imaging at infinity. A brief description of the TAGCAMS instrument and how it is used to support critical OSIRIS-REx operations is provided.

  3. High-Resolution Large Field-of-View FUV Compact Camera

    NASA Technical Reports Server (NTRS)

    Spann, James F.

    2006-01-01

    The need for a high resolution camera with a large field of view and capable to image dim emissions in the far-ultraviolet is driven by the widely varying intensities of FUV emissions and spatial/temporal scales of phenomena of interest in the Earth% ionosphere. In this paper, the concept of a camera is presented that is designed to achieve these goals in a lightweight package with sufficient visible light rejection to be useful for dayside and nightside emissions. The camera employs the concept of self-filtering to achieve good spectral resolution tuned to specific wavelengths. The large field of view is sufficient to image the Earth's disk at Geosynchronous altitudes and capable of a spatial resolution of >20 km. The optics and filters are emphasized.

  4. Sampling errors for a nadir viewing instrument on the International Space Station

    NASA Astrophysics Data System (ADS)

    Berger, H. I.; Pincus, R.; Evans, F.; Santek, D.; Ackerman, S.; Ackerman, S.

    2001-12-01

    In an effort to improve the observational charactarization of ice clouds in the earth's atmosphere, we are developing a sub-millimeter wavelength radiometer which we propose to fly on the International Space Station for two years. Our goal is to accurately measure the ice water path and mass-weighted particle size at the finest possible temporal and spatial resolution. The ISS orbit precesses, sampling through the dirunal cycle every 16 days, but technological constraints limit our instrument to a single pixel viewed near nadir. We discuss sampling errors associated with this instrument/platform configuration. We use as "truth" the ISCCP dataset of pixel-level cloud optical retrievals, which acts as a proxy for ice water path; this dataset is sampled according to the orbital characteristics of the space station, and the statistics computed from the sub-sampled population are compared with those from the full dataset. We explore the tradeoffs in average sampling error as a function of the averaging time and spatial scale, and explore the possibility of resolving the dirunal cycle.

  5. A view of the ET camera on STS-112

    NASA Technical Reports Server (NTRS)

    2002-01-01

    KENNEDY SPACE CENTER, FLA. - A view of the camera mounted on the external tank of Space Shuttle Atlantis. The color video camera mounted to the top of Atlantis' external tank will provide a view of the front and belly of the orbiter and a portion of the solid rocket boosters (SRBs) and external tank during the launch of Atlantis on mission STS-112. It will offer the STS-112 team an opportunity to monitor the shuttle's performance from a new angle. The camera will be turned on fifteen minutes prior to launch and will show the orbiter and solid rocket boosters on the launch pad. The video will be downlinked from the external tank during flight to several NASA data-receiving sites and then relayed to the live television broadcast. The camera is expected to operate for about 15 minutes following liftoff. At liftoff, viewers will see the shuttle clearing the launch tower and, at two minutes after liftoff, see the right SRB separate from the external tank. When the external tank separates from Atlantis about eight minutes into the flight, the camera is expected to continue its live feed for about six more minutes although NASA may be unable to pick up the camera's signal because the tank may have moved out of range.

  6. The Orbiter camera payload system's large-format camera and attitude reference system

    NASA Technical Reports Server (NTRS)

    Schardt, B. B.; Mollberg, B. H.

    1985-01-01

    The Orbiter camera payload system (OCPS) is an integrated photographic system carried into earth orbit as a payload in the Space Transportation System (STS) Orbiter vehicle's cargo bay. The major component of the OCPS is a large-format camera (LFC), a precision wide-angle cartographic instrument capable of producing high-resolution stereophotography of great geometric fidelity in multiple base-to-height ratios. A secondary and supporting system to the LFC is the attitude reference system (ARS), a dual-lens stellar camera array (SCA) and camera support structure. The SCA is a 70 mm film system that is rigidly mounted to the LFC lens support structure and, through the simultaneous acquisition of two star fields with each earth viewing LFC frame, makes it possible to precisely determine the pointing of the LFC optical axis with reference to the earth nadir point. Other components complete the current OCPS configuration as a high-precision cartographic data acquisition system. The primary design objective for the OCPS was to maximize system performance characteristics while maintaining a high level of reliability compatible with rocket launch conditions and the on-orbit environment. The full OCPS configuration was launched on a highly successful maiden voyage aboard the STS Orbiter vehicle Challenger on Oct. 5, 1984, as a major payload aboard the STS-41G mission.

  7. Movable Cameras And Monitors For Viewing Telemanipulator

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B.; Venema, Steven C.

    1993-01-01

    Three methods proposed to assist operator viewing telemanipulator on video monitor in control station when video image generated by movable video camera in remote workspace of telemanipulator. Monitors rotated or shifted and/or images in them transformed to adjust coordinate systems of scenes visible to operator according to motions of cameras and/or operator's preferences. Reduces operator's workload and probability of error by obviating need for mental transformations of coordinates during operation. Methods applied in outer space, undersea, in nuclear industry, in surgery, in entertainment, and in manufacturing.

  8. Opportunity's View After Drive on Sol 1806 (Vertical)

    NASA Technical Reports Server (NTRS)

    2009-01-01

    NASA's Mars Exploration Rover Opportunity used its navigation camera to take the images combined into this full-circle view of the rover's surroundings just after driving 60.86 meters (200 feet) on the 1,806th Martian day, or sol, of Opportunity's surface mission (Feb. 21, 2009). North is at the center; south at both ends.

    Tracks from the drive extend northward across dark-toned sand ripples and light-toned patches of exposed bedrock in the Meridiani Planum region of Mars. For scale, the distance between the parallel wheel tracks is about 1 meter (about 40 inches).

    Engineers designed the Sol 1806 drive to be driven backwards as a strategy to redistribute lubricant in the rovers wheels. The right-front wheel had been showing signs of increased friction.

    The rover's position after the Sol 1806 drive was about 2 kilometer (1.2 miles) south southwest of Victoria Crater. Cumulative odometry was 14.74 kilometers (9.16 miles) since landing in January 2004, including 2.96 kilometers (1.84 miles) since climbing out of Victoria Crater on the west side of the crater on Sol 1634 (August 28, 2008).

    This view is presented as a vertical projection with geometric seam correction.

  9. A view of the ET camera on STS-112

    NASA Technical Reports Server (NTRS)

    2002-01-01

    KENNEDY SPACE CENTER, FLA. - A closeup view of the camera mounted on the external tank of Space Shuttle Atlantis. The color video camera mounted to the top of Atlantis' external tank will provide a view of the front and belly of the orbiter and a portion of the solid rocket boosters (SRBs) and external tank during the launch of Atlantis on mission STS-112. It will offer the STS-112 team an opportunity to monitor the shuttle's performance from a new angle. The camera will be turned on fifteen minutes prior to launch and will show the orbiter and solid rocket boosters on the launch pad. The video will be downlinked from the external tank during flight to several NASA data-receiving sites and then relayed to the live television broadcast. The camera is expected to operate for about 15 minutes following liftoff. At liftoff, viewers will see the shuttle clearing the launch tower and, at two minutes after liftoff, see the right SRB separate from the external tank. When the external tank separates from Atlantis about eight minutes into the flight, the camera is expected to continue its live feed for about six more minutes although NASA may be unable to pick up the camera's signal because the tank may have moved out of range.

  10. Time for a Change; Spirit's View on Sol 1843 (Vertical)

    NASA Technical Reports Server (NTRS)

    2009-01-01

    NASA's Mars Exploration Rover Spirit used its navigation camera to take the images that have been combined into this full-circle view of the rover's surroundings during the 1,843rd Martian day, or sol, of Spirit's surface mission (March 10, 2009). South is in the middle. North is at both ends.

    This view is presented as a vertical projection with geometric seam correction. North is at the top.

    The rover had driven 36 centimeters downhill earlier on Sol 1854, but had not been able to get free of ruts in soft material that had become an obstacle to getting around the northeastern corner of the low plateau called 'Home Plate.'

    The Sol 1854 drive, following two others in the preceding four sols that also achieved little progress in the soft ground, prompted the rover team to switch to a plan of getting around Home Plate counterclockwise, instead of clockwise. The drive direction in subsequent sols was westward past the northern edge of Home Plate.

  11. Calibration Procedures on Oblique Camera Setups

    NASA Astrophysics Data System (ADS)

    Kemper, G.; Melykuti, B.; Yu, C.

    2016-06-01

    Beside the creation of virtual animated 3D City models, analysis for homeland security and city planning, the accurately determination of geometric features out of oblique imagery is an important task today. Due to the huge number of single images the reduction of control points force to make use of direct referencing devices. This causes a precise camera-calibration and additional adjustment procedures. This paper aims to show the workflow of the various calibration steps and will present examples of the calibration flight with the final 3D City model. In difference to most other software, the oblique cameras are used not as co-registered sensors in relation to the nadir one, all camera images enter the AT process as single pre-oriented data. This enables a better post calibration in order to detect variations in the single camera calibration and other mechanical effects. The shown sensor (Oblique Imager) is based o 5 Phase One cameras were the nadir one has 80 MPIX equipped with a 50 mm lens while the oblique ones capture images with 50 MPix using 80 mm lenses. The cameras are mounted robust inside a housing to protect this against physical and thermal deformations. The sensor head hosts also an IMU which is connected to a POS AV GNSS Receiver. The sensor is stabilized by a gyro-mount which creates floating Antenna -IMU lever arms. They had to be registered together with the Raw GNSS-IMU Data. The camera calibration procedure was performed based on a special calibration flight with 351 shoots of all 5 cameras and registered the GPS/IMU data. This specific mission was designed in two different altitudes with additional cross lines on each flying heights. The five images from each exposure positions have no overlaps but in the block there are many overlaps resulting in up to 200 measurements per points. On each photo there were in average 110 well distributed measured points which is a satisfying number for the camera calibration. In a first step with the help of

  12. The Potential of Low-Cost Rpas for Multi-View Reconstruction of Sub-Vertical Rock Faces

    NASA Astrophysics Data System (ADS)

    Thoeni, K.; Guccione, D. E.; Santise, M.; Giacomini, A.; Roncella, R.; Forlani, G.

    2016-06-01

    The current work investigates the potential of two low-cost off-the-shelf quadcopters for multi-view reconstruction of sub-vertical rock faces. The two platforms used are a DJI Phantom 1 equipped with a Gopro Hero 3+ Black and a DJI Phantom 3 Professional with integrated camera. The study area is a small sub-vertical rock face. Several flights were performed with both cameras set in time-lapse mode. Hence, images were taken automatically but the flights were performed manually as the investigated rock face is very irregular which required manual adjustment of the yaw and roll for optimal coverage. The digital images were processed with commercial SfM software packages. Several processing settings were investigated in order to find out the one providing the most accurate 3D reconstruction of the rock face. To this aim, all 3D models produced with both platforms are compared to a point cloud obtained with a terrestrial laser scanner. Firstly, the difference between the use of coded ground control targets and the use of natural features was studied. Coded targets generally provide the best accuracy, but they need to be placed on the surface, which is not always possible, as sub-vertical rock faces are not easily accessible. Nevertheless, natural features can provide a good alternative if wisely chosen as shown in this work. Secondly, the influence of using fixed interior orientation parameters or self-calibration was investigated. The results show that, in the case of the used sensors and camera networks, self-calibration provides better results. To support such empirical finding, a numerical investigation using a Monte Carlo simulation was performed.

  13. MISR Views a Fire-Scarred Landscape

    NASA Technical Reports Server (NTRS)

    2000-01-01

    This MISR image pair shows 'before and after' views of the area around the Hanford Nuclear Reservation near Richland, Washington. On June 27, 2000, a fire in the dry sagebrush was sparked by an automobile crash. The flames were fanned by hot summer winds. By the day after the accident, about 100,000 acres had burned, and the fire's spread forced the closure of highways and loss of homes.

    These images, from Terra orbits 2176 and 3341, were obtained by MISR's vertical-viewing (nadir) camera. Compare the area just above and to the right of the line of cumulus clouds in the May 15 image with the same area imaged on August 3. The darkened burn scar measures approximately 35 kilometers across. The Columbia River is seen wending its way around the area, and the Snake River branches off to the right.

    According to Idaho's National Interagency Fire Center, the US has been experiencing the worst fire season since 1996.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  14. Note on the Effect of Horizontal Gradients for Nadir-Viewing Microwave and Infrared Sounders

    NASA Technical Reports Server (NTRS)

    Joiner, J.; Poli, P.

    2004-01-01

    Passive microwave and infrared nadir sounders such as the Advanced Microwave Sounding Unit A (AMSU-A) and the Atmospheric InfraRed Sounder (AIRS), both flying on NASA s EOS Aqua satellite, provide information about vertical temperature and humidity structure that is used in data assimilation systems for numerical weather prediction and climate applications. These instruments scan cross track so that at the satellite swath edges, the satellite zenith angles can reach approx. 60 deg. The emission path through the atmosphere as observed by the satellite is therefore slanted with respect to the satellite footprint s zenith. Although radiative transfer codes currently in use at operational centers use the appropriate satellite zenith angle to compute brightness temperature, the input atmospheric fields are those from the vertical profile above the center of the satellite footprint. If horizontal gradients are present in the atmospheric fields, the use of a vertical atmospheric profile may produce an error. This note attempts to quantify the effects of horizontal gradients on AIRS and AMSU-A channels by computing brightness temperatures with accurate slanted atmospheric profiles. We use slanted temperature, water vapor, and ozone fields from data assimilation systems. We compare the calculated slanted and vertical brightness temperatures with AIRS and AMSU-A observations. We show that the effects of horizontal gradients on these sounders are generally small and below instrument noise. However, there are cases where the effects are greater than the instrument noise and may produce erroneous increments in an assimilation system. The majority of the affected channels have weighting functions that peak in the upper troposphere (water vapor sensitive channels) and above (temperature sensitive channels) and are unlikely t o significantly impact tropospheric numerical weather prediction. However, the errors could be significant for other applications such as stratospheric

  15. View of forest fires in South America

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This view, acquired with a Hasselblad camera equipped with a 250mm lens, shows only a small portion of forest fires that marked the Earth photography taken over Bolivia, Brazil, Paraguay, and Argentina during this mission. Numerous fires are visible in this late-dry-season scene of the areas between the Parana and Uruguay Rivers. Most of this burning is usually associated with agricultural preparations. The nadir point of the Space Shuttle at the time this photograph was taken (2018 GMT, September 16, 1993) was 28.5 degrees South, 60.0 degrees West. The view is to the west.

  16. STS-37 Breakfast / Ingress / Launch & ISO Camera Views

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The primary objective of the STS-37 mission was to deploy the Gamma Ray Observatory. The mission was launched at 9:22:44 am on April 5, 1991, onboard the space shuttle Atlantis. The mission was led by Commander Steven Nagel. The crew was Pilot Kenneth Cameron and Mission Specialists Jerry Ross, Jay Apt, and Linda Godwing. This videotape shows the crew having breakfast on the launch day, with the narrator introducing them. It then shows the crew's final preparations and the entry into the shuttle, while the narrator gives information about each of the crew members. The countdown and launch is shown including the shuttle separation from the solid rocket boosters. The launch is reshown from 17 different camera views. Some of the other camera views were in black and white.

  17. Conceptual design of a neutron camera for MAST Upgrade

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weiszflog, M., E-mail: matthias.weiszflog@physics.uu.se; Sangaroon, S.; Cecconello, M.

    2014-11-15

    This paper presents two different conceptual designs of neutron cameras for Mega Ampere Spherical Tokamak (MAST) Upgrade. The first one consists of two horizontal cameras, one equatorial and one vertically down-shifted by 65 cm. The second design, viewing the plasma in a poloidal section, also consists of two cameras, one radial and the other one with a diagonal view. Design parameters for the different cameras were selected on the basis of neutron transport calculations and on a set of target measurement requirements taking into account the predicted neutron emissivities in the different MAST Upgrade operating scenarios. Based on a comparisonmore » of the cameras’ profile resolving power, the horizontal cameras are suggested as the best option.« less

  18. Fisheye camera around view monitoring system

    NASA Astrophysics Data System (ADS)

    Feng, Cong; Ma, Xinjun; Li, Yuanyuan; Wu, Chenchen

    2018-04-01

    360 degree around view monitoring system is the key technology of the advanced driver assistance system, which is used to assist the driver to clear the blind area, and has high application value. In this paper, we study the transformation relationship between multi coordinate system to generate panoramic image in the unified car coordinate system. Firstly, the panoramic image is divided into four regions. By using the parameters obtained by calibration, four fisheye images pixel corresponding to the four sub regions are mapped to the constructed panoramic image. On the basis of 2D around view monitoring system, 3D version is realized by reconstructing the projection surface. Then, we compare 2D around view scheme and 3D around view scheme in unified coordinate system, 3D around view scheme solves the shortcomings of the traditional 2D scheme, such as small visual field, prominent ground object deformation and so on. Finally, the image collected by a fisheye camera installed around the car body can be spliced into a 360 degree panoramic image. So it has very high application value.

  19. Estimation of Nitrogen Vertical Distribution by Bi-Directional Canopy Reflectance in Winter Wheat

    PubMed Central

    Huang, Wenjiang; Yang, Qinying; Pu, Ruiliang; Yang, Shaoyuan

    2014-01-01

    Timely measurement of vertical foliage nitrogen distribution is critical for increasing crop yield and reducing environmental impact. In this study, a novel method with partial least square regression (PLSR) and vegetation indices was developed to determine optimal models for extracting vertical foliage nitrogen distribution of winter wheat by using bi-directional reflectance distribution function (BRDF) data. The BRDF data were collected from ground-based hyperspectral reflectance measurements recorded at the Xiaotangshan Precision Agriculture Experimental Base in 2003, 2004 and 2007. The view zenith angles (1) at nadir, 40° and 50°; (2) at nadir, 30° and 40°; and (3) at nadir, 20° and 30° were selected as optical view angles to estimate foliage nitrogen density (FND) at an upper, middle and bottom layer, respectively. For each layer, three optimal PLSR analysis models with FND as a dependent variable and two vegetation indices (nitrogen reflectance index (NRI), normalized pigment chlorophyll index (NPCI) or a combination of NRI and NPCI) at corresponding angles as explanatory variables were established. The experimental results from an independent model verification demonstrated that the PLSR analysis models with the combination of NRI and NPCI as the explanatory variables were the most accurate in estimating FND for each layer. The coefficients of determination (R2) of this model between upper layer-, middle layer- and bottom layer-derived and laboratory-measured foliage nitrogen density were 0.7335, 0.7336, 0.6746, respectively. PMID:25353983

  20. Estimation of nitrogen vertical distribution by bi-directional canopy reflectance in winter wheat.

    PubMed

    Huang, Wenjiang; Yang, Qinying; Pu, Ruiliang; Yang, Shaoyuan

    2014-10-28

    Timely measurement of vertical foliage nitrogen distribution is critical for increasing crop yield and reducing environmental impact. In this study, a novel method with partial least square regression (PLSR) and vegetation indices was developed to determine optimal models for extracting vertical foliage nitrogen distribution of winter wheat by using bi-directional reflectance distribution function (BRDF) data. The BRDF data were collected from ground-based hyperspectral reflectance measurements recorded at the Xiaotangshan Precision Agriculture Experimental Base in 2003, 2004 and 2007. The view zenith angles (1) at nadir, 40° and 50°; (2) at nadir, 30° and 40°; and (3) at nadir, 20° and 30° were selected as optical view angles to estimate foliage nitrogen density (FND) at an upper, middle and bottom layer, respectively. For each layer, three optimal PLSR analysis models with FND as a dependent variable and two vegetation indices (nitrogen reflectance index (NRI), normalized pigment chlorophyll index (NPCI) or a combination of NRI and NPCI) at corresponding angles as explanatory variables were established. The experimental results from an independent model verification demonstrated that the PLSR analysis models with the combination of NRI and NPCI as the explanatory variables were the most accurate in estimating FND for each layer. The coefficients of determination (R2) of this model between upper layer-, middle layer- and bottom layer-derived and laboratory-measured foliage nitrogen density were 0.7335, 0.7336, 0.6746, respectively.

  1. Interior view showing south entrance; camera facing south. Mare ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view showing south entrance; camera facing south. - Mare Island Naval Shipyard, Machine Shop, California Avenue, southwest corner of California Avenue & Thirteenth Street, Vallejo, Solano County, CA

  2. VIEW OF EAST ELEVATION; CAMERA FACING WEST Mare Island ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW OF EAST ELEVATION; CAMERA FACING WEST - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  3. VIEW OF SOUTH ELEVATION; CAMERA FACING NORTH Mare Island ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW OF SOUTH ELEVATION; CAMERA FACING NORTH - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  4. VIEW OF WEST ELEVATION: CAMERA FACING NORTHEAST Mare Island ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW OF WEST ELEVATION: CAMERA FACING NORTHEAST - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  5. VIEW OF NORTH ELEVATION; CAMERA FACING SOUTH Mare Island ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW OF NORTH ELEVATION; CAMERA FACING SOUTH - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  6. View of south elevation; camera facing northeast. Mare Island ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of south elevation; camera facing northeast. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  7. View of north elevation; camera facing southeast. Mare Island ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of north elevation; camera facing southeast. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  8. Oblique view of southeast corner; camera facing northwest. Mare ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Oblique view of southeast corner; camera facing northwest. - Mare Island Naval Shipyard, Defense Electronics Equipment Operating Center, I Street, terminus west of Cedar Avenue, Vallejo, Solano County, CA

  9. Contextual view of building 733; camera facing southeast. Mare ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of building 733; camera facing southeast. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  10. MISR Stereo Imaging Distinguishes Smoke from Cloud

    NASA Technical Reports Server (NTRS)

    2000-01-01

    These views of western Alaska were acquired by MISR on June 25, 2000 during Terra orbit 2775. The images cover an area of about 150 kilometers x 225 kilometers, and have been oriented with north to the left. The left image is from the vertical-viewing (nadir) camera, whereas the right image is a stereo 'anaglyph' that combines data from the forward-viewing 45-degree and 60-degree cameras. This image appears three-dimensional when viewed through red/blue glasses with the red filter over the left eye. It may help to darken the room lights when viewing the image on a computer screen.

    The Yukon River is seen wending its way from upper left to lower right. A forest fire in the Kaiyuh Mountains produced the long smoke plume that originates below and to the right of image center. In the nadir view, the high cirrus clouds at the top of the image and the smoke plume are similar in appearance, and the lack of vertical information makes them hard to differentiate. Viewing the righthand image with stereo glasses, on the other hand, demonstrates that the scene consists of several vertically-stratified layers, including the surface terrain, the smoke, some scattered cumulus clouds, and streaks of high, thin cirrus. This added dimensionality is one of the ways MISR data helps scientists identify and classify various components of terrestrial scenes.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  11. Interior view of second floor lobby; camera facing south. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of second floor lobby; camera facing south. - Mare Island Naval Shipyard, Hospital Headquarters, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  12. Interior view of second floor space; camera facing southwest. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of second floor space; camera facing southwest. - Mare Island Naval Shipyard, Hospital Ward, Johnson Lane, west side at intersection of Johnson Lane & Cossey Street, Vallejo, Solano County, CA

  13. Opportunity View During Exploration in 'Duck Bay,' Sols 1506-1510 (Vertical)

    NASA Technical Reports Server (NTRS)

    2009-01-01

    NASA Mars Exploration Rover Opportunity used its navigation camera to take the images combined into this full-circle view of the rover's surroundings on the 1,506th through 1,510th Martian days, or sols, of Opportunity's mission on Mars (April 19-23, 2008). North is at the top.

    This view is presented as a vertical projection with geometric seam correction.

    The site is within an alcove called 'Duck Bay' in the western portion of Victoria Crater. Victoria Crater is about 800 meters (half a mile) wide. Opportunity had descended into the crater at the top of Duck Bay 7 months earlier. By the time the rover acquired this view, it had examined rock layers inside the rim.

    Opportunity was headed for a closer look at the base of a promontory called 'Cape Verde,' the cliff at about the 2-o'clock position of this image, before leaving Victoria. The face of Cape Verde is about 6 meters (20 feet) tall. Just clockwise from Cape Verde is the main bowl of Victoria Crater, with sand dunes at the bottom. A promontory called 'Cabo Frio,' at the southern side of Duck Bay, stands near the 6-o'clock position of the image.

  14. MISR Scans the Texas-Oklahoma Border

    NASA Technical Reports Server (NTRS)

    2000-01-01

    These MISR images of Oklahoma and north Texas were acquired on March 12, 2000 during Terra orbit 1243. The three images on the left, from top to bottom, are from the 70-degree forward viewing camera, the vertical-viewing (nadir) camera, and the 70-degree aftward viewing camera. The higher brightness, bluer tinge, and reduced contrast of the oblique views result primarily from scattering of sunlight in the Earth's atmosphere, though some color and brightness variations are also due to differences in surface reflection at the different angles. The longer slant path through the atmosphere at the oblique angles also accentuates the appearance of thin, high-altitude cirrus clouds.

    On the right, two areas from the nadir camera image are shown in more detail, along with notations highlighting major geographic features. The south bank of the Red River marks the boundary between Texas and Oklahoma. Traversing brush-covered and grassy plains, rolling hills, and prairies, the Red River and the Canadian River are important resources for farming, ranching, public drinking water, hydroelectric power, and recreation. Both originate in New Mexico and flow eastward, their waters eventually discharging into the Mississippi River.

    A smoke plume to the north of the Ouachita Mountains and east of Lake Eufaula is visible in the detailed nadir imagery. The plume is also very obvious at the 70-degree forward view angle, to the right of center and about one-fourth of the way down from the top of the image.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  15. 1. VIEW OF ARVFS BUNKER TAKEN FROM GROUND ELEVATION. CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VIEW OF ARVFS BUNKER TAKEN FROM GROUND ELEVATION. CAMERA FACING NORTH. VIEW SHOWS PROFILE OF BUNKER IN RELATION TO NATURAL GROUND ELEVATION. TOP OF BUNKER HAS APPROXIMATELY THREE FEET OF EARTH COVER. - Idaho National Engineering Laboratory, Advanced Reentry Vehicle Fusing System, Scoville, Butte County, ID

  16. Systems and methods for maintaining multiple objects within a camera field-of-view

    DOEpatents

    Gans, Nicholas R.; Dixon, Warren

    2016-03-15

    In one embodiment, a system and method for maintaining objects within a camera field of view include identifying constraints to be enforced, each constraint relating to an attribute of the viewed objects, identifying a priority rank for the constraints such that more important constraints have a higher priority that less important constraints, and determining the set of solutions that satisfy the constraints relative to the order of their priority rank such that solutions that satisfy lower ranking constraints are only considered viable if they also satisfy any higher ranking constraints, each solution providing an indication as to how to control the camera to maintain the objects within the camera field of view.

  17. Analysis of calibration accuracy of cameras with different target sizes for large field of view

    NASA Astrophysics Data System (ADS)

    Zhang, Jin; Chai, Zhiwen; Long, Changyu; Deng, Huaxia; Ma, Mengchao; Zhong, Xiang; Yu, Huan

    2018-03-01

    Visual measurement plays an increasingly important role in the field o f aerospace, ship and machinery manufacturing. Camera calibration of large field-of-view is a critical part of visual measurement . For the issue a large scale target is difficult to be produced, and the precision can not to be guaranteed. While a small target has the advantage of produced of high precision, but only local optimal solutions can be obtained . Therefore, studying the most suitable ratio of the target size to the camera field of view to ensure the calibration precision requirement of the wide field-of-view is required. In this paper, the cameras are calibrated by a series of different dimensions of checkerboard calibration target s and round calibration targets, respectively. The ratios of the target size to the camera field-of-view are 9%, 18%, 27%, 36%, 45%, 54%, 63%, 72%, 81% and 90%. The target is placed in different positions in the camera field to obtain the camera parameters of different positions . Then, the distribution curves of the reprojection mean error of the feature points' restructure in different ratios are analyzed. The experimental data demonstrate that with the ratio of the target size to the camera field-of-view increas ing, the precision of calibration is accordingly improved, and the reprojection mean error changes slightly when the ratio is above 45%.

  18. Pre-flight and On-orbit Geometric Calibration of the Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Speyerer, E. J.; Wagner, R. V.; Robinson, M. S.; Licht, A.; Thomas, P. C.; Becker, K.; Anderson, J.; Brylow, S. M.; Humm, D. C.; Tschimmel, M.

    2016-04-01

    The Lunar Reconnaissance Orbiter Camera (LROC) consists of two imaging systems that provide multispectral and high resolution imaging of the lunar surface. The Wide Angle Camera (WAC) is a seven color push-frame imager with a 90∘ field of view in monochrome mode and 60∘ field of view in color mode. From the nominal 50 km polar orbit, the WAC acquires images with a nadir ground sampling distance of 75 m for each of the five visible bands and 384 m for the two ultraviolet bands. The Narrow Angle Camera (NAC) consists of two identical cameras capable of acquiring images with a ground sampling distance of 0.5 m from an altitude of 50 km. The LROC team geometrically calibrated each camera before launch at Malin Space Science Systems in San Diego, California and the resulting measurements enabled the generation of a detailed camera model for all three cameras. The cameras were mounted and subsequently launched on the Lunar Reconnaissance Orbiter (LRO) on 18 June 2009. Using a subset of the over 793000 NAC and 207000 WAC images of illuminated terrain collected between 30 June 2009 and 15 December 2013, we improved the interior and exterior orientation parameters for each camera, including the addition of a wavelength dependent radial distortion model for the multispectral WAC. These geometric refinements, along with refined ephemeris, enable seamless projections of NAC image pairs with a geodetic accuracy better than 20 meters and sub-pixel precision and accuracy when orthorectifying WAC images.

  19. Interior view of second floor sleeping area; camera facing south. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of second floor sleeping area; camera facing south. - Mare Island Naval Shipyard, Marine Barracks, Cedar Avenue, west side between Twelfth & Fourteenth Streets, Vallejo, Solano County, CA

  20. Limb-Nadir Matching Using Non-Coincident NO2 Observations: Proof of Concept and the OMI-minus-OSIRIS Prototype Product

    NASA Technical Reports Server (NTRS)

    Adams, Cristen; Normand, Elise N.; Mclinden, Chris A.; Bourassa, Adam E.; Lloyd, Nicholas D.; Degenstein, Douglas A.; Krotkov, Nickolay A.; Rivas, Maria Belmonte; Boersma, K. Folkert; Eskes, Henk

    2016-01-01

    A variant of the limb-nadir matching technique for deriving tropospheric NO2 columns is presented in which the stratospheric component of the NO2 slant column density (SCD) measured by the Ozone Monitoring Instrument (OMI) is removed using non-coincident profiles from the Optical Spectrograph and InfraRed Imaging System (OSIRIS). In order to correct their mismatch in local time and the diurnal variation of stratospheric NO2, OSIRIS profiles, which were measured just after sunrise, were mapped to the local time of OMI observations using a photochemical boxmodel. Following the profile time adjustment, OSIRIS NO2 stratospheric vertical column densities (VCDs) were calculated. For profiles that did not reach down to the tropopause, VCDs were adjusted using the photochemical model. Using air mass factors from the OMI Standard Product (SP), a new tropospheric NO2 VCD product - referred to as OMI-minus-OSIRIS (OmO) - was generated through limb-nadir matching. To accomplish this, the OMI total SCDs were scaled using correction factors derived from the next-generation SCDs that improve upon the spectral fitting used for the current operational products. One year, 2008, of OmO was generated for 60 deg S to 60 deg N and a cursory evaluation was performed. The OmO product was found to capture the main features of tropospheric NO2, including a background value of about 0.3 x 10(exp 15) molecules per sq cm over the tropical Pacific and values comparable to the OMI operational products over anthropogenic source areas. While additional study is required, these results suggest that a limb-nadir matching approach is feasible for the removal of stratospheric NO2 measured by a polar orbiter from a nadir-viewing instrument in a geostationary orbit such as Tropospheric Emissions: Monitoring of Pollution (TEMPO) or Sentinel-4.

  1. View of camera station located northeast of Building 70022, facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of camera station located northeast of Building 70022, facing northwest - Naval Ordnance Test Station Inyokern, Randsburg Wash Facility Target Test Towers, Tower Road, China Lake, Kern County, CA

  2. Interior view of north wing, south wall offices; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of north wing, south wall offices; camera facing south. - Mare Island Naval Shipyard, Smithery, California Avenue, west side at California Avenue & Eighth Street, Vallejo, Solano County, CA

  3. View of main terrace with mature tree, camera facing southeast ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of main terrace with mature tree, camera facing southeast - Naval Training Station, Senior Officers' Quarters District, Naval Station Treasure Island, Yerba Buena Island, San Francisco, San Francisco County, CA

  4. Contextual view of building 926 west elevation; camera facing east. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of building 926 west elevation; camera facing east. - Mare Island Naval Shipyard, Wilderman Hall, Johnson Lane, north side adjacent to (south of) Hospital Complex, Vallejo, Solano County, CA

  5. View of steel warehouses, building 710 north sidewalk; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of steel warehouses, building 710 north sidewalk; camera facing east. - Naval Supply Annex Stockton, Steel Warehouse Type, Between James & Humphreys Drives south of Embarcadero, Stockton, San Joaquin County, CA

  6. Interior view of hallway on second floor; camera facing south. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior view of hallway on second floor; camera facing south. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  7. Synergy Between Occultation, Limb and Nadir Satellite Data to Study Atmospheric Ozone, Aerosols and Radiation

    NASA Astrophysics Data System (ADS)

    Bhartia, P. K.; Loughman, R. P.; Ziemke, J. R.

    2017-12-01

    There is a widespread concern in the atmospheric chemistry community about the continuity of long-term datasets of ozone and related species needed to understand changes in Earth's atmospheric composition, particularly in the climate-sensitive upper tropospheric/lower stratospheric (UTLS) region. The MLS instrument on NASA 's Aura satellite designed to make such measurements is now more than 13 years old. The Canadian ACE-FTS solar occultation instrument is even older, and ESA's MIPAS instrument ceased operation in 2012. There are currently no plans to replace these instruments. Yet, at the same time for some of the atmospheric composition products we are arguably entering a golden era in space-based measurements. New generation of nadir-viewing instruments operating in IR, VIS and UV wavelengths are already flying and soon there will be 3 UV/VIS instruments in geostationary orbits. The limb-viewing component of the OMPS instrument launched on the Suomi NPP satellite in 2011 is capable of measuring ozone and aerosols at 2 km vertical resolution down to about 12 km. NASA is building another copy of this instrument for launch on JPSS-2 in 2022 and there are plans to build more. The SAGE III instrument installed on the International Space Station earlier this year has restarted the venerable time series of ozone and aerosols that ended in 2005 with the demise of SAGE II. However, we argue that to make best use of these assets it is desirable to take advantage of the synergies between these instruments. Several multi-instrument tropospheric ozone products are already available. We expect continued efforts to improve these products by doing joint retrieval of limb, IR and UV nadir data. Another promising area is to combine solar occultation and limb-scattered data to produce aerosol extinction profiles at high spatial resolution, and to constrain aerosol size distribution parameters and refractive indices- an approach similar to the almucantar technique pioneered by the

  8. Earth's Radiation Belts: The View from Juno's Cameras

    NASA Astrophysics Data System (ADS)

    Becker, H. N.; Joergensen, J. L.; Hansen, C. J.; Caplinger, M. A.; Ravine, M. A.; Gladstone, R.; Versteeg, M. H.; Mauk, B.; Paranicas, C.; Haggerty, D. K.; Thorne, R. M.; Connerney, J. E.; Kang, S. S.

    2013-12-01

    Juno's cameras, particle instruments, and ultraviolet imaging spectrograph have been heavily shielded for operation within Jupiter's high radiation environment. However, varying quantities of >1-MeV electrons and >10-MeV protons will be energetic enough to penetrate instrument shielding and be detected as transient background signatures by the instruments. The differing shielding profiles of Juno's instruments lead to differing spectral sensitivities to penetrating electrons and protons within these regimes. This presentation will discuss radiation data collected by Juno in the Earth's magnetosphere during Juno's October 9, 2013 Earth flyby (559 km altitude at closest approach). The focus will be data from Juno's Stellar Reference Unit, Advanced Stellar Compass star cameras, and JunoCam imager acquired during coordinated proton measurements within the inner zone and during the spacecraft's inbound and outbound passages through the outer zone (L ~3-5). The background radiation signatures from these cameras will be correlated with dark count background data collected at these geometries by Juno's Ultraviolet Spectrograph (UVS) and Jupiter Energetic Particle Detector Instrument (JEDI). Further comparison will be made to Van Allen Probe data to calibrate Juno's camera results and contribute an additional view of the Earth's radiation environment during this unique event.

  9. Contextual view of building 733 along Cedar Avenue; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of building 733 along Cedar Avenue; camera facing southwest. - Mare Island Naval Shipyard, WAVES Officers Quarters, Cedar Avenue, west side between Tisdale Avenue & Eighth Street, Vallejo, Solano County, CA

  10. MTR STACK, TRA710, CONTEXTUAL VIEW, CAMERA FACING SOUTH. PERIMETER SECURITY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    MTR STACK, TRA-710, CONTEXTUAL VIEW, CAMERA FACING SOUTH. PERIMETER SECURITY FENCE AND SECURITY LIGHTING IN VIEW AT LEFT. INL NEGATIVE NO. HD52-1-1. Mike Crane, Photographer, 5/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  11. View of Crew Commander Henry Hartsfield Jr. loading film into IMAX camera

    NASA Image and Video Library

    1984-09-08

    41D-11-004 (8 September 1984 --- View of Crew Commander Henry Hartsfield Jr. loading film into the IMAX camera during the 41-D mission. The camera is floating in front of the middeck lockers. Above it is a sticker of the University of Kansas mascott, the Jayhawk.

  12. INTERIOR VIEW OF FIRST STORY SPACE SHOWING CONCRETE BEAMS; CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR VIEW OF FIRST STORY SPACE SHOWING CONCRETE BEAMS; CAMERA FACING NORTH - Mare Island Naval Shipyard, Transportation Building & Gas Station, Third Street, south side between Walnut Avenue & Cedar Avenue, Vallejo, Solano County, CA

  13. Feasibility study of transmission of OTV camera control information in the video vertical blanking interval

    NASA Technical Reports Server (NTRS)

    White, Preston A., III

    1994-01-01

    The Operational Television system at Kennedy Space Center operates hundreds of video cameras, many remotely controllable, in support of the operations at the center. This study was undertaken to determine if commercial NABTS (North American Basic Teletext System) teletext transmission in the vertical blanking interval of the genlock signals distributed to the cameras could be used to send remote control commands to the cameras and the associated pan and tilt platforms. Wavelength division multiplexed fiberoptic links are being installed in the OTV system to obtain RS-250 short-haul quality. It was demonstrated that the NABTS transmission could be sent over the fiberoptic cable plant without excessive video quality degradation and that video cameras could be controlled using NABTS transmissions over multimode fiberoptic paths as long as 1.2 km.

  14. Backing collisions: a study of drivers' eye and backing behaviour using combined rear-view camera and sensor systems.

    PubMed

    Hurwitz, David S; Pradhan, Anuj; Fisher, Donald L; Knodler, Michael A; Muttart, Jeffrey W; Menon, Rajiv; Meissner, Uwe

    2010-04-01

    Backing crash injures can be severe; approximately 200 of the 2,500 reported injuries of this type per year to children under the age of 15 years result in death. Technology for assisting drivers when backing has limited success in preventing backing crashes. Two questions are addressed: Why is the reduction in backing crashes moderate when rear-view cameras are deployed? Could rear-view cameras augment sensor systems? 46 drivers (36 experimental, 10 control) completed 16 parking trials over 2 days (eight trials per day). Experimental participants were provided with a sensor camera system, controls were not. Three crash scenarios were introduced. Parking facility at UMass Amherst, USA. 46 drivers (33 men, 13 women) average age 29 years, who were Massachusetts residents licensed within the USA for an average of 9.3 years. Interventions Vehicles equipped with a rear-view camera and sensor system-based parking aid. Subject's eye fixations while driving and researcher's observation of collision with objects during backing. Only 20% of drivers looked at the rear-view camera before backing, and 88% of those did not crash. Of those who did not look at the rear-view camera before backing, 46% looked after the sensor warned the driver. This study indicates that drivers not only attend to an audible warning, but will look at a rear-view camera if available. Evidence suggests that when used appropriately, rear-view cameras can mitigate the occurrence of backing crashes, particularly when paired with an appropriate sensor system.

  15. PROCESS WATER BUILDING, TRA605. CONTEXTUAL VIEW, CAMERA FACING SOUTHEAST. PROCESS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PROCESS WATER BUILDING, TRA-605. CONTEXTUAL VIEW, CAMERA FACING SOUTHEAST. PROCESS WATER BUILDING AND ETR STACK ARE IN LEFT HALF OF VIEW. TRA-666 IS NEAR CENTER, ABUTTED BY SECURITY BUILDING; TRA-626, AT RIGHT EDGE OF VIEW BEHIND BUS. INL NEGATIVE NO. HD46-34-1. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  16. Optimal directional view angles for remote-sensing missions

    NASA Technical Reports Server (NTRS)

    Kimes, D. S.; Holben, B. N.; Tucker, C. J.; Newcomb, W. W.

    1984-01-01

    The present investigation is concerned with the directional, off-nadir viewing of terrestrial scenes using remote-sensing systems from aircraft and satellite platforms, taking into account advantages of such an approach over strictly nadir viewing systems. Directional reflectance data collected for bare soil and several different vegetation canopies in NOAA-7 AVHRR bands 1 and 2 were analyzed. Optimum view angles were recommended for two strategies. The first strategy views the utility of off-nadir measurements as extending spatial and temporal coverage of the target area. The second strategy views the utility of off-nadir measurements as providing additional information about the physical characteristics of the target. Conclusions regarding the two strategies are discussed.

  17. 7. DETAIL VIEW OF FIGUEROA STREET VIADUCT. SAME CAMERA POSITION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. DETAIL VIEW OF FIGUEROA STREET VIADUCT. SAME CAMERA POSITION AS CA-265-J-8. LOOKING 266°W. - Arroyo Seco Parkway, Figueroa Street Viaduct, Spanning Los Angeles River, Los Angeles, Los Angeles County, CA

  18. Opportunity's View After Long Drive on Sol 1770 (Vertical)

    NASA Technical Reports Server (NTRS)

    2009-01-01

    NASA's Mars Exploration Rover Opportunity used its navigation camera to take the images combined into this full-circle view of the rover's surroundings just after driving 104 meters (341 feet) on the 1,770th Martian day, or sol, of Opportunity's surface mission (January 15, 2009).

    This view is presented as a vertical projection with geometric seam correction. North is at the top.

    Tracks from the drive extend northward across dark-toned sand ripples and light-toned patches of exposed bedrock in the Meridiani Planum region of Mars. For scale, the distance between the parallel wheel tracks is about 1 meter (about 40 inches).

    Prior to the Sol 1770 drive, Opportunity had driven less than a meter since Sol 1713 (November 17, 2008), while it used the tools on its robotic arm first to examine a meteorite called 'Santorini' during weeks of restricted communication while the sun was nearly in line between Mars and Earth, then to examine bedrock and soil targets near Santorini.

    The rover's position after the Sol 1770 drive was about 1.1 kilometer (two-thirds of a mile) south southwest of Victoria Crater. Cumulative odometry was 13.72 kilometers (8.53 miles) since landing in January 2004, including 1.94 kilometers (1.21 miles) since climbing out of Victoria Crater on the west side of the crater on Sol 1634 (August 28, 2008).

  19. 1. CONTEXTUAL VIEW OF WASTE CALCINING FACILITY. CAMERA FACING NORTHEAST. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. CONTEXTUAL VIEW OF WASTE CALCINING FACILITY. CAMERA FACING NORTHEAST. ON RIGHT OF VIEW IS PART OF EARTH/GRAVEL SHIELDING FOR BIN SET. AERIAL STRUCTURE MOUNTED ON POLES IS PNEUMATIC TRANSFER SYSTEM FOR DELIVERY OF SAMPLES BEING SENT FROM NEW WASTE CALCINING FACILITY TO THE CPP REMOTE ANALYTICAL LABORATORY. INEEL PROOF NUMBER HD-17-1. - Idaho National Engineering Laboratory, Old Waste Calcining Facility, Scoville, Butte County, ID

  20. DETAIL VIEW OF A VIDEO CAMERA POSITIONED ALONG THE PERIMETER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL VIEW OF A VIDEO CAMERA POSITIONED ALONG THE PERIMETER OF THE MLP - Cape Canaveral Air Force Station, Launch Complex 39, Mobile Launcher Platforms, Launcher Road, East of Kennedy Parkway North, Cape Canaveral, Brevard County, FL

  1. 2. View from same camera position facing 232 degrees southwest ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. View from same camera position facing 232 degrees southwest showing abandoned section of old grade - Oak Creek Administrative Center, One half mile east of Zion-Mount Carmel Highway at Oak Creek, Springdale, Washington County, UT

  2. [Survival is associated with time to reach PSA nadir (DAN) and the ratio DAN/nadir value after androgen deprivation for prostate cancer].

    PubMed

    Gagnat, A; Larré, S; Fromont, G; Pirès, C; Doré, B; Irani, J

    2011-05-01

    The objective of this study was to assess the prognostic decrease rate of PSA in patients treated with androgen suppression (AS) for prostate cancer (PCa). We identified in our database CaP patients with histologically documented, treated with SA alone and for whom vital status with a minimum follow-up of 6 months (except death beforehand) was established. Patient characteristics and CaP and PSA at baseline, PSA nadir, time of reaching the nadir PSA (DAN) and the ratio of the DAN/nadir value (ratio DAN/Nadir) were analyzed in relation to progression-free survival, specific and overall survival. One hundred ninety eight patients met the inclusion criteria and the median was 61.5 months (range 4.8 to 233). The median PSA at the start of the SA were 37.1 ng/mL and the median nadir PSA was 0.48 ng/mL. The median time to progression was 23.6 months. The median specific and overall survivals were 94 and 78 months, respectively. In univariate analysis, predictors of progression-free survival were PSA before SA, PSA nadir, DAN, DAN ratio/nadir, Gleason score, the percentage of core positive prostate biopsy and the status of bone scintigraphy. Except for PSA before SA which was no longer significant, predictors of specific and overall survival were similar and added the biochemical response (decrease of more than 50% of PSA) to a second hormonal manipulation during the biological progression. In multivariate analysis, the nadir PSA and the ratio DAN/Nadir remained significant predictors. These results have confirmed in one hand the predictive value of survival in patients DAN SA for CaP: achieving faster nadir PSA was associated with shorter survival. They have introduced in the other hand the new concept of DAN/Nadir PSA which provides independent prognostic information. Copyright © 2010 Elsevier Masson SAS. All rights reserved.

  3. Backing collisions: a study of drivers’ eye and backing behaviour using combined rear-view camera and sensor systems

    PubMed Central

    Hurwitz, David S; Pradhan, Anuj; Fisher, Donald L; Knodler, Michael A; Muttart, Jeffrey W; Menon, Rajiv; Meissner, Uwe

    2012-01-01

    Context Backing crash injures can be severe; approximately 200 of the 2,500 reported injuries of this type per year to children under the age of 15 years result in death. Technology for assisting drivers when backing has limited success in preventing backing crashes. Objectives Two questions are addressed: Why is the reduction in backing crashes moderate when rear-view cameras are deployed? Could rear-view cameras augment sensor systems? Design 46 drivers (36 experimental, 10 control) completed 16 parking trials over 2 days (eight trials per day). Experimental participants were provided with a sensor camera system, controls were not. Three crash scenarios were introduced. Setting Parking facility at UMass Amherst, USA. Subjects 46 drivers (33 men, 13 women) average age 29 years, who were Massachusetts residents licensed within the USA for an average of 9.3 years. Interventions Vehicles equipped with a rear-view camera and sensor system-based parking aid. Main Outcome Measures Subject’s eye fixations while driving and researcher’s observation of collision with objects during backing. Results Only 20% of drivers looked at the rear-view camera before backing, and 88% of those did not crash. Of those who did not look at the rear-view camera before backing, 46% looked after the sensor warned the driver. Conclusions This study indicates that drivers not only attend to an audible warning, but will look at a rear-view camera if available. Evidence suggests that when used appropriately, rear-view cameras can mitigate the occurrence of backing crashes, particularly when paired with an appropriate sensor system. PMID:20363812

  4. DETAIL VIEW OF VIDEO CAMERA, MAIN FLOOR LEVEL, PLATFORM ESOUTH, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL VIEW OF VIDEO CAMERA, MAIN FLOOR LEVEL, PLATFORM E-SOUTH, HB-3, FACING SOUTHWEST - Cape Canaveral Air Force Station, Launch Complex 39, Vehicle Assembly Building, VAB Road, East of Kennedy Parkway North, Cape Canaveral, Brevard County, FL

  5. 10. 22'X34' original blueprint, VariableAngle Launcher, 'SIDE VIEW CAMERA CARSTEEL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. 22'X34' original blueprint, Variable-Angle Launcher, 'SIDE VIEW CAMERA CAR-STEEL FRAME AND AXLES' drawn at 1/2'=1'-0'. (BOURD Sketch # 209124). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  6. Initial inflight calibration for Hayabusa2 optical navigation camera (ONC) for science observations of asteroid Ryugu

    NASA Astrophysics Data System (ADS)

    Suzuki, H.; Yamada, M.; Kouyama, T.; Tatsumi, E.; Kameda, S.; Honda, R.; Sawada, H.; Ogawa, N.; Morota, T.; Honda, C.; Sakatani, N.; Hayakawa, M.; Yokota, Y.; Yamamoto, Y.; Sugita, S.

    2018-01-01

    Hayabusa2, the first sample return mission to a C-type asteroid was launched by the Japan Aerospace Exploration Agency (JAXA) on December 3, 2014 and will arrive at the asteroid in the middle of 2018 to collect samples from its surface, which may contain both hydrated minerals and organics. The optical navigation camera (ONC) system on board the Hayabusa2 consists of three individual framing CCD cameras, ONC-T for a telescopic nadir view, ONC-W1 for a wide-angle nadir view, and ONC-W2 for a wide-angle slant view will be used to observe the surface of Ryugu. The cameras will be used to measure the global asteroid shape, local morphologies, and visible spectroscopic properties. Thus, image data obtained by ONC will provide essential information to select landing (sampling) sites on the asteroid. This study reports the results of initial inflight calibration based on observations of Earth, Mars, Moon, and stars to verify and characterize the optical performance of the ONC, such as flat-field sensitivity, spectral sensitivity, point-spread function (PSF), distortion, and stray light of ONC-T, and distortion for ONC-W1 and W2. We found some potential problems that may influence our science observations. This includes changes in sensitivity of flat fields for all bands from those that were measured in the pre-flight calibration and existence of a stray light that arises under certain conditions of spacecraft attitude with respect to the sun. The countermeasures for these problems were evaluated by using data obtained during initial in-flight calibration. The results of our inflight calibration indicate that the error of spectroscopic measurements around 0.7 μm using 0.55, 0.70, and 0.86 μm bands of the ONC-T can be lower than 0.7% after these countermeasures and pixel binning. This result suggests that our ONC-T would be able to detect typical strength (∼3%) of the serpentine absorption band often found on CM chondrites and low albedo asteroids with ≥ 4

  7. 'Endurance' Untouched (vertical)

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This navigation camera mosaic, created from images taken by NASA's Mars Exploration Rover Opportunity on sols 115 and 116 (May 21 and 22, 2004) provides a dramatic view of 'Endurance Crater.' The rover engineering team carefully plotted the safest path into the football field-sized crater, eventually easing the rover down the slopes around sol 130 (June 12, 2004). To the upper left of the crater sits the rover's protective heatshield, which sheltered Opportunity as it passed through the martian atmosphere. The 360-degree view is presented in a vertical projection, with geometric and radiometric seam correction.

  8. 3. CONTEXTUAL VIEW OF WASTE CALCINING FACILITY, CAMERA FACING NORTHEAST. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. CONTEXTUAL VIEW OF WASTE CALCINING FACILITY, CAMERA FACING NORTHEAST. SHOWS RELATIONSHIP BETWEEN DECONTAMINATION ROOM, ADSORBER REMOVAL HATCHES (FLAT ON GRADE), AND BRIDGE CRANE. INEEL PROOF NUMBER HD-17-2. - Idaho National Engineering Laboratory, Old Waste Calcining Facility, Scoville, Butte County, ID

  9. ETR CRITICAL FACILITY, TRA654. CONTEXTUAL VIEW. CAMERA ON ROOF OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR CRITICAL FACILITY, TRA-654. CONTEXTUAL VIEW. CAMERA ON ROOF OF MTR BUILDING AND FACING SOUTH. ETR AND ITS COOLANT BUILDING AT UPPER PART OF VIEW. ETR COOLING TOWER NEAR TOP EDGE OF VIEW. EXCAVATION AT CENTER IS FOR ETR CF. CENTER OF WHICH WILL CONTAIN POOL FOR REACTOR. NOTE CHOPPER TUBE PROCEEDING FROM MTR IN LOWER LEFT OF VIEW, DIAGONAL TOWARD LEFT. INL NEGATIVE NO. 56-4227. Jack L. Anderson, Photographer, 12/18/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  10. Sub-Camera Calibration of a Penta-Camera

    NASA Astrophysics Data System (ADS)

    Jacobsen, K.; Gerke, M.

    2016-03-01

    Penta cameras consisting of a nadir and four inclined cameras are becoming more and more popular, having the advantage of imaging also facades in built up areas from four directions. Such system cameras require a boresight calibration of the geometric relation of the cameras to each other, but also a calibration of the sub-cameras. Based on data sets of the ISPRS/EuroSDR benchmark for multi platform photogrammetry the inner orientation of the used IGI Penta DigiCAM has been analyzed. The required image coordinates of the blocks Dortmund and Zeche Zollern have been determined by Pix4Dmapper and have been independently adjusted and analyzed by program system BLUH. With 4.1 million image points in 314 images respectively 3.9 million image points in 248 images a dense matching was provided by Pix4Dmapper. With up to 19 respectively 29 images per object point the images are well connected, nevertheless the high number of images per object point are concentrated to the block centres while the inclined images outside the block centre are satisfying but not very strongly connected. This leads to very high values for the Student test (T-test) of the finally used additional parameters or in other words, additional parameters are highly significant. The estimated radial symmetric distortion of the nadir sub-camera corresponds to the laboratory calibration of IGI, but there are still radial symmetric distortions also for the inclined cameras with a size exceeding 5μm even if mentioned as negligible based on the laboratory calibration. Radial and tangential effects of the image corners are limited but still available. Remarkable angular affine systematic image errors can be seen especially in the block Zeche Zollern. Such deformations are unusual for digital matrix cameras, but it can be caused by the correlation between inner and exterior orientation if only parallel flight lines are used. With exception of the angular affinity the systematic image errors for corresponding

  11. Stereo matching and view interpolation based on image domain triangulation.

    PubMed

    Fickel, Guilherme Pinto; Jung, Claudio R; Malzbender, Tom; Samadani, Ramin; Culbertson, Bruce

    2013-09-01

    This paper presents a new approach for stereo matching and view interpolation problems based on triangular tessellations suitable for a linear array of rectified cameras. The domain of the reference image is initially partitioned into triangular regions using edge and scale information, aiming to place vertices along image edges and increase the number of triangles in textured regions. A region-based matching algorithm is then used to find an initial disparity for each triangle, and a refinement stage is applied to change the disparity at the vertices of the triangles, generating a piecewise linear disparity map. A simple post-processing procedure is applied to connect triangles with similar disparities generating a full 3D mesh related to each camera (view), which are used to generate new synthesized views along the linear camera array. With the proposed framework, view interpolation reduces to the trivial task of rendering polygonal meshes, which can be done very fast, particularly when GPUs are employed. Furthermore, the generated views are hole-free, unlike most point-based view interpolation schemes that require some kind of post-processing procedures to fill holes.

  12. A&M. Guard house (TAN638), contextual view. Built in 1968. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. Guard house (TAN-638), contextual view. Built in 1968. Camera faces south. Guard house controlled access to radioactive waste storage tanks beyond and to left of view. Date: February 4, 2003. INEEL negative no. HD-33-4-1 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  13. A multi-camera system for real-time pose estimation

    NASA Astrophysics Data System (ADS)

    Savakis, Andreas; Erhard, Matthew; Schimmel, James; Hnatow, Justin

    2007-04-01

    This paper presents a multi-camera system that performs face detection and pose estimation in real-time and may be used for intelligent computing within a visual sensor network for surveillance or human-computer interaction. The system consists of a Scene View Camera (SVC), which operates at a fixed zoom level, and an Object View Camera (OVC), which continuously adjusts its zoom level to match objects of interest. The SVC is set to survey the whole filed of view. Once a region has been identified by the SVC as a potential object of interest, e.g. a face, the OVC zooms in to locate specific features. In this system, face candidate regions are selected based on skin color and face detection is accomplished using a Support Vector Machine classifier. The locations of the eyes and mouth are detected inside the face region using neural network feature detectors. Pose estimation is performed based on a geometrical model, where the head is modeled as a spherical object that rotates upon the vertical axis. The triangle formed by the mouth and eyes defines a vertical plane that intersects the head sphere. By projecting the eyes-mouth triangle onto a two dimensional viewing plane, equations were obtained that describe the change in its angles as the yaw pose angle increases. These equations are then combined and used for efficient pose estimation. The system achieves real-time performance for live video input. Testing results assessing system performance are presented for both still images and video.

  14. Node Scheduling Strategies for Achieving Full-View Area Coverage in Camera Sensor Networks.

    PubMed

    Wu, Peng-Fei; Xiao, Fu; Sha, Chao; Huang, Hai-Ping; Wang, Ru-Chuan; Xiong, Nai-Xue

    2017-06-06

    Unlike conventional scalar sensors, camera sensors at different positions can capture a variety of views of an object. Based on this intrinsic property, a novel model called full-view coverage was proposed. We study the problem that how to select the minimum number of sensors to guarantee the full-view coverage for the given region of interest (ROI). To tackle this issue, we derive the constraint condition of the sensor positions for full-view neighborhood coverage with the minimum number of nodes around the point. Next, we prove that the full-view area coverage can be approximately guaranteed, as long as the regular hexagons decided by the virtual grid are seamlessly stitched. Then we present two solutions for camera sensor networks in two different deployment strategies. By computing the theoretically optimal length of the virtual grids, we put forward the deployment pattern algorithm (DPA) in the deterministic implementation. To reduce the redundancy in random deployment, we come up with a local neighboring-optimal selection algorithm (LNSA) for achieving the full-view coverage. Finally, extensive simulation results show the feasibility of our proposed solutions.

  15. [Study of hyperspectral polarized reflectance of vegetation canopy at nadir viewing direction].

    PubMed

    Lŭ, Yun-Feng

    2013-04-01

    In the present study, corn canopy is the objective. Firstly the polarization of corn canopy was analyzed based on polarization reflection mechanism; then, the polarization of canopy was measured in different growth period at nadir before heading. The result proved the theoretical derivation that the light reflected from corn canopy is polarized, and found that in the total reflection the polarization light accounts for up to 10%. This shows that polarization measurement provides auxiliary information for remote sensing, but also illustrates that the use of the polarization information retrieval of atmospheric parameters should be considered when the surface polarization affects on it.

  16. Using Nadir and Directional Emissivity as a Probe of Particle Microphysical Properties

    NASA Astrophysics Data System (ADS)

    Pitman, Karly M.; Wolff, Michael J.; Bandfield, Joshua L.; Clayton, Geoffrey C.

    Real surfaces are not expected to be diffuse emitters, thus observed emissivity values of surface dust deposits are a function of viewing geometry. Attempts to model infrared emission spectral profiles of surface dust deposits at nadir have not yet matured to match the sophistication of astrophysical dust radiative transfer codes. In the absence of strong thermal gradients, directional emissivity may be obtained theoretically via a combination of reciprocity and Kirchhoff's Law. Owing to a lack of laboratory data on directional emissivity for comparison, theorists have not explored the potential utility of directional emissivity as a direct probe of surface dust microphysical properties. Motivated by future analyses of MGS/TES emission phase function (EPF) sequences and the upcoming Mars Exploration Rover mini-TES dataset, we explore the effects of dust particle size and composition on observed radiances at nadir and off-nadir geometries in the TES spectral regime using a combination of multiple scattering radiative transfer and Mie scattering algorithms. Comparisons of these simulated spectra to laboratory spectra of standard mineral assemblages will also be made. This work is supported through NASA grant NAGS-9820 (MJW) and LSU Board of Regents (KMP).

  17. Composite video and graphics display for camera viewing systems in robotics and teleoperation

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B. (Inventor); Venema, Steven C. (Inventor)

    1993-01-01

    A system for real-time video image display for robotics or remote-vehicle teleoperation is described that has at least one robot arm or remotely operated vehicle controlled by an operator through hand-controllers, and one or more television cameras and optional lighting element. The system has at least one television monitor for display of a television image from a selected camera and the ability to select one of the cameras for image display. Graphics are generated with icons of cameras and lighting elements for display surrounding the television image to provide the operator information on: the location and orientation of each camera and lighting element; the region of illumination of each lighting element; the viewed region and range of focus of each camera; which camera is currently selected for image display for each monitor; and when the controller coordinate for said robot arms or remotely operated vehicles have been transformed to correspond to coordinates of a selected or nonselected camera.

  18. HOT CELL BUILDING, TRA632. CONTEXTUAL VIEW ALONG WALLEYE AVENUE, CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    HOT CELL BUILDING, TRA-632. CONTEXTUAL VIEW ALONG WALLEYE AVENUE, CAMERA FACING EASTERLY. HOT CELL BUILDING IS AT CENTER LEFT OF VIEW; THE LOW-BAY PROJECTION WITH LADDER IS THE TEST TRAIN ASSEMBLY FACILITY, ADDED IN 1968. MTR BUILDING IS IN LEFT OF VIEW. HIGH-BAY BUILDING AT RIGHT IS THE ENGINEERING TEST REACTOR BUILDING, TRA-642. INL NEGATIVE NO. HD46-32-1. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  19. Mars Orbiter Camera Views the 'Face on Mars' - Calibrated, contrast enhanced, filtered,

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Shortly after midnight Sunday morning (5 April 1998 12:39 AM PST), the Mars Orbiter Camera (MOC) on the Mars Global Surveyor (MGS) spacecraft successfully acquired a high resolution image of the 'Face on Mars' feature in the Cydonia region. The image was transmitted to Earth on Sunday, and retrieved from the mission computer data base Monday morning (6 April 1998). The image was processed at the Malin Space Science Systems (MSSS) facility 9:15 AM and the raw image immediately transferred to the Jet Propulsion Laboratory (JPL) for release to the Internet. The images shown here were subsequently processed at MSSS.

    The picture was acquired 375 seconds after the spacecraft's 220th close approach to Mars. At that time, the 'Face', located at approximately 40.8o N, 9.6o W, was 275 miles (444 km) from the spacecraft. The 'morning' sun was 25o above the horizon. The picture has a resolution of 14.1 feet (4.3 meters) per pixel, making it ten times higher resolution than the best previous image of the feature, which was taken by the Viking Mission in the mid-1970's. The full image covers an area 2.7 miles (4.4 km) wide and 25.7 miles (41.5 km) long. Processing Image processing has been applied to the images in order to improve the visibility of features. This processing included the following steps:

    The image was processed to remove the sensitivity differences between adjacent picture elements (calibrated). This removes the vertical streaking.

    The contrast and brightness of the image was adjusted, and 'filters' were applied to enhance detail at several scales.

    The image was then geometrically warped to meet the computed position information for a mercator-type map. This corrected for the left-right flip, and the non-vertical viewing angle (about 45o from vertical), but also introduced some vertical 'elongation' of the image for the same reason Greenland looks larger than Africa on a mercator map of the Earth.

    A section of the image, containing the 'Face

  20. Mars Orbiter Camera Views the 'Face on Mars' - Calibrated, contrast enhanced, filtered

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Shortly after midnight Sunday morning (5 April 1998 12:39 AM PST), the Mars Orbiter Camera (MOC) on the Mars Global Surveyor (MGS) spacecraft successfully acquired a high resolution image of the 'Face on Mars' feature in the Cydonia region. The image was transmitted to Earth on Sunday, and retrieved from the mission computer data base Monday morning (6 April 1998). The image was processed at the Malin Space Science Systems (MSSS) facility 9:15 AM and the raw image immediately transferred to the Jet Propulsion Laboratory (JPL) for release to the Internet. The images shown here were subsequently processed at MSSS.

    The picture was acquired 375 seconds after the spacecraft's 220th close approach to Mars. At that time, the 'Face', located at approximately 40.8o N, 9.6o W, was 275 miles (444 km) from the spacecraft. The 'morning' sun was 25o above the horizon. The picture has a resolution of 14.1 feet (4.3 meters) per pixel, making it ten times higher resolution than the best previous image of the feature, which was taken by the Viking Mission in the mid-1970's. The full image covers an area 2.7 miles (4.4 km) wide and 25.7 miles (41.5 km) long. Processing Image processing has been applied to the images in order to improve the visibility of features. This processing included the following steps:

    The image was processed to remove the sensitivity differences between adjacent picture elements (calibrated). This removes the vertical streaking.

    The contrast and brightness of the image was adjusted, and 'filters' were applied to enhance detail at several scales.

    The image was then geometrically warped to meet the computed position information for a mercator-type map. This corrected for the left-right flip, and the non-vertical viewing angle (about 45o from vertical), but also introduced some vertical 'elongation' of the image for the same reason Greenland looks larger than Africa on a mercator map of the Earth.

    A section of the image, containing the 'Face

  1. Aerial multi-camera systems: Accuracy and block triangulation issues

    NASA Astrophysics Data System (ADS)

    Rupnik, Ewelina; Nex, Francesco; Toschi, Isabella; Remondino, Fabio

    2015-03-01

    Oblique photography has reached its maturity and has now been adopted for several applications. The number and variety of multi-camera oblique platforms available on the market is continuously growing. So far, few attempts have been made to study the influence of the additional cameras on the behaviour of the image block and comprehensive revisions to existing flight patterns are yet to be formulated. This paper looks into the precision and accuracy of 3D points triangulated from diverse multi-camera oblique platforms. Its coverage is divided into simulated and real case studies. Within the simulations, different imaging platform parameters and flight patterns are varied, reflecting both current market offerings and common flight practices. Attention is paid to the aspect of completeness in terms of dense matching algorithms and 3D city modelling - the most promising application of such systems. The experimental part demonstrates the behaviour of two oblique imaging platforms in real-world conditions. A number of Ground Control Point (GCP) configurations are adopted in order to point out the sensitivity of tested imaging networks and arising block deformations. To stress the contribution of slanted views, all scenarios are compared against a scenario in which exclusively nadir images are used for evaluation.

  2. REACTOR SERVICE BUILDING, TRA635, CONTEXTUAL VIEW DURING CONSTRUCTION. CAMERA IS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    REACTOR SERVICE BUILDING, TRA-635, CONTEXTUAL VIEW DURING CONSTRUCTION. CAMERA IS ATOP MTR BUILDING AND LOOKING SOUTHERLY. FOUNDATION AND DRAINS ARE UNDER CONSTRUCTION. THE BUILDING WILL BUTT AGAINST CHARGING FACE OF PLUG STORAGE BUILDING. HOT CELL BUILDING, TRA-632, IS UNDER CONSTRUCTION AT TOP CENTER OF VIEW. INL NEGATIVE NO. 8518. Unknown Photographer, 8/25/1953 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  3. Mission Report on the Orbiter Camera Payload System (OCPS) Large Format Camera (LFC) and Attitude Reference System (ARS)

    NASA Technical Reports Server (NTRS)

    Mollberg, Bernard H.; Schardt, Bruton B.

    1988-01-01

    The Orbiter Camera Payload System (OCPS) is an integrated photographic system which is carried into earth orbit as a payload in the Space Transportation System (STS) Orbiter vehicle's cargo bay. The major component of the OCPS is a Large Format Camera (LFC), a precision wide-angle cartographic instrument that is capable of producing high resolution stereo photography of great geometric fidelity in multiple base-to-height (B/H) ratios. A secondary, supporting system to the LFC is the Attitude Reference System (ARS), which is a dual lens Stellar Camera Array (SCA) and camera support structure. The SCA is a 70-mm film system which is rigidly mounted to the LFC lens support structure and which, through the simultaneous acquisition of two star fields with each earth-viewing LFC frame, makes it possible to determine precisely the pointing of the LFC optical axis with reference to the earth nadir point. Other components complete the current OCPS configuration as a high precision cartographic data acquisition system. The primary design objective for the OCPS was to maximize system performance characteristics while maintaining a high level of reliability compatible with Shuttle launch conditions and the on-orbit environment. The full-up OCPS configuration was launched on a highly successful maiden voyage aboard the STS Orbiter vehicle Challenger on October 5, 1984, as a major payload aboard mission STS 41-G. This report documents the system design, the ground testing, the flight configuration, and an analysis of the results obtained during the Challenger mission STS 41-G.

  4. PBF Cooling Tower contextual view. Camera facing southwest. West wing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Cooling Tower contextual view. Camera facing southwest. West wing and north facade (rear) of Reactor Building (PER-620) is at left; Cooling Tower to right. Photographer: Kirsh. Date: November 2, 1970. INEEL negative no. 70-4913 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  5. Oil Fire Plumes Over Baghdad

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Dark smoke from oil fires extend for about 60 kilometers south of Iraq's capital city of Baghdad in these images acquired by the Multi-angle Imaging SpectroRadiometer (MISR) on April 2, 2003. The thick, almost black smoke is apparent near image center and contains chemical and particulate components hazardous to human health and the environment.

    The top panel is from MISR's vertical-viewing (nadir) camera. Vegetated areas appear red here because this display is constructed using near-infrared, red and blue band data, displayed as red, green and blue, respectively, to produce a false-color image. The bottom panel is a combination of two camera views of the same area and is a 3-D stereo anaglyph in which red band nadir camera data are displayed as red, and red band data from the 60-degree backward-viewing camera are displayed as green and blue. Both panels are oriented with north to the left in order to facilitate stereo viewing. Viewing the 3-D anaglyph with red/blue glasses (with the red filter placed over the left eye and the blue filter over the right) makes it possible to see the rising smoke against the surface terrain. This technique helps to distinguish features in the atmosphere from those on the surface. In addition to the smoke, several high, thin cirrus clouds (barely visible in the nadir view) are readily observed using the stereo image.

    The Multi-angle Imaging SpectroRadiometer observes the daylit Earth continuously and every 9 days views the entire globe between 82 degrees north and 82 degrees south latitude. These data products were generated from a portion of the imagery acquired during Terra orbit 17489. The panels cover an area of about 187 kilometers x 123 kilometers, and use data from blocks 63 to 65 within World Reference System-2 path 168.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory,Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight

  6. 26. VIEW OF METAL SHED OVER SHIELDING TANK WITH CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    26. VIEW OF METAL SHED OVER SHIELDING TANK WITH CAMERA FACING SOUTHWEST. SHOWS OPEN SIDE OF SHED ROOF, HERCULON SHEET, AND HAND-OPERATED CRANE. TAKEN IN 1983. INEL PHOTO NUMBER 83-476-2-9, TAKEN IN 1983. PHOTOGRAPHER NOT NAMED. - Idaho National Engineering Laboratory, Advanced Reentry Vehicle Fusing System, Scoville, Butte County, ID

  7. Node Scheduling Strategies for Achieving Full-View Area Coverage in Camera Sensor Networks

    PubMed Central

    Wu, Peng-Fei; Xiao, Fu; Sha, Chao; Huang, Hai-Ping; Wang, Ru-Chuan; Xiong, Nai-Xue

    2017-01-01

    Unlike conventional scalar sensors, camera sensors at different positions can capture a variety of views of an object. Based on this intrinsic property, a novel model called full-view coverage was proposed. We study the problem that how to select the minimum number of sensors to guarantee the full-view coverage for the given region of interest (ROI). To tackle this issue, we derive the constraint condition of the sensor positions for full-view neighborhood coverage with the minimum number of nodes around the point. Next, we prove that the full-view area coverage can be approximately guaranteed, as long as the regular hexagons decided by the virtual grid are seamlessly stitched. Then we present two solutions for camera sensor networks in two different deployment strategies. By computing the theoretically optimal length of the virtual grids, we put forward the deployment pattern algorithm (DPA) in the deterministic implementation. To reduce the redundancy in random deployment, we come up with a local neighboring-optimal selection algorithm (LNSA) for achieving the full-view coverage. Finally, extensive simulation results show the feasibility of our proposed solutions. PMID:28587304

  8. IET. Aerial view of project, 95 percent complete. Camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    IET. Aerial view of project, 95 percent complete. Camera facing east. Left to right: stack, duct, mobile test cell building (TAN-624), four-rail track, dolly. Retaining wall between mobile test building and shielded control building (TAN-620) just beyond. North of control building are tank building (TAN-627) and fuel-transfer pump building (TAN-625). Guard house at upper right along exclusion fence. Construction vehicles and temporary warehouse in view near guard house. Date: June 6, 1955. INEEL negative no. 55-1462 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  9. Photometric Calibration and Image Stitching for a Large Field of View Multi-Camera System

    PubMed Central

    Lu, Yu; Wang, Keyi; Fan, Gongshu

    2016-01-01

    A new compact large field of view (FOV) multi-camera system is introduced. The camera is based on seven tiny complementary metal-oxide-semiconductor sensor modules covering over 160° × 160° FOV. Although image stitching has been studied extensively, sensor and lens differences have not been considered in previous multi-camera devices. In this study, we have calibrated the photometric characteristics of the multi-camera device. Lenses were not mounted on the sensor in the process of radiometric response calibration to eliminate the influence of the focusing effect of uniform light from an integrating sphere. Linearity range of the radiometric response, non-linearity response characteristics, sensitivity, and dark current of the camera response function are presented. The R, G, and B channels have different responses for the same illuminance. Vignetting artifact patterns have been tested. The actual luminance of the object is retrieved by sensor calibration results, and is used to blend images to make panoramas reflect the objective luminance more objectively. This compensates for the limitation of stitching images that are more realistic only through the smoothing method. The dynamic range limitation of can be resolved by using multiple cameras that cover a large field of view instead of a single image sensor with a wide-angle lens. The dynamic range is expanded by 48-fold in this system. We can obtain seven images in one shot with this multi-camera system, at 13 frames per second. PMID:27077857

  10. A study on obstacle detection method of the frontal view using a camera on highway

    NASA Astrophysics Data System (ADS)

    Nguyen, Van-Quang; Park, Jeonghyeon; Seo, Changjun; Kim, Heungseob; Boo, Kwangsuck

    2018-03-01

    In this work, we introduce an approach to detect vehicles for driver assistance, or warning system. For driver assistance system, it must detect both lanes (left and right side lane), and discover vehicles ahead of the test vehicle. Therefore, in this study, we use a camera, it is installed on the windscreen of the test vehicle. Images from the camera are used to detect three lanes, and detect multiple vehicles. In lane detection, line detection and vanishing point estimation are used. For the vehicle detection, we combine the horizontal and vertical edge detection, the horizontal edge is used to detect the vehicle candidates, and then the vertical edge detection is used to verify the vehicle candidates. The proposed algorithm works with of 480 × 640 image frame resolution. The system was tested on the highway in Korea.

  11. 19. VIEW SOUTHWEST OF INTERMEDIATE VERTICAL PENNSYLVANIA PETIT TRUSS WITH ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    19. VIEW SOUTHWEST OF INTERMEDIATE VERTICAL PENNSYLVANIA PETIT TRUSS WITH CASTLE ROCK IN BACKGROUND. JUNCTION OF INTERMEDIATE VERTICAL AND TOP CHORD WITH STABILIZING LATERAL STRUT ABOVE AND SWAY STRUT BELOW. ORIGINAL PAIRED DIAGONAL EYE BARS LATER REINFORCED WITH TIE ROD - New River Bridge, Spanning New River at State Route 623, Pembroke, Giles County, VA

  12. Composite video and graphics display for multiple camera viewing system in robotics and teleoperation

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B. (Inventor); Venema, Steven C. (Inventor)

    1991-01-01

    A system for real-time video image display for robotics or remote-vehicle teleoperation is described that has at least one robot arm or remotely operated vehicle controlled by an operator through hand-controllers, and one or more television cameras and optional lighting element. The system has at least one television monitor for display of a television image from a selected camera and the ability to select one of the cameras for image display. Graphics are generated with icons of cameras and lighting elements for display surrounding the television image to provide the operator information on: the location and orientation of each camera and lighting element; the region of illumination of each lighting element; the viewed region and range of focus of each camera; which camera is currently selected for image display for each monitor; and when the controller coordinate for said robot arms or remotely operated vehicles have been transformed to correspond to coordinates of a selected or nonselected camera.

  13. 9. GENERAL INTERIOR VIEW OF THE VERTICAL FURNACE BUILDING (PART ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. GENERAL INTERIOR VIEW OF THE VERTICAL FURNACE BUILDING (PART OF MACHINE SHOP No. 2). TWO FURNACES, WITH THEIR SUPPORT FRAMEWORK, ARE VISIBLE TO THE RIGHT. THE TALL STRUCTURE IN THE CENTER TOWARD THE BACKGROUND IS THE VERTICAL QUENCH TOWER. - U.S. Steel Homestead Works, Machine Shop No. 2, Along Monongahela River, Homestead, Allegheny County, PA

  14. Lymphocyte Nadir and Esophageal Cancer Survival Outcomes After Chemoradiation Therapy.

    PubMed

    Davuluri, Rajayogesh; Jiang, Wen; Fang, Penny; Xu, Cai; Komaki, Ritsuko; Gomez, Daniel R; Welsh, James; Cox, James D; Crane, Christopher H; Hsu, Charles C; Lin, Steven H

    2017-09-01

    Host immunity may affect the outcome in patients with esophageal cancer. We sought to identify factors that influenced absolute lymphocyte count (ALC) nadir during chemoradiation therapy (CRT) for esophageal cancer (EC) and looked for clinically relevant associations with survival. 504 patients with stage I-III EC (2007-2013) treated with neoadjuvant or definitive CRT with weekly ALC determinations made during treatment were analyzed. Grade of lymphopenia from ALC nadir during CRT was based on Common Terminology Criteria for Adverse Events version 4.0. Associations of ALC nadir with survival were examined using multivariate Cox proportional hazards analysis (MVA) and competing risks regression analysis. The median follow-up time was 36 months. The incidences of grade 1, 2, 3, and 4 ALC nadir during CRT were 2%, 12%, 59%, and 27%, respectively. The impact was lymphocyte-specific because this was not seen for monocyte or neutrophil count. On MVA, grade 4 ALC nadir (G4 nadir) was significantly associated with worse overall and disease-specific survival outcomes. Predictors of G4 nadir included distal tumor location, definitive CRT, taxane/5-fluorouracil chemotherapy, and photon-based radiation type (vs proton-based). Radiation type strongly influenced the mean body dose exposure, which was a strong predictor for G4 nadir (odds ratio 1.22 per Gray, P<.001). G4 nadir during CRT for EC was associated with poor outcomes, suggesting a role of host immunity in disease control. This observation provides a rationale to prospectively test chemotherapeutic and radiation treatment strategies that may have a lower impact on host immunity. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. LOFT. Interior view of entry (TAN624) rollup door. Camera is ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT. Interior view of entry (TAN-624) rollup door. Camera is inside entry building facing south. Rollup door was a modification of the original ANP door arrangement. Date: March 2004. INEEL negative no. HD-39-5-2 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  16. Comparison of Sentinel-2A and Landsat-8 Nadir BRDF Adjusted Reflectance (NBAR) over Southern Africa

    NASA Astrophysics Data System (ADS)

    Li, J.; Roy, D. P.; Zhang, H.

    2016-12-01

    The Landsat satellites have been providing moderate resolution imagery of the Earth's surface for over 40 years with continuity provided by the Landsat 8 and planned Landsat 9 missions. The European Space Agency Sentinel-2 satellite was successfully launched into a polar sun-synchronous orbit in 2015 and carries the Multi Spectral Instrument (MSI) that has Landsat-like bands and acquisition coverage. These new sensors acquire images at view angles ± 7.5° (Landsat) and ± 10.3° (Sentinel-2) from nadir that result in small directional effects in the surface reflectance. When data from adjoining paths, or from long time series are used, a model of the surface anisotropy is required to adjust observations to a uniform nadir view (primarily for visual consistency, vegetation monitoring, or detection of subtle surface changes). Recently a generalized approach was published that provides consistent Landsat view angle corrections to provide nadir BRDF-adjusted reflectance (NBAR). Because the BRDF shapes of different terrestrial surfaces are sufficiently similar over the narrow 15° Landsat field of view, a fixed global set of MODIS BRDF spectral model parameters was shown to be adequate for Landsat NBAR derivation with little sensitivity to the land cover type, condition, or surface disturbance. This poster demonstrates the application of this methodology to Sentinel-2 data over a west-east transect across southern Africa. The reflectance differences between adjacent overlapping paths in the forward and backward scatter directions are quantified for both before and after BRDF correction. Sentinel-2 and Landsat-8 reflectance and NBAR inter-comparison results considering different stages of cloud and saturation filtering, and filtering to reduce surface state differences caused by acquisition time differences, demonstrate the utility of the approach. The relevance and limitations of the corrections for providing consistent moderate resolution reflectance are discussed.

  17. 1. Context view includes Building 59 (second from left). Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. Context view includes Building 59 (second from left). Camera is pointed ENE along Farragut Aveune. Buildings on left side of street are, from left: Building 856, Building 59 and Building 107. On right side of street they are, from right; Building 38, Building 452 and Building 460. - Puget Sound Naval Shipyard, Pattern Shop, Farragut Avenue, Bremerton, Kitsap County, WA

  18. Detection Method of Lightning and TLEs by JEM-GLIMS Nadir Observation

    NASA Astrophysics Data System (ADS)

    Adachi, T.; Sato, M.; Ushio, T.; Yamazaki, A.; Suzuki, M.; Masayuki, K.; Takahashi, Y.; Inan, U.; Linscott, I.; Hobara, Y.

    2013-12-01

    A scientific payload named JEM-GLIMS aboard the International Space Station (ISS) is aimed at observing lightning and Transient Luminous Events (TLEs) globally. Keeping its field-of-view toward the nadir direction, GLIMS clarifies the horizontal structures of lightning and TLEs, which is a crucial issue to understand the electrodynamic coupling between the troposphere and ionosphere. A difficult point, however, is that careful analyses are necessary to separate the emissions of lightning and TLEs which spatially overlap along the line-of-sights in the case of nadir observation. In this study, we analyze the multi-wavelength optical data obtained by GLIMS to identify lightning and TLEs. The main data analyzed are those of imager (LSI) and spectrophotometer (PH). LSI consists of two cameras equipped with a broadband red filter and a narrowband 762-nm filter, respectively, and obtains imagery at a spatial resolution of 400 m/pixel on the ground surface. PH detects time-resolved emission intensity at a sampling rate of 20 kHz by six photometer channels measuring at 150-280, 337, 762, 600-900, 316 and 392 nm, respectively. During a period between November 2012 and June 2013, GLIMS observed 815 lightning and/or TLE events, and in 494 of them, both LSI and PH data showed clear signals above the noise level. As the first step, we carried out case study using an event observed at 09:50:47UT on Jan 29 2013 which did not cause strong saturation on the LSI and PH data. The estimated peak irradiance was 1.38x10^(-3) W/m^(2) at 600-900 nm, which is equivalent to the top 10 % bright lightning events observed by FORTE satellite in the past. This finding suggests that GLIMS selectively observes the most optically-powerful events. The peak irradiance was estimated also for the other PH channels. At all visible channels other than a far ultra violet (FUV) channel, the peak irradiance was estimated to be in good agreement with the atmospheric transmittance curve calculated between 10

  19. VERTICAL DETAIL OBLIQUE VIEW OF NORTHEAST SIDE OF HYDROELECTRIC POWER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VERTICAL DETAIL OBLIQUE VIEW OF NORTHEAST SIDE OF HYDROELECTRIC POWER HOUSE WITH OLD BYPASS IN FOREGROUND, SHOWING GLASS BLOCKS PROVIDING LIGHT TO BASEMENT OF HYDROELECTRIC POWER HOUSE, VIEW TOWARDS WEST SOUTHWEST - St. Lucie Canal, Lock No. 1, Hydroelectric Power House, St. Lucie, Cross State Canal, Okeechobee Intracoastal Waterway, Stuart, Martin County, FL

  20. Effect of viewing distance on the generation of vertical eye movements during locomotion

    NASA Technical Reports Server (NTRS)

    Moore, S. T.; Hirasaki, E.; Cohen, B.; Raphan, T.

    1999-01-01

    Vertical head and eye coordination was studied as a function of viewing distance during locomotion. Vertical head translation and pitch movements were measured using a video motion analysis system (Optotrak 3020). Vertical eye movements were recorded using a video-based pupil tracker (Iscan). Subjects (five) walked on a linear treadmill at a speed of 1.67 m/s (6 km/h) while viewing a target screen placed at distances ranging from 0.25 to 2.0 m at 0. 25-m intervals. The predominant frequency of vertical head movement was 2 Hz. In accordance with previous studies, there was a small head pitch rotation, which was compensatory for vertical head translation. The magnitude of the vertical head movements and the phase relationship between head translation and pitch were little affected by viewing distance, and tended to orient the naso-occipital axis of the head at a point approximately 1 m in front of the subject (the head fixation distance or HFD). In contrast, eye velocity was significantly affected by viewing distance. When viewing a far (2-m) target, vertical eye velocity was 180 degrees out of phase with head pitch velocity, with a gain of 0. 8. This indicated that the angular vestibulo-ocular reflex (aVOR) was generating the eye movement response. The major finding was that, at a close viewing distance (0.25 m), eye velocity was in phase with head pitch and compensatory for vertical head translation, suggesting that activation of the linear vestibulo-ocular reflex (lVOR) was contributing to the eye movement response. There was also a threefold increase in the magnitude of eye velocity when viewing near targets, which was consistent with the goal of maintaining gaze on target. The required vertical lVOR sensitivity to cancel an unmodified aVOR response and generate the observed eye velocity magnitude for near targets was almost 3 times that previously measured. Supplementary experiments were performed utilizing body-fixed active head pitch rotations at 1 and 2 Hz

  1. PBF Reactor Building (PER620). Aerial view of early construction. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Aerial view of early construction. Camera facing northwest. Excavation and concrete placement in two basements are underway. Note exposed lava rock. Photographer: Farmer. Date: March 22, 1965. INEEL negative no. 65-2219 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  2. Opportunity's Surroundings on Sol 1818 (Vertical)

    NASA Technical Reports Server (NTRS)

    2009-01-01

    NASA's Mars Exploration Rover Opportunity used its navigation camera to take the images combined into this full-circle view of the rover's surroundings during the 1,818th Martian day, or sol, of Opportunity's surface mission (March 5, 2009). South is at the center; north at both ends.

    This view is presented as a vertical projection with geometric seam correction. North is at the top.

    The rover had driven 80.3 meters (263 feet) southward earlier on that sol. Tracks from the drive recede northward in this view.

    The terrain in this portion of Mars' Meridiani Planum region includes dark-toned sand ripples and lighter-toned bedrock.

  3. Newborn Plasma Glucose Concentration Nadirs by Gestational-Age Group.

    PubMed

    Kaiser, Jeffrey R; Bai, Shasha; Rozance, Paul J

    2018-01-01

    The glucose concentrations and times to nadir for newborns of all gestational ages when intrapartum glucose-containing solutions are not routinely provided are unknown. To characterize and compare patterns of initial glucose concentration nadirs by gestational-age groups. A cross-sectional cohort study of 1,366 newborns born in 1998 at the University of Arkansas for Medical Sciences, appropriate for gestational age, nonasphyxiated, nonpolycythemic, and not infants of diabetic mothers, were included. Initial plasma glucose concentrations, before intravenous fluids or feedings, were plotted against time after birth for 4 gestational-age groups (full term [FT], ≥37-42 weeks; late preterm [LPT], ≥34 and < 37 weeks; preterm [PT], ≥28 and < 34 weeks; and extremely low gestational age newborns [ELGAN], 23 and < 28 weeks of gestation). ELGAN had the earliest nadir at 61 ± 4 min, followed by PT newborns (71 ± 2 min), and then LPT and FT newborns at 92-93 min. The time to nadir for ELGAN and PT newborns was significantly earlier than for FT newborns. Glucose nadir concentrations for ELGAN, PT, and LPT newborns were significantly lower than for FT newborns. LPT newborns' pattern of glucose paralleled those of FT newborns, with values approximately 5-6 mg/dL lower during the first 3 h. Plasma glucose nadirs occurred at different times among gestational-age groups during the early postnatal period as follows: ELGAN < PT < LPT ≈ FT. In order to potentially prevent low glucose concentrations at the time of the nadir, exogenous glucose should be provided to all newborns as soon as possible after birth. © 2018 S. Karger AG, Basel.

  4. LOFT complex in 1975 awaits renewed mission. Aerial view. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT complex in 1975 awaits renewed mission. Aerial view. Camera facing southwesterly. Left to right: stack, entry building (TAN-624), door shroud, duct shroud and filter hatches, dome (painted white), pre-amp building, equipment and piping building, shielded control room (TAN-630), airplane hangar (TAN-629). Date: 1975. INEEL negative no. 75-3690 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  5. Radiation physics and modelling for off-nadir satellite-sensing of non-Lambertian surfaces

    NASA Technical Reports Server (NTRS)

    Gerstl, S. A.; Simmer, C.

    1986-01-01

    The primary objective of this paper is to provide a deeper understanding of the physics of satellite remote-sensing when off-nadir observations are considered. Emphasis is placed on the analysis and modeling of atmospheric effects and the radiative transfer of non-Lambertian surface reflectance characteristics from ground-level to satellite locations. The relative importance of spectral, spatial, angular, and temporal reflectance characteristics for satellite-sensed identification of vegetation types in the visible and near-infrared wavelength regions is evaluated. The highest identification value is attributed to angular reflectance signatures. Using radiative transfer calculations to evaluate the atmospheric effects on angular reflectance distributions of vegetation surfaces, atmosphere-invariant angular reflectance features such as the 'hot spot' and the 'persistent valley' are identified. A new atmospheric correction formalism for complete angular reflectance distributions is described. A sample calculation demonstrates that a highly non-Lambertian measured surface reflectance distribution can be retrieved from simulated satellite data in the visible and near infrared to within about 20 percent accuracy for almost all view directions up to 60 deg off-nadir. Thus the high value of angular surface reflectance characteristics (the 'angular signature') for satellite-sensed feature identification is confirmed, which provides a scientific basis for future off-nadir satellite observations.

  6. True 3-D View of 'Columbia Hills' from an Angle

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This mosaic of images from NASA's Mars Exploration Rover Spirit shows a panorama of the 'Columbia Hills' without any adjustment for rover tilt. When viewed through 3-D glasses, depth is much more dramatic and easier to see, compared with a tilt-adjusted version. This is because stereo views are created by producing two images, one corresponding to the view from the panoramic camera's left-eye camera, the other corresponding to the view from the panoramic camera's right-eye camera. The brain processes the visual input more accurately when the two images do not have any vertical offset. In this view, the vertical alignment is nearly perfect, but the horizon appears to curve because of the rover's tilt (because the rover was parked on a steep slope, it was tilted approximately 22 degrees to the west-northwest). Spirit took the images for this 360-degree panorama while en route to higher ground in the 'Columbia Hills.'

    The highest point visible in the hills is 'Husband Hill,' named for space shuttle Columbia Commander Rick Husband. To the right are the rover's tracks through the soil, where it stopped to perform maintenance on its right front wheel in July. In the distance, below the hills, is the floor of Gusev Crater, where Spirit landed Jan. 3, 2004, before traveling more than 3 kilometers (1.8 miles) to reach this point. This vista comprises 188 images taken by Spirit's panoramic camera from its 213th day, or sol, on Mars to its 223rd sol (Aug. 9 to 19, 2004). Team members at NASA's Jet Propulsion Laboratory and Cornell University spent several weeks processing images and producing geometric maps to stitch all the images together in this mosaic. The 360-degree view is presented in a cylindrical-perspective map projection with geometric seam correction.

  7. Adjustable control station with movable monitors and cameras for viewing systems in robotics and teleoperations

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B. (Inventor)

    1994-01-01

    Real-time video presentations are provided in the field of operator-supervised automation and teleoperation, particularly in control stations having movable cameras for optimal viewing of a region of interest in robotics and teleoperations for performing different types of tasks. Movable monitors to match the corresponding camera orientations (pan, tilt, and roll) are provided in order to match the coordinate systems of all the monitors to the operator internal coordinate system. Automated control of the arrangement of cameras and monitors, and of the configuration of system parameters, is provided for optimal viewing and performance of each type of task for each operator since operators have different individual characteristics. The optimal viewing arrangement and system parameter configuration is determined and stored for each operator in performing each of many types of tasks in order to aid the automation of setting up optimal arrangements and configurations for successive tasks in real time. Factors in determining what is optimal include the operator's ability to use hand-controllers for each type of task. Robot joint locations, forces and torques are used, as well as the operator's identity, to identify the current type of task being performed in order to call up a stored optimal viewing arrangement and system parameter configuration.

  8. IET. Aerial view of snaptran destructive experiment in 1964. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    IET. Aerial view of snaptran destructive experiment in 1964. Camera facing north. Test cell building (TAN-624) is positioned away from coupling station. Weather tower in right foreground. Divided duct just beyond coupling station. Air intake structure on south side of shielded control room. Experiment is on dolly at coupling station. Date: 1964. INEEL negative no. 64-1736 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  9. Mars Orbiter Camera Views the 'Face on Mars' - Comparison with Viking

    NASA Technical Reports Server (NTRS)

    1998-01-01

    image was processed to remove the sensitivity differences between adjacent picture elements (calibrated). This removes the vertical streaking.

    The contrast and brightness of the image was adjusted, and 'filters' were applied to enhance detail at several scales.

    The image was then geometrically warped to meet the computed position information for a mercator-type map. This corrected for the left-right flip, and the non-vertical viewing angle (about 45o from vertical), but also introduced some vertical 'elongation' of the image for the same reason Greenland looks larger than Africa on a mercator map of the Earth.

    A section of the image, containing the 'Face' and a couple of nearly impact craters and hills, was 'cut' out of the full image and reproduced separately.

    See PIA01440-1442 for additional processing steps. Also see PIA01236 for the raw image.

    Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.

  10. Noise Reduction in Brainwaves by Using Both EEG Signals and Frontal Viewing Camera Images

    PubMed Central

    Bang, Jae Won; Choi, Jong-Suk; Park, Kang Ryoung

    2013-01-01

    Electroencephalogram (EEG)-based brain-computer interfaces (BCIs) have been used in various applications, including human–computer interfaces, diagnosis of brain diseases, and measurement of cognitive status. However, EEG signals can be contaminated with noise caused by user's head movements. Therefore, we propose a new method that combines an EEG acquisition device and a frontal viewing camera to isolate and exclude the sections of EEG data containing these noises. This method is novel in the following three ways. First, we compare the accuracies of detecting head movements based on the features of EEG signals in the frequency and time domains and on the motion features of images captured by the frontal viewing camera. Second, the features of EEG signals in the frequency domain and the motion features captured by the frontal viewing camera are selected as optimal ones. The dimension reduction of the features and feature selection are performed using linear discriminant analysis. Third, the combined features are used as inputs to support vector machine (SVM), which improves the accuracy in detecting head movements. The experimental results show that the proposed method can detect head movements with an average error rate of approximately 3.22%, which is smaller than that of other methods. PMID:23669713

  11. Power Burst Facility (PBF), PER620, contextual and oblique view. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Power Burst Facility (PBF), PER-620, contextual and oblique view. Camera facing northwest. South and east facade. The 1980 west-wing expansion is left of center bay. Concrete structure at right is PER-730. Date: March 2004. INEEL negative no. HD-41-2-3 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  12. VIEW OF PROCESS DEVELOPMENT PILE (PDP) TANK TOP, WITH VERTICAL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW OF PROCESS DEVELOPMENT PILE (PDP) TANK TOP, WITH VERTICAL ELEMENTS IN BACKGROUND, LEVEL 0’, LOOKING NORTHWEST - Physics Assembly Laboratory, Area A/M, Savannah River Site, Aiken, Aiken County, SC

  13. A small field of view camera for hybrid gamma and optical imaging

    NASA Astrophysics Data System (ADS)

    Lees, J. E.; Bugby, S. L.; Bhatia, B. S.; Jambi, L. K.; Alqahtani, M. S.; McKnight, W. R.; Ng, A. H.; Perkins, A. C.

    2014-12-01

    The development of compact low profile gamma-ray detectors has allowed the production of small field of view, hand held imaging devices for use at the patient bedside and in operating theatres. The combination of an optical and a gamma camera, in a co-aligned configuration, offers high spatial resolution multi-modal imaging giving a superimposed scintigraphic and optical image. This innovative introduction of hybrid imaging offers new possibilities for assisting surgeons in localising the site of uptake in procedures such as sentinel node detection. Recent improvements to the camera system along with results of phantom and clinical imaging are reported.

  14. Southern Quebec in Late Winter

    NASA Technical Reports Server (NTRS)

    2002-01-01

    These images of Canada's Quebec province were acquired by the Multi-angle Imaging SpectroRadiometer on March 4, 2001. The region's forests are a mixture of coniferous and hardwood trees, and 'sugar-shack' festivities are held at this time of year to celebrate the beginning of maple syrup production. The large river visible in the images is the northeast-flowing St. Lawrence. The city of Montreal is located near the lower left corner, and Quebec City, at the upper right, is near the mouth of the partially ice-covered St. Lawrence Seaway.

    Both spectral and angular information are retrieved for every scene observed by MISR. The left-hand image was acquired by the instrument's vertical-viewing (nadir) camera, and is a false-color spectral composite from the near-infrared, red, and blue bands. The right-hand image is a false-color angular composite using red band data from the 60-degree backward-viewing, nadir, and 60-degree forward-viewing cameras. In each case, the individual channels of data are displayed as red, green, and blue, respectively.

    Much of the ground remains covered or partially covered with snow. Vegetation appears red in the left-hand image because of its high near-infrared brightness. In the multi-angle composite, vegetated areas appear in shades of green because they are brighter at nadir, possibly as a result of an underlying blanket of snow which is more visible from this direction. Enhanced forward scatter from the smooth water surface results in bluer hues, whereas urban areas look somewhat orange, possibly due to the effect of vertical structures which preferentially backscatter sunlight.

    The data were acquired during Terra orbit 6441, and cover an area measuring 275 kilometers x 310 kilometers.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the

  15. Spectral and spatial variability of undisturbed and disturbed grass under different view and illumination directions

    NASA Astrophysics Data System (ADS)

    Borel-Donohue, Christoph C.; Shivers, Sarah Wells; Conover, Damon

    2017-05-01

    It is well known that disturbed grass covered surfaces show variability with view and illumination conditions. A good example is a grass field in a soccer stadium that shows stripes indicating in which direction the grass was mowed. These spatial variations are due to a complex interplay of spectral characteristics of grass blades, density, their length and orientations. Viewing a grass surface from nadir or near horizontal directions results in observing different components. Views from a vertical direction show more variations due to reflections from the randomly oriented grass blades and their shadows. Views from near horizontal show a mixture of reflected and transmitted light from grass blades. An experiment was performed on a mowed grass surface which had paths of simulated heavy foot traffic laid down in different directions. High spatial resolution hyperspectral data cubes were taken by an imaging spectrometer covering the visible through near infrared over a period of time covering several hours. Ground truth grass reflectance spectra with a hand held spectrometer were obtained of undisturbed and disturbed areas. Close range images were taken of selected areas with a hand held camera which were then used to reconstruct the 3D geometry of the grass using structure-from-motion algorithms. Computer graphics rendering using raytracing of reconstructed and procedurally created grass surfaces were used to compute BRDF models. In this paper, we discuss differences between observed and simulated spectral and spatial variability. Based on the measurements and/or simulations, we derive simple spectral index methods to detect spatial disturbances and apply scattering models.

  16. Multi-Angle View of the Canary Islands

    NASA Technical Reports Server (NTRS)

    2000-01-01

    A multi-angle view of the Canary Islands in a dust storm, 29 February 2000. At left is a true-color image taken by the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite. This image was captured by the MISR camera looking at a 70.5-degree angle to the surface, ahead of the spacecraft. The middle image was taken by the MISR downward-looking (nadir) camera, and the right image is from the aftward 70.5-degree camera. The images are reproduced using the same radiometric scale, so variations in brightness, color, and contrast represent true variations in surface and atmospheric reflectance with angle. Windblown dust from the Sahara Desert is apparent in all three images, and is much brighter in the oblique views. This illustrates how MISR's oblique imaging capability makes the instrument a sensitive detector of dust and other particles in the atmosphere. Data for all channels are presented in a Space Oblique Mercator map projection to facilitate their co-registration. The images are about 400 km (250 miles)wide, with a spatial resolution of about 1.1 kilometers (1,200 yards). North is toward the top. MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  17. 1. GENERAL VIEW. OVERHANG, PAINTED RED, HAS VERTICAL SIDING AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. GENERAL VIEW. OVERHANG, PAINTED RED, HAS VERTICAL SIDING AND FADED PAINTINGS OF FARM ANIMALS: COW, DONKEYS AND HORSE. - De Turck House, Barn, State Route 662 vicinity, Oley Township, Oley, Berks County, PA

  18. Opportunity's Surroundings on Sol 1798 (Vertical)

    NASA Technical Reports Server (NTRS)

    2009-01-01

    NASA's Mars Exploration Rover Opportunity used its navigation camera to take the images combined into this 180-degree view of the rover's surroundings during the 1,798th Martian day, or sol, of Opportunity's surface mission (Feb. 13, 2009). North is on top.

    This view is presented as a vertical projection with geometric seam correction.

    The rover had driven 111 meters (364 feet) southward on the preceding sol. Tracks from that drive recede northward in this view. For scale, the distance between the parallel wheel tracks is about 1 meter (about 40 inches).

    The terrain in this portion of Mars' Meridiani Planum region includes dark-toned sand ripples and lighter-toned bedrock.

  19. General view in the Vertical Processing Area of the Space ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    General view in the Vertical Processing Area of the Space Shuttle Main Engine (SSME) Processing Facility at Kennedy Space Center. This view shows a SSME Rotating Sling in the foreground right and SSME 2056 in the foreground and SSMEs 2050, 2062 and 2054 in succession towards the background. - Space Transportation System, Space Shuttle Main Engine, Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX

  20. Analysis of Camera Arrays Applicable to the Internet of Things.

    PubMed

    Yang, Jiachen; Xu, Ru; Lv, Zhihan; Song, Houbing

    2016-03-22

    The Internet of Things is built based on various sensors and networks. Sensors for stereo capture are essential for acquiring information and have been applied in different fields. In this paper, we focus on the camera modeling and analysis, which is very important for stereo display and helps with viewing. We model two kinds of cameras, a parallel and a converged one, and analyze the difference between them in vertical and horizontal parallax. Even though different kinds of camera arrays are used in various applications and analyzed in the research work, there are few discussions on the comparison of them. Therefore, we make a detailed analysis about their performance over different shooting distances. From our analysis, we find that the threshold of shooting distance for converged cameras is 7 m. In addition, we design a camera array in our work that can be used as a parallel camera array, as well as a converged camera array and take some images and videos with it to identify the threshold.

  1. 15. Detail view of connection between vertical posts and deck ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. Detail view of connection between vertical posts and deck beam. Jack Boucher, photographer, 1983 - Neshanic Station Lenticular Truss Bridge, State Route 567, spanning South Branch of Raritan River, Neshanic Station, Somerset County, NJ

  2. LOFT. Interior view of entry to reactor building, TAN650. Camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT. Interior view of entry to reactor building, TAN-650. Camera is inside entry (TAN-624) and facing north. At far end of domed chamber are penetrations in wall for electrical and other connections. Reactor and other equipment has been removed. Date: March 2004. INEEL negative no. HD-39-5-1 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  3. Nadir creatinine in posterior urethral valves: How high is low enough?

    PubMed

    Coleman, R; King, T; Nicoara, C-D; Bader, M; McCarthy, L; Chandran, H; Parashar, K

    2015-12-01

    Large retrospective studies of people with posterior urethral valves (PUV) have reported chronic renal insufficiency (CRI) in up to one third of the participants and end-stage renal failure in up to one quarter of them. Nadir creatinine (lowest creatinine during the first year following diagnosis) is the recognised prognostic indicator for renal outcome in PUV, the most commonly used cut-off being 1 mg/dl (88.4 umol/l). To conduct a statistical analysis of nadir creatinine in PUV patients in order to identify the optimal cut-off level as a prognostic indicator for CRI. Patients treated by endoscopic valve ablation at the present institution between 1993 and 2004 were reviewed. Chronic renal insufficiency was defined as CKD2 or higher. Statistical methods included receiver operating characteristic (ROC) curve analysis, Fisher exact test and diagnostic utility tests. Statistical significance was defined as P < 0.05. Nadir creatinine was identified in 96 patients. The median follow-up was 9.4 (IQR 7.0, 13.4) years. A total of 29 (30.2%) patients developed CRI, with nine (9.4%) reaching end-stage renal failure. On ROC analysis, Nadir creatinine was highly prognostic for future CRI, with an Area Under the Curve of 0.887 (P < 0.001). Renal insufficiency occurred in all 10 (100%) patients with nadir creatinine >88.4 umol/l compared with 19 of 86 (22.2%) patients with lower nadir creatinine (P < 0.001). As a test for future CRI, a nadir creatinine cut-off of 88.4 umol/l gave a specificity of 100%, but poor sensitivity of 34.5%. Lowering the cut-off to 75 umol/l resulted in improvement in all diagnostic utility tests (Table). All 14 (100%) patients with nadir creatinine >75 umol/l developed CRI, compared with 15 of 82 (18.3%) patients with lower nadir creatinine (P < 0.001). Sensitivity only approached 95% at 35 umol/l, at which level specificity was low (Table). Two out of 36 (5.6%) patients with nadir creatinine <35 umol/l developed CRI. Multivariate analysis

  4. Impact of Footprint Diameter and Off-Nadir Pointing on the Precision of Canopy Height Estimates from Spaceborne Lidar

    NASA Technical Reports Server (NTRS)

    Pang, Yong; Lefskky, Michael; Sun, Guoqing; Ranson, Jon

    2011-01-01

    A spaceborne lidar mission could serve multiple scientific purposes including remote sensing of ecosystem structure, carbon storage, terrestrial topography and ice sheet monitoring. The measurement requirements of these different goals will require compromises in sensor design. Footprint diameters that would be larger than optimal for vegetation studies have been proposed. Some spaceborne lidar mission designs include the possibility that a lidar sensor would share a platform with another sensor, which might require off-nadir pointing at angles of up to 16 . To resolve multiple mission goals and sensor requirements, detailed knowledge of the sensitivity of sensor performance to these aspects of mission design is required. This research used a radiative transfer model to investigate the sensitivity of forest height estimates to footprint diameter, off-nadir pointing and their interaction over a range of forest canopy properties. An individual-based forest model was used to simulate stands of mixed conifer forest in the Tahoe National Forest (Northern California, USA) and stands of deciduous forests in the Bartlett Experimental Forest (New Hampshire, USA). Waveforms were simulated for stands generated by a forest succession model using footprint diameters of 20 m to 70 m. Off-nadir angles of 0 to 16 were considered for a 25 m diameter footprint diameter. Footprint diameters in the range of 25 m to 30 m were optimal for estimates of maximum forest height (R(sup 2) of 0.95 and RMSE of 3 m). As expected, the contribution of vegetation height to the vertical extent of the waveform decreased with larger footprints, while the contribution of terrain slope increased. Precision of estimates decreased with an increasing off-nadir pointing angle, but off-nadir pointing had less impact on height estimates in deciduous forests than in coniferous forests. When pointing off-nadir, the decrease in precision was dependent on local incidence angle (the angle between the off-nadir

  5. 14. Detail view of connection between vertical post and bottom ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    14. Detail view of connection between vertical post and bottom chord member. Jack Boucher, photographer, 1977 - Neshanic Station Lenticular Truss Bridge, State Route 567, spanning South Branch of Raritan River, Neshanic Station, Somerset County, NJ

  6. Emission computerized axial tomography from multiple gamma-camera views using frequency filtering.

    PubMed

    Pelletier, J L; Milan, C; Touzery, C; Coitoux, P; Gailliard, P; Budinger, T F

    1980-01-01

    Emission computerized axial tomography is achievable in any nuclear medicine department from multiple gamma camera views. Data are collected by rotating the patient in front of the camera. A simple fast algorithm is implemented, known as the convolution technique: first the projection data are Fourier transformed and then an original filter designed for optimizing resolution and noise suppression is applied; finally the inverse transform of the latter operation is back-projected. This program, which can also take into account the attenuation for single photon events, was executed with good results on phantoms and patients. We think that it can be easily implemented for specific diagnostic problems.

  7. MISR Global Images See the Light of Day

    NASA Technical Reports Server (NTRS)

    2002-01-01

    As of July 31, 2002, global multi-angle, multi-spectral radiance products are available from the MISR instrument aboard the Terra satellite. Measuring the radiative properties of different types of surfaces, clouds and atmospheric particulates is an important step toward understanding the Earth's climate system. These images are among the first planet-wide summary views to be publicly released from the Multi-angle Imaging SpectroRadiometer experiment. Data for these images were collected during the month of March 2002, and each pixel represents monthly-averaged daylight radiances from an area measuring 1/2 degree in latitude by 1/2 degree in longitude.

    The top panel is from MISR's nadir (vertical-viewing) camera and combines data from the red, green and blue spectral bands to create a natural color image. The central view combines near-infrared, red, and green spectral data to create a false-color rendition that enhances highly vegetated terrain. It takes 9 days for MISR to view the entire globe, and only areas within 8 degrees of latitude of the north and south poles are not observed due to the Terra orbit inclination. Because a single pole-to-pole swath of MISR data is just 400 kilometers wide, multiple swaths must be mosaiced to create these global views. Discontinuities appear in some cloud patterns as a consequence of changes in cloud cover from one day to another.

    The lower panel is a composite in which red, green, and blue radiances from MISR's 70-degree forward-viewing camera are displayed in the northern hemisphere, and radiances from the 70-degree backward-viewing camera are displayed in the southern hemisphere. At the March equinox (spring in the northern hemisphere, autumn in the southern hemisphere), the Sun is near the equator. Therefore, both oblique angles are observing the Earth in 'forward scattering', particularly at high latitudes. Forward scattering occurs when you (or MISR) observe an object with the Sun at a point in the sky that is

  8. Inversion of the conical Radon transform with vertices on a surface of revolution arising in an application of a Compton camera

    NASA Astrophysics Data System (ADS)

    Moon, Sunghwan

    2017-06-01

    A Compton camera has been introduced for use in single photon emission computed tomography to improve the low efficiency of a conventional gamma camera. In general, a Compton camera brings about the conical Radon transform. Here we consider a conical Radon transform with the vertices on a rotation symmetric set with respect to a coordinate axis. We show that this conical Radon transform can be decomposed into two transforms: the spherical sectional transform and the weighted fan beam transform. After finding inversion formulas for these two transforms, we provide an inversion formula for the conical Radon transform.

  9. A comparison of multi-view 3D reconstruction of a rock wall using several cameras and a laser scanner

    NASA Astrophysics Data System (ADS)

    Thoeni, K.; Giacomini, A.; Murtagh, R.; Kniest, E.

    2014-06-01

    This work presents a comparative study between multi-view 3D reconstruction using various digital cameras and a terrestrial laser scanner (TLS). Five different digital cameras were used in order to estimate the limits related to the camera type and to establish the minimum camera requirements to obtain comparable results to the ones of the TLS. The cameras used for this study range from commercial grade to professional grade and included a GoPro Hero 1080 (5 Mp), iPhone 4S (8 Mp), Panasonic Lumix LX5 (9.5 Mp), Panasonic Lumix ZS20 (14.1 Mp) and Canon EOS 7D (18 Mp). The TLS used for this work was a FARO Focus 3D laser scanner with a range accuracy of ±2 mm. The study area is a small rock wall of about 6 m height and 20 m length. The wall is partly smooth with some evident geological features, such as non-persistent joints and sharp edges. Eight control points were placed on the wall and their coordinates were measured by using a total station. These coordinates were then used to georeference all models. A similar number of images was acquired from a distance of between approximately 5 to 10 m, depending on field of view of each camera. The commercial software package PhotoScan was used to process the images, georeference and scale the models, and to generate the dense point clouds. Finally, the open-source package CloudCompare was used to assess the accuracy of the multi-view results. Each point cloud obtained from a specific camera was compared to the point cloud obtained with the TLS. The latter is taken as ground truth. The result is a coloured point cloud for each camera showing the deviation in relation to the TLS data. The main goal of this study is to quantify the quality of the multi-view 3D reconstruction results obtained with various cameras as objectively as possible and to evaluate its applicability to geotechnical problems.

  10. ETR COMPLEX. CAMERA FACING SOUTH. FROM BOTTOM OF VIEW TO ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR COMPLEX. CAMERA FACING SOUTH. FROM BOTTOM OF VIEW TO TOP: MTR, MTR SERVICE BUILDING, ETR CRITICAL FACILITY, ETR CONTROL BUILDING (ATTACHED TO ETR), ETR BUILDING (HIGH-BAY), COMPRESSOR BUILDING (ATTACHED AT LEFT OF ETR), HEAT EXCHANGER BUILDING (JUST BEYOND COMPRESSOR BUILDING), COOLING TOWER PUMP HOUSE, COOLING TOWER. OTHER BUILDINGS ARE CONTRACTORS' CONSTRUCTION BUILDINGS. INL NEGATIVE NO. 56-4105. Unknown Photographer, ca. 1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  11. [Estimating Winter Wheat Nitrogen Vertical Distribution Based on Bidirectional Canopy Reflected Spectrum].

    PubMed

    Yang, Shao-yuan; Huang, Wen-jiang; Liang, Dong; Uang, Lin-sheng; Yang, Gui-jun; Zhang, Gui-jan; Cai, Shu-Hong

    2015-07-01

    The vertical distribution of crop nitrogen is increased with plant height, timely and non-damaging measurement of crop nitrogen vertical distribution is critical for the crop production and quality, improving fertilizer utilization and reducing environmental impact. The objective of this study was to discuss the method of estimating winter wheat nitrogen vertical distribution by exploring bidirectional reflectance distribution function (BRDF) data using partial least square (PLS) algorithm. The canopy reflectance at nadir, +/-50 degrees and +/- 60 degrees; at nadir, +/- 30 degrees and +/- 40 degrees; and at nadir, +/- 20 degrees and +/- 30 degrees were selected to estimate foliage nitrogen density (FND) at upper layer, middle layer and bottom layer, respectively. Three PLS analysis models with FND as the dependent variable and vegetation indices at corresponding angles as the explicative variables were. established. The impact of soil reflectance and the canopy non-photosynthetic materials, was minimized by seven kinds of modifying vegetation indices with the ratio R700/R670. The estimated accuracy is significant raised at upper layer, middle layer and bottom layer in modeling experiment. Independent model verification selected the best three vegetation indices for further research. The research result showed that the modified Green normalized difference vegetation index (GNDVI) shows better performance than other vegetation indices at each layer, which means modified GNDVI could be used in estimating winter wheat nitrogen vertical distribution

  12. Close-up view of RCA color television camera mounted on the LRV

    NASA Image and Video Library

    1972-04-23

    AS16-117-18754 (23 April 1972) --- A view of the smooth terrain in the general area of the North Ray Crater geological site, photographed by the Apollo 16 crew from the Lunar Roving Vehicle (LRV) shortly after leaving the immediate area of the geology site. The RCA color television camera is mounted on the front of the LRV and can be seen in the foreground, along with a small part of the high gain antenna, upper left. The tracks were made on the earlier trip to the North Ray Crater site. Astronaut Charles M. Duke Jr., lunar module pilot, exposed this view with his 70mm Hasselblad camera. Astronaut John W. Young, commander, said that this area was much smoother than the region around South Ray Crater. While astronauts Young and Duke descended in the Apollo 16 Lunar Module (LM) "Orion" to explore the Descartes highlands landing site on the moon, astronaut Thomas K. Mattingly II, command module pilot, remained with the Command and Service Modules (CSM) "Casper" in lunar orbit.

  13. Improved iris localization by using wide and narrow field of view cameras for iris recognition

    NASA Astrophysics Data System (ADS)

    Kim, Yeong Gon; Shin, Kwang Yong; Park, Kang Ryoung

    2013-10-01

    Biometrics is a method of identifying individuals by their physiological or behavioral characteristics. Among other biometric identifiers, iris recognition has been widely used for various applications that require a high level of security. When a conventional iris recognition camera is used, the size and position of the iris region in a captured image vary according to the X, Y positions of a user's eye and the Z distance between a user and the camera. Therefore, the searching area of the iris detection algorithm is increased, which can inevitably decrease both the detection speed and accuracy. To solve these problems, we propose a new method of iris localization that uses wide field of view (WFOV) and narrow field of view (NFOV) cameras. Our study is new as compared to previous studies in the following four ways. First, the device used in our research acquires three images, one each of the face and both irises, using one WFOV and two NFOV cameras simultaneously. The relation between the WFOV and NFOV cameras is determined by simple geometric transformation without complex calibration. Second, the Z distance (between a user's eye and the iris camera) is estimated based on the iris size in the WFOV image and anthropometric data of the size of the human iris. Third, the accuracy of the geometric transformation between the WFOV and NFOV cameras is enhanced by using multiple matrices of the transformation according to the Z distance. Fourth, the searching region for iris localization in the NFOV image is significantly reduced based on the detected iris region in the WFOV image and the matrix of geometric transformation corresponding to the estimated Z distance. Experimental results showed that the performance of the proposed iris localization method is better than that of conventional methods in terms of accuracy and processing time.

  14. View From Camera Not Used During Curiosity's First Six Months on Mars

    NASA Image and Video Library

    2017-12-08

    This view of Curiosity's left-front and left-center wheels and of marks made by wheels on the ground in the "Yellowknife Bay" area comes from one of six cameras used on Mars for the first time more than six months after the rover landed. The left Navigation Camera (Navcam) linked to Curiosity's B-side computer took this image during the 223rd Martian day, or sol, of Curiosity's work on Mars (March 22, 2013). The wheels are 20 inches (50 centimeters) in diameter. Curiosity carries a pair of main computers, redundant to each other, in order to have a backup available if one fails. Each of the computers, A-side and B-side, also has other redundant subsystems linked to just that computer. Curiosity operated on its A-side from before the August 2012 landing until Feb. 28, when engineers commanded a switch to the B-side in response to a memory glitch on the A-side. One set of activities after switching to the B-side computer has been to check the six engineering cameras that are hard-linked to that computer. The rover's science instruments, including five science cameras, can each be operated by either the A-side or B-side computer, whichever is active. However, each of Curiosity's 12 engineering cameras is linked to just one of the computers. The engineering cameras are the Navigation Camera (Navcam), the Front Hazard-Avoidance Camera (Front Hazcam) and Rear Hazard-Avoidance Camera (Rear Hazcam). Each of those three named cameras has four cameras as part of it: two stereo pairs of cameras, with one pair linked to each computer. Only the pairs linked to the active computer can be used, and the A-side computer was active from before landing, in August, until Feb. 28. All six of the B-side engineering cameras have been used during March 2013 and checked out OK. Image Credit: NASA/JPL-Caltech NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and

  15. Phoenix Lander on Mars with Surrounding Terrain, Vertical Projection

    NASA Technical Reports Server (NTRS)

    2008-01-01

    This view is a vertical projection that combines more than 500 exposures taken by the Surface Stereo Imager camera on NASA's Mars Phoenix Lander and projects them as if looking down from above.

    The black circle on the spacecraft is where the camera itself is mounted on the lander, out of view in images taken by the camera. North is toward the top of the image. The height of the lander's meteorology mast, extending toward the southwest, appears exaggerated because that mast is taller than the camera mast.

    This view in approximately true color covers an area about 30 meters by 30 meters (about 100 feet by 100 feet). The landing site is at 68.22 degrees north latitude, 234.25 degrees east longitude on Mars.

    The ground surface around the lander has polygonal patterning similar to patterns in permafrost areas on Earth.

    This view comprises more than 100 different Stereo Surface Imager pointings, with images taken through three different filters at each pointing. The images were taken throughout the period from the 13th Martian day, or sol, after landing to the 47th sol (June 5 through July 12, 2008). The lander's Robotic Arm is cut off in this mosaic view because component images were taken when the arm was out of the frame.

    The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  16. Low earth orbiting Nadir Etalon Sounding Spectrometer instrument concept for temperature, moisture and trace species, LeoNESS

    NASA Technical Reports Server (NTRS)

    Kumer, J. B.; Sterritt, L. W.; Roche, A. E.; Rosenberg, W. J.; Morrow, H. E.; Shenk, W. E.; Susskind, J.

    1992-01-01

    A concept for a low earth orbiting nadir etalon spectrometer sounder (LeoNESS) is described which can achieve retrieval of temperature, H2O, surface, boundary conditions, cloudiness, and trace species with an accuracy that meets or exceeds the AIRS specifications. Options employing 65-K and 30-K detectors are examined; the former may be implemented via passive radiative cooling. The concept, which is derived from the Cryogenic Limb Array Etalon Spectrometer, has the potential for improving the horizontal and vertical resolution.

  17. Optimal viewing position in vertically and horizontally presented Japanese words.

    PubMed

    Kajii, N; Osaka, N

    2000-11-01

    In the present study, the optimal viewing position (OVP) phenomenon in Japanese Hiragana was investigated, with special reference to a comparison between the vertical and the horizontal meridians in the visual field. In the first experiment, word recognition scores were determined while the eyes were fixating predetermined locations in vertically and horizontally displayed words. Similar to what has been reported for Roman scripts, OVP curves, which were asymmetric with respect to the beginning of words, were observed in both conditions. However, this asymmetry was less pronounced for vertically than for horizontally displayed words. In the second experiment, the visibility of individual characters within strings was examined for the vertical and horizontal meridians. As for Roman characters, letter identification scores were better in the right than in the left visual field. However, identification scores did not differ between the upper and the lower sides of fixation along the vertical meridian. The results showed that the model proposed by Nazir, O'Regan, and Jacobs (1991) cannot entirely account for the OVP phenomenon. A model in which visual and lexical factors are combined is proposed instead.

  18. The PRo3D View Planner - interactive simulation of Mars rover camera views to optimise capturing parameters

    NASA Astrophysics Data System (ADS)

    Traxler, Christoph; Ortner, Thomas; Hesina, Gerd; Barnes, Robert; Gupta, Sanjeev; Paar, Gerhard

    2017-04-01

    High resolution Digital Terrain Models (DTM) and Digital Outcrop Models (DOM) are highly useful for geological analysis and mission planning in planetary rover missions. PRo3D, developed as part of the EU-FP7 PRoViDE project, is a 3D viewer in which orbital DTMs and DOMs derived from rover stereo imagery can be rendered in a virtual environment for exploration and analysis. It allows fluent navigation over planetary surface models and provides a variety of measurement and annotation tools to complete an extensive geological interpretation. A key aspect of the image collection during planetary rover missions is determining the optimal viewing positions of rover instruments from different positions ('wide baseline stereo'). For the collection of high quality panoramas and stereo imagery the visibility of regions of interest from those positions, and the amount of common features shared by each stereo-pair, or image bundle is crucial. The creation of a highly accurate and reliable 3D surface, in the form of an Ordered Point Cloud (OPC), of the planetary surface, with a low rate of error and a minimum of artefacts, is greatly enhanced by using images that share a high amount of features and a sufficient overlap for wide baseline stereo or target selection. To support users in the selection of adequate viewpoints an interactive View Planner was integrated into PRo3D. The users choose from a set of different rovers and their respective instruments. PRo3D supports for instance the PanCam instrument of ESA's ExoMars 2020 rover mission or the Mastcam-Z camera of NASA's Mars2020 mission. The View Planner uses a DTM obtained from orbiter imagery, which can also be complemented with rover-derived DOMs as the mission progresses. The selected rover is placed onto a position on the terrain - interactively or using the current rover pose as known from the mission. The rover's base polygon and its local coordinate axes, and the chosen instrument's up- and forward vectors are

  19. 24. DETAIL VIEW OF COLUMN #072 DEVIATING FROM VERTICAL IN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    24. DETAIL VIEW OF COLUMN #072 DEVIATING FROM VERTICAL IN ROW OF INTACT COLUMNS, LOOKING NORTHEAST TO SOUTHWEST. (NOTE BOLTED BLOCK SCABBED TO COLUMN AS JOIST/TRUSS SUPPORT) - Oakland Army Base, Transit Shed, East of Dunkirk Street & South of Burma Road, Oakland, Alameda County, CA

  20. Effects of camera location on the reconstruction of 3D flare trajectory with two cameras

    NASA Astrophysics Data System (ADS)

    Özsaraç, Seçkin; Yeşilkaya, Muhammed

    2015-05-01

    Flares are used as valuable electronic warfare assets for the battle against infrared guided missiles. The trajectory of the flare is one of the most important factors that determine the effectiveness of the counter measure. Reconstruction of the three dimensional (3D) position of a point, which is seen by multiple cameras, is a common problem. Camera placement, camera calibration, corresponding pixel determination in between the images of different cameras and also the triangulation algorithm affect the performance of 3D position estimation. In this paper, we specifically investigate the effects of camera placement on the flare trajectory estimation performance by simulations. Firstly, 3D trajectory of a flare and also the aircraft, which dispenses the flare, are generated with simple motion models. Then, we place two virtual ideal pinhole camera models on different locations. Assuming the cameras are tracking the aircraft perfectly, the view vectors of the cameras are computed. Afterwards, using the view vector of each camera and also the 3D position of the flare, image plane coordinates of the flare on both cameras are computed using the field of view (FOV) values. To increase the fidelity of the simulation, we have used two sources of error. One is used to model the uncertainties in the determination of the camera view vectors, i.e. the orientations of the cameras are measured noisy. Second noise source is used to model the imperfections of the corresponding pixel determination of the flare in between the two cameras. Finally, 3D position of the flare is estimated using the corresponding pixel indices, view vector and also the FOV of the cameras by triangulation. All the processes mentioned so far are repeated for different relative camera placements so that the optimum estimation error performance is found for the given aircraft and are trajectories.

  1. NADIR: A Flexible Archiving System Current Development

    NASA Astrophysics Data System (ADS)

    Knapic, C.; De Marco, M.; Smareglia, R.; Molinaro, M.

    2014-05-01

    The New Archiving Distributed InfrastructuRe (NADIR) is under development at the Italian center for Astronomical Archives (IA2) to increase the performances of the current archival software tools at the data center. Traditional softwares usually offer simple and robust solutions to perform data archive and distribution but are awkward to adapt and reuse in projects that have different purposes. Data evolution in terms of data model, format, publication policy, version, and meta-data content are the main threats to re-usage. NADIR, using stable and mature framework features, answers those very challenging issues. Its main characteristics are a configuration database, a multi threading and multi language environment (C++, Java, Python), special features to guarantee high scalability, modularity, robustness, error tracking, and tools to monitor with confidence the status of each project at each archiving site. In this contribution, the development of the core components is presented, commenting also on some performance and innovative features (multi-cast and publisher-subscriber paradigms). NADIR is planned to be developed as simply as possible with default configurations for every project, first of all for LBT and other IA2 projects.

  2. Detail view of the vertical stabilizer of the Orbiter Discovery ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail view of the vertical stabilizer of the Orbiter Discovery as it sits at Launch Complex 39 A at Kennedy Space Center being prepared for its launch. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX

  3. Evaluation of a stereoscopic camera-based three-dimensional viewing workstation for ophthalmic surgery.

    PubMed

    Bhadri, Prashant R; Rowley, Adrian P; Khurana, Rahul N; Deboer, Charles M; Kerns, Ralph M; Chong, Lawrence P; Humayun, Mark S

    2007-05-01

    To evaluate the effectiveness of a prototype stereoscopic camera-based viewing system (Digital Microsurgical Workstation, three-dimensional (3D) Vision Systems, Irvine, California, USA) for anterior and posterior segment ophthalmic surgery. Institutional-based prospective study. Anterior and posterior segment surgeons performed designated standardized tasks on porcine eyes after training on prosthetic plastic eyes. Both anterior and posterior segment surgeons were able to complete tasks requiring minimal or moderate stereoscopic viewing. The results indicate that the system provides improved ergonomics. Improvements in key viewing performance areas would further enhance the value over a conventional operating microscope. The performance of the prototype system is not at par with the planned commercial system. With continued development of this technology, the three- dimensional system may be a novel viewing system in ophthalmic surgery with improved ergonomics with respect to traditional microscopic viewing.

  4. Unmanned aerial system nadir reflectance and MODIS nadir BRDF-adjusted surface reflectances intercompared over Greenland

    NASA Astrophysics Data System (ADS)

    Faulkner Burkhart, John; Kylling, Arve; Schaaf, Crystal B.; Wang, Zhuosen; Bogren, Wiley; Storvold, Rune; Solbø, Stian; Pedersen, Christina A.; Gerland, Sebastian

    2017-07-01

    Albedo is a fundamental parameter in earth sciences, and many analyses utilize the Moderate Resolution Imaging Spectroradiometer (MODIS) bidirectional reflectance distribution function (BRDF)/albedo (MCD43) algorithms. While derivative albedo products have been evaluated over Greenland, we present a novel, direct comparison with nadir surface reflectance collected from an unmanned aerial system (UAS). The UAS was flown from Summit, Greenland, on 210 km transects coincident with the MODIS sensor overpass on board the Aqua and Terra satellites on 5 and 6 August 2010. Clear-sky acquisitions were available from the overpasses within 2 h of the UAS flights. The UAS was equipped with upward- and downward-looking spectrometers (300-920 nm) with a spectral resolution of 10 nm, allowing for direct integration into the MODIS bands 1, 3, and 4. The data provide a unique opportunity to directly compare UAS nadir reflectance with the MODIS nadir BRDF-adjusted surface reflectance (NBAR) products. The data show UAS measurements are slightly higher than the MODIS NBARs for all bands but agree within their stated uncertainties. Differences in variability are observed as expected due to different footprints of the platforms. The UAS data demonstrate potentially large sub-pixel variability of MODIS reflectance products and the potential to explore this variability using the UAS as a platform. It is also found that, even at the low elevations flown typically by a UAS, reflectance measurements may be influenced by haze if present at and/or below the flight altitude of the UAS. This impact could explain some differences between data from the two platforms and should be considered in any use of airborne platforms.

  5. Comparison of a single-view and a double-view aerosol optical depth retrieval algorithm

    NASA Astrophysics Data System (ADS)

    Henderson, Bradley G.; Chylek, Petr

    2003-11-01

    We compare the results of a single-view and a double-view aerosol optical depth (AOD) retrieval algorithm applied to image pairs acquired over NASA Stennis Space Center, Mississippi. The image data were acquired by the Department of Energy's (DOE) Multispectral Thermal Imager (MTI), a pushbroom satellite imager with 15 bands from the visible to the thermal infrared. MTI has the ability to acquire imagery in pairs in which the first image is a near-nadir view and the second image is off-nadir with a zenith angle of approximately 60°. A total of 15 image pairs were used in the analysis. For a given image pair, AOD retrieval is performed twice---once using a single-view algorithm applied to the near-nadir image, then again using a double-view algorithm. Errors for both retrievals are computed by comparing the results to AERONET AOD measurements obtained at the same time and place. The single-view algorithm showed an RMS error about the mean of 0.076 in AOD units, whereas the double-view algorithm showed a modest improvement with an RMS error of 0.06. The single-view errors show a positive bias which is presumed to be a result of the empirical relationship used to determine ground reflectance in the visible. A plot of AOD error of the double-view algorithm versus time shows a noticeable trend which is interpreted to be a calibration drift. When this trend is removed, the RMS error of the double-view algorithm drops to 0.030. The single-view algorithm qualitatively appears to perform better during the spring and summer whereas the double-view algorithm seems to be less sensitive to season.

  6. The SALSA Project - High-End Aerial 3d Camera

    NASA Astrophysics Data System (ADS)

    Rüther-Kindel, W.; Brauchle, J.

    2013-08-01

    The ATISS measurement drone, developed at the University of Applied Sciences Wildau, is an electrical powered motor glider with a maximum take-off weight of 25 kg including a payload capacity of 10 kg. Two 2.5 kW engines enable ultra short take-off procedures and the motor glider design results in a 1 h endurance. The concept of ATISS is based on the idea to strictly separate between aircraft and payload functions, which makes ATISS a very flexible research platform for miscellaneous payloads. ATISS is equipped with an autopilot for autonomous flight patterns but under permanent pilot control from the ground. On the basis of ATISS the project SALSA was undertaken. The aim was to integrate a system for digital terrain modelling. Instead of a laser scanner a new design concept was chosen based on two synchronized high resolution digital cameras, one in a fixed nadir orientation and the other in a oblique orientation. Thus from every object on the ground images from different view angles are taken. This new measurement camera system MACS-TumbleCam was developed at the German Aerospace Center DLR Berlin-Adlershof especially for the ATISS payload concept. Special advantage in comparison to laser scanning is the fact, that instead of a cloud of points a surface including texture is generated and a high-end inertial orientation system can be omitted. The first test flights show a ground resolution of 2 cm and height resolution of 3 cm, which underline the extraordinary capabilities of ATISS and the MACS measurement camera system.

  7. 7. Close view of the lower portion of vertical sign ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. Close view of the lower portion of vertical sign with the letters "A-G-O" after removal from theatre (Note: the steel I-beam was inserted and sheet metal side panels taken off to facilitate removal from theatre - Chicago Theater, 175 North State Street, Chicago, Cook County, IL

  8. VIEW OF PDP TANK TOP AT LEVEL 0’, WITH VERTICAL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW OF PDP TANK TOP AT LEVEL 0’, WITH VERTICAL ELEMENTS IN BACKGROUND AND PART OF SHEAVE RACK ABOVE THE TANK, LOOKING NORTH - Physics Assembly Laboratory, Area A/M, Savannah River Site, Aiken, Aiken County, SC

  9. Geocam Space: Enhancing Handheld Digital Camera Imagery from the International Space Station for Research and Applications

    NASA Technical Reports Server (NTRS)

    Stefanov, William L.; Lee, Yeon Jin; Dille, Michael

    2016-01-01

    Handheld astronaut photography of the Earth has been collected from the International Space Station (ISS) since 2000, making it the most temporally extensive remotely sensed dataset from this unique Low Earth orbital platform. Exclusive use of digital handheld cameras to perform Earth observations from the ISS began in 2004. Nadir viewing imagery is constrained by the inclined equatorial orbit of the ISS to between 51.6 degrees North and South latitude, however numerous oblique images of land surfaces above these latitudes are included in the dataset. While unmodified commercial off-the-shelf digital cameras provide only visible wavelength, three-band spectral information of limited quality current cameras used with long (400+ mm) lenses can obtain high quality spatial information approaching 2 meters/ground pixel resolution. The dataset is freely available online at the Gateway to Astronaut Photography of Earth site (http://eol.jsc.nasa.gov), and now comprises over 2 million images. Despite this extensive image catalog, use of the data for scientific research, disaster response, commercial applications and visualizations is minimal in comparison to other data collected from free-flying satellite platforms such as Landsat, Worldview, etc. This is due primarily to the lack of fully-georeferenced data products - while current digital cameras typically have integrated GPS, this does not function in the Low Earth Orbit environment. The Earth Science and Remote Sensing (ESRS) Unit at NASA Johnson Space Center provides training in Earth Science topics to ISS crews, performs daily operations and Earth observation target delivery to crews through the Crew Earth Observations (CEO) Facility on board ISS, and also catalogs digital handheld imagery acquired from orbit by manually adding descriptive metadata and determining an image geographic centerpoint using visual feature matching with other georeferenced data, e.g. Landsat, Google Earth, etc. The lack of full geolocation

  10. Mars Orbiter Camera Views the 'Face on Mars' - Best View from Viking

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Shortly after midnight Sunday morning (5 April 1998 12:39 AM PST), the Mars Orbiter Camera (MOC) on the Mars Global Surveyor (MGS) spacecraft successfully acquired a high resolution image of the 'Face on Mars' feature in the Cydonia region. The image was transmitted to Earth on Sunday, and retrieved from the mission computer data base Monday morning (6 April 1998). The image was processed at the Malin Space Science Systems (MSSS) facility 9:15 AM and the raw image immediately transferred to the Jet Propulsion Laboratory (JPL) for release to the Internet. The images shown here were subsequently processed at MSSS.

    The picture was acquired 375 seconds after the spacecraft's 220th close approach to Mars. At that time, the 'Face', located at approximately 40.8o N, 9.6o W, was 275 miles (444 km) from the spacecraft. The 'morning' sun was 25o above the horizon. The picture has a resolution of 14.1 feet (4.3 meters) per pixel, making it ten times higher resolution than the best previous image of the feature, which was taken by the Viking Mission in the mid-1970's. The full image covers an area 2.7 miles (4.4 km) wide and 25.7 miles (41.5 km) long.

    This Viking Orbiter image is one of the best Viking pictures of the area Cydonia where the 'Face' is located. Marked on the image are the 'footprint' of the high resolution (narrow angle) Mars Orbiter Camera image and the area seen in enlarged views (dashed box). See PIA01440-1442 for these images in raw and processed form.

    Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.

  11. VIEW OF PDP TANK TOP, LEVEL 0’, WITH VERTICAL ELEMENTS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW OF PDP TANK TOP, LEVEL 0’, WITH VERTICAL ELEMENTS IN BACKGROUND, LTR TANK TOP ON LEFT, AND SHEAVE RACK ELEMENTS AT TOP, LOOKING NORTH - Physics Assembly Laboratory, Area A/M, Savannah River Site, Aiken, Aiken County, SC

  12. Space Station Camera Captures New Views of Hurricane Harvey

    NASA Image and Video Library

    2017-08-24

    The National Hurricane Center (NHC) upgraded the remnants of tropical storm Harvey to a tropical depression on August 23, 2017 at 11 a.m. EDT (1500 UTC). Harvey became better organized and was revived after moving from Mexico's Yucatan Peninsula into the Bay of Campeche. The warm waters of the Gulf of Mexico and favorable vertical wind shear promoted the regeneration of the tropical cyclone. This video includes views from The International Space Station recorded on August 24, 2017 at 6:15 p.m. Eastern Time.

  13. Use of a microscope-mounted wide-angle point of view camera to record optimal hand position in ocular surgery.

    PubMed

    Gooi, Patrick; Ahmed, Yusuf; Ahmed, Iqbal Ike K

    2014-07-01

    We describe the use of a microscope-mounted wide-angle point-of-view camera to record optimal hand positions in ocular surgery. The camera is mounted close to the objective lens beneath the surgeon's oculars and faces the same direction as the surgeon, providing a surgeon's view. A wide-angle lens enables viewing of both hands simultaneously and does not require repositioning the camera during the case. Proper hand positioning and instrument placement through microincisions are critical for effective and atraumatic handling of tissue within the eye. Our technique has potential in the assessment and training of optimal hand position for surgeons performing intraocular surgery. It is an innovative way to routinely record instrument and operating hand positions in ophthalmic surgery and has minimal requirements in terms of cost, personnel, and operating-room space. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2014 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  14. Distributed processing method for arbitrary view generation in camera sensor network

    NASA Astrophysics Data System (ADS)

    Tehrani, Mehrdad P.; Fujii, Toshiaki; Tanimoto, Masayuki

    2003-05-01

    Camera sensor network as a new advent of technology is a network that each sensor node can capture video signals, process and communicate them with other nodes. The processing task in this network is to generate arbitrary view, which can be requested from central node or user. To avoid unnecessary communication between nodes in camera sensor network and speed up the processing time, we have distributed the processing tasks between nodes. In this method, each sensor node processes part of interpolation algorithm to generate the interpolated image with local communication between nodes. The processing task in camera sensor network is ray-space interpolation, which is an object independent method and based on MSE minimization by using adaptive filtering. Two methods were proposed for distributing processing tasks, which are Fully Image Shared Decentralized Processing (FIS-DP), and Partially Image Shared Decentralized Processing (PIS-DP), to share image data locally. Comparison of the proposed methods with Centralized Processing (CP) method shows that PIS-DP has the highest processing speed after FIS-DP, and CP has the lowest processing speed. Communication rate of CP and PIS-DP is almost same and better than FIS-DP. So, PIS-DP is recommended because of its better performance than CP and FIS-DP.

  15. LabVIEW application for motion tracking using USB camera

    NASA Astrophysics Data System (ADS)

    Rob, R.; Tirian, G. O.; Panoiu, M.

    2017-05-01

    The technical state of the contact line and also the additional equipment in electric rail transport is very important for realizing the repairing and maintenance of the contact line. During its functioning, the pantograph motion must stay in standard limits. Present paper proposes a LabVIEW application which is able to track in real time the motion of a laboratory pantograph and also to acquire the tracking images. An USB webcam connected to a computer acquires the desired images. The laboratory pantograph contains an automatic system which simulates the real motion. The tracking parameters are the horizontally motion (zigzag) and the vertically motion which can be studied in separate diagrams. The LabVIEW application requires appropriate tool-kits for vision development. Therefore the paper describes the subroutines that are especially programmed for real-time image acquisition and also for data processing.

  16. Summer Harvest in Saratov, Russia

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Russia's Saratov Oblast (province) is located in the southeastern portion of the East-European plain, in the Lower Volga River Valley. Southern Russia produces roughly 40 percent of the country's total agricultural output, and Saratov Oblast is the largest producer of grain in the Volga region. Vegetation changes in the province's agricultural lands between spring and summer are apparent in these images acquired on May 31 and July 18, 2002 (upper and lower image panels, respectively) by the Multi-angle Imaging SpectroRadiometer (MISR).

    The left-hand panels are natural color views acquired by MISR's vertical-viewing (nadir) camera. Less vegetation and more earth tones (indicative of bare soils) are apparent in the summer image (lower left). Farmers in the region utilize staggered sowing to help stabilize yields, and a number of different stages of crop maturity can be observed. The main crop is spring wheat, cultivated under non-irrigated conditions. A short growing season and relatively low and variable rainfall are the major limitations to production. Saratov city is apparent as the light gray pixels on the left (west) bank of the Volga River. Riparian vegetation along the Volga exhibits dark green hues, with some new growth appearing in summer.

    The right-hand panels are multi-angle composites created with red band data from MISR's 60-degree backward, nadir and 60-degree forward-viewing cameras displayed as red, green and blue respectively. In these images, color variations serve as a proxy for changes in angular reflectance, and the spring and summer views were processed identically to preserve relative variations in brightness between the two dates. Urban areas and vegetation along the Volga banks look similar in the two seasonal multi-angle composites. The agricultural areas, on the other hand, look strikingly different. This can be attributed to differences in brightness and texture between bare soil and vegetated land. The chestnut-colored soils in

  17. Towards new constraints on the impacts of fires on air quality and the nitrogen cycle: Extending the nadir satellite record of peroxyacetyl nitrate (PAN) with CrIS

    NASA Astrophysics Data System (ADS)

    Fischer, E. V.; Payne, V.; Kulawik, S. S.; Fu, D.

    2017-12-01

    Peroxyacetyl nitrate (PAN) plays a critical role both in atmospheric chemistry and in the redistribution of nitrogen in the troposphere. As a thermally unstable reservoir for nitrogen oxide radicals (NOx) PAN allows NOxto be transported large distances from the original source, thereby extending the range of air quality impacts from fires. Satellite measurements of PAN from the nadir-viewing Aura Tropospheric Emission Spectrometer (TES) have shown large enhancements in PAN associated with fires, and have recently been used to shed new light on the role of fires, PAN precursor emissions and dynamics on the global distribution of PAN and long-range transport of ozone. TES PAN retrievals have also been used to explore interannual variability in PAN mixing ratios in the Western US. The Cross-track Infrared Sounder (CrIS) on S-NPP and the upcoming JPSS series provides a means to continue the satellite record of PAN from the nadir view, with increased spatial coverage. Retrievals of PAN from TES have so far relied on the PAN absorption feature centered at 1150 cm-1, a spectral region not covered by CrIS. Our team has recently developed an approach that would allow the use of another PAN spectral feature centered at 790 cm-1, a spectral region that is covered by CrIS. Here, we apply this approach to CrIS spectra and compare the characteristics of the CrIS PAN retrievals, including vertical sensitivity and uncertainty estimates, with those of the TES PAN product. The CrIS PAN measurements can offer improved spatial coverage, extend the existing satellite PAN record and provide new opportunities for validation of satellite PAN retrievals.

  18. Ross Sea

    Atmospheric Science Data Center

    2013-04-16

    article title:  Icebergs in the Ross Sea     View Larger Image Two ... (MISR) nadir camera view of the Ross Ice Shelf and Ross Sea in Antarctica. The image was acquired on December 10, 2000 during Terra ...

  19. 34. VERTICAL AND TORSIONAL MOTION VIEWED FROM EAST TOWER, 7 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    34. VERTICAL AND TORSIONAL MOTION VIEWED FROM EAST TOWER, 7 NOVEMBER 1940, FROM 16MN FILM SHOT BY PROFESSOR F.B. FARQUHARSON, UNIVERSITY OF WASHINGTON. (LABORATORY STUDIES ON THE TACOMA NARROWS BRIDGE, AT UNIVERSITY OF WASHINGTON (SEATTLE: UNIVERSITY OF WASHINGTON, DEPARTMENT OF CIVIL ENGINEERING, 1941) - Tacoma Narrows Bridge, Spanning Narrows at State Route 16, Tacoma, Pierce County, WA

  20. Accuracy Analysis for Automatic Orientation of a Tumbling Oblique Viewing Sensor System

    NASA Astrophysics Data System (ADS)

    Stebner, K.; Wieden, A.

    2014-03-01

    Dynamic camera systems with moving parts are difficult to handle in photogrammetric workflow, because it is not ensured that the dynamics are constant over the recording period. Minimum changes of the camera's orientation greatly influence the projection of oblique images. In this publication these effects - originating from the kinematic chain of a dynamic camera system - are analysed and validated. A member of the Modular Airborne Camera System family - MACS-TumbleCam - consisting of a vertical viewing and a tumbling oblique camera was used for this investigation. Focus is on dynamic geometric modeling and the stability of the kinematic chain. To validate the experimental findings, the determined parameters are applied to the exterior orientation of an actual aerial image acquisition campaign using MACS-TumbleCam. The quality of the parameters is sufficient for direct georeferencing of oblique image data from the orientation information of a synchronously captured vertical image dataset. Relative accuracy for the oblique data set ranges from 1.5 pixels when using all images of the image block to 0.3 pixels when using only adjacent images.

  1. Opportunity's Surroundings After Sol 1820 Drive (Vertical)

    NASA Technical Reports Server (NTRS)

    2009-01-01

    NASA's Mars Exploration Rover Opportunity used its navigation camera to take the images combined into this full-circle view of the rover's surroundings during the 1,820th to 1,822nd Martian days, or sols, of Opportunity's surface mission (March 7 to 9, 2009).

    This view is presented as a vertical projection with geometric seam correction. North is at the top.

    The rover had driven 20.6 meters toward the northwest on Sol 1820 before beginning to take the frames in this view. Tracks from that drive recede southwestward. For scale, the distance between the parallel wheel tracks is about 1 meter (about 40 inches).

    The terrain in this portion of Mars' Meridiani Planum region includes dark-toned sand ripples and small exposures of lighter-toned bedrock.

  2. Opportunity's Surroundings on Sol 1687 (Vertical)

    NASA Technical Reports Server (NTRS)

    2009-01-01

    NASA's Mars Exploration Rover Opportunity used its navigation camera to take the images combined into this 360-degree view of the rover's surroundings on the 1,687th Martian day, or sol, of its surface mission (Oct. 22, 2008).

    Opportunity had driven 133 meters (436 feet) that sol, crossing sand ripples up to about 10 centimeters (4 inches) tall. The tracks visible in the foreground are in the east-northeast direction.

    Opportunity's position on Sol 1687 was about 300 meters southwest of Victoria Crater. The rover was beginning a long trek toward a much larger crater, Endeavour, about 12 kilometers (7 miles) to the southeast.

    This view is presented as a vertical projection with geometric seam correction.

  3. An attentive multi-camera system

    NASA Astrophysics Data System (ADS)

    Napoletano, Paolo; Tisato, Francesco

    2014-03-01

    Intelligent multi-camera systems that integrate computer vision algorithms are not error free, and thus both false positive and negative detections need to be revised by a specialized human operator. Traditional multi-camera systems usually include a control center with a wall of monitors displaying videos from each camera of the network. Nevertheless, as the number of cameras increases, switching from a camera to another becomes hard for a human operator. In this work we propose a new method that dynamically selects and displays the content of a video camera from all the available contents in the multi-camera system. The proposed method is based on a computational model of human visual attention that integrates top-down and bottom-up cues. We believe that this is the first work that tries to use a model of human visual attention for the dynamic selection of the camera view of a multi-camera system. The proposed method has been experimented in a given scenario and has demonstrated its effectiveness with respect to the other methods and manually generated ground-truth. The effectiveness has been evaluated in terms of number of correct best-views generated by the method with respect to the camera views manually generated by a human operator.

  4. Making limb and nadir measurements comparable: A common volume study of PMC brightness observed by Odin OSIRIS and AIM CIPS

    NASA Astrophysics Data System (ADS)

    Benze, Susanne; Gumbel, Jörg; Randall, Cora E.; Karlsson, Bodil; Hultgren, Kristoffer; Lumpe, Jerry D.; Baumgarten, Gerd

    2018-01-01

    Combining limb and nadir satellite observations of Polar Mesospheric Clouds (PMCs) has long been recognized as problematic due to differences in observation geometry, scattering conditions, and retrieval approaches. This study offers a method of comparing PMC brightness observations from the nadir-viewing Aeronomy of Ice in the Mesosphere (AIM) Cloud Imaging and Particle Size (CIPS) instrument and the limb-viewing Odin Optical Spectrograph and InfraRed Imaging System (OSIRIS). OSIRIS and CIPS measurements are made comparable by defining a common volume for overlapping OSIRIS and CIPS observations for two northern hemisphere (NH) PMC seasons: NH08 and NH09. We define a scattering intensity quantity that is suitable for either nadir or limb observations and for different scattering conditions. A known CIPS bias is applied, differences in instrument sensitivity are analyzed and taken into account, and effects of cloud inhomogeneity and common volume definition on the comparison are discussed. Not accounting for instrument sensitivity differences or inhomogeneities in the PMC field, the mean relative difference in cloud brightness (CIPS - OSIRIS) is -102 ± 55%. The differences are largest for coincidences with very inhomogeneous clouds that are dominated by pixels that CIPS reports as non-cloud points. Removing these coincidences, the mean relative difference in cloud brightness reduces to -6 ± 14%. The correlation coefficient between the CIPS and OSIRIS measurements of PMC brightness variations in space and time is remarkably high, at 0.94. Overall, the comparison shows excellent agreement despite different retrieval approaches and observation geometries.

  5. PBF Reactor Building (PER620) under construction. Aerial view with camera ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620) under construction. Aerial view with camera facing northeast. Steel framework is exposed for west wing and high bay. Concrete block siding on east wing. Railroad crane set up on west side. Note trenches proceeding from front of building. Left trench is for secondary coolant and will lead to Cooling Tower. Shorter trench will contain cables leading to control area. Photographer: Larry Page. Date: March 22, 1967. INEEL negative no. 67-5025 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  6. ADM. Aerial view of administration area. Camera facing westerly. From ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ADM. Aerial view of administration area. Camera facing westerly. From left to right in foregound: Substation (TAN-605), Warehouse (TAN-628), Gate House (TAN-601), Administration Building (TAN-602). Left to right middle ground: Service Building (TAN-603), Warehouse (later known as Maintenance Shop or Craft Shop, TAN-604), Water Well Pump Houses, Fuel Tanks and Fuel Pump Houses, and Water Storage Tanks. Change House (TAN-606) on near side of berm. Large building beyond berm is A&M. Building, TAN-607. Railroad tracks beyond lead from (unseen) turntable to the IET. Date: June 6, 1955. INEEL negative no. 13201 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  7. Tropospheric Ozone Near-Nadir-Viewing IR Spectral Sensitivity and Ozone Measurements from NAST-I

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Smith, William L.; Larar, Allen M.

    2001-01-01

    Infrared ozone spectra from near nadir observations have provided atmospheric ozone information from the sensor to the Earth's surface. Simulations of the NPOESS Airborne Sounder Testbed-Interferometer (NAST-I) from the NASA ER-2 aircraft (approximately 20 km altitude) with a spectral resolution of 0.25/cm were used for sensitivity analysis. The spectral sensitivity of ozone retrievals to uncertainties in atmospheric temperature and water vapor is assessed in order to understand the relationship between the IR emissions and the atmospheric state. In addition, ozone spectral radiance sensitivity to its ozone layer densities and radiance weighting functions reveals the limit of the ozone profile retrieval accuracy from NAST-I measurements. Statistical retrievals of ozone with temperature and moisture retrievals from NAST-I spectra have been investigated and the preliminary results from NAST-I field campaigns are presented.

  8. Wetlands of the Gulf Coast

    NASA Technical Reports Server (NTRS)

    2001-01-01

    This set of images from the Multi-angle Imaging SpectroRadiometer highlights coastal areas of four states along the Gulf of Mexico: Louisiana, Mississippi, Alabama and part of the Florida panhandle. The images were acquired on October 15, 2001 (Terra orbit 9718)and represent an area of 345 kilometers x 315 kilometers.

    The two smaller images on the right are (top) a natural color view comprised of red, green, and blue band data from MISR's nadir(vertical-viewing) camera, and (bottom) a false-color view comprised of near-infrared, red, and blue band data from the same camera. The predominantly red color of the false-color image is due to the presence of vegetation, which is bright at near-infrared wavelengths. Cities appear as grey patches, with New Orleans visible at the southern edge of Lake Pontchartrain, along the left-hand side of the images. The Lake Pontchartrain Bridge runs approximately north-south across the middle of the lake. The distinctive shape of the Mississippi River Delta can be seen to the southeast of New Orleans. Other coastal cities are visible east of the Mississippi, including Biloxi, Mobile and Pensacola.

    The large image is similar to the true-color nadir view, except that red band data from the 60-degree backward-looking camera has been substituted into the red channel; the blue and green data from the nadir camera have been preserved. In this visualization, green hues appear somewhat subdued, and a number of areas with a reddish color are present, particularly near the mouths of the Mississippi, Pascagoula, Mobile-Tensaw, and Escambia Rivers. Here, the red color is highlighting differences in surface texture. This combination of angular and spectral information differentiates areas with aquatic vegetation associated with poorly drained bottom lands, marshes, and/or estuaries from the surrounding surface vegetation. These wetland regions are not as well differentiated in the conventional nadir views.

    Variations in ocean color

  9. Detail view of the vertical stabilizer of the Orbiter Discovery ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail view of the vertical stabilizer of the Orbiter Discovery Discovery showing the thermal protection system components with the white Advanced Flexible Reusable Surface Insulation (AFSI) Blanket and the black High-temperature Reusable Surface Insulation (HRSI) tiles along the outer edges . The marks seen on the HRSI tiles are injection point marks and holes for the application of waterproofing material. This view also a good detailed view of the two-piece rudder which is used to control the yaw position of orbiter on approach and landing in earth's atmosphere and upon landing the two-piece rudder splays open to both sides of the stabilizer to act as an air brake to help slow the craft to a stop. This view was taken from a service platform in the Orbiter Processing Facility at Kennedy Space Center. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX

  10. Characterization of Global Near-Nadir Backscatter for Remote Sensing Radar Design

    NASA Technical Reports Server (NTRS)

    Spencer, Michael W.; Long, David G.

    2000-01-01

    In order to evaluate side-lobe contamination from the near-nadir region for Ku-Band radars, a statistical characterization of global near-nadir backscatter is constructed. This characterization is performed for a variety of surface types using data from TRMM, Seasat, and Topex. An assessment of the relative calibration accuracy of these sensors is also presented.

  11. Characterization of Global Near-Nadir Backscatter for Remote Sensing Radar Design

    NASA Technical Reports Server (NTRS)

    Spencer, Michael W.; Long, David G.

    2000-01-01

    In order to evaluate side-lobe contamination from the near-nadir region for Ku-Band radars, a statistical characterization of global near-nadir backscatter is constructed. This characterization is performed for a variety of surface types using data from TRMM, Seasat, and Topex. An assessment of the relative calibration accuracy of them sensors is also presented.

  12. Design and Construction of an X-ray Lightning Camera

    NASA Astrophysics Data System (ADS)

    Schaal, M.; Dwyer, J. R.; Rassoul, H. K.; Uman, M. A.; Jordan, D. M.; Hill, J. D.

    2010-12-01

    A pinhole-type camera was designed and built for the purpose of producing high-speed images of the x-ray emissions from rocket-and-wire-triggered lightning. The camera consists of 30 7.62-cm diameter NaI(Tl) scintillation detectors, each sampling at 10 million frames per second. The steel structure of the camera is encased in 1.27-cm thick lead, which blocks x-rays that are less than 400 keV, except through a 7.62-cm diameter “pinhole” aperture located at the front of the camera. The lead and steel structure is covered in 0.16-cm thick aluminum to block RF noise, water and light. All together, the camera weighs about 550-kg and is approximately 1.2-m x 0.6-m x 0.6-m. The image plane, which is adjustable, was placed 32-cm behind the pinhole aperture, giving a field of view of about ±38° in both the vertical and horizontal directions. The elevation of the camera is adjustable between 0 and 50° from horizontal and the camera may be pointed in any azimuthal direction. In its current configuration, the camera’s angular resolution is about 14°. During the summer of 2010, the x-ray camera was located 44-m from the rocket-launch tower at the UF/Florida Tech International Center for Lightning Research and Testing (ICLRT) at Camp Blanding, FL and several rocket-triggered lightning flashes were observed. In this presentation, I will discuss the design, construction and operation of this x-ray camera.

  13. MISR Sees the Sierra Nevadas in Stereo

    NASA Technical Reports Server (NTRS)

    2000-01-01

    These MISR images of the Sierra Nevada mountains near the California-Nevada border were acquired on August 12, 2000 during Terra orbit 3472. On the left is an image from the vertical-viewing (nadir) camera. On the right is a stereo 'anaglyph' created using the nadir and 45.6-degree forward-viewing cameras, providing a three-dimensional view of the scene when viewed with red/blue glasses. The red filter should be placed over your left eye. To facilitate the stereo viewing, the images have been oriented with north toward the left.

    Some prominent features are Mono Lake, in the center of the images; Walker Lake, to its left; and Lake Tahoe, near the lower left. This view of the Sierra Nevadas includes Yosemite, Kings Canyon, and Sequoia National Parks. Mount Whitney, the highest peak in the contiguous 48 states (elev. 14,495 feet), is visible near the righthand edge. Above it (to the east), the Owens Valley shows up prominently between the Sierra Nevada and Inyo ranges.

    Precipitation falling as rain or snow on the Sierras feeds numerous rivers flowing southwestward into the San Joaquin Valley. The abundant fields of this productive agricultural area can be seen along the lower right; a large number of reservoirs that supply water for crop irrigation are apparent in the western foothills of the Sierras. Urban areas in the valley appear as gray patches; among the California cities that are visible are Fresno, Merced, and Modesto.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  14. Memoris, A Wide Angle Camera For Bepicolombo

    NASA Astrophysics Data System (ADS)

    Cremonese, G.; Memoris Team

    In order to answer to the Announcement of Opportunity of ESA for the BepiColombo payload, we are working on a wide angle camera concept named MEMORIS (MEr- cury MOderate Resolution Imaging System). MEMORIS will performe stereoscopic images of the whole Mercury surface using two different channels at +/- 20 degrees from the nadir point. It will achieve a spatial resolution of 50m per pixel at 400 km from the surface (peri-Herm), corresponding to a vertical resolution of about 75m with the stereo performances. The scientific objectives will be addressed by MEMORIS may be identified as follows: Estimate of surface age based on crater counting Crater morphology and degrada- tion Stratigraphic sequence of geological units Identification of volcanic features and related deposits Origin of plain units from morphological observations Distribution and type of the tectonic structures Determination of relative age among the structures based on cross-cutting relationships 3D Tectonics Global mineralogical mapping of main geological units Identification of weathering products The last two items will come from the multispectral capabilities of the camera utilizing 8 to 12 (TBD) broad band filters. MEMORIS will be equipped by a further channel devoted to the observations of the tenuous exosphere. It will look at the limb on a given arc of the BepiColombo orbit, in so doing it will observe the exosphere above a surface latitude range of 25-75 degrees in the northern emisphere. The exosphere images will be obtained above the surface just observed by the other two channels, trying to find possible relantionship, as ground-based observations suggest. The exospheric channel will have four narrow-band filters centered on the sodium and potassium emissions and the adjacent continua.

  15. Snowstorm Along the China-Mongolia-Russia Borders

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Heavy snowfall on March 12, 2004, across north China's Inner Mongolia Autonomous Region, Mongolia and Russia, caused train and highway traffic to stop for several days along the Russia-China border. This pair of images from the Multi-angle Imaging SpectroRadiometer (MISR) highlights the snow and surface properties across the region on March 13. The left-hand image is a multi-spectral false-color view made from the near-infrared, red, and green bands of MISR's vertical-viewing (nadir) camera. The right-hand image is a multi-angle false-color view made from the red band data of the 46-degree aftward camera, the nadir camera, and the 46-degree forward camera.

    About midway between the frozen expanse of China's Hulun Nur Lake (along the right-hand edge of the images) and Russia's Torey Lakes (above image center) is a dark linear feature that corresponds with the China-Mongolia border. In the upper portion of the images, many small plumes of black smoke rise from coal and wood fires and blow toward the southeast over the frozen lakes and snow-covered grasslands. Along the upper left-hand portion of the images, in Russia's Yablonovyy mountain range and the Onon River Valley, the terrain becomes more hilly and forested. In the nadir image, vegetation appears in shades of red, owing to its high near-infrared reflectivity. In the multi-angle composite, open-canopy forested areas are indicated by green hues. Since this is a multi-angle composite, the green color arises not from the color of the leaves but from the architecture of the surface cover. The green areas appear brighter at the nadir angle than at the oblique angles because more of the snow-covered surface in the gaps between the trees is visible. Color variations in the multi-angle composite also indicate angular reflectance properties for areas covered by snow and ice. The light blue color of the frozen lakes is due to the increased forward scattering of smooth ice, and light orange colors indicate

  16. Color and 3D views of the Sierra Nevada mountains

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A stereo 'anaglyph' created using the nadir and 45.6-degree forward-viewing cameras provides a three-dimensional view of the scene when viewed with red/blue glasses. The red filter should be placed over your left eye. To facilitate the stereo viewing, the images have been oriented with north toward the left. Some prominent features are Mono Lake, in the center of the image; Walker Lake, to its left; and Lake Tahoe, near the lower left. This view of the Sierra Nevadas includes Yosemite, Kings Canyon, and Sequoia National Parks. Mount Whitney, the highest peak in the contiguous 48 states (elev. 14,495 feet), is visible near the righthand edge. Above it (to the east), the Owens Valley shows up prominently between the Sierra Nevada and Inyo ranges. Precipitation falling as rain or snow on the Sierras feeds numerous rivers flowing southwestward into the San Joaquin Valley. The abundant fields of this productive agricultural area can be seen along the lower right; a large number of reservoirs that supply water for crop irrigation are apparent in the western foothills of the Sierras. Urban areas in the valley appear as gray patches; among the California cities that are visible are Fresno, Merced, and Modesto.

  17. Vertical viewing angle enhancement for the 360  degree integral-floating display using an anamorphic optic system.

    PubMed

    Erdenebat, Munkh-Uchral; Kwon, Ki-Chul; Yoo, Kwan-Hee; Baasantseren, Ganbat; Park, Jae-Hyeung; Kim, Eun-Soo; Kim, Nam

    2014-04-15

    We propose a 360 degree integral-floating display with an enhanced vertical viewing angle. The system projects two-dimensional elemental image arrays via a high-speed digital micromirror device projector and reconstructs them into 3D perspectives with a lens array. Double floating lenses relate initial 3D perspectives to the center of a vertically curved convex mirror. The anamorphic optic system tailors the initial 3D perspectives horizontally and vertically disperse light rays more widely. By the proposed method, the entire 3D image provides both monocular and binocular depth cues, a full-parallax demonstration with high-angular ray density and an enhanced vertical viewing angle.

  18. Photometric Characteristics of Sprites and Elves Derived from JEM-GLIMS Nadir Observations (Invited)

    NASA Astrophysics Data System (ADS)

    Sato, M.; Takahashi, Y.; Adachi, T.; Kobayashi, N.; Mihara, M.; Ushio, T.; Morimoto, T.; Suzuki, M.; Yamazaki, A.; Inan, U.; Linscott, I.

    2013-12-01

    The main goal of the JEM-GLIMS mission is to identify the horizontal structures of Transient Luminous Events (TLEs) and spatiotemporal relationship between TLEs and their parent lightning discharges based on the nadir observations from the International Space Station (ISS). For this purpose JEM-GLIMS equips two sets of optical instruments (LSI: CMOS camera, and PH: spectrophotometers) and two sets of radio wave receivers (VLFR: VLF receiver, and VITF: VHF interferometer). As all these instruments are installed at the bottom plane of the bus module facing to the Earth, JEM-GLIMS can carry out the nadir observations continuously. JEM-GLIMS was launched by HTV3 and was successfully installed at the exposed facility of the Japanese Experiment Module (JEM) on August 9, 2012. After the initial checkout operations, JEM-GLIMS finally started continuous observations on November 20, 2012. In the period from November 20, 2012 to June 30, 2013, totally 1597 transient optical events related to lightning flashes and/or TLE emissions were detected by the optical instruments. In 578 of these events, both LSI and PH detected clear transient optical signals well above the noise level. In order to derive sprite events from the detected transient optical events, we analyzed PH light-curve data first and estimated the peak irradiance related to the transient optical flashes. Then, we compared these intensities with the atmospheric transmittance. Finally, LSI image data are examined to clarify the morphological properties of the optical emission. We analyzed a transient optical event detected at 00:56:29.198 UT on December 15, 2012. The peak intensities of PH channels are estimated to be 1.4E-2 W/m2 (150-280 nm), 2.3E-4 W/m2 (316 nm), 5.9E-4 W/m2 (337 nm), 4.0E-4 W/m2 (392 nm), 4.2E-4 W/m2 (762 nm), and 6.3E-2 W/m2 (600-900 nm), respectively. It is found that all these intensities are significantly stronger than the lightning emission affected by the atmospheric transmittance. This fact

  19. 30. VERTICAL AERIAL VIEW OF THE MOUTH OF THE FEDERAL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    30. VERTICAL AERIAL VIEW OF THE MOUTH OF THE FEDERAL CHANNEL, SCALE 1:14,400. TO THE SOUTH OF THE CHANNEL ARE THE RUNWAYS OF THE FORMER ALAMEDA NAVAL AIR STATION; TO THE NORTH ARE THE BERTHS AND BUILDINGS OF THE FORMER NAVAL SUPPLY CENTER, OAKLAND. Date and time of photography '12-9-98 10:51." - Oakland Harbor Training Walls, Mouth of Federal Channel to Inner Harbor, Oakland, Alameda County, CA

  20. Detail view of the lower portion of the vertical stabilizer ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail view of the lower portion of the vertical stabilizer of the Orbiter Discovery. The section below the rudder, often referred to as the "stinger", is used to house the orbiter drag chute assembly. The system consisted of a mortar deployed pilot chute, the main drag chute, a controller assembly and an attach/jettison mechanism. This system was a modification to the original design of the Orbiter Discovery to safely reduce the roll to stop distance without adversely affecting the vehicle handling qualities. This view was taken from a service platform in the Orbiter Processing Facility at Kennedy Space Center. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX

  1. 33. VERTICAL AND TORSIONAL OSCILLATIONS, 3/4 VIEW, 7 NOVEMBER 1940, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    33. VERTICAL AND TORSIONAL OSCILLATIONS, 3/4 VIEW, 7 NOVEMBER 1940, FROM 16MN FILM SHOT BY PROFESSOR F.B. FARQUHARSON, UNIVERSITY OF WASHINGTON. (LABORATORY STUDIES ON THE TACOMA NARROWS BRIDGE, AT UNIVERSITY OF WASHINGTON (SEATTLE: UNIVERSITY OF WASHINGTON, DEPARTMENT OF CIVIL ENGINEERING, 1941) - Tacoma Narrows Bridge, Spanning Narrows at State Route 16, Tacoma, Pierce County, WA

  2. Global calibration of multi-cameras with non-overlapping fields of view based on photogrammetry and reconfigurable target

    NASA Astrophysics Data System (ADS)

    Xia, Renbo; Hu, Maobang; Zhao, Jibin; Chen, Songlin; Chen, Yueling

    2018-06-01

    Multi-camera vision systems are often needed to achieve large-scale and high-precision measurement because these systems have larger fields of view (FOV) than a single camera. Multiple cameras may have no or narrow overlapping FOVs in many applications, which pose a huge challenge to global calibration. This paper presents a global calibration method for multi-cameras without overlapping FOVs based on photogrammetry technology and a reconfigurable target. Firstly, two planar targets are fixed together and made into a long target according to the distance between the two cameras to be calibrated. The relative positions of the two planar targets can be obtained by photogrammetric methods and used as invariant constraints in global calibration. Then, the reprojection errors of target feature points in the two cameras’ coordinate systems are calculated at the same time and optimized by the Levenberg–Marquardt algorithm to find the optimal solution of the transformation matrix between the two cameras. Finally, all the camera coordinate systems are converted to the reference coordinate system in order to achieve global calibration. Experiments show that the proposed method has the advantages of high accuracy (the RMS error is 0.04 mm) and low cost and is especially suitable for on-site calibration.

  3. Radiometric stability of the Multi-angle Imaging SpectroRadiometer (MISR) following 15 years on-orbit

    NASA Astrophysics Data System (ADS)

    Bruegge, Carol J.; Val, Sebastian; Diner, David J.; Jovanovic, Veljko; Gray, Ellyn; Di Girolamo, Larry; Zhao, Guangyu

    2014-09-01

    The Multi-angle Imaging SpectroRadiometer (MISR) has successfully operated on the EOS/ Terra spacecraft since 1999. It consists of nine cameras pointing from nadir to 70.5° view angle with four spectral channels per camera. Specifications call for a radiometric uncertainty of 3% absolute and 1% relative to the other cameras. To accomplish this, MISR utilizes an on-board calibrator (OBC) to measure camera response changes. Once every two months the two Spectralon panels are deployed to direct solar-light into the cameras. Six photodiode sets measure the illumination level that are compared to MISR raw digital numbers, thus determining the radiometric gain coefficients used in Level 1 data processing. Although panel stability is not required, there has been little detectable change in panel reflectance, attributed to careful preflight handling techniques. The cameras themselves have degraded in radiometric response by 10% since launch, but calibration updates using the detector-based scheme has compensated for these drifts and allowed the radiance products to meet accuracy requirements. Validation using Sahara desert observations show that there has been a drift of ~1% in the reported nadir-view radiance over a decade, common to all spectral bands.

  4. Tinder Fire in Arizona Viewed by NASA's MISR

    NASA Image and Video Library

    2018-05-02

    On April 27, 2018, the Tinder Fire ignited in eastern Arizona near the Blue Ridge Reservoir, about 50 miles (80 kilometers) southeast of Flagstaff and 20 miles (32 kilometers) northeast of Payson. During the first 24 hours it remained relatively small at 500 acres (202 hectares), but on April 29, during red flag wind conditions, it exploded to 8,600 acres (3,480 hectares). Residents of rural communities in the area were forced to evacuate and an unknown number of structures were burned. As of April 30, the Tinder Fire had burned a total of 11,400 acres (4,613 hectares). On April 30 at 11:15 a.m. local time, the Multi-angle Imaging SpectroRadiometer (MISR) captured imagery of the Tinder Fire as it passed overhead on NASA's Terra satellite. The MISR instrument has nine cameras that view Earth at different angles. This image shows the view from MISR's nadir (downward-pointing) camera. The angular information from MISR's images is used to calculate the height of the smoke plume, results of which are superimposed on the right-hand image. This shows that the plume top near the active fire was at approximately 13,000 feet altitude (4,000 meters). In general, higher-altitude plumes transport smoke greater distances from the source, impacting communities downwind. A stereo anaglyph providing a three-dimensional view of the plume is also shown. Red-blue glasses with the red lens placed over your left eye are required to observe the 3D effect. These data were acquired during Terra orbit 97691. An annotated figure and anaglyph are available at https://photojournal.jpl.nasa.gov/catalog/PIA00698

  5. Driving techniques for high frame rate CCD camera

    NASA Astrophysics Data System (ADS)

    Guo, Weiqiang; Jin, Longxu; Xiong, Jingwu

    2008-03-01

    This paper describes a high-frame rate CCD camera capable of operating at 100 frames/s. This camera utilizes Kodak KAI-0340, an interline transfer CCD with 640(vertical)×480(horizontal) pixels. Two output ports are used to read out CCD data and pixel rates approaching 30 MHz. Because of its reduced effective opacity of vertical charge transfer registers, interline transfer CCD can cause undesired image artifacts, such as random white spots and smear generated in the registers. To increase frame rate, a kind of speed-up structure has been incorporated inside KAI-0340, then it is vulnerable to a vertical stripe effect. The phenomena which mentioned above may severely impair the image quality. To solve these problems, some electronic methods of eliminating these artifacts are adopted. Special clocking mode can dump the unwanted charge quickly, then the fast readout of the images, cleared of smear, follows immediately. Amplifier is used to sense and correct delay mismatch between the dual phase vertical clock pulses, the transition edges become close to coincident, so vertical stripes disappear. Results obtained with the CCD camera are shown.

  6. Bidirectional measurements of surface reflectance for view angle corrections of oblique imagery

    NASA Technical Reports Server (NTRS)

    Jackson, R. D.; Teillet, P. M.; Slater, P. N.; Fedosejevs, G.; Jasinski, Michael F.

    1990-01-01

    An apparatus for acquiring bidirectional reflectance-factor data was constructed and used over four surface types. Data sets were obtained over a headed wheat canopy, bare soil having several different roughness conditions, playa (dry lake bed), and gypsum sand. Results are presented in terms of relative bidirectional reflectance factors (BRFs) as a function of view angle at a number of solar zenith angles, nadir BRFs as a function of solar zenith angles, and, for wheat, vegetation indices as related to view and solar zenith angles. The wheat canopy exhibited the largest BRF changes with view angle. BRFs for the red and the NIR bands measured over wheat did not have the same relationship with view angle. NIR/Red ratios calculated from nadir BRFs changed by nearly a factor of 2 when the solar zenith angle changed from 20 to 50 degs. BRF versus view angle relationships were similar for soils having smooth and intermediate rough surfaces but were considerably different for the roughest surface. Nadir BRF versus solar-zenith angle relationships were distinctly different for the three soil roughness levels. Of the various surfaces, BRFs for gypsum sand changed the least with view angle (10 percent at 30 degs).

  7. Oblique Aerial Photography Tool for Building Inspection and Damage Assessment

    NASA Astrophysics Data System (ADS)

    Murtiyoso, A.; Remondino, F.; Rupnik, E.; Nex, F.; Grussenmeyer, P.

    2014-11-01

    Aerial photography has a long history of being employed for mapping purposes due to some of its main advantages, including large area imaging from above and minimization of field work. Since few years multi-camera aerial systems are becoming a practical sensor technology across a growing geospatial market, as complementary to the traditional vertical views. Multi-camera aerial systems capture not only the conventional nadir views, but also tilted images at the same time. In this paper, a particular use of such imagery in the field of building inspection as well as disaster assessment is addressed. The main idea is to inspect a building from four cardinal directions by using monoplotting functionalities. The developed application allows to measure building height and distances and to digitize man-made structures, creating 3D surfaces and building models. The realized GUI is capable of identifying a building from several oblique points of views, as well as calculates the approximate height of buildings, ground distances and basic vectorization. The geometric accuracy of the results remains a function of several parameters, namely image resolution, quality of available parameters (DEM, calibration and orientation values), user expertise and measuring capability.

  8. Measuring Distances Using Digital Cameras

    ERIC Educational Resources Information Center

    Kendal, Dave

    2007-01-01

    This paper presents a generic method of calculating accurate horizontal and vertical object distances from digital images taken with any digital camera and lens combination, where the object plane is parallel to the image plane or tilted in the vertical plane. This method was developed for a project investigating the size, density and spatial…

  9. Two-Camera Acquisition and Tracking of a Flying Target

    NASA Technical Reports Server (NTRS)

    Biswas, Abhijit; Assad, Christopher; Kovalik, Joseph M.; Pain, Bedabrata; Wrigley, Chris J.; Twiss, Peter

    2008-01-01

    A method and apparatus have been developed to solve the problem of automated acquisition and tracking, from a location on the ground, of a luminous moving target in the sky. The method involves the use of two electronic cameras: (1) a stationary camera having a wide field of view, positioned and oriented to image the entire sky; and (2) a camera that has a much narrower field of view (a few degrees wide) and is mounted on a two-axis gimbal. The wide-field-of-view stationary camera is used to initially identify the target against the background sky. So that the approximate position of the target can be determined, pixel locations on the image-detector plane in the stationary camera are calibrated with respect to azimuth and elevation. The approximate target position is used to initially aim the gimballed narrow-field-of-view camera in the approximate direction of the target. Next, the narrow-field-of view camera locks onto the target image, and thereafter the gimbals are actuated as needed to maintain lock and thereby track the target with precision greater than that attainable by use of the stationary camera.

  10. Burn Scar Near the Hanford Nuclear Reservation

    NASA Technical Reports Server (NTRS)

    2002-01-01

    This Multi-angle Imaging Spectroradiometer (MISR) image pair shows 'before and after' views of the area around the Hanford Nuclear Reservation near Richland, Washington. On June 27, 2000, a fire in the dry sagebrush was sparked by an automobile crash. The flames were fanned by hot summer winds. By the day after the accident, about 100,000 acres had burned, and the fire's spread forced the closure of highways and loss of homes. These images were obtained by MISR's vertical-viewing (nadir) camera. Compare the area just above and to the right of the line of cumulus clouds in the May 15 image with the same area imaged on August 3. The darkened burn scar measures approximately 35 kilometers across. The Columbia River is seen wending its way around Hanford. Image courtesy NASA/GSFC/JPL, MISR Science Team

  11. Adaptive strategies of remote systems operators exposed to perturbed camera-viewing conditions

    NASA Technical Reports Server (NTRS)

    Stuart, Mark A.; Manahan, Meera K.; Bierschwale, John M.; Sampaio, Carlos E.; Legendre, A. J.

    1991-01-01

    This report describes a preliminary investigation of the use of perturbed visual feedback during the performance of simulated space-based remote manipulation tasks. The primary objective of this NASA evaluation was to determine to what extent operators exhibit adaptive strategies which allow them to perform these specific types of remote manipulation tasks more efficiently while exposed to perturbed visual feedback. A secondary objective of this evaluation was to establish a set of preliminary guidelines for enhancing remote manipulation performance and reducing the adverse effects. These objectives were accomplished by studying the remote manipulator performance of test subjects exposed to various perturbed camera-viewing conditions while performing a simulated space-based remote manipulation task. Statistical analysis of performance and subjective data revealed that remote manipulation performance was adversely affected by the use of perturbed visual feedback and performance tended to improve with successive trials in most perturbed viewing conditions.

  12. Analytic evaluation of the weighting functions for remote sensing of blackbody planetary atmospheres : the case of limb viewing geometry

    NASA Technical Reports Server (NTRS)

    Ustinov, Eugene A.

    2006-01-01

    In a recent publication (Ustinov, 2002), we proposed an analytic approach to evaluation of radiative and geophysical weighting functions for remote sensing of a blackbody planetary atmosphere, based on general linearization approach applied to the case of nadir viewing geometry. In this presentation, the general linearization approach is applied to the limb viewing geometry. The expressions, similar to those obtained in (Ustinov, 2002), are obtained for weighting functions with respect to the distance along the line of sight. Further on, these expressions are converted to the expressions for weighting functions with respect to the vertical coordinate in the atmosphere. Finally, the numerical representation of weighting functions in the form of matrices of partial derivatives of grid limb radiances with respect to the grid values of atmospheric parameters is used for a convolution with the finite field of view of the instrument.

  13. Vertical view Apollo 16 Descartes landing sites as photographed by Apollo 14

    NASA Technical Reports Server (NTRS)

    1972-01-01

    An almost vertical view of the Apollo 16 Descartes landing sites as photographed from the Apollo 14 spacecraft. Overlays are provided to point out extravehicular activity (EVA), Lunar Roving Vehicle (LRV) travers routes and the nicknames of features. The Roman numerals indicate the EVA numbers and the Arabic numbers point out stations or traverse stops.

  14. 32. VERTICAL OSCILLATIONS, 3/4 VIEW, 7 NOVEMBER 1940, FROM 16MM ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    32. VERTICAL OSCILLATIONS, 3/4 VIEW, 7 NOVEMBER 1940, FROM 16MM FILM SHOT PROFESSOR F.B. FARQUHARSON, UNIVERSITY OF WASHINGTON. (LABORATORY STUDIES ON THE TACOMA NARROWS BRIDGE, AT UNIVERSITY OF WASHINGTON (SEATTLE: UNIVERSITY OF WASHINGTON, DEPARTMENT OF CIVIL ENGINEERING, 1941) - Tacoma Narrows Bridge, Spanning Narrows at State Route 16, Tacoma, Pierce County, WA

  15. Solid state television camera

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The design, fabrication, and tests of a solid state television camera using a new charge-coupled imaging device are reported. An RCA charge-coupled device arranged in a 512 by 320 format and directly compatible with EIA format standards was the sensor selected. This is a three-phase, sealed surface-channel array that has 163,840 sensor elements, which employs a vertical frame transfer system for image readout. Included are test results of the complete camera system, circuit description and changes to such circuits as a result of integration and test, maintenance and operation section, recommendations to improve the camera system, and a complete set of electrical and mechanical drawing sketches.

  16. Researches on hazard avoidance cameras calibration of Lunar Rover

    NASA Astrophysics Data System (ADS)

    Li, Chunyan; Wang, Li; Lu, Xin; Chen, Jihua; Fan, Shenghong

    2017-11-01

    Lunar Lander and Rover of China will be launched in 2013. It will finish the mission targets of lunar soft landing and patrol exploration. Lunar Rover has forward facing stereo camera pair (Hazcams) for hazard avoidance. Hazcams calibration is essential for stereo vision. The Hazcam optics are f-theta fish-eye lenses with a 120°×120° horizontal/vertical field of view (FOV) and a 170° diagonal FOV. They introduce significant distortion in images and the acquired images are quite warped, which makes conventional camera calibration algorithms no longer work well. A photogrammetric calibration method of geometric model for the type of optical fish-eye constructions is investigated in this paper. In the method, Hazcams model is represented by collinearity equations with interior orientation and exterior orientation parameters [1] [2]. For high-precision applications, the accurate calibration model is formulated with the radial symmetric distortion and the decentering distortion as well as parameters to model affinity and shear based on the fisheye deformation model [3] [4]. The proposed method has been applied to the stereo camera calibration system for Lunar Rover.

  17. Potential of remote sensing of cirrus optical thickness by airborne spectral radiance measurements at different sideward viewing angles

    NASA Astrophysics Data System (ADS)

    Wolf, Kevin; Ehrlich, André; Hüneke, Tilman; Pfeilsticker, Klaus; Werner, Frank; Wirth, Martin; Wendisch, Manfred

    2017-03-01

    Spectral radiance measurements collected in nadir and sideward viewing directions by two airborne passive solar remote sensing instruments, the Spectral Modular Airborne Radiation measurement sysTem (SMART) and the Differential Optical Absorption Spectrometer (mini-DOAS), are used to compare the remote sensing results of cirrus optical thickness τ. The comparison is based on a sensitivity study using radiative transfer simulations (RTS) and on data obtained during three airborne field campaigns: the North Atlantic Rainfall VALidation (NARVAL) mission, the Mid-Latitude Cirrus Experiment (ML-CIRRUS) and the Aerosol, Cloud, Precipitation, and Radiation Interactions and Dynamics of Convective Cloud Systems (ACRIDICON) campaign. Radiative transfer simulations are used to quantify the sensitivity of measured upward radiance I with respect to τ, ice crystal effective radius reff, viewing angle of the sensor θV, spectral surface albedo α, and ice crystal shape. From the calculations it is concluded that sideward viewing measurements are generally better suited than radiance data from the nadir direction to retrieve τ of optically thin cirrus, especially at wavelengths larger than λ = 900 nm. Using sideward instead of nadir-directed spectral radiance measurements significantly improves the sensitivity and accuracy in retrieving τ, in particular for optically thin cirrus of τ ≤ 2. The comparison of retrievals of τ based on nadir and sideward viewing radiance measurements from SMART, mini-DOAS and independent estimates of τ from an additional active remote sensing instrument, the Water Vapor Lidar Experiment in Space (WALES), shows general agreement within the range of measurement uncertainties. For the selected example a mean τ of 0.54 ± 0.2 is derived from SMART, and 0.49 ± 0.2 by mini-DOAS nadir channels, while WALES obtained a mean value of τ = 0.32 ± 0.02 at 532 nm wavelength, respectively. The mean of τ derived from the sideward viewing mini

  18. CCD Camera Lens Interface for Real-Time Theodolite Alignment

    NASA Technical Reports Server (NTRS)

    Wake, Shane; Scott, V. Stanley, III

    2012-01-01

    Theodolites are a common instrument in the testing, alignment, and building of various systems ranging from a single optical component to an entire instrument. They provide a precise way to measure horizontal and vertical angles. They can be used to align multiple objects in a desired way at specific angles. They can also be used to reference a specific location or orientation of an object that has moved. Some systems may require a small margin of error in position of components. A theodolite can assist with accurately measuring and/or minimizing that error. The technology is an adapter for a CCD camera with lens to attach to a Leica Wild T3000 Theodolite eyepiece that enables viewing on a connected monitor, and thus can be utilized with multiple theodolites simultaneously. This technology removes a substantial part of human error by relying on the CCD camera and monitors. It also allows image recording of the alignment, and therefore provides a quantitative means to measure such error.

  19. Wind-Sculpted Vicinity After Opportunity's Sol 1797 Drive (Vertical)

    NASA Technical Reports Server (NTRS)

    2009-01-01

    NASA's Mars Exploration Rover Opportunity used its navigation camera to take the images combined into this full-circle view of the rover's surroundings just after driving 111 meters (364 feet) on the 1,797th Martian day, or sol, of Opportunity's surface mission (Feb. 12, 2009). North is at the center; south at both ends.

    Tracks from the drive recede northward across dark-toned sand ripples in the Meridiani Planum region of Mars. Patches of lighter-toned bedrock are visible on the left and right sides of the image. For scale, the distance between the parallel wheel tracks is about 1 meter (about 40 inches).

    This view is presented as a vertical projection with geometric seam correction.

  20. 360 deg Camera Head for Unmanned Sea Surface Vehicles

    NASA Technical Reports Server (NTRS)

    Townsend, Julie A.; Kulczycki, Eric A.; Willson, Reginald G.; Huntsberger, Terrance L.; Garrett, Michael S.; Trebi-Ollennu, Ashitey; Bergh, Charles F.

    2012-01-01

    The 360 camera head consists of a set of six color cameras arranged in a circular pattern such that their overlapping fields of view give a full 360 view of the immediate surroundings. The cameras are enclosed in a watertight container along with support electronics and a power distribution system. Each camera views the world through a watertight porthole. To prevent overheating or condensation in extreme weather conditions, the watertight container is also equipped with an electrical cooling unit and a pair of internal fans for circulation.

  1. Analysis of Crosstalk in 3D Circularly Polarized LCDs Depending on the Vertical Viewing Location.

    PubMed

    Zeng, Menglin; Nguyen, Truong Q

    2016-03-01

    Crosstalk in circularly polarized (CP) liquid crystal display (LCD) with polarized glasses (passive 3D glasses) is mainly caused by two factors: 1) the polarizing system including wave retarders and 2) the vertical misalignment (VM) of light between the LC module and the patterned retarder. We show that the latter, which is highly dependent on the vertical viewing location, is a much more significant factor of crosstalk in CP LCD than the former. There are three contributions in this paper. Initially, a display model for CP LCD, which accurately characterizes VM, is proposed. A novel display calibration method for the VM characterization that only requires pictures of the screen taken at four viewing locations. In addition, we prove that the VM-based crosstalk cannot be efficiently reduced by either preprocessing the input images or optimizing the polarizing system. Furthermore, we derive the analytic solution for the viewing zone, where the entire screen does not have the VM-based crosstalk.

  2. After Conquering 'Husband Hill,' Spirit Moves On (Vertical)

    NASA Technical Reports Server (NTRS)

    2005-01-01

    The first explorer ever to scale a summit on another planet, NASA's Mars Exploration Rover Spirit has begun a long trek downward from the top of 'Husband Hill' to new destinations. As shown in this 180-degree panorama from east of the summit, Spirit's earlier tracks are no longer visible. They are off to the west (to the left in this view). Spirit's next destination is 'Haskin Ridge,' straight ahead along the edge of the steep cliff on the right side of this panorama.

    The scene is a mosaic of images that Spirit took with the navigation camera on the rover's 635th Martian day, or sol, (Oct. 16, 2005) of exploration of Gusev Crater on Mars. This view is presented in a vertical projection with geometric seam correction.

  3. Constraining the physical properties of Titan's empty lake basins using nadir and off-nadir Cassini RADAR backscatter

    NASA Astrophysics Data System (ADS)

    Michaelides, R. J.; Hayes, A. G.; Mastrogiuseppe, M.; Zebker, H. A.; Farr, T. G.; Malaska, M. J.; Poggiali, V.; Mullen, J. P.

    2016-05-01

    We use repeat synthetic aperture radar (SAR) observations and complementary altimetry passes acquired by the Cassini spacecraft to study the scattering properties of Titan's empty lake basins. The best-fit coefficients from fitting SAR data to a quasi-specular plus diffuse backscatter model suggest that the bright basin floors have a higher dielectric constant, but similar facet-scale rms surface facet slopes, to surrounding terrain. Waveform analysis of altimetry returns reveals that nadir backscatter returns from basin floors are greater than nadir backscatter returns from basin surroundings and have narrower pulse widths. This suggests that floor deposits are structurally distinct from their surroundings, consistent with the interpretation that some of these basins may be filled with evaporitic and/or sedimentary deposits. Basin floor deposits also express a larger diffuse component to their backscatter, which is likely due to variations in subsurface structure or an increase in roughness at the wavelength scale (Hayes, A.G. et al. [2008]. Geophys. Res. Lett. 35, 9). We generate a high-resolution altimetry radargram of the T30 altimetry pass over an empty lake basin, with which we place geometric constraints on the basin's slopes, rim heights, and depth. Finally, the importance of these backscatter observations and geometric measurements for basin formation mechanisms is briefly discussed.

  4. Principal axis-based correspondence between multiple cameras for people tracking.

    PubMed

    Hu, Weiming; Hu, Min; Zhou, Xue; Tan, Tieniu; Lou, Jianguang; Maybank, Steve

    2006-04-01

    Visual surveillance using multiple cameras has attracted increasing interest in recent years. Correspondence between multiple cameras is one of the most important and basic problems which visual surveillance using multiple cameras brings. In this paper, we propose a simple and robust method, based on principal axes of people, to match people across multiple cameras. The correspondence likelihood reflecting the similarity of pairs of principal axes of people is constructed according to the relationship between "ground-points" of people detected in each camera view and the intersections of principal axes detected in different camera views and transformed to the same view. Our method has the following desirable properties: 1) Camera calibration is not needed. 2) Accurate motion detection and segmentation are less critical due to the robustness of the principal axis-based feature to noise. 3) Based on the fused data derived from correspondence results, positions of people in each camera view can be accurately located even when the people are partially occluded in all views. The experimental results on several real video sequences from outdoor environments have demonstrated the effectiveness, efficiency, and robustness of our method.

  5. Variation in spectral response of soybeans with respect to illumination, view, and canopy geometry

    NASA Technical Reports Server (NTRS)

    Ranson, K. J.; Biehl, L. L.; Bauer, M. E.

    1984-01-01

    Comparisons of the spectral response for incomplete (well-defined row structure) and complete (overlapping row structure) canopies of soybeans indicated a greater dependence on Sun and view geometry for the incomplete canopies. Red and near-IR reflectance for the incomplete canopy decreased as solar zenith angle increased for a nadir view angle until the soil between the plant rows was completely shaded. Thereafter for increasing solar zenith angle, the red reflectance leveled off and the near-IR reflectance increased. A 'hot spot' effect was evident for the red and near-IR reflectance factors. The 'hot spot' effect was more pronounced for the red band based on relative reflectance value changes. The ratios of off-nadir to nadir acquired data reveal that off-nadir red band reflectance factors more closely approximated straightdown measurements for time periods away from solar noon. Normalized difference generally approximated straightdown measurements during the middle portion of the day.

  6. MACS-Himalaya: A photogrammetric aerial oblique camera system designed for highly accurate 3D-reconstruction and monitoring in steep terrain and under extreme illumination conditions

    NASA Astrophysics Data System (ADS)

    Brauchle, Joerg; Berger, Ralf; Hein, Daniel; Bucher, Tilman

    2017-04-01

    The DLR Institute of Optical Sensor Systems has developed the MACS-Himalaya, a custom built Modular Aerial Camera System specifically designed for the extreme geometric (steep slopes) and radiometric (high contrast) conditions of high mountain areas. It has an overall field of view of 116° across-track consisting of a nadir and two oblique looking RGB camera heads and a fourth nadir looking near-infrared camera. This design provides the capability to fly along narrow valleys and simultaneously cover ground and steep valley flank topography with similar ground resolution. To compensate for extreme contrasts between fresh snow and dark shadows in high altitudes a High Dynamic Range (HDR) mode was implemented, which typically takes a sequence of 3 images with graded integration times, each covering 12 bit radiometric depth, resulting in a total dynamic range of 15-16 bit. This enables dense image matching and interpretation for sunlit snow and glaciers as well as for dark shaded rock faces in the same scene. Small and lightweight industrial grade camera heads are used and operated at a rate of 3.3 frames per second with 3-step HDR, which is sufficient to achieve a longitudinal overlap of approximately 90% per exposure time at 1,000 m above ground at a velocity of 180 km/h. Direct georeferencing and multitemporal monitoring without the need of ground control points is possible due to the use of a high end GPS/INS system, a stable calibrated inner geometry of the camera heads and a fully photogrammetric workflow at DLR. In 2014 a survey was performed on the Nepalese side of the Himalayas. The remote sensing system was carried in a wingpod by a Stemme S10 motor glider. Amongst other targets, the Seti Valley, Kali-Gandaki Valley and the Mt. Everest/Khumbu Region were imaged at altitudes up to 9,200 m. Products such as dense point clouds, DSMs and true orthomosaics with a ground pixel resolution of up to 15 cm were produced in regions and outcrops normally inaccessible to

  7. In this medium close-up view, captured by an Electronic Still Camera (ESC), the Spartan 207

    NASA Technical Reports Server (NTRS)

    1996-01-01

    STS-77 ESC VIEW --- In this medium close-up view, captured by an Electronic Still Camera (ESC), the Spartan 207 free-flyer is held in the grasp of the Space Shuttle Endeavour's Remote Manipulator System (RMS) following its re-capture on May 21, 1996. The six-member crew has spent a portion of the early stages of the mission in various activities involving the Spartan 207 and the related Inflatable Antenna Experiment (IAE). The Spartan project is managed by NASA's Goddard Space Flight Center (GSFC) for NASA's Office of Space Science, Washington, D.C. GMT: 09:38:05.

  8. MATERIALS TESTING REACTOR (MTR) BUILDING, TRA603. CONTEXTUAL VIEW OF MTR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    MATERIALS TESTING REACTOR (MTR) BUILDING, TRA-603. CONTEXTUAL VIEW OF MTR BUILDING SHOWING NORTH SIDES OF THE HIGH-BAY REACTOR BUILDING, ITS SECOND/THIRD FLOOR BALCONY LEVEL, AND THE ATTACHED ONE-STORY OFFICE/LABORATORY BUILDING, TRA-604. CAMERA FACING SOUTHEAST. VERTICAL CONCRETE-SHROUDED BEAMS SUPPORT PRECAST CONCRETE PANELS. CONCRETE PROJECTION FORMED AS A BUNKER AT LEFT OF VIEW IS TRA-657, PLUG STORAGE BUILDING. INL NEGATIVE NO. HD46-42-1. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  9. Geometrical distortion calibration of the stereo camera for the BepiColombo mission to Mercury

    NASA Astrophysics Data System (ADS)

    Simioni, Emanuele; Da Deppo, Vania; Re, Cristina; Naletto, Giampiero; Martellato, Elena; Borrelli, Donato; Dami, Michele; Aroldi, Gianluca; Ficai Veltroni, Iacopo; Cremonese, Gabriele

    2016-07-01

    The ESA-JAXA mission BepiColombo that will be launched in 2018 is devoted to the observation of Mercury, the innermost planet of the Solar System. SIMBIOSYS is its remote sensing suite, which consists of three instruments: the High Resolution Imaging Channel (HRIC), the Visible and Infrared Hyperspectral Imager (VIHI), and the Stereo Imaging Channel (STC). The latter will provide the global three dimensional reconstruction of the Mercury surface, and it represents the first push-frame stereo camera on board of a space satellite. Based on a new telescope design, STC combines the advantages of a compact single detector camera to the convenience of a double direction acquisition system; this solution allows to minimize mass and volume performing a push-frame imaging acquisition. The shared camera sensor is divided in six portions: four are covered with suitable filters; the others, one looking forward and one backwards with respect to nadir direction, are covered with a panchromatic filter supplying stereo image pairs of the planet surface. The main STC scientific requirements are to reconstruct in 3D the Mercury surface with a vertical accuracy better than 80 m and performing a global imaging with a grid size of 65 m along-track at the periherm. Scope of this work is to present the on-ground geometric calibration pipeline for this original instrument. The selected STC off-axis configuration forced to develop a new distortion map model. Additional considerations are connected to the detector, a Si-Pin hybrid CMOS, which is characterized by a high fixed pattern noise. This had a great impact in pre-calibration phases compelling to use a not common approach to the definition of the spot centroids in the distortion calibration process. This work presents the results obtained during the calibration of STC concerning the distortion analysis for three different temperatures. These results are then used to define the corresponding distortion model of the camera.

  10. Exploring point-cloud features from partial body views for gender classification

    NASA Astrophysics Data System (ADS)

    Fouts, Aaron; McCoppin, Ryan; Rizki, Mateen; Tamburino, Louis; Mendoza-Schrock, Olga

    2012-06-01

    In this paper we extend a previous exploration of histogram features extracted from 3D point cloud images of human subjects for gender discrimination. Feature extraction used a collection of concentric cylinders to define volumes for counting 3D points. The histogram features are characterized by a rotational axis and a selected set of volumes derived from the concentric cylinders. The point cloud images are drawn from the CAESAR anthropometric database provided by the Air Force Research Laboratory (AFRL) Human Effectiveness Directorate and SAE International. This database contains approximately 4400 high resolution LIDAR whole body scans of carefully posed human subjects. Success from our previous investigation was based on extracting features from full body coverage which required integration of multiple camera images. With the full body coverage, the central vertical body axis and orientation are readily obtainable; however, this is not the case with a one camera view providing less than one half body coverage. Assuming that the subjects are upright, we need to determine or estimate the position of the vertical axis and the orientation of the body about this axis relative to the camera. In past experiments the vertical axis was located through the center of mass of torso points projected on the ground plane and the body orientation derived using principle component analysis. In a natural extension of our previous work to partial body views, the absence of rotational invariance about the cylindrical axis greatly increases the difficulty for gender classification. Even the problem of estimating the axis is no longer simple. We describe some simple feasibility experiments that use partial image histograms. Here, the cylindrical axis is assumed to be known. We also discuss experiments with full body images that explore the sensitivity of classification accuracy relative to displacements of the cylindrical axis. Our initial results provide the basis for further

  11. IMAX camera in payload bay

    NASA Image and Video Library

    1995-12-20

    STS074-361-035 (12-20 Nov 1995) --- This medium close-up view centers on the IMAX Cargo Bay Camera (ICBC) and its associated IMAX Camera Container Equipment (ICCE) at its position in the cargo bay of the Earth-orbiting Space Shuttle Atlantis. With its own ?space suit? or protective covering to protect it from the rigors of space, this version of the IMAX was able to record scenes not accessible with the in-cabin cameras. For docking and undocking activities involving Russia?s Mir Space Station and the Space Shuttle Atlantis, the camera joined a variety of in-cabin camera hardware in recording the historical events. IMAX?s secondary objectives were to film Earth views. The IMAX project is a collaboration between NASA, the Smithsonian Institution?s National Air and Space Museum (NASM), IMAX Systems Corporation, and the Lockheed Corporation to document significant space activities and promote NASA?s educational goals using the IMAX film medium.

  12. Sun-view angle effects on reflectance factors of corn canopies

    NASA Technical Reports Server (NTRS)

    Ranson, K. J.; Daughtry, C. S. T.; Biehl, L. L.; Bauer, M. E.

    1985-01-01

    The effects of sun and view angles on reflectance factors of corn (Zea mays L.) canopies ranging from the six leaf stage to harvest maturity were studied on the Purdue University Agronomy Farm by a multiband radiometer. The two methods of acquiring spectral data, the truck system and the tower systrem, are described. The analysis of the spectral data is presented in three parts: solar angle effects on reflectance factors viewed at nadir; solar angle effects on reflectance factors viewed at a fixed sun angle; and both sun and view angles effect on reflectance factors. The analysis revealed that for nadir-viewed reflectance factors there is a strong solar angle dependence in all spectral bands for canopies with low leaf area index. Reflectance factors observed from the sun angle at different view azimuth angles showed that the position of the sensor relative to the sun is important in determining angular reflectance characteristics. For both sun and view angles, reflectance factors are maximized when the sensor view direction is towards the sun.

  13. NPP VIIRS and Aqua MODIS RSB Comparison Using Observations from Simultaneous Nadir Overpasses (SNO)

    NASA Technical Reports Server (NTRS)

    Xiong, X.; Wu, A.

    2012-01-01

    Suomi NPP (National Polar-orbiting Partnership) satellite (http://npp.gsfc.nasa.gov/viirs.html) began to daily collect global data following its successful launch on October 28, 2011. The Visible Infrared Imaging Radiometer Suite (VIIRS) is a key NPP sensor. Similar to the design of the OLS, SeaWiFS and MODIS instruments, VIIRS has on-board calibration components including a solar diffuser (SD) and a solar diffuser stability monitor (SDSM) for the reflective solar bands (RSB), a V-groove blackbody for the thermal emissive bands (TEB), and a space view (SV) port for background subtraction. Immediately after the VIIRS nadir door s opening on November 21, 2011, anomalously large degradation in the SD response was identified in the near-IR wavelength region, which was unexpected as decreases in the SD reflectance usually occur gradually in the blue (0.4 m) wavelength region based on past experience. In this study, we use a well-calibrated Aqua MODIS as reference to track and evaluate VIIRS RSB stability and performance. Reflectances observed by both sensors from simultaneous nadir overpasses (SNO) are used to determine VIIRS to MODIS reflectance ratios for their spectral matching bands. Results of this study provide an immediate post-launch assessment, independent validation of the anomalous degradation observed in SD measurements at near-IR wavelengths and initial analysis of calibration stability and consistency.

  14. Integrating Chlorophyll fapar and Nadir Photochemical Reflectance Index from EO-1/Hyperion to Predict Cornfield Daily Gross Primary Production

    NASA Technical Reports Server (NTRS)

    Zhang, Qingyuan; Middleton, Elizabeth M.; Cheng, Yen-Ben; Huemmrich, K. Fred; Cook, Bruce D.; Corp, Lawrence A.; Kustas, William P.; Russ, Andrew L.; Prueger, John H.; Yao, Tian

    2016-01-01

    The concept of light use efficiency (Epsilon) and the concept of fraction of photosynthetically active ration (PAR) absorbed for vegetation photosynthesis (PSN), i.e., fAPAR (sub PSN), have been widely utilized to estimate vegetation gross primary productivity (GPP). It has been demonstrated that the photochemical reflectance index (PRI) is empirically related to e. An experimental US Department of Agriculture (USDA) cornfield in Maryland was selected as our study field. We explored the potential of integrating fAPAR(sub chl) (defined as the fraction of PAR absorbed by chlorophyll) and nadir PRI (PRI(sub nadir)) to predict cornfield daily GPP. We acquired nadir or near-nadir EO-1/Hyperion satellite images that covered the cornfield and took nadir in-situ field spectral measurements. Those data were used to derive the PRI(sub nadir) and fAPAR (sub chl). The fAPAR (sub chl) is retrieved with the advanced radiative transfer model PROSAIL2 and the Metropolis approach, a type of Markov Chain Monte Carlo (MCMC) estimation procedure. We define chlorophyll light use efficiency Epsilon (sub chl) as the ratio of vegetation GPP as measured by eddy covariance techniques to PAR absorbed by chlorophyll (Epsilon(sub chl) = GPP/APAR (sub chl). Daily Epsilon (sub chl) retrieved with the EO-1 Hyperion images was regressed with a linear equation of PRI (sub nadir) Epsilon (sub chl) = Alpha × PRI (sub nadir) + Beta). The satellite Epsilon(sub chl- PRI (sub nadir) linear relationship for the cornfield was implemented to develop an integrated daily GPP model [GPP = (Alpha × PRI(sub nadir) + Beta) × fAPAR (sub chl) × PAR], which was evaluated with fAPAR (sub chl) and PRI (sub nadir) retrieved from field measurements. Daily GPP estimated with this fAPAR (sub chl-) PRI (nadir) integration model was strongly correlated with the observed tower in-situ daily GPP (R(sup 2) = 0.93); with a root mean square error (RMSE) of 1.71 g C mol-(sup -1) PPFD and coefficient of variation (CV) of 16

  15. Detail view of the vertical stabilizer of the Orbiter Discovery ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail view of the vertical stabilizer of the Orbiter Discovery looking at the two-piece rudder which is used to control the yaw position of orbiter on approach and landing in earth's atmosphere and upon landing the two-piece rudder splays open to both sides of the stabilizer to act as an air brake to help slow the craft to a stop. Note the thermal protection system components with the white Advanced Flexible Reusable Surface Insulation Blanket and the black High-temperature Reusable Surface Insulation tiles along the outer edges (HRSI tiles). The marks seen on the HRSI tiles are injection point marks and holes for the application of waterproofing material. This view was taken from a service platform in the Orbiter Processing Facility at Kennedy Space Center. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX

  16. The Twin Peaks in 3-D, as Viewed by the Mars Pathfinder IMP Camera

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The Twin Peaks are modest-size hills to the southwest of the Mars Pathfinder landing site. They were discovered on the first panoramas taken by the IMP camera on the 4th of July, 1997, and subsequently identified in Viking Orbiter images taken over 20 years ago. The peaks are approximately 30-35 meters (-100 feet) tall. North Twin is approximately 860 meters (2800 feet) from the lander, and South Twin is about a kilometer away (3300 feet). The scene includes bouldery ridges and swales or 'hummocks' of flood debris that range from a few tens of meters away from the lander to the distance of the South Twin Peak. The large rock at the right edge of the scene is nicknamed 'Hippo'. This rock is about a meter (3 feet) across and 25 meters (80 feet) distant.

    This view of the Twin Peaks was produced by combining 4 individual 'Superpan' scenes from the left and right eyes of the IMP camera to cover both peaks. Each frame consists of 8 individual frames (left eye) and 7 frames (right eye) taken with different color filters that were enlarged by 500% and then co-added using Adobe Photoshop to produce, in effect, a super-resolution pancromatic frame that is sharper than an individual frame would be.

    The anaglyph view of the Twin Peaks was produced by combining the left and right eye mosaics (above) by assigning the left eye view to the red color plane and the right eye view to the green and blue color planes (cyan), to produce a stereo anaglyph mosaic. This mosaic can be viewed in 3-D on your computer monitor or in color print form by wearing red-blue 3-D glasses.

    Mars Pathfinder is the second in NASA's Discovery program of low-cost spacecraft with highly focused science goals. The Jet Propulsion Laboratory, Pasadena, CA, developed and manages the Mars Pathfinder mission for NASA's Office of Space Science, Washington, D.C. JPL is a division of the California Institute of Technology (Caltech). The IMP was developed by the University of Arizona Lunar and Planetary

  17. 2016 Summer Olympic Games Site

    Atmospheric Science Data Center

    2016-12-30

    article title:  Site of the 2016 Summer Olympic Games viewed by NASA's MISR     ... 2, 2016, just prior to the opening of the Summer Olympic Games. On the left is an image from MISR's nadir (downward-looking) camera; the ...

  18. GRACE star camera noise

    NASA Astrophysics Data System (ADS)

    Harvey, Nate

    2016-08-01

    Extending results from previous work by Bandikova et al. (2012) and Inacio et al. (2015), this paper analyzes Gravity Recovery and Climate Experiment (GRACE) star camera attitude measurement noise by processing inter-camera quaternions from 2003 to 2015. We describe a correction to star camera data, which will eliminate a several-arcsec twice-per-rev error with daily modulation, currently visible in the auto-covariance function of the inter-camera quaternion, from future GRACE Level-1B product releases. We also present evidence supporting the argument that thermal conditions/settings affect long-term inter-camera attitude biases by at least tens-of-arcsecs, and that several-to-tens-of-arcsecs per-rev star camera errors depend largely on field-of-view.

  19. Stereoscopic camera and viewing systems with undistorted depth presentation and reduced or eliminated erroneous acceleration and deceleration perceptions, or with perceptions produced or enhanced for special effects

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B. (Inventor)

    1991-01-01

    Methods for providing stereoscopic image presentation and stereoscopic configurations using stereoscopic viewing systems having converged or parallel cameras may be set up to reduce or eliminate erroneously perceived accelerations and decelerations by proper selection of parameters, such as an image magnification factor, q, and intercamera distance, 2w. For converged cameras, q is selected to be equal to Ve - qwl = 0, where V is the camera distance, e is half the interocular distance of an observer, w is half the intercamera distance, and l is the actual distance from the first nodal point of each camera to the convergence point, and for parallel cameras, q is selected to be equal to e/w. While converged cameras cannot be set up to provide fully undistorted three-dimensional views, they can be set up to provide a linear relationship between real and apparent depth and thus minimize erroneously perceived accelerations and decelerations for three sagittal planes, x = -w, x = 0, and x = +w which are indicated to the observer. Parallel cameras can be set up to provide fully undistorted three-dimensional views by controlling the location of the observer and by magnification and shifting of left and right images. In addition, the teachings of this disclosure can be used to provide methods of stereoscopic image presentation and stereoscopic camera configurations to produce a nonlinear relation between perceived and real depth, and erroneously produce or enhance perceived accelerations and decelerations in order to provide special effects for entertainment, training, or educational purposes.

  20. View of portion of Mediterranean Coast of Turkey and Syria

    NASA Image and Video Library

    1975-07-20

    AST-16-1268 (20 July 1975) --- A near vertical view of a portion of the Mediterranean coast of Turkey and Syria, as photographed from the Apollo spacecraft in Earth orbit during the joint U.S-USSR Apollo-Soyuz Test Project mission. This view covers the Levant Coast north of Beirut, showing the cities of Aleppo, Hamah, Homs and Latakia. The Levantine rift bends to the northeast. This picture was taken with a 70mm Hasselblad camera using high-definition aerial Ektachrome SO-242 type film. The altitude of the spacecraft was 225 kilometers (140 statute miles) when this photograph was taken.

  1. The Brazilian wide field imaging camera (WFI) for the China/Brazil earth resources satellite: CBERS 3 and 4

    NASA Astrophysics Data System (ADS)

    Scaduto, L. C. N.; Carvalho, E. G.; Modugno, R. G.; Cartolano, R.; Evangelista, S. H.; Segoria, D.; Santos, A. G.; Stefani, M. A.; Castro Neto, J. C.

    2017-11-01

    The purpose of this paper is to present the optical system developed for the Wide Field imaging Camera - WFI that will be integrated to the CBERS 3 and 4 satellites (China Brazil Earth resources Satellite). This camera will be used for remote sensing of the Earth and it is aimed to work at an altitude of 778 km. The optical system is designed for four spectral bands covering the range of wavelengths from blue to near infrared and its field of view is +/-28.63°, which covers 866 km, with a ground resolution of 64 m at nadir. WFI has been developed through a consortium formed by Opto Electrônica S. A. and Equatorial Sistemas. In particular, we will present the optical analysis based on the Modulation Transfer Function (MTF) obtained during the Engineering Model phase (EM) and the optical tests performed to evaluate the requirements. Measurements of the optical system MTF have been performed using an interferometer at the wavelength of 632.8nm and global MTF tests (including the CCD and signal processing electronic) have been performed by using a collimator with a slit target. The obtained results showed that the performance of the optical system meets the requirements of project.

  2. Situational Awareness from a Low-Cost Camera System

    NASA Technical Reports Server (NTRS)

    Freudinger, Lawrence C.; Ward, David; Lesage, John

    2010-01-01

    A method gathers scene information from a low-cost camera system. Existing surveillance systems using sufficient cameras for continuous coverage of a large field necessarily generate enormous amounts of raw data. Digitizing and channeling that data to a central computer and processing it in real time is difficult when using low-cost, commercially available components. A newly developed system is located on a combined power and data wire to form a string-of-lights camera system. Each camera is accessible through this network interface using standard TCP/IP networking protocols. The cameras more closely resemble cell-phone cameras than traditional security camera systems. Processing capabilities are built directly onto the camera backplane, which helps maintain a low cost. The low power requirements of each camera allow the creation of a single imaging system comprising over 100 cameras. Each camera has built-in processing capabilities to detect events and cooperatively share this information with neighboring cameras. The location of the event is reported to the host computer in Cartesian coordinates computed from data correlation across multiple cameras. In this way, events in the field of view can present low-bandwidth information to the host rather than high-bandwidth bitmap data constantly being generated by the cameras. This approach offers greater flexibility than conventional systems, without compromising performance through using many small, low-cost cameras with overlapping fields of view. This means significant increased viewing without ignoring surveillance areas, which can occur when pan, tilt, and zoom cameras look away. Additionally, due to the sharing of a single cable for power and data, the installation costs are lower. The technology is targeted toward 3D scene extraction and automatic target tracking for military and commercial applications. Security systems and environmental/ vehicular monitoring systems are also potential applications.

  3. NAOMI instrument: a product line of compact and versatile cameras designed for high resolution missions in Earth observation

    NASA Astrophysics Data System (ADS)

    Luquet, Ph.; Chikouche, A.; Benbouzid, A. B.; Arnoux, J. J.; Chinal, E.; Massol, C.; Rouchit, P.; De Zotti, S.

    2017-11-01

    EADS Astrium is currently developing a new product line of compact and versatile instruments for high resolution missions in Earth Observation. First version has been developed in the frame of the ALSAT-2 contract awarded by the Algerian Space Agency (ASAL) to EADS Astrium. The Silicon Carbide Korsch-type telescope coupled with a multilines detector array offers a 2.5 m GSD in PAN band at Nadir @ 680 km altitude (10 m GSD in the four multispectral bands) with a 17.5 km swath width. This compact camera - 340 (W) x 460 (L) x 510 (H) mm3, 13 kg - is embarked on a Myriade-type small platform. The electronics unit accommodates video, housekeeping, and thermal control functions and also a 64 Gbit mass memory. Two satellites are developed; the first one is planned to be launched on mid 2009. Several other versions of the instrument have already been defined with enhanced resolution or/and larger field of view.

  4. Constrained space camera assembly

    DOEpatents

    Heckendorn, F.M.; Anderson, E.K.; Robinson, C.W.; Haynes, H.B.

    1999-05-11

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity is disclosed. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras. 17 figs.

  5. Constrained space camera assembly

    DOEpatents

    Heckendorn, Frank M.; Anderson, Erin K.; Robinson, Casandra W.; Haynes, Harriet B.

    1999-01-01

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras.

  6. Surgical Videos with Synchronised Vertical 2-Split Screens Recording the Surgeons' Hand Movement.

    PubMed

    Kaneko, Hiroki; Ra, Eimei; Kawano, Kenichi; Yasukawa, Tsutomu; Takayama, Kei; Iwase, Takeshi; Terasaki, Hiroko

    2015-01-01

    To improve the state-of-the-art teaching system by creating surgical videos with synchronised vertical 2-split screens. An ultra-compact, wide-angle point-of-view camcorder (HX-A1, Panasonic) was mounted on the surgical microscope focusing mostly on the surgeons' hand movements. In combination with the regular surgical videos obtained from the CCD camera in the surgical microscope, synchronised vertical 2-split-screen surgical videos were generated with the video-editing software. Using synchronised vertical 2-split-screen videos, residents of the ophthalmology department could watch and learn how assistant surgeons controlled the eyeball, while the main surgeons performed scleral buckling surgery. In vitrectomy, the synchronised vertical 2-split-screen videos showed the surgeons' hands holding the instruments and moving roughly and boldly, in contrast to the very delicate movements of the vitrectomy instruments inside the eye. Synchronised vertical 2-split-screen surgical videos are beneficial for the education of young surgical trainees when learning surgical skills including the surgeons' hand movements. © 2015 S. Karger AG, Basel.

  7. Object tracking using multiple camera video streams

    NASA Astrophysics Data System (ADS)

    Mehrubeoglu, Mehrube; Rojas, Diego; McLauchlan, Lifford

    2010-05-01

    Two synchronized cameras are utilized to obtain independent video streams to detect moving objects from two different viewing angles. The video frames are directly correlated in time. Moving objects in image frames from the two cameras are identified and tagged for tracking. One advantage of such a system involves overcoming effects of occlusions that could result in an object in partial or full view in one camera, when the same object is fully visible in another camera. Object registration is achieved by determining the location of common features in the moving object across simultaneous frames. Perspective differences are adjusted. Combining information from images from multiple cameras increases robustness of the tracking process. Motion tracking is achieved by determining anomalies caused by the objects' movement across frames in time in each and the combined video information. The path of each object is determined heuristically. Accuracy of detection is dependent on the speed of the object as well as variations in direction of motion. Fast cameras increase accuracy but limit the speed and complexity of the algorithm. Such an imaging system has applications in traffic analysis, surveillance and security, as well as object modeling from multi-view images. The system can easily be expanded by increasing the number of cameras such that there is an overlap between the scenes from at least two cameras in proximity. An object can then be tracked long distances or across multiple cameras continuously, applicable, for example, in wireless sensor networks for surveillance or navigation.

  8. Multi-Angle Snowflake Camera Instrument Handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stuefer, Martin; Bailey, J.

    2016-07-01

    The Multi-Angle Snowflake Camera (MASC) takes 9- to 37-micron resolution stereographic photographs of free-falling hydrometers from three angles, while simultaneously measuring their fall speed. Information about hydrometeor size, shape orientation, and aspect ratio is derived from MASC photographs. The instrument consists of three commercial cameras separated by angles of 36º. Each camera field of view is aligned to have a common single focus point about 10 cm distant from the cameras. Two near-infrared emitter pairs are aligned with the camera’s field of view within a 10-angular ring and detect hydrometeor passage, with the lower emitters configured to trigger the MASCmore » cameras. The sensitive IR motion sensors are designed to filter out slow variations in ambient light. Fall speed is derived from successive triggers along the fall path. The camera exposure times are extremely short, in the range of 1/25,000th of a second, enabling the MASC to capture snowflake sizes ranging from 30 micrometers to 3 cm.« less

  9. TES/Aura L2 Ozone (O3) Nadir V6 (TL2O3NS)

    Atmospheric Science Data Center

    2018-01-22

    TES/Aura L2 Ozone (O3) Nadir (TL2O3NS) News:  TES News Join ... Project Title:  TES Discipline:  Tropospheric Composition Version:  V6 Level:  L2 Platform:  TES/Aura L2 Ozone Spatial Coverage:  5.3 x 8.5 km nadir ...

  10. TES/Aura L2 Ozone (O3) Nadir V6 (TL2O3N)

    Atmospheric Science Data Center

    2018-01-18

    TES/Aura L2 Ozone (O3) Nadir (TL2O3N) News:  TES News Join ... Project Title:  TES Discipline:  Tropospheric Composition Version:  V6 Level:  L2 Platform:  TES/Aura L2 Ozone Spatial Coverage:  5.3 x 8.5 km nadir ...

  11. Height and Motion of the Chikurachki Eruption Plume

    NASA Technical Reports Server (NTRS)

    2003-01-01

    The height and motion of the ash and gas plume from the April 22, 2003, eruption of the Chikurachki volcano is portrayed in these views from the Multi-angle Imaging SpectroRadiometer (MISR). Situated within the northern portion of the volcanically active Kuril Island group, the Chikurachki volcano is an active stratovolcano on Russia's Paramushir Island (just south of the Kamchatka Peninsula).

    In the upper panel of the still image pair, this scene is displayed as a natural-color view from MISR's vertical-viewing (nadir) camera. The white and brownish-grey plume streaks several hundred kilometers from the eastern edge of Paramushir Island toward the southeast. The darker areas of the plume typically indicate volcanic ash, while the white portions of the plume indicate entrained water droplets and ice. According to the Kamchatkan Volcanic Eruptions Response Team (KVERT), the temperature of the plume near the volcano on April 22 was -12o C.

    The lower panel shows heights derived from automated stereoscopic processing of MISR's multi-angle imagery, in which the plume is determined to reach heights of about 2.5 kilometers above sea level. Heights for clouds above and below the eruption plume were also retrieved, including the high-altitude cirrus clouds in the lower left (orange pixels). The distinctive patterns of these features provide sufficient spatial contrast for MISR's stereo height retrieval to perform automated feature matching between the images acquired at different view angles. Places where clouds or other factors precluded a height retrieval are shown in dark gray.

    The multi-angle 'fly-over' animation (below) allows the motion of the plume and of the surrounding clouds to be directly observed. The frames of the animation consist of data acquired by the 70-degree, 60-degree, 46-degree and 26-degree forward-viewing cameras in sequence, followed by the images from the nadir camera and each of the four backward-viewing cameras, ending with the view

  12. Barrier Coverage for 3D Camera Sensor Networks

    PubMed Central

    Wu, Chengdong; Zhang, Yunzhou; Jia, Zixi; Ji, Peng; Chu, Hao

    2017-01-01

    Barrier coverage, an important research area with respect to camera sensor networks, consists of a number of camera sensors to detect intruders that pass through the barrier area. Existing works on barrier coverage such as local face-view barrier coverage and full-view barrier coverage typically assume that each intruder is considered as a point. However, the crucial feature (e.g., size) of the intruder should be taken into account in the real-world applications. In this paper, we propose a realistic resolution criterion based on a three-dimensional (3D) sensing model of a camera sensor for capturing the intruder’s face. Based on the new resolution criterion, we study the barrier coverage of a feasible deployment strategy in camera sensor networks. Performance results demonstrate that our barrier coverage with more practical considerations is capable of providing a desirable surveillance level. Moreover, compared with local face-view barrier coverage and full-view barrier coverage, our barrier coverage is more reasonable and closer to reality. To the best of our knowledge, our work is the first to propose barrier coverage for 3D camera sensor networks. PMID:28771167

  13. Barrier Coverage for 3D Camera Sensor Networks.

    PubMed

    Si, Pengju; Wu, Chengdong; Zhang, Yunzhou; Jia, Zixi; Ji, Peng; Chu, Hao

    2017-08-03

    Barrier coverage, an important research area with respect to camera sensor networks, consists of a number of camera sensors to detect intruders that pass through the barrier area. Existing works on barrier coverage such as local face-view barrier coverage and full-view barrier coverage typically assume that each intruder is considered as a point. However, the crucial feature (e.g., size) of the intruder should be taken into account in the real-world applications. In this paper, we propose a realistic resolution criterion based on a three-dimensional (3D) sensing model of a camera sensor for capturing the intruder's face. Based on the new resolution criterion, we study the barrier coverage of a feasible deployment strategy in camera sensor networks. Performance results demonstrate that our barrier coverage with more practical considerations is capable of providing a desirable surveillance level. Moreover, compared with local face-view barrier coverage and full-view barrier coverage, our barrier coverage is more reasonable and closer to reality. To the best of our knowledge, our work is the first to propose barrier coverage for 3D camera sensor networks.

  14. LSST Camera Optics Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riot, V J; Olivier, S; Bauman, B

    2012-05-24

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics willmore » meet their performance goals.« less

  15. Correlation of pretreatment clinical parameters and PSA nadir after high-intensity focused ultrasound (HIFU) for localised prostate cancer.

    PubMed

    Ganzer, Roman; Bründl, Johannes; Koch, Daniel; Wieland, Wolf F; Burger, Maximilian; Blana, Andreas

    2015-01-01

    To determine which pretreatment clinical parameters were predictive of a low prostate-specific antigen (PSA) nadir following high-intensity focused ultrasound (HIFU) treatment. Retrospective study of patients with clinically localised prostate cancer undergoing HIFU at a single centre between December 1997 and September 2009. Whole-gland treatment was applied. Patients also included if they had previously undergone transurethral resection of the prostate (TURP). TURP was also conducted simultaneously to HIFU. Biochemical failure based on Phoenix definition (PSA nadir + 2). Univariate and multivariate analysis of pretreatment clinical parameters conducted to assess those factors predictive of a PSA nadir ≤0.2 and >0.2 ng/ml. Mean (SD) follow-up was 6.2 (2.8) years; median (range) was 6.3 (1.1-12.2) years. Kaplan-Meier estimate of biochemical disease-free survival rate at 8 years was 83 and 48 % for patients achieving a PSA nadir of ≤0.2 and >0.2 ng/ml, respectively. Prostate volume and incidental finding of cancer were significant predictors of low PSA nadir (≤0.2 ng/ml). Prostate volume and incidental finding of cancer could be predictors for oncologic success of HIFU based on post-treatment PSA nadir.

  16. SEOS frame camera applications study

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A research and development satellite is discussed which will provide opportunities for observation of transient phenomena that fall within the fixed viewing circle of the spacecraft. The evaluation of possible applications for frame cameras, for SEOS, are studied. The computed lens characteristics for each camera are listed.

  17. TES/Aura L2 Ammonia (NH3) Nadir V6 (TL2NH3N)

    Atmospheric Science Data Center

    2018-01-18

    TES/Aura L2 Ammonia (NH3) Nadir (TL2NH3N) News:  TES News ... Level:  L2 Instrument:  TES/Aura L2 Ammonia Spatial Coverage:  5.3 x 8.5 km nadir ... Contact User Services Parameters:  Ammonia Legacy:  Retired data product , click here for ...

  18. TES/Aura L2 Ammonia (NH3) Nadir V6 (TL2NH3NS)

    Atmospheric Science Data Center

    2018-01-22

    TES/Aura L2 Ammonia (NH3) Nadir (TL2NH3NS) News:  TES News ... Level:  L2 Platform:  TES/Aura L2 Ammonia Spatial Coverage:  5.3 x 8.5 km nadir ... Contact ASDC User Services Parameters:  Ammonia Legacy:  Retired data product , click here for ...

  19. A stroboscopic technique for using CCD cameras in flow visualization systems for continuous viewing and stop action photography

    NASA Technical Reports Server (NTRS)

    Franke, John M.; Rhodes, David B.; Jones, Stephen B.; Dismond, Harriet R.

    1992-01-01

    A technique for synchronizing a pulse light source to charge coupled device cameras is presented. The technique permits the use of pulse light sources for continuous as well as stop action flow visualization. The technique has eliminated the need to provide separate lighting systems at facilities requiring continuous and stop action viewing or photography.

  20. Fast-camera imaging on the W7-X stellarator

    NASA Astrophysics Data System (ADS)

    Ballinger, S. B.; Terry, J. L.; Baek, S. G.; Tang, K.; Grulke, O.

    2017-10-01

    Fast cameras recording in the visible range have been used to study filamentary (``blob'') edge turbulence in tokamak plasmas, revealing that emissive filaments aligned with the magnetic field can propagate perpendicular to it at speeds on the order of 1 km/s in the SOL or private flux region. The motion of these filaments has been studied in several tokamaks, including MAST, NSTX, and Alcator C-Mod. Filaments were also observed in the W7-X Stellarator using fast cameras during its initial run campaign. For W7-X's upcoming 2017-18 run campaign, we have installed a Phantom V710 fast camera with a view of the machine cross section and part of a divertor module in order to continue studying edge and divertor filaments. The view is coupled to the camera via a coherent fiber bundle. The Phantom camera is able to record at up to 400,000 frames per second and has a spatial resolution of roughly 2 cm in the view. A beam-splitter is used to share the view with a slower machine-protection camera. Stepping-motor actuators tilt the beam-splitter about two orthogonal axes, making it possible to frame user-defined sub-regions anywhere within the view. The diagnostic has been prepared to be remotely controlled via MDSplus. The MIT portion of this work is supported by US DOE award DE-SC0014251.

  1. China Dust

    Atmospheric Science Data Center

    2013-04-16

    ... SpectroRadiometer (MISR) nadir-camera images of eastern China compare a somewhat hazy summer view from July 9, 2000 (left) with a ... arid and sparsely vegetated surfaces of Mongolia and western China pick up large quantities of yellow dust. Airborne dust clouds from the ...

  2. New Archiving Distributed InfrastructuRe (NADIR): Status and Evolution

    NASA Astrophysics Data System (ADS)

    De Marco, M.; Knapic, C.; Smareglia, R.

    2015-09-01

    The New Archiving Distributed InfrastructuRe (NADIR) has been developed at INAF-OATs IA2 (Italian National Institute for Astrophysics - Astronomical Observatory of Trieste, Italian center of Astronomical Archives), as an evolution of the previous archiving and distribution system, used on several telescopes (LBT, TNG, Asiago, etc.) to improve performance, efficiency and reliability. At the present, NADIR system is running on LBT telescope and Vespa (Italian telescopes network for outreach) Ramella et al. (2014), and will be used on TNG, Asiago and IRA (Istituto Radio Astronomia) archives of Medicina, Noto and SRT radio telescopes Zanichelli et al. (2014) as the data models for radio data will be ready. This paper will discuss the progress status, the architectural choices and the solutions adopted, during the development and the commissioning phase of the project. A special attention will be given to the LBT case, due to some critical aspect of data flow and policies and standards compliance, adopted by the LBT organization.

  3. Vertical view Apollo 16 Descartes landing sites as photographed by Apollo 14

    NASA Image and Video Library

    1971-01-12

    S72-00147 (January 1972) --- An almost vertical view of the Apollo 16 Descartes landing area, as photographed from the Apollo 14 spacecraft. Overlays are provided to point out extravehicular activity (EVA) Lunar Roving Vehicle (LRV) traverse routes and the nicknames of features. Hold picture with South Ray Crater in lower left corner. North will then be at the top. The Roman numerals indicate EVA numbers and the Arabic numbers point out stations or traverse stops.

  4. Saskatchewan and Manitoba

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Surface brightness contrasts accentuated by a thin layer of snow enable a network of rivers, roads, and farmland boundaries to stand out clearly in these MISR images of southeastern Saskatchewan and southwestern Manitoba. The lefthand image is a multi-spectral false-color view made from the near-infrared, red, and green bands of MISR's vertical-viewing (nadir) camera. The righthand image is a multi-angle false-color view made from the red band data of the 60-degree aftward camera, the nadir camera, and the 60-degree forward camera. In each image, the selected channels are displayed as red, green, and blue, respectively. The data were acquired April 17, 2001 during Terra orbit 7083, and cover an area measuring about 285 kilometers x 400 kilometers. North is at the top.

    The junction of the Assiniboine and Qu'Apelle Rivers in the bottom part of the images is just east of the Saskatchewan-Manitoba border. During the growing season, the rich, fertile soils in this area support numerous fields of wheat, canola, barley, flaxseed, and rye. Beef cattle are raised in fenced pastures. To the north, the terrain becomes more rocky and forested. Many frozen lakes are visible as white patches in the top right. The narrow linear, north-south trending patterns about a third of the way down from the upper right corner are snow-filled depressions alternating with vegetated ridges, most probably carved by glacial flow.

    In the lefthand image, vegetation appears in shades of red, owing to its high near-infrared reflectivity. In the righthand image, several forested regions are clearly visible in green hues. Since this is a multi-angle composite, the green arises not from the color of the leaves but from the architecture of the surface cover. Progressing southeastward along the Manitoba Escarpment, the forested areas include the Pasquia Hills, the Porcupine Hills, Duck Mountain Provincial Park, and Riding Mountain National Park. The forests are brighter in the nadir than at the

  5. Protective laser beam viewing device

    DOEpatents

    Neil, George R.; Jordan, Kevin Carl

    2012-12-18

    A protective laser beam viewing system or device including a camera selectively sensitive to laser light wavelengths and a viewing screen receiving images from the laser sensitive camera. According to a preferred embodiment of the invention, the camera is worn on the head of the user or incorporated into a goggle-type viewing display so that it is always aimed at the area of viewing interest to the user and the viewing screen is incorporated into a video display worn as goggles over the eyes of the user.

  6. Optimising Camera Traps for Monitoring Small Mammals

    PubMed Central

    Glen, Alistair S.; Cockburn, Stuart; Nichols, Margaret; Ekanayake, Jagath; Warburton, Bruce

    2013-01-01

    Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera’s field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1) trigger speed, 2) passive infrared vs. microwave sensor, 3) white vs. infrared flash, and 4) still photographs vs. video. We also tested a new approach to standardise each camera’s field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats ( Mustela erminea ), feral cats (Felis catus) and hedgehogs ( Erinaceus europaeus ). Trigger speeds of 0.2–2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera’s field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps. PMID:23840790

  7. CAOS-CMOS camera.

    PubMed

    Riza, Nabeel A; La Torre, Juan Pablo; Amin, M Junaid

    2016-06-13

    Proposed and experimentally demonstrated is the CAOS-CMOS camera design that combines the coded access optical sensor (CAOS) imager platform with the CMOS multi-pixel optical sensor. The unique CAOS-CMOS camera engages the classic CMOS sensor light staring mode with the time-frequency-space agile pixel CAOS imager mode within one programmable optical unit to realize a high dynamic range imager for extreme light contrast conditions. The experimentally demonstrated CAOS-CMOS camera is built using a digital micromirror device, a silicon point-photo-detector with a variable gain amplifier, and a silicon CMOS sensor with a maximum rated 51.3 dB dynamic range. White light imaging of three different brightness simultaneously viewed targets, that is not possible by the CMOS sensor, is achieved by the CAOS-CMOS camera demonstrating an 82.06 dB dynamic range. Applications for the camera include industrial machine vision, welding, laser analysis, automotive, night vision, surveillance and multispectral military systems.

  8. Calibrating nadir striped artifacts in a multibeam backscatter image using the equal mean-variance fitting model

    NASA Astrophysics Data System (ADS)

    Yang, Fanlin; Zhao, Chunxia; Zhang, Kai; Feng, Chengkai; Ma, Yue

    2017-07-01

    Acoustic seafloor classification with multibeam backscatter measurements is an attractive approach for mapping seafloor properties over a large area. However, artifacts in the multibeam backscatter measurements prevent accurate characterization of the seafloor. In particular, the backscatter level is extremely strong and highly variable in the near-nadir region due to the specular echo phenomenon. Consequently, striped artifacts emerge in the backscatter image, which can degrade the classification accuracy. This study focuses on the striped artifacts in multibeam backscatter images. To this end, a calibration algorithm based on equal mean-variance fitting is developed. By fitting the local shape of the angular response curve, the striped artifacts are compressed and moved according to the relations between the mean and variance in the near-nadir and off-nadir region. The algorithm utilized the measured data of near-nadir region and retained the basic shape of the response curve. The experimental results verify the high performance of the proposed method.

  9. Uncooled radiometric camera performance

    NASA Astrophysics Data System (ADS)

    Meyer, Bill; Hoelter, T.

    1998-07-01

    Thermal imaging equipment utilizing microbolometer detectors operating at room temperature has found widespread acceptance in both military and commercial applications. Uncooled camera products are becoming effective solutions to applications currently using traditional, photonic infrared sensors. The reduced power consumption and decreased mechanical complexity offered by uncooled cameras have realized highly reliable, low-cost, hand-held instruments. Initially these instruments displayed only relative temperature differences which limited their usefulness in applications such as Thermography. Radiometrically calibrated microbolometer instruments are now available. The ExplorIR Thermography camera leverages the technology developed for Raytheon Systems Company's first production microbolometer imaging camera, the Sentinel. The ExplorIR camera has a demonstrated temperature measurement accuracy of 4 degrees Celsius or 4% of the measured value (whichever is greater) over scene temperatures ranges of minus 20 degrees Celsius to 300 degrees Celsius (minus 20 degrees Celsius to 900 degrees Celsius for extended range models) and camera environmental temperatures of minus 10 degrees Celsius to 40 degrees Celsius. Direct temperature measurement with high resolution video imaging creates some unique challenges when using uncooled detectors. A temperature controlled, field-of-view limiting aperture (cold shield) is not typically included in the small volume dewars used for uncooled detector packages. The lack of a field-of-view shield allows a significant amount of extraneous radiation from the dewar walls and lens body to affect the sensor operation. In addition, the transmission of the Germanium lens elements is a function of ambient temperature. The ExplorIR camera design compensates for these environmental effects while maintaining the accuracy and dynamic range required by today's predictive maintenance and condition monitoring markets.

  10. Volga Delta and the Caspian Sea

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Russia's Volga River is the largest river system in Europe, draining over 1.3 million square kilometers of catchment area into the Caspian Sea. The brackish Caspian is Earth's largest landlocked water body, and its isolation from the world's oceans has enabled the preservation of several unique animal and plant species. The Volga provides most of the Caspian's fresh water and nutrients, and also discharges large amounts of sediment and industrial waste into the relatively shallow northern part of the sea. These images of the region were captured by the Multi-angle Imaging SpectroRadiometer on October 5, 2001, during Terra orbit 9567. Each image represents an area of approximately 275 kilometers x 376 kilometers.

    The left-hand image is from MISR's nadir (vertical-viewing) camera, and shows how light is reflected at red, green, and blue wavelengths. The right-hand image is a false color composite of red-band imagery from MISR's 60-degree backward, nadir, and 60-degree forward-viewing cameras, displayed as red, green, and blue, respectively. Here, color variations indicate how light is reflected at different angles of view. Water appears blue in the right-hand image, for example, because sun glitter makes smooth, wet surfaces look brighter at the forward camera's view angle. The rougher-textured vegetated wetlands near the coast exhibit preferential backscattering, and consequently appear reddish. A small cloud near the center of the delta separates into red, green, and blue components due to geometric parallax associated with its elevation above the surface.

    Other notable features within the images include several linear features located near the Volga Delta shoreline. These long, thin lines are artificially maintained shipping channels, dredged to depths of at least 2 meters. The crescent-shaped Kulaly Island, also known as Seal Island, is visible near the right-hand edge of the images.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory

  11. Northern California and San Francisco Bay

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The left image of this pair was acquired by MISR's nadir camera on August 17, 2000 during Terra orbit 3545. Toward the top, and nestled between the Coast Range and the Sierra Nevadas, are the green fields of the Sacramento Valley. The city of Sacramento is the grayish area near the right-hand side of the image. Further south, San Francisco and other cities of the Bay Area are visible.

    On the right is a zoomed-in view of the area outlined by the yellow polygon. It highlights the southern end of San Francisco Bay, and was acquired by MISR's airborne counterpart, AirMISR, during an engineering check-out flight on August 25, 1997. AirMISR flies aboard a NASA ER-2 high-altitude aircraft and contains a single camera that rotates to different view angles. When this image was acquired, the AirMISR camera was pointed 70 degrees forward of the vertical. Colorful tidal flats are visible in both the AirMISR and MISR imagery.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

    For more information: http://www-misr.jpl.nasa.gov

  12. Prediction of Viking lander camera image quality

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Burcher, E. E.; Jobson, D. J.; Wall, S. D.

    1976-01-01

    Formulations are presented that permit prediction of image quality as a function of camera performance, surface radiance properties, and lighting and viewing geometry. Predictions made for a wide range of surface radiance properties reveal that image quality depends strongly on proper camera dynamic range command and on favorable lighting and viewing geometry. Proper camera dynamic range commands depend mostly on the surface albedo that will be encountered. Favorable lighting and viewing geometries depend mostly on lander orientation with respect to the diurnal sun path over the landing site, and tend to be independent of surface albedo and illumination scattering function. Side lighting with low sun elevation angles (10 to 30 deg) is generally favorable for imaging spatial details and slopes, whereas high sun elevation angles are favorable for measuring spectral reflectances.

  13. IMAX camera (12-IML-1)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The IMAX camera system is used to record on-orbit activities of interest to the public. Because of the extremely high resolution of the IMAX camera, projector, and audio systems, the audience is afforded a motion picture experience unlike any other. IMAX and OMNIMAX motion picture systems were designed to create motion picture images of superior quality and audience impact. The IMAX camera is a 65 mm, single lens, reflex viewing design with a 15 perforation per frame horizontal pull across. The frame size is 2.06 x 2.77 inches. Film travels through the camera at a rate of 336 feet per minute when the camera is running at the standard 24 frames/sec.

  14. Calibration Target for Curiosity Arm Camera

    NASA Image and Video Library

    2012-09-10

    This view of the calibration target for the MAHLI camera aboard NASA Mars rover Curiosity combines two images taken by that camera during Sept. 9, 2012. Part of Curiosity left-front and center wheels and a patch of Martian ground are also visible.

  15. A wide-angle camera module for disposable endoscopy

    NASA Astrophysics Data System (ADS)

    Shim, Dongha; Yeon, Jesun; Yi, Jason; Park, Jongwon; Park, Soo Nam; Lee, Nanhee

    2016-08-01

    A wide-angle miniaturized camera module for disposable endoscope is demonstrated in this paper. A lens module with 150° angle of view (AOV) is designed and manufactured. All plastic injection-molded lenses and a commercial CMOS image sensor are employed to reduce the manufacturing cost. The image sensor and LED illumination unit are assembled with a lens module. The camera module does not include a camera processor to further reduce its size and cost. The size of the camera module is 5.5 × 5.5 × 22.3 mm3. The diagonal field of view (FOV) of the camera module is measured to be 110°. A prototype of a disposable endoscope is implemented to perform a pre-clinical animal testing. The esophagus of an adult beagle dog is observed. These results demonstrate the feasibility of a cost-effective and high-performance camera module for disposable endoscopy.

  16. Maryland: La Plata

    Atmospheric Science Data Center

    2014-05-15

    article title:  Tornado Cuts Through La Plata, Maryland     View Larger Image A category F4 tornado tore through La Plata, Maryland on April 28, 2002, killing 5 and ... illustrates the strip of flattened vegetation left by the tornado. The lower image was acquired by MISR's nadir (vertical-viewing) ...

  17. Stereo Cameras for Clouds (STEREOCAM) Instrument Handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romps, David; Oktem, Rusen

    2017-10-31

    The three pairs of stereo camera setups aim to provide synchronized and stereo calibrated time series of images that can be used for 3D cloud mask reconstruction. Each camera pair is positioned at approximately 120 degrees from the other pair, with a 17o-19o pitch angle from the ground, and at 5-6 km distance from the U.S. Department of Energy (DOE) Central Facility at the Atmospheric Radiation Measurement (ARM) Climate Research Facility Southern Great Plains (SGP) observatory to cover the region from northeast, northwest, and southern views. Images from both cameras of the same stereo setup can be paired together tomore » obtain 3D reconstruction by triangulation. 3D reconstructions from the ring of three stereo pairs can be combined together to generate a 3D mask from surrounding views. This handbook delivers all stereo reconstruction parameters of the cameras necessary to make 3D reconstructions from the stereo camera images.« less

  18. Digital image measurement of specimen deformation based on CCD cameras and Image J software: an application to human pelvic biomechanics

    NASA Astrophysics Data System (ADS)

    Jia, Yongwei; Cheng, Liming; Yu, Guangrong; Lou, Yongjian; Yu, Yan; Chen, Bo; Ding, Zuquan

    2008-03-01

    A method of digital image measurement of specimen deformation based on CCD cameras and Image J software was developed. This method was used to measure the biomechanics behavior of human pelvis. Six cadaveric specimens from the third lumbar vertebra to the proximal 1/3 part of femur were tested. The specimens without any structural abnormalities were dissected of all soft tissue, sparing the hip joint capsules and the ligaments of the pelvic ring and floor. Markers with black dot on white background were affixed to the key regions of the pelvis. Axial loading from the proximal lumbar was applied by MTS in the gradient of 0N to 500N, which simulated the double feet standing stance. The anterior and lateral images of the specimen were obtained through two CCD cameras. Based on Image J software, digital image processing software, which can be freely downloaded from the National Institutes of Health, digital 8-bit images were processed. The procedure includes the recognition of digital marker, image invert, sub-pixel reconstruction, image segmentation, center of mass algorithm based on weighted average of pixel gray values. Vertical displacements of S1 (the first sacral vertebrae) in front view and micro-angular rotation of sacroiliac joint in lateral view were calculated according to the marker movement. The results of digital image measurement showed as following: marker image correlation before and after deformation was excellent. The average correlation coefficient was about 0.983. According to the 768 × 576 pixels image (pixel size 0.68mm × 0.68mm), the precision of the displacement detected in our experiment was about 0.018 pixels and the comparatively error could achieve 1.11\\perthou. The average vertical displacement of S1 of the pelvis was 0.8356+/-0.2830mm under vertical load of 500 Newtons and the average micro-angular rotation of sacroiliac joint in lateral view was 0.584+/-0.221°. The load-displacement curves obtained from our optical measure system

  19. Homography-based multiple-camera person-tracking

    NASA Astrophysics Data System (ADS)

    Turk, Matthew R.

    2009-01-01

    Multiple video cameras are cheaply installed overlooking an area of interest. While computerized single-camera tracking is well-developed, multiple-camera tracking is a relatively new problem. The main multi-camera problem is to give the same tracking label to all projections of a real-world target. This is called the consistent labelling problem. Khan and Shah (2003) introduced a method to use field of view lines to perform multiple-camera tracking. The method creates inter-camera meta-target associations when objects enter at the scene edges. They also said that a plane-induced homography could be used for tracking, but this method was not well described. Their homography-based system would not work if targets use only one side of a camera to enter the scene. This paper overcomes this limitation and fully describes a practical homography-based tracker. A new method to find the feet feature is introduced. The method works especially well if the camera is tilted, when using the bottom centre of the target's bounding-box would produce inaccurate results. The new method is more accurate than the bounding-box method even when the camera is not tilted. Next, a method is presented that uses a series of corresponding point pairs "dropped" by oblivious, live human targets to find a plane-induced homography. The point pairs are created by tracking the feet locations of moving targets that were associated using the field of view line method. Finally, a homography-based multiple-camera tracking algorithm is introduced. Rules governing when to create the homography are specified. The algorithm ensures that homography-based tracking only starts after a non-degenerate homography is found. The method works when not all four field of view lines are discoverable; only one line needs to be found to use the algorithm. To initialize the system, the operator must specify pairs of overlapping cameras. Aside from that, the algorithm is fully automatic and uses the natural movement of

  20. TES/Aura L2 Ammonia (NH3) Lite Nadir V6 (TL2NH3LN)

    Atmospheric Science Data Center

    2017-07-20

    TES/Aura L2 Ammonia (NH3) Lite Nadir (TL2NH3LN) News:  TES News ... Level:  L2 Instrument:  TES/Aura L2 Ammonia Spatial Coverage:  5.3 km nadir Spatial ... OPeNDAP Access:  OPeNDAP Parameters:  Ammonia Order Data:  Earthdata Search:   Order Data ...

  1. TES/Aura L2 Carbon Dioxide (CO2) Nadir V6 (TL2CO2N)

    Atmospheric Science Data Center

    2018-01-18

    TES/Aura L2 Carbon Dioxide (CO2) Nadir (TL2CO2N) News:  TES News ... Level:  L2 Platform:  TES/Aura L2 Carbon Dioxide Spatial Coverage:  5.2 x 8.5 km nadir ... Contact User Services Parameters:  Carbon Dioxide Legacy:  Retired data product , click here ...

  2. TES/Aura L2 Carbon Dioxide (CO2) Nadir V6 (TL2CO2NS)

    Atmospheric Science Data Center

    2018-01-22

    TES/Aura L2 Carbon Dioxide (CO2) Nadir (TL2CO2NS) News:  TES News ... Level:  L2 Platform:  TES/Aura L2 Carbon Dioxide Spatial Coverage:  5.3 x 8.5 km nadir ... Contact ASDC User Services Parameters:  Carbon Dioxide Legacy:  Retired data product , click here ...

  3. Helms at photo quality window in Destiny Laboratory module

    NASA Image and Video Library

    2001-03-31

    ISS002-E-5489 (31 March 2001) --- Astronaut Susan J. Helms, Expedition Two flight engineer, views the topography of a point on Earth from the nadir window in the U.S. Laboratory / Destiny module of the International Space Station (ISS). The image was recorded with a digital still camera.

  4. The Feasibility of Tropospheric and Total Ozone Determination Using a Fabry-perot Interferometer as a Satellite-based Nadir-viewing Atmospheric Sensor. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Larar, Allen Maurice

    1993-01-01

    Monitoring of the global distribution of tropospheric ozone (O3) is desirable for enhanced scientific understanding as well as to potentially lessen the ill-health impacts associated with exposure to elevated concentrations in the lower atmosphere. Such a capability can be achieved using a satellite-based device making high spectral resolution measurements with high signal-to-noise ratios; this would enable observation in the pressure-broadened wings of strong O3 lines while minimizing the impact of undesirable signal contributions associated with, for example, the terrestrial surface, interfering species, and clouds. The Fabry-Perot Interferometer (FPI) provides high spectral resolution and high throughput capabilities that are essential for this measurement task. Through proper selection of channel spectral regions, the FPI optimized for tropospheric O3 measurements can simultaneously observe a stratospheric component and thus the total O3 column abundance. Decreasing stratospheric O3 concentrations may lead to an increase in biologically harmful solar ultraviolet radiation reaching the earth's surface, which is detrimental to health. In this research, a conceptual instrument design to achieve the desired measurement has been formulated. This involves a double-etalon fixed-gap series configuration FPI along with an ultra-narrow bandpass filter to achieve single-order operation with an overall spectral resolution of approximately .068 cm(exp -1). A spectral region of about 1 cm(exp -1) wide centered at 1054.73 cm(exp -1) within the strong 9.6 micron ozone infrared band is sampled with 24 spectral channels. Other design characteristics include operation from a nadir-viewing satellite configuration utilizing a 9 inch (diameter) telescope and achieving horizontal spatial resolution with a 50 km nadir footprint. A retrieval technique has been implemented and is demonstrated for a tropical atmosphere possessing enhanced tropospheric ozone amounts. An error analysis

  5. The NASA 2003 Mars Exploration Rover Panoramic Camera (Pancam) Investigation

    NASA Astrophysics Data System (ADS)

    Bell, J. F.; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.; Schwochert, M.; Morris, R. V.; Athena Team

    2002-12-01

    The Panoramic Camera System (Pancam) is part of the Athena science payload to be launched to Mars in 2003 on NASA's twin Mars Exploration Rover missions. The Pancam imaging system on each rover consists of two major components: a pair of digital CCD cameras, and the Pancam Mast Assembly (PMA), which provides the azimuth and elevation actuation for the cameras as well as a 1.5 meter high vantage point from which to image. Pancam is a multispectral, stereoscopic, panoramic imaging system, with a field of regard provided by the PMA that extends across 360o of azimuth and from zenith to nadir, providing a complete view of the scene around the rover. Pancam utilizes two 1024x2048 Mitel frame transfer CCD detector arrays, each having a 1024x1024 active imaging area and 32 optional additional reference pixels per row for offset monitoring. Each array is combined with optics and a small filter wheel to become one "eye" of a multispectral, stereoscopic imaging system. The optics for both cameras consist of identical 3-element symmetrical lenses with an effective focal length of 42 mm and a focal ratio of f/20, yielding an IFOV of 0.28 mrad/pixel or a rectangular FOV of 16o\\x9D 16o per eye. The two eyes are separated by 30 cm horizontally and have a 1o toe-in to provide adequate parallax for stereo imaging. The cameras are boresighted with adjacent wide-field stereo Navigation Cameras, as well as with the Mini-TES instrument. The Pancam optical design is optimized for best focus at 3 meters range, and allows Pancam to maintain acceptable focus from infinity to within 1.5 meters of the rover, with a graceful degradation (defocus) at closer ranges. Each eye also contains a small 8-position filter wheel to allow multispectral sky imaging, direct Sun imaging, and surface mineralogic studies in the 400-1100 nm wavelength region. Pancam has been designed and calibrated to operate within specifications from -55oC to +5oC. An onboard calibration target and fiducial marks provide

  6. The influence of polarization on box air mass factors for UV/vis nadir satellite observations

    NASA Astrophysics Data System (ADS)

    Hilboll, Andreas; Richter, Andreas; Rozanov, Vladimir V.; Burrows, John P.

    2015-04-01

    Tropospheric abundances of pollutant trace gases like, e.g., NO2, are often derived by applying the differential optical absorption spectroscopy (DOAS) method to space-borne measurements of back-scattered and reflected solar radiation. The resulting quantity, the slant column density (SCD), subsequently has to be converted to more easily interpretable vertical column densities by means of the so-called box air mass factor (BAMF). The BAMF describes the ratio of SCD and VCD within one atmospheric layer and is calculated by a radiative transfer model. Current operational and scientific data products of satellite-derived trace gas VCDs do not include the effect of polarization in their radiative transfer models. However, the various scattering processes in the atmosphere do lead to a distinctive polarization pattern of the observed Earthshine spectra. This study investigates the influence of these polarization patterns on box air mass factors for satellite nadir DOAS measurements of NO2 in the UV/vis wavelength region. NO2 BAMFs have been simulated for a multitude of viewing geometries, surface albedos, and surface altitudes, using the radiative transfer model SCIATRAN. The results show a potentially large influence of polarization on the BAMF, which can reach 10% and more close to the surface. A simple correction for this effect seems not to be feasible, as it strongly depends on the specific measurement scenario and can lead to both high and low biases of the resulting NO2 VCD. We therefore conclude that all data products of NO2 VCDs derived from space-borne DOAS measurements should include polarization effects in their radiative transfer model calculations, or at least include the errors introduced by using linear models in their uncertainty estimates.

  7. Field-of-View Guiding Camera on the HISAKI (SPRINT-A) Satellite

    NASA Astrophysics Data System (ADS)

    Yamazaki, A.; Tsuchiya, F.; Sakanoi, T.; Uemizu, K.; Yoshioka, K.; Murakami, G.; Kagitani, M.; Kasaba, Y.; Yoshikawa, I.; Terada, N.; Kimura, T.; Sakai, S.; Nakaya, K.; Fukuda, S.; Sawai, S.

    2014-11-01

    HISAKI (SPRINT-A) satellite is an earth-orbiting Extreme UltraViolet (EUV) spectroscopic mission and launched on 14 Sep. 2013 by the launch vehicle Epsilon-1. Extreme ultraviolet spectroscope (EXCEED) onboard the satellite will investigate plasma dynamics in Jupiter's inner magnetosphere and atmospheric escape from Venus and Mars. EUV spectroscopy is useful to measure electron density and temperature and ion composition in plasma environment. EXCEED also has an advantage to measure spatial distribution of plasmas around the planets. To measure radial plasma distribution in the Jovian inner magnetosphere and plasma emissions from ionosphere, exosphere and tail separately (for Venus and Mars), the pointing accuracy of the spectroscope should be smaller than spatial structures of interest (20 arc-seconds). For satellites in the low earth orbit (LEO), the pointing displacement is generally caused by change of alignment between the satellite bus module and the telescope due to the changing thermal inputs from the Sun and Earth. The HISAKI satellite is designed to compensate the displacement by tracking the target with using a Field-Of-View (FOV) guiding camera. Initial checkout of the attitude control for the EXCEED observation shows that pointing accuracy kept within 2 arc-seconds in a case of "track mode" which is used for Jupiter observation. For observations of Mercury, Venus, Mars, and Saturn, the entire disk will be guided inside slit to observe plasma around the planets. Since the FOV camera does not capture the disk in this case, the satellite uses a star tracker (STT) to hold the attitude ("hold mode"). Pointing accuracy during this mode has been 20-25 arc-seconds. It has been confirmed that the attitude control works well as designed.

  8. A multirotor platform for mapping and inspecting sub-vertical rock faces

    NASA Astrophysics Data System (ADS)

    Thoeni, Klaus; Renton, Christopher; Giacomini, Anna

    2016-04-01

    Only in recent years UAS technology has become accessible to everyone and, hence, it is rapidly becoming a valuable tool for researchers and scientists (Westoby et al., 2012; Nex and Remondino, 2014). Electric multicopters (i.e., multirotor helicopters) are one of the most exciting developments of the last couple of years. Only the development and implementation of advanced flight controllers made the use of multicopters possible. Generally being an aerodynamically unstable UAS they absolutely require a flight controller for stable flight. Several open-source and commercial flight controller are now available which makes it possible to build custom UAS. The current work presents a custom build hexacopter (i.e., a multicopter with six rotors) which was specifically developed for 3D mapping and inspection of sub-vertical rock faces. The main sensor installed on the platform is a Canon 100D DSLR camera. The camera is attached to a two axis gimbal. The roll angle is automatically controlled to keep the camera level during the flight whereas the user controls the tilt angle. The two forward facing arms of the hexacopter have been raised, i.e., they are located higher than the other four propellers (Mantis arms). This provides a clear field of view when looking forward and even makes it possible to look slightly upward without having the propellers in the field of view. A DJI A2 flight controller is installed on the platform and an additional FPV camera can be switched on if pictures are taken in manual mode. So far the flights are all performed in manual mode. The fact that the platform is generally flying very close to very irregular sub-vertical rock faces makes autonomous flights in GPS mode almost impossible. In addition, GPS reception is often very poor around sub-vertical rock faces. One main issue when flying in manual mode is to keep the hexacopter at a constant distance from the surface. As the rock surface gets higher and higher it becomes more and more

  9. Incremental Multi-view 3D Reconstruction Starting from Two Images Taken by a Stereo Pair of Cameras

    NASA Astrophysics Data System (ADS)

    El hazzat, Soulaiman; Saaidi, Abderrahim; Karam, Antoine; Satori, Khalid

    2015-03-01

    In this paper, we present a new method for multi-view 3D reconstruction based on the use of a binocular stereo vision system constituted of two unattached cameras to initialize the reconstruction process. Afterwards , the second camera of stereo vision system (characterized by varying parameters) moves to capture more images at different times which are used to obtain an almost complete 3D reconstruction. The first two projection matrices are estimated by using a 3D pattern with known properties. After that, 3D scene points are recovered by triangulation of the matched interest points between these two images. The proposed approach is incremental. At each insertion of a new image, the camera projection matrix is estimated using the 3D information already calculated and new 3D points are recovered by triangulation from the result of the matching of interest points between the inserted image and the previous image. For the refinement of the new projection matrix and the new 3D points, a local bundle adjustment is performed. At first, all projection matrices are estimated, the matches between consecutive images are detected and Euclidean sparse 3D reconstruction is obtained. So, to increase the number of matches and have a more dense reconstruction, the Match propagation algorithm, more suitable for interesting movement of the camera, was applied on the pairs of consecutive images. The experimental results show the power and robustness of the proposed approach.

  10. Image quality evaluation of color displays using a Fovean color camera

    NASA Astrophysics Data System (ADS)

    Roehrig, Hans; Dallas, William J.; Fan, Jiahua; Krupinski, Elizabeth A.; Redford, Gary R.; Yoneda, Takahiro

    2007-03-01

    This paper presents preliminary data on the use of a color camera for the evaluation of Quality Control (QC) and Quality Analysis (QA) of a color LCD in comparison with that of a monochrome LCD. The color camera is a C-MOS camera with a pixel size of 9 µm and a pixel matrix of 2268 × 1512 × 3. The camera uses a sensor that has co-located pixels for all three primary colors. The imaging geometry used mostly was 12 × 12 camera pixels per display pixel even though it appears that an imaging geometry of 17.6 might provide results which are more accurate. The color camera is used as an imaging colorimeter, where each camera pixel is calibrated to serve as a colorimeter. This capability permits the camera to determine chromaticity of the color LCD at different sections of the display. After the color calibration with a CS-200 colorimeter the color coordinates of the display's primaries determined from the camera's luminance response are very close to those found from the CS-200. Only the color coordinates of the display's white point were in error. Modulation Transfer Function (MTF) as well as Noise in terms of the Noise Power Spectrum (NPS) of both LCDs were evaluated. The horizontal MTFs of both displays have a larger negative slope than the vertical MTFs, indicating that the horizontal MTFs are poorer than the vertical MTFs. However the modulations at the Nyquist frequency seem lower for the color LCD than for the monochrome LCD. These results contradict simulations regarding MTFs in the vertical direction. The spatial noise of the color display in both directions are larger than that of the monochrome display. Attempts were also made to analyze the total noise in terms of spatial and temporal noise by applying subtractions of images taken at exactly the same exposure. Temporal noise seems to be significantly lower than spatial noise.

  11. The concurrent validity and reliability of a low-cost, high-speed camera-based method for measuring the flight time of vertical jumps.

    PubMed

    Balsalobre-Fernández, Carlos; Tejero-González, Carlos M; del Campo-Vecino, Juan; Bavaresco, Nicolás

    2014-02-01

    Flight time is the most accurate and frequently used variable when assessing the height of vertical jumps. The purpose of this study was to analyze the validity and reliability of an alternative method (i.e., the HSC-Kinovea method) for measuring the flight time and height of vertical jumping using a low-cost high-speed Casio Exilim FH-25 camera (HSC). To this end, 25 subjects performed a total of 125 vertical jumps on an infrared (IR) platform while simultaneously being recorded with a HSC at 240 fps. Subsequently, 2 observers with no experience in video analysis analyzed the 125 videos independently using the open-license Kinovea 0.8.15 software. The flight times obtained were then converted into vertical jump heights, and the intraclass correlation coefficient (ICC), Bland-Altman plot, and Pearson correlation coefficient were calculated for those variables. The results showed a perfect correlation agreement (ICC = 1, p < 0.0001) between both observers' measurements of flight time and jump height and a highly reliable agreement (ICC = 0.997, p < 0.0001) between the observers' measurements of flight time and jump height using the HSC-Kinovea method and those obtained using the IR system, thus explaining 99.5% (p < 0.0001) of the differences (shared variance) obtained using the IR platform. As a result, besides requiring no previous experience in the use of this technology, the HSC-Kinovea method can be considered to provide similarly valid and reliable measurements of flight time and vertical jump height as more expensive equipment (i.e., IR). As such, coaches from many sports could use the HSC-Kinovea method to measure the flight time and height of their athlete's vertical jumps.

  12. Performance evaluation of low-cost airglow cameras for mesospheric gravity wave measurements

    NASA Astrophysics Data System (ADS)

    Suzuki, S.; Shiokawa, K.

    2016-12-01

    Atmospheric gravity waves significantly contribute to the wind/thermal balances in the mesosphere and lower thermosphere (MLT) through their vertical transport of horizontal momentum. It has been reported that the gravity wave momentum flux preferentially associated with the scale of the waves; the momentum fluxes of the waves with a horizontal scale of 10-100 km are particularly significant. Airglow imaging is a useful technique to observe two-dimensional structure of small-scale (<100 km) gravity waves in the MLT region and has been used to investigate global behaviour of the waves. Recent studies with simultaneous/multiple airglow cameras have derived spatial extent of the MLT waves. Such network imaging observations are advantageous to ever better understanding of coupling between the lower and upper atmosphere via gravity waves. In this study, we newly developed low-cost airglow cameras to enlarge the airglow imaging network. Each of the cameras has a fish-eye lens with a 185-deg field-of-view and equipped with a CCD video camera (WATEC WAT-910HX) ; the camera is small (W35.5 x H36.0 x D63.5 mm) and inexpensive, much more than the airglow camera used for the existing ground-based network (Optical Mesosphere Thermosphere Imagers (OMTI) operated by Solar-Terrestrial Environmental Laboratory, Nagoya University), and has a CCD sensor with 768 x 494 pixels that is highly sensitive enough to detect the mesospheric OH airglow emission perturbations. In this presentation, we will report some results of performance evaluation of this camera made at Shigaraki (35-deg N, 136-deg E), Japan, where is one of the OMTI station. By summing 15-images (i.e., 1-min composition of the images) we recognised clear gravity wave patterns in the images with comparable quality to the OMTI's image. Outreach and educational activities based on this research will be also reported.

  13. a Uav-Based Low-Cost Stereo Camera System for Archaeological Surveys - Experiences from Doliche (turkey)

    NASA Astrophysics Data System (ADS)

    Haubeck, K.; Prinz, T.

    2013-08-01

    The use of Unmanned Aerial Vehicles (UAVs) for surveying archaeological sites is becoming more and more common due to their advantages in rapidity of data acquisition, cost-efficiency and flexibility. One possible usage is the documentation and visualization of historic geo-structures and -objects using UAV-attached digital small frame cameras. These monoscopic cameras offer the possibility to obtain close-range aerial photographs, but - under the condition that an accurate nadir-waypoint flight is not possible due to choppy or windy weather conditions - at the same time implicate the problem that two single aerial images not always meet the required overlap to use them for 3D photogrammetric purposes. In this paper, we present an attempt to replace the monoscopic camera with a calibrated low-cost stereo camera that takes two pictures from a slightly different angle at the same time. Our results show that such a geometrically predefined stereo image pair can be used for photogrammetric purposes e.g. the creation of digital terrain models (DTMs) and orthophotos or the 3D extraction of single geo-objects. Because of the limited geometric photobase of the applied stereo camera and the resulting base-height ratio the accuracy of the DTM however directly depends on the UAV flight altitude.

  14. Indirect Vision Driving with Fixed Flat Panel Displays for Near Unity, Wide, and Extended Fields of Camera View

    DTIC Science & Technology

    2001-06-01

    The choice of camera FOV may depend on the task being performed. The driver may prefer a unity view for driving along a known route to...increase his or her perception of potential road hazards. On the other hand, the driver may prefer a compressed image at road turns for route selection...with a supervisory evaluation of the road ahead and the impact on the driving schema. Included in this

  15. Value Added: the Case for Point-of-View Camera use in Orthopedic Surgical Education.

    PubMed

    Karam, Matthew D; Thomas, Geb W; Taylor, Leah; Liu, Xiaoxing; Anthony, Chris A; Anderson, Donald D

    2016-01-01

    Orthopedic surgical education is evolving as educators search for new ways to enhance surgical skills training. Orthopedic educators should seek new methods and technologies to augment and add value to real-time orthopedic surgical experience. This paper describes a protocol whereby we have started to capture and evaluate specific orthopedic milestone procedures with a GoPro® point-of-view video camera and a dedicated video reviewing website as a way of supplementing the current paradigm in surgical skills training. We report our experience regarding the details and feasibility of this protocol. Upon identification of a patient undergoing surgical fixation of a hip or ankle fracture, an orthopedic resident places a GoPro® point-of-view camera on his or her forehead. All fluoroscopic images acquired during the case are saved and later incorporated into a video on the reviewing website. Surgical videos are uploaded to a secure server and are accessible for later review and assessment via a custom-built website. An electronic survey of resident participants was performed utilizing Qualtrics software. Results are reported using descriptive statistics. A total of 51 surgical videos involving 23 different residents have been captured to date. This includes 20 intertrochanteric hip fracture cases and 31 ankle fracture cases. The average duration of each surgical video was 1 hour and 16 minutes (range 40 minutes to 2 hours and 19 minutes). Of 24 orthopedic resident surgeons surveyed, 88% thought capturing a video portfolio of orthopedic milestones would benefit their education. There is a growing demand in orthopedic surgical education to extract more value from each surgical experience. While further work in development and refinement of such assessments is necessary, we feel that intraoperative video, particularly when captured and presented in a non-threatening, user friendly manner, can add significant value to the present and future paradigm of orthopedic surgical

  16. Measurements of the vertical profile of water vapor abundance in the Martian atmosphere from Mars Observer

    NASA Technical Reports Server (NTRS)

    Schofield, J. T.; Mccleese, Daniel J.

    1988-01-01

    An analysis is presented of the Pressure Modulator Infrared Radiometer (PMIRR) capabilities along with how the vertical profiles of water vapor will be obtained. The PMIRR will employ filter and pressure modulation radiometry using nine spectral channels, in both limb scanning and nadir sounding modes, to obtain daily, global maps of temperature, dust extinction, condensate extinction, and water vapor mixing ratio profiles as a function of pressure to half scale height or 5 km vertical resolution. Surface thermal properties will also be mapped, and the polar radiactive balance will be monitored.

  17. Earth observations taken from OV-105 during the STS-99 mission

    NASA Image and Video Library

    2000-02-17

    S99-E-5555 (17 February 2000) --- As photographed from the Space Shuttle Endeavour, this oblique electronic still image of Earth's horizon reveals a great deal of cloud cover. In the case of the electronic still camera (ESC), as well as film-bearing instruments, clouds naturally obscure views of recognizable land masses. Much of Earth is heavily cloud covered during the current mission and meteorlogists and oceanographers are interested in studying that aspect. However, the Shuttle Radar Topography Mission's other sensing equipment, X-SAR and C-band antennae, are able to penetrate cloud cover and record important topographic data for mapmakers and scientists of other disciplines. In addition to the sensing equipment mentioned above, this mission is supporting the EarthKAM project which utilizes the services of another electronic still camera mounted in Endeavour's windows. Unlike this oblique view, EarthKAM records strictly vertical or nadir imagery of points all over the world. Students across the United States and in France, Germany and Japan are taking photos throughout the STS-99 mission. And they are using these new photos, plus all the images already available in the EarthKAM system, to enhance their classroom learning in Earth and space science, social studies, geography, mathematics and more.

  18. Indoor calibration for stereoscopic camera STC: a new method

    NASA Astrophysics Data System (ADS)

    Simioni, E.; Re, C.; Da Deppo, V.; Naletto, G.; Borrelli, D.; Dami, M.; Ficai Veltroni, I.; Cremonese, G.

    2017-11-01

    In the framework of the ESA-JAXA BepiColombo mission to Mercury, the global mapping of the planet will be performed by the on-board Stereo Camera (STC), part of the SIMBIO-SYS suite [1]. In this paper we propose a new technique for the validation of the 3D reconstruction of planetary surface from images acquired with a stereo camera. STC will provide a three-dimensional reconstruction of Mercury surface. The generation of a DTM of the observed features is based on the processing of the acquired images and on the knowledge of the intrinsic and extrinsic parameters of the optical system. The new stereo concept developed for STC needs a pre-flight verification of the actual capabilities to obtain elevation information from stereo couples: for this, a stereo validation setup to get an indoor reproduction of the flight observing condition of the instrument would give a much greater confidence to the developed instrument design. STC is the first stereo satellite camera with two optical channels converging in a unique sensor. Its optical model is based on a brand new concept to minimize mass and volume and to allow push-frame imaging. This model imposed to define a new calibration pipeline to test the reconstruction method in a controlled ambient. An ad-hoc indoor set-up has been realized for validating the instrument designed to operate in deep space, i.e. in-flight STC will have to deal with source/target essentially placed at infinity. This auxiliary indoor setup permits on one side to rescale the stereo reconstruction problem from the operative distance in-flight of 400 km to almost 1 meter in lab; on the other side it allows to replicate different viewing angles for the considered targets. Neglecting for sake of simplicity the Mercury curvature, the STC observing geometry of the same portion of the planet surface at periherm corresponds to a rotation of the spacecraft (SC) around the observed target by twice the 20° separation of each channel with respect to nadir

  19. Indoor Calibration for Stereoscopic Camera STC, A New Method

    NASA Astrophysics Data System (ADS)

    Simioni, E.; Re, C.; Da Deppo, V.; Naletto, G.; Borrelli, D.; Dami, M.; Ficai Veltroni, I.; Cremonese, G.

    2014-10-01

    In the framework of the ESA-JAXA BepiColombo mission to Mercury, the global mapping of the planet will be performed by the on-board Stereo Camera (STC), part of the SIMBIO-SYS suite [1]. In this paper we propose a new technique for the validation of the 3D reconstruction of planetary surface from images acquired with a stereo camera. STC will provide a three-dimensional reconstruction of Mercury surface. The generation of a DTM of the observed features is based on the processing of the acquired images and on the knowledge of the intrinsic and extrinsic parameters of the optical system. The new stereo concept developed for STC needs a pre-flight verification of the actual capabilities to obtain elevation information from stereo couples: for this, a stereo validation setup to get an indoor reproduction of the flight observing condition of the instrument would give a much greater confidence to the developed instrument design. STC is the first stereo satellite camera with two optical channels converging in a unique sensor. Its optical model is based on a brand new concept to minimize mass and volume and to allow push-frame imaging. This model imposed to define a new calibration pipeline to test the reconstruction method in a controlled ambient. An ad-hoc indoor set-up has been realized for validating the instrument designed to operate in deep space, i.e. in-flight STC will have to deal with source/target essentially placed at infinity. This auxiliary indoor setup permits on one side to rescale the stereo reconstruction problem from the operative distance in-flight of 400 km to almost 1 meter in lab; on the other side it allows to replicate different viewing angles for the considered targets. Neglecting for sake of simplicity the Mercury curvature, the STC observing geometry of the same portion of the planet surface at periherm corresponds to a rotation of the spacecraft (SC) around the observed target by twice the 20° separation of each channel with respect to nadir

  20. Multi-layer Clouds Over the South Indian Ocean

    NASA Technical Reports Server (NTRS)

    2003-01-01

    The complex structure and beauty of polar clouds are highlighted by these images acquired by the Multi-angle Imaging SpectroRadiometer (MISR) on April 23, 2003. These clouds occur at multiple altitudes and exhibit a noticeable cyclonic circulation over the Southern Indian Ocean, to the north of Enderbyland, East Antarctica.

    The image at left was created by overlying a natural-color view from MISR's downward-pointing (nadir) camera with a color-coded stereo height field. MISR retrieves heights by a pattern recognition algorithm that utilizes multiple view angles to derive cloud height and motion. The opacity of the height field was then reduced until the field appears as a translucent wash over the natural-color image. The resulting purple, cyan and green hues of this aesthetic display indicate low, medium or high altitudes, respectively, with heights ranging from less than 2 kilometers (purple) to about 8 kilometers (green). In the lower right corner, the edge of the Antarctic coastline and some sea ice can be seen through some thin, high cirrus clouds.

    The right-hand panel is a natural-color image from MISR's 70-degree backward viewing camera. This camera looks backwards along the path of Terra's flight, and in the southern hemisphere the Sun is in front of this camera. This perspective causes the cloud-tops to be brightly outlined by the sun behind them, and enhances the shadows cast by clouds with significant vertical structure. An oblique observation angle also enhances the reflection of light by atmospheric particles, and accentuates the appearance of polar clouds. The dark ocean and sea ice that were apparent through the cirrus clouds at the bottom right corner of the nadir image are overwhelmed by the brightness of these clouds at the oblique view.

    The Multi-angle Imaging SpectroRadiometer observes the daylit Earth continuously from pole to pole, and every 9 days views the entire globe between 82 degrees north and 82 degrees south latitude

  1. Smoke from Fires in Southern Mexico

    NASA Technical Reports Server (NTRS)

    2002-01-01

    On May 2, 2002, numerous fires in southern Mexico sent smoke drifting northward over the Gulf of Mexico. These views from the Multi-angle Imaging SpectroRadiometer illustrate the smoke extent over parts of the Gulf and the southern Mexican states of Tabasco, Campeche and Chiapas. At the same time, dozens of other fires were also burning in the Yucatan Peninsula and across Central America. A similar situation occurred in May and June of 1998, when Central American fires resulted in air quality warnings for several U.S. States.

    The image on the left is a natural color view acquired by MISR's vertical-viewing (nadir) camera. Smoke is visible, but sunglint in some ocean areas makes detection difficult. The middle image, on the other hand, is a natural color view acquired by MISR's 70-degree backward-viewing camera; its oblique view angle simultaneously suppresses sunglint and enhances the smoke. A map of aerosol optical depth, a measurement of the abundance of atmospheric particulates, is provided on the right. This quantity is retrieved using an automated computer algorithm that takes advantage of MISR's multi-angle capability. Areas where no retrieval occurred are shown in black.

    The images each represent an area of about 380 kilometers x 1550 kilometers and were captured during Terra orbit 12616.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  2. Surgeon point-of-view recording: Using a high-definition head-mounted video camera in the operating room.

    PubMed

    Nair, Akshay Gopinathan; Kamal, Saurabh; Dave, Tarjani Vivek; Mishra, Kapil; Reddy, Harsha S; Della Rocca, David; Della Rocca, Robert C; Andron, Aleza; Jain, Vandana

    2015-10-01

    To study the utility of a commercially available small, portable ultra-high definition (HD) camera (GoPro Hero 4) for intraoperative recording. A head mount was used to fix the camera on the operating surgeon's head. Due care was taken to protect the patient's identity. The recorded video was subsequently edited and used as a teaching tool. This retrospective, noncomparative study was conducted at three tertiary eye care centers. The surgeries recorded were ptosis correction, ectropion correction, dacryocystorhinostomy, angular dermoid excision, enucleation, blepharoplasty and lid tear repair surgery (one each). The recorded videos were reviewed, edited, and checked for clarity, resolution, and reproducibility. The recorded videos were found to be high quality, which allowed for zooming and visualization of the surgical anatomy clearly. Minimal distortion is a drawback that can be effectively addressed during postproduction. The camera, owing to its lightweight and small size, can be mounted on the surgeon's head, thus offering a unique surgeon point-of-view. In our experience, the results were of good quality and reproducible. A head-mounted ultra-HD video recording system is a cheap, high quality, and unobtrusive technique to record surgery and can be a useful teaching tool in external facial and ophthalmic plastic surgery.

  3. Surgeon point-of-view recording: Using a high-definition head-mounted video camera in the operating room

    PubMed Central

    Nair, Akshay Gopinathan; Kamal, Saurabh; Dave, Tarjani Vivek; Mishra, Kapil; Reddy, Harsha S; Rocca, David Della; Rocca, Robert C Della; Andron, Aleza; Jain, Vandana

    2015-01-01

    Objective: To study the utility of a commercially available small, portable ultra-high definition (HD) camera (GoPro Hero 4) for intraoperative recording. Methods: A head mount was used to fix the camera on the operating surgeon's head. Due care was taken to protect the patient's identity. The recorded video was subsequently edited and used as a teaching tool. This retrospective, noncomparative study was conducted at three tertiary eye care centers. The surgeries recorded were ptosis correction, ectropion correction, dacryocystorhinostomy, angular dermoid excision, enucleation, blepharoplasty and lid tear repair surgery (one each). The recorded videos were reviewed, edited, and checked for clarity, resolution, and reproducibility. Results: The recorded videos were found to be high quality, which allowed for zooming and visualization of the surgical anatomy clearly. Minimal distortion is a drawback that can be effectively addressed during postproduction. The camera, owing to its lightweight and small size, can be mounted on the surgeon's head, thus offering a unique surgeon point-of-view. In our experience, the results were of good quality and reproducible. Conclusions: A head-mounted ultra-HD video recording system is a cheap, high quality, and unobtrusive technique to record surgery and can be a useful teaching tool in external facial and ophthalmic plastic surgery. PMID:26655001

  4. Automatic Camera Calibration Using Multiple Sets of Pairwise Correspondences.

    PubMed

    Vasconcelos, Francisco; Barreto, Joao P; Boyer, Edmond

    2018-04-01

    We propose a new method to add an uncalibrated node into a network of calibrated cameras using only pairwise point correspondences. While previous methods perform this task using triple correspondences, these are often difficult to establish when there is limited overlap between different views. In such challenging cases we must rely on pairwise correspondences and our solution becomes more advantageous. Our method includes an 11-point minimal solution for the intrinsic and extrinsic calibration of a camera from pairwise correspondences with other two calibrated cameras, and a new inlier selection framework that extends the traditional RANSAC family of algorithms to sampling across multiple datasets. Our method is validated on different application scenarios where a lack of triple correspondences might occur: addition of a new node to a camera network; calibration and motion estimation of a moving camera inside a camera network; and addition of views with limited overlap to a Structure-from-Motion model.

  5. PBF Reactor Building (PER620). Camera faces southeast. Concrete placement will ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera faces southeast. Concrete placement will leave opening for neutron camera to be installed later. Note vertical piping within rebar. Photographer: John Capek. Date: July 6, 1967. INEEL negative no. 67-3514 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  6. An Unusual View: MISR sees the Moon

    NASA Image and Video Library

    2017-08-17

    The job of the Multiangle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite is to view Earth. For more than 17 years, its nine cameras have stared downward 24 hours a day, faithfully collecting images used to study Earth's surface and atmosphere. On August 5, however, MISR captured some very unusual data as the Terra satellite performed a backflip in space. This maneuver was performed to allow MISR and the other instruments on Terra to catch a glimpse of the Moon, something that has been done only once before, in 2003. Why task an elderly satellite with such a radical maneuver? Since we can be confident that the Moon's brightness has remained very constant over the mission, MISR's images of the Moon can be used as a check of the instrument's calibration, allowing an independent verification of the procedures used to correct the images for any changes the cameras have experienced over their many years in space. If changes in the cameras' responses to light aren't properly accounted for, the images captured by MISR would make it appear as if Earth were growing darker or lighter, which would throw off scientists' efforts to characterize air pollution, cloud cover and Earth's climate. Because of this, the MISR team uses several methods to calibrate the data, all of which involve imaging something with a known (or independently measured) brightness and correcting the images to match that brightness. Every month, MISR views two panels of a special material called Spectralon, which reflects sunlight in a very particular way, onboard the instrument. Periodically, this calibration is checked by a field team who measures the brightness of a flat, uniformly colored surface on Earth, usually a dry desert lakebed, as MISR flies overhead. The lunar maneuver offers a third opportunity to check the brightness calibration of MISR's images. While viewing Earth, MISR's cameras are fixed at nine different angles, with one (called An) pointed straight down, four

  7. HST Solar Arrays photographed by Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This medium close-up view of one of two original Solar Arrays (SA) on the Hubble Space Telescope (HST) was photographed with an Electronic Still Camera (ESC), and downlinked to ground controllers soon afterward. This view shows the cell side of the minus V-2 panel. Electronic still photography is a technology which provides the means for a handheld camera to electronically capture and digitize an image with resolution approaching film quality.

  8. A feasibility study on the measurement of tree trunks in forests using multi-scale vertical images

    NASA Astrophysics Data System (ADS)

    Berveglieri, A.; Oliveira, R. O.; Tommaselli, A. M. G.

    2014-06-01

    The determination of the Diameter at Breast Height (DBH) is an important variable that contributes to several studies on forest, e.g., environmental monitoring, tree growth, volume of wood, and biomass estimation. This paper presents a preliminary technique for the measurement of tree trunks using terrestrial images collected with a panoramic camera in nadir view. A multi-scale model is generated with these images. Homologue points on the trunk surface are measured over the images and their ground coordinates are determined by intersection of rays. The resulting XY coordinates of each trunk, defining an arc shape, can be used as observations in a circle fitting by least squares. Then, the DBH of each trunk is calculated using an estimated radius. Experiments were performed in two urban forest areas to assess the approach. In comparison with direct measurements on the trunks taken with a measuring tape, the discrepancies presented a Root Mean Square Error (RMSE) of 1.8 cm with a standard deviation of 0.7 cm. These results demonstrate compatibility with manual measurements and confirm the feasibility of the proposed technique.

  9. MISR CMVs and Multiangular Views of Tropical Cyclone Inner-Core Dynamics

    NASA Technical Reports Server (NTRS)

    Wu, Dong L.; Diner, David J.; Garay, Michael J; Jovanovic, Veljko M.; Lee, Jae N.; Moroney, Catherine M.; Mueller, Kevin J.; Nelson, David L.

    2010-01-01

    Multi-camera stereo imaging of cloud features from the MISR (Multiangle Imaging SpectroRadiometer) instrument on NASA's Terra satellite provides accurate and precise measurements of cloud top heights (CTH) and cloud motion vector (CMV) winds. MISR observes each cloudy scene from nine viewing angles (Nadir, +/-26(sup o), +/-46(sup o), +/-60(sup o), +/-70(sup o)) with approximatel 275-m pixel resolution. This paper provides an update on MISR CMV and CTH algorithm improvements, and explores a high-resolution retrieval of tangential winds inside the eyewall of tropical cyclones (TC). The MISR CMV and CTH retrievals from the updated algorithm are significantly improved in terms of spatial coverage and systematic errors. A new product, the 1.1-km cross-track wind, provides high accuracy and precision in measuring convective outflows. Preliminary results obtained from the 1.1-km tangential wind retrieval inside the TC eyewall show that the inner-core rotation is often faster near the eyewall, and this faster rotation appears to be related linearly to cyclone intensity.

  10. A remote camera at Launch Pad 39B, at the Kennedy Space Center (KSC), recorded this profile view of

    NASA Technical Reports Server (NTRS)

    1996-01-01

    STS-75 LAUNCH VIEW --- A remote camera at Launch Pad 39B, at the Kennedy Space Center (KSC), recorded this profile view of the Space Shuttle Columbia as it cleared the tower to begin the mission. The liftoff occurred on schedule at 3:18:00 p.m. (EST), February 22, 1996. Onboard Columbia for the scheduled two-week mission were astronauts Andrew M. Allen, commander; Scott J. Horowitz, pilot; Franklin R. Chang-Diaz, payload commander; and astronauts Maurizio Cheli, Jeffrey A. Hoffman and Claude Nicollier, along with payload specialist Umberto Guidioni. Cheli and Nicollier represent the European Space Agency (ESA), while Guidioni represents the Italian Space Agency (ASI).

  11. NASA MISR Tracks Growth of Rift in the Larsen C Ice Shelf

    NASA Image and Video Library

    2017-04-11

    A rift in Antarctica's Larsen C ice shelf has grown to 110 miles (175 km) long, making it inevitable that an iceberg larger than Rhode Island will soon calve from the ice shelf. Larsen C is the fourth largest ice shelf in Antarctica, with an area of almost 20,000 square miles (50,000 square kilometers). The calving event will remove approximately 10 percent of the ice shelf's mass, according to the Project for Impact of Melt on Ice Shelf Dynamics and Stability (MIDAS), a UK-based team studying the ice shelf. Only 12 miles (20 km) of ice now separates the end of the rift from the ocean. The rift has grown at least 30 miles (50 km) in length since August, but appears to be slowing recently as Antarctica returns to polar winter. Project MIDAS reports that the calving event might destabilize the ice shelf, which could result in a collapse similar to what occurred to the Larsen B ice shelf in 2002. The Multi-angle Imaging SpectroRadiometer (MISR) instrument aboard NASA's Terra satellite captured views of Larsen C on August 22, 2016, when the rift was 80 miles (130 km) in length; December 8, 2016, when the rift was approximately 90 miles (145 km) long; and April 6, 2017. The MISR instrument has nine cameras, which view the Earth at different angles. The overview image, from December 8, shows the entire Antarctic Peninsula -- home to Larsen A, B, and C ice shelves -- in natural color (similar to how it would appear to the human eye) from MISR's vertical-viewing camera. Combining information from several MISR cameras pointed at different angles gives information about the texture of the ice. The accompanying GIF depicts the inset area shown on the larger image and displays data from all three dates in false color. These multiangular views -- composited from MISR's 46-degree backward-pointing camera, the nadir (vertical-viewing) camera, and the 46-degree forward-pointing camera -- represent variations in ice texture as changes in color, such that areas of rough ice appear

  12. Morning view, contextual view showing the road and gate to ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Morning view, contextual view showing the road and gate to be widened; view taken from the statue area with the camera facing north. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  13. Modified Hitschfeld-Bordan Equations for Attenuation-Corrected Radar Rain Reflectivity: Application to Nonuniform Beamfilling at Off-Nadir Incidence

    NASA Technical Reports Server (NTRS)

    Meneghini, Robert; Liao, Liang

    2013-01-01

    As shown by Takahashi et al., multiple path attenuation estimates over the field of view of an airborne or spaceborne weather radar are feasible for off-nadir incidence angles. This follows from the fact that the surface reference technique, which provides path attenuation estimates, can be applied to each radar range gate that intersects the surface. This study builds on this result by showing that three of the modified Hitschfeld-Bordan estimates for the attenuation-corrected radar reflectivity factor can be generalized to the case where multiple path attenuation estimates are available, thereby providing a correction to the effects of nonuniform beamfilling. A simple simulation is presented showing some strengths and weaknesses of the approach.

  14. Mars Exploration Rover engineering cameras

    USGS Publications Warehouse

    Maki, J.N.; Bell, J.F.; Herkenhoff, K. E.; Squyres, S. W.; Kiely, A.; Klimesh, M.; Schwochert, M.; Litwin, T.; Willson, R.; Johnson, Aaron H.; Maimone, M.; Baumgartner, E.; Collins, A.; Wadsworth, M.; Elliot, S.T.; Dingizian, A.; Brown, D.; Hagerott, E.C.; Scherr, L.; Deen, R.; Alexander, D.; Lorre, J.

    2003-01-01

    NASA's Mars Exploration Rover (MER) Mission will place a total of 20 cameras (10 per rover) onto the surface of Mars in early 2004. Fourteen of the 20 cameras are designated as engineering cameras and will support the operation of the vehicles on the Martian surface. Images returned from the engineering cameras will also be of significant importance to the scientific community for investigative studies of rock and soil morphology. The Navigation cameras (Navcams, two per rover) are a mast-mounted stereo pair each with a 45?? square field of view (FOV) and an angular resolution of 0.82 milliradians per pixel (mrad/pixel). The Hazard Avoidance cameras (Hazcams, four per rover) are a body-mounted, front- and rear-facing set of stereo pairs, each with a 124?? square FOV and an angular resolution of 2.1 mrad/pixel. The Descent camera (one per rover), mounted to the lander, has a 45?? square FOV and will return images with spatial resolutions of ???4 m/pixel. All of the engineering cameras utilize broadband visible filters and 1024 x 1024 pixel detectors. Copyright 2003 by the American Geophysical Union.

  15. Decline of vertical gaze and convergence with aging.

    PubMed

    Oguro, Hiroaki; Okada, Kazunori; Suyama, Nobuo; Yamashita, Kazuya; Yamaguchi, Shuhei; Kobayashi, Shotai

    2004-01-01

    Disturbance of vertical eye movement and ocular convergence is often observed in elderly people, but little is known about its frequency. The purpose of this study was to investigate age-associated changes in vertical eye movement and convergence in healthy elderly people, using a digital video camera system. We analyzed vertical eye movements and convergence in 113 neurologically normal elderly subjects (mean age 70 years) in comparison with 20 healthy young controls (mean age 32 years). The range of vertical eye movement was analyzed quantitatively and convergence was analyzed qualitatively. In the elderly subjects, the angle of vertical gaze decreased with advancing age and it was significantly smaller than that of the younger subjects. The mean angle of upward gaze was significantly smaller than that of downward gaze for both young and elderly subjects. Upward gaze impairment became apparent in subjects in their 70s, and downward gaze impairment in subjects in their 60s. Disturbance in convergence also increased with advancing age, and was found in 40.7% of the elderly subjects. These findings indicate that the mechanisms of age-related change are different for upward and downward vertical gaze. Digital video camera monitoring was useful for assessing and monitoring eye movements. Copyright 2004 S. Karger AG, Basel

  16. Versatile microsecond movie camera

    NASA Astrophysics Data System (ADS)

    Dreyfus, R. W.

    1980-03-01

    A laboratory-type movie camera is described which satisfies many requirements in the range 1 microsec to 1 sec. The camera consists of a He-Ne laser and compatible state-of-the-art components; the primary components are an acoustooptic modulator, an electromechanical beam deflector, and a video tape system. The present camera is distinct in its operation in that submicrosecond laser flashes freeze the image motion while still allowing the simplicity of electromechanical image deflection in the millisecond range. The gating and pulse delay circuits of an oscilloscope synchronize the modulator and scanner relative to the subject being photographed. The optical table construction and electronic control enhance the camera's versatility and adaptability. The instant replay video tape recording allows for easy synchronization and immediate viewing of the results. Economy is achieved by using off-the-shelf components, optical table construction, and short assembly time.

  17. The development of large-aperture test system of infrared camera and visible CCD camera

    NASA Astrophysics Data System (ADS)

    Li, Yingwen; Geng, Anbing; Wang, Bo; Wang, Haitao; Wu, Yanying

    2015-10-01

    Infrared camera and CCD camera dual-band imaging system is used in many equipment and application widely. If it is tested using the traditional infrared camera test system and visible CCD test system, 2 times of installation and alignment are needed in the test procedure. The large-aperture test system of infrared camera and visible CCD camera uses the common large-aperture reflection collimator, target wheel, frame-grabber, computer which reduces the cost and the time of installation and alignment. Multiple-frame averaging algorithm is used to reduce the influence of random noise. Athermal optical design is adopted to reduce the change of focal length location change of collimator when the environmental temperature is changing, and the image quality of the collimator of large field of view and test accuracy are also improved. Its performance is the same as that of the exotic congener and is much cheaper. It will have a good market.

  18. Hubble Space Telescope photographed by Electronic Still Camera

    NASA Image and Video Library

    1993-12-04

    S61-E-008 (4 Dec 1993) --- This view of the Earth-orbiting Hubble Space Telescope (HST) was photographed with an Electronic Still Camera (ESC), and down linked to ground controllers soon afterward. This view was taken during rendezvous operations. Endeavour's crew captured the HST on December 4, 1993 in order to service the telescope. Over a period of five days, four of the crew members will work in alternating pairs outside Endeavour's shirt sleeve environment. Electronic still photography is a relatively new technology which provides the means for a handheld camera to electronically capture and digitize an image with resolution approaching film quality. The electronic still camera has flown as an experiment on several other shuttle missions.

  19. Supertyphoon Nepartak Barreling Toward Taiwan Viewed by NASA MISR

    NASA Image and Video Library

    2016-07-08

    Typhoon Nepartak, the first large typhoon in the northwest Pacific this season, is currently taking aim at the east coast of Taiwan. Over the past few days, Nepartak has rapidly gained strength, growing from a tropical storm to the equivalent of a Category 5 hurricane with sustained wind speeds of more than 160 miles (258 kilometers) per hour. Taiwan's Central Weather Bureau has issued a torrential rain warning, bracing for likely flooding as 5 to 15 inches (13 to 38 centimeters) of rain are expected to fall over Taiwan during the storm's passage. Waves of up to 40 feet (12 meters) are predicted on the coast as the typhoon approaches, and air and train travel have been severely impacted. The typhoon is currently moving at about 10 miles per hour (16 kilometers) to the west-northwest, and is predicted to pass over Taiwan within the next day and then hit the coast of mainland China. Central and eastern China are poorly situated to absorb the rainfall from Nepartak after suffering the effects of severe monsoon flooding, which has killed at least 140 people in the past week. The Multi-angle Imaging SpectroRadiometer (MISR) instrument aboard NASA's Terra satellite captured this view of Typhoon Nepartak on July 7, 2016, at 10:30 a.m. local time (2:30 a.m. UTC). On the left is an image from the nadir (vertical pointing) camera, which shows the central portion of Nepartak and the storm's eye. The image is about 235 miles (378 kilometers) across. The island of Manila in the Philippines, about 250 miles (400 kilometers) south of Taiwan, is visible to the southwest of the eye. The image shows that Nepartak's center is extremely compact, rather than broken up into spiral bands as is more typical of typhoons. This means that the storm may retain more of its strength as it passes over land. MISR uses nine cameras to capture images of the typhoon from different angles. This provides a stereographic view, which can be used to determine the height of the storm's cloud tops. These

  20. Semi-autonomous wheelchair system using stereoscopic cameras.

    PubMed

    Nguyen, Jordan S; Nguyen, Thanh H; Nguyen, Hung T

    2009-01-01

    This paper is concerned with the design and development of a semi-autonomous wheelchair system using stereoscopic cameras to assist hands-free control technologies for severely disabled people. The stereoscopic cameras capture an image from both the left and right cameras, which are then processed with a Sum of Absolute Differences (SAD) correlation algorithm to establish correspondence between image features in the different views of the scene. This is used to produce a stereo disparity image containing information about the depth of objects away from the camera in the image. A geometric projection algorithm is then used to generate a 3-Dimensional (3D) point map, placing pixels of the disparity image in 3D space. This is then converted to a 2-Dimensional (2D) depth map allowing objects in the scene to be viewed and a safe travel path for the wheelchair to be planned and followed based on the user's commands. This assistive technology utilising stereoscopic cameras has the purpose of automated obstacle detection, path planning and following, and collision avoidance during navigation. Experimental results obtained in an indoor environment displayed the effectiveness of this assistive technology.

  1. Harbour surveillance with cameras calibrated with AIS data

    NASA Astrophysics Data System (ADS)

    Palmieri, F. A. N.; Castaldo, F.; Marino, G.

    The inexpensive availability of surveillance cameras, easily connected in network configurations, suggests the deployment of this additional sensor modality in port surveillance. Vessels appearing within cameras fields of view can be recognized and localized providing to fusion centers information that can be added to data coming from Radar, Lidar, AIS, etc. Camera systems, that are used as localizers however, must be properly calibrated in changing scenarios where often there is limited choice on the position on which they are deployed. Automatic Identification System (AIS) data, that includes position, course and vessel's identity, freely available through inexpensive receivers, for some of the vessels appearing within the field of view, provide the opportunity to achieve proper camera calibration to be used for the localization of vessels not equipped with AIS transponders. In this paper we assume a pinhole model for camera geometry and propose perspective matrices computation using AIS positional data. Images obtained from calibrated cameras are then matched and pixel association is utilized for other vessel's localization. We report preliminary experimental results of calibration and localization using two cameras deployed on the Gulf of Naples coastline. The two cameras overlook a section of the harbour and record short video sequences that are synchronized offline with AIS positional information of easily-identified passenger ships. Other small vessels, not equipped with AIS transponders, are localized using camera matrices and pixel matching. Localization accuracy is experimentally evaluated as a function of target distance from the sensors.

  2. Highest-Resolution View of 'Face on Mars'

    NASA Technical Reports Server (NTRS)

    2001-01-01

    A key aspect of the Mars Global Surveyor (MGS) Extended Mission is the opportunity to turn the spacecraft and point the Mars Orbiter Camera (MOC) at specific features of interest. A chance to point the spacecraft comes about ten times a week. Throughout the Primary Mission (March 1999 - January 2001), nearly all MGS operations were conducted with the spacecraft pointing 'nadir'--that is, straight down. In this orientation, opportunities to hit a specific small feature of interest were in some cases rare, and in other cases non-existent. In April 1998, nearly a year before MGS reached its Primary Mission mapping orbit, several tests of the spacecraft's ability to be pointed at specific features was conducted with great success (e.g., Mars Pathfinder landing site, Viking 1 site, and Cydonia landforms). When the Mars Polar Lander was lost in December 1999, this capability was again employed to search for the missing lander. Following the lander search activities, a plan to conduct similar off-nadir observations during the MGS Extended Mission was put into place. The Extended Mission began February 1, 2001. On April 8, 2001, the first opportunity since April 1998 arose to turn the spacecraft and point the MOC at the popular 'Face on Mars' feature.

    Viking orbiter images acquired in 1976 showed that one of thousands of buttes, mesas, ridges, and knobs in the transition zone between the cratered uplands of western Arabia Terra and the low, northern plains of Mars looked somewhat like a human face. The feature was subsequently popularized as a potential 'alien artifact' in books, tabloids, radio talk shows, television, and even a major motion picture. Given the popularity of this landform, a new high-resolution view was targeted by pointing the spacecraft off-nadir on April 8, 2001. On that date at 20:54 UTC (8:54 p.m., Greenwich time zone), the MGS was rolled 24.8o to the left so that it was looking at the 'face' 165 km to the side from a distance of about 450 km

  3. Localization and Mapping Using a Non-Central Catadioptric Camera System

    NASA Astrophysics Data System (ADS)

    Khurana, M.; Armenakis, C.

    2018-05-01

    This work details the development of an indoor navigation and mapping system using a non-central catadioptric omnidirectional camera and its implementation for mobile applications. Omnidirectional catadioptric cameras find their use in navigation and mapping of robotic platforms, owing to their wide field of view. Having a wider field of view, or rather a potential 360° field of view, allows the system to "see and move" more freely in the navigation space. A catadioptric camera system is a low cost system which consists of a mirror and a camera. Any perspective camera can be used. A platform was constructed in order to combine the mirror and a camera to build a catadioptric system. A calibration method was developed in order to obtain the relative position and orientation between the two components so that they can be considered as one monolithic system. The mathematical model for localizing the system was determined using conditions based on the reflective properties of the mirror. The obtained platform positions were then used to map the environment using epipolar geometry. Experiments were performed to test the mathematical models and the achieved location and mapping accuracies of the system. An iterative process of positioning and mapping was applied to determine object coordinates of an indoor environment while navigating the mobile platform. Camera localization and 3D coordinates of object points obtained decimetre level accuracies.

  4. Mars Science Laboratory Engineering Cameras

    NASA Technical Reports Server (NTRS)

    Maki, Justin N.; Thiessen, David L.; Pourangi, Ali M.; Kobzeff, Peter A.; Lee, Steven W.; Dingizian, Arsham; Schwochert, Mark A.

    2012-01-01

    NASA's Mars Science Laboratory (MSL) Rover, which launched to Mars in 2011, is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover (MER) cameras, which were sent to Mars in 2003. The engineering cameras weigh less than 300 grams each and use less than 3 W of power. Images returned from the engineering cameras are used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The hazard avoidance cameras (Haz - cams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a frame-transfer CCD (charge-coupled device) with a 1024x1024 imaging region and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer A and the other set is connected to rover computer B. The MSL rover carries 8 Hazcams and 4 Navcams.

  5. Postlaunch Performance of the Suomi National Polar-Orbiting Partnership Ozone Mapping and Profiler Suite (OMPS) Nadir Sensors

    NASA Technical Reports Server (NTRS)

    Seftor, C. J.; Jaross, G.; Kowitt, M.; Haken, M.; Li, J.; Flynn, L. E.

    2014-01-01

    The prelaunch specifications for nadir sensors of the Ozone Mapping and Profiler Suite (OMPS) were designed to ensure that measurements from them could be used to retrieve total column ozone and nadir ozone profile information both for operational use and for use in long-term ozone data records. In this paper, we will show results from our extensive analysis of the performance of the nadir mapper (NM) and nadir profiler (NP) sensors during the first year and a half of OMPS nadir operations. In most cases, we determined that both sensors meet or exceed their prelaunch specifications. Normalized radiance (radiance divided by irradiance) measurements have been determined to be well within their 2% specification for both sensors. In the case of stray light, the NM sensor is within its 2% specification for all but the shortest wavelengths, while the NP sensor is within its 2% specification for all but the longest wavelengths. Artifacts that negatively impacted the sensor calibration due to diffuser features were reduced to less than 1% through changes made in the solar calibration sequence. Preliminary analysis of the disagreement between measurements made by the NM and NP sensors in the region where their wavelengths overlap indicates that it is due to shifts in the shared dichroic filter after launch and that it can be corrected. In general, our analysis indicates that both the NM and NP sensors are performing well, that they are stable, and that any deviations from nominal performance can be well characterized and corrected.

  6. The Camera Is Not a Methodology: Towards a Framework for Understanding Young Children's Use of Video Cameras

    ERIC Educational Resources Information Center

    Bird, Jo; Colliver, Yeshe; Edwards, Susan

    2014-01-01

    Participatory research methods argue that young children should be enabled to contribute their perspectives on research seeking to understand their worldviews. Visual research methods, including the use of still and video cameras with young children have been viewed as particularly suited to this aim because cameras have been considered easy and…

  7. Polarizing aperture stereoscopic cinema camera

    NASA Astrophysics Data System (ADS)

    Lipton, Lenny

    2012-03-01

    The art of stereoscopic cinematography has been held back because of the lack of a convenient way to reduce the stereo camera lenses' interaxial to less than the distance between the eyes. This article describes a unified stereoscopic camera and lens design that allows for varying the interaxial separation to small values using a unique electro-optical polarizing aperture design for imaging left and right perspective views onto a large single digital sensor (the size of the standard 35mm frame) with the means to select left and right image information. Even with the added stereoscopic capability the appearance of existing camera bodies will be unaltered.

  8. Polarizing aperture stereoscopic cinema camera

    NASA Astrophysics Data System (ADS)

    Lipton, Lenny

    2012-07-01

    The art of stereoscopic cinematography has been held back because of the lack of a convenient way to reduce the stereo camera lenses' interaxial to less than the distance between the eyes. This article describes a unified stereoscopic camera and lens design that allows for varying the interaxial separation to small values using a unique electro-optical polarizing aperture design for imaging left and right perspective views onto a large single digital sensor, the size of the standard 35 mm frame, with the means to select left and right image information. Even with the added stereoscopic capability, the appearance of existing camera bodies will be unaltered.

  9. Clinical assessment of oral mucositis and candidiasis compare to chemotherapic nadir in transplanted patients.

    PubMed

    Patussi, Cleverson; Sassi, Laurindo Moacir; Munhoz, Eduardo Ciliao; Zanicotti, Roberta Targa Stramandinoli; Schussel, Juliana Lucena

    2014-01-01

    Oral mucositis is a chief complication in patients undergoing hematopoietic stem cell transplantation (HSCT). It is considered a toxic inflammatory reaction that interferes with the patient's recuperation and quality of life. Oral candidiasis is a common fungal infection observed in dental practice, particularly in immunocompromised patients. The aim of this study was to evaluate the presence of oral mucositis and oral candidiasis in patients who underwent HSCT and their correlation with the chemotherapeutic nadir (lowest possible outcome). We evaluated patients with different diagnoses who underwent HSCT at the Hospital Erasto Gaertner. No chemotherapeutic nadir curves could be associated with mucositis, and patients had different presentations of mucositis. No patient developed oral candidiasis during hospitalization. Together with cell counts, we collected demographic data including age, oral hygiene, habits harmful to health, and the use of oral prostheses. It was observed that patients who smoked cigarettes before hospitalization showed less mucositis, resulting in no feeding problems or other comorbid conditions due to the effect of mucositis. However, the nadir of the chemotherapy curve, in isolation, is not a predictive tool for the appearance (or no appearance) of oral mucositis.

  10. Depth Perception In Remote Stereoscopic Viewing Systems

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B.; Von Sydow, Marika

    1989-01-01

    Report describes theoretical and experimental studies of perception of depth by human operators through stereoscopic video systems. Purpose of such studies to optimize dual-camera configurations used to view workspaces of remote manipulators at distances of 1 to 3 m from cameras. According to analysis, static stereoscopic depth distortion decreased, without decreasing stereoscopitc depth resolution, by increasing camera-to-object and intercamera distances and camera focal length. Further predicts dynamic stereoscopic depth distortion reduced by rotating cameras around center of circle passing through point of convergence of viewing axes and first nodal points of two camera lenses.

  11. Multi-band infrared camera systems

    NASA Astrophysics Data System (ADS)

    Davis, Tim; Lang, Frank; Sinneger, Joe; Stabile, Paul; Tower, John

    1994-12-01

    The program resulted in an IR camera system that utilizes a unique MOS addressable focal plane array (FPA) with full TV resolution, electronic control capability, and windowing capability. Two systems were delivered, each with two different camera heads: a Stirling-cooled 3-5 micron band head and a liquid nitrogen-cooled, filter-wheel-based, 1.5-5 micron band head. Signal processing features include averaging up to 16 frames, flexible compensation modes, gain and offset control, and real-time dither. The primary digital interface is a Hewlett-Packard standard GPID (IEEE-488) port that is used to upload and download data. The FPA employs an X-Y addressed PtSi photodiode array, CMOS horizontal and vertical scan registers, horizontal signal line (HSL) buffers followed by a high-gain preamplifier and a depletion NMOS output amplifier. The 640 x 480 MOS X-Y addressed FPA has a high degree of flexibility in operational modes. By changing the digital data pattern applied to the vertical scan register, the FPA can be operated in either an interlaced or noninterlaced format. The thermal sensitivity performance of the second system's Stirling-cooled head was the best of the systems produced.

  12. A Fast and Robust Extrinsic Calibration for RGB-D Camera Networks.

    PubMed

    Su, Po-Chang; Shen, Ju; Xu, Wanxin; Cheung, Sen-Ching S; Luo, Ying

    2018-01-15

    From object tracking to 3D reconstruction, RGB-Depth (RGB-D) camera networks play an increasingly important role in many vision and graphics applications. Practical applications often use sparsely-placed cameras to maximize visibility, while using as few cameras as possible to minimize cost. In general, it is challenging to calibrate sparse camera networks due to the lack of shared scene features across different camera views. In this paper, we propose a novel algorithm that can accurately and rapidly calibrate the geometric relationships across an arbitrary number of RGB-D cameras on a network. Our work has a number of novel features. First, to cope with the wide separation between different cameras, we establish view correspondences by using a spherical calibration object. We show that this approach outperforms other techniques based on planar calibration objects. Second, instead of modeling camera extrinsic calibration using rigid transformation, which is optimal only for pinhole cameras, we systematically test different view transformation functions including rigid transformation, polynomial transformation and manifold regression to determine the most robust mapping that generalizes well to unseen data. Third, we reformulate the celebrated bundle adjustment procedure to minimize the global 3D reprojection error so as to fine-tune the initial estimates. Finally, our scalable client-server architecture is computationally efficient: the calibration of a five-camera system, including data capture, can be done in minutes using only commodity PCs. Our proposed framework is compared with other state-of-the-arts systems using both quantitative measurements and visual alignment results of the merged point clouds.

  13. Near vertical view of Lubbock area in west Texas as seen from Apollo 9

    NASA Image and Video Library

    1969-03-11

    AS09-23-3561 (3-13 March 1969) --- Near vertical view of the Lubbock area in west Texas as photographed from the Apollo 9 spacecraft during its Earth-orbital mission. Conspicuous patterns of farmland surround the city and extend eastward (up) to the Caprock Escarpment. The Double Mountain fork of the Brazos River drains east (toward upper center); Leeland is at lower center; Brownfield at lower right. The sharp edge of a cloud disk cuts across the upper right corner.

  14. HIGH SPEED KERR CELL FRAMING CAMERA

    DOEpatents

    Goss, W.C.; Gilley, L.F.

    1964-01-01

    The present invention relates to a high speed camera utilizing a Kerr cell shutter and a novel optical delay system having no moving parts. The camera can selectively photograph at least 6 frames within 9 x 10/sup -8/ seconds during any such time interval of an occurring event. The invention utilizes particularly an optical system which views and transmits 6 images of an event to a multi-channeled optical delay relay system. The delay relay system has optical paths of successively increased length in whole multiples of the first channel optical path length, into which optical paths the 6 images are transmitted. The successively delayed images are accepted from the exit of the delay relay system by an optical image focusing means, which in turn directs the images into a Kerr cell shutter disposed to intercept the image paths. A camera is disposed to simultaneously view and record the 6 images during a single exposure of the Kerr cell shutter. (AEC)

  15. An electronic pan/tilt/zoom camera system

    NASA Technical Reports Server (NTRS)

    Zimmermann, Steve; Martin, H. Lee

    1991-01-01

    A camera system for omnidirectional image viewing applications that provides pan, tilt, zoom, and rotational orientation within a hemispherical field of view (FOV) using no moving parts was developed. The imaging device is based on the effect that from a fisheye lens, which produces a circular image of an entire hemispherical FOV, can be mathematically corrected using high speed electronic circuitry. An incoming fisheye image from any image acquisition source is captured in memory of the device, a transformation is performed for the viewing region of interest and viewing direction, and a corrected image is output as a video image signal for viewing, recording, or analysis. As a result, this device can accomplish the functions of pan, tilt, rotation, and zoom throughout a hemispherical FOV without the need for any mechanical mechanisms. A programmable transformation processor provides flexible control over viewing situations. Multiple images, each with different image magnifications and pan tilt rotation parameters, can be obtained from a single camera. The image transformation device can provide corrected images at frame rates compatible with RS-170 standard video equipment.

  16. Value Added: the Case for Point-of-View Camera use in Orthopedic Surgical Education

    PubMed Central

    Thomas, Geb W.; Taylor, Leah; Liu, Xiaoxing; Anthony, Chris A.; Anderson, Donald D.

    2016-01-01

    Abstract Background Orthopedic surgical education is evolving as educators search for new ways to enhance surgical skills training. Orthopedic educators should seek new methods and technologies to augment and add value to real-time orthopedic surgical experience. This paper describes a protocol whereby we have started to capture and evaluate specific orthopedic milestone procedures with a GoPro® point-of-view video camera and a dedicated video reviewing website as a way of supplementing the current paradigm in surgical skills training. We report our experience regarding the details and feasibility of this protocol. Methods Upon identification of a patient undergoing surgical fixation of a hip or ankle fracture, an orthopedic resident places a GoPro® point-of-view camera on his or her forehead. All fluoroscopic images acquired during the case are saved and later incorporated into a video on the reviewing website. Surgical videos are uploaded to a secure server and are accessible for later review and assessment via a custom-built website. An electronic survey of resident participants was performed utilizing Qualtrics software. Results are reported using descriptive statistics. Results A total of 51 surgical videos involving 23 different residents have been captured to date. This includes 20 intertrochanteric hip fracture cases and 31 ankle fracture cases. The average duration of each surgical video was 1 hour and 16 minutes (range 40 minutes to 2 hours and 19 minutes). Of 24 orthopedic resident surgeons surveyed, 88% thought capturing a video portfolio of orthopedic milestones would benefit their education Conclusions There is a growing demand in orthopedic surgical education to extract more value from each surgical experience. While further work in development and refinement of such assessments is necessary, we feel that intraoperative video, particularly when captured and presented in a non-threatening, user friendly manner, can add significant value to the

  17. Press Pelease Image - STS-1 - Earth View

    NASA Image and Video Library

    1981-04-12

    S81-30396 (12-14 April 1981) --- A vertical view of Eleuthera Island in the Bahamas and part of the great Bahama Bank, as photographed with a 70mm handheld camera from the space shuttle Columbia in Earth orbit. The light blue of the Bahama Bank contrasts sharply with the darker blue of the deep ocean waters. Astronauts John W. Young, commander, and Robert L. Crippen, pilot, took a series of Earth photos from inside the flight deck of the Columbia, which has windows on its top side, convenient for shooting photographs as the spacecraft flew ?upside down? above Earth. The mission frame ID number is STS001-12-322. Photo credit: NASA

  18. Morning view, brick post detail; view also shows dimensional wallconstruction ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Morning view, brick post detail; view also shows dimensional wall-construction detail. North wall, with the camera facing northwest. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  19. Lights, Camera, AG-Tion: Promoting Agricultural and Environmental Education on Camera

    ERIC Educational Resources Information Center

    Fuhrman, Nicholas E.

    2016-01-01

    Viewing of online videos and television segments has become a popular and efficient way for Extension audiences to acquire information. This article describes a unique approach to teaching on camera that may help Extension educators communicate their messages with comfort and personality. The S.A.L.A.D. approach emphasizes using relevant teaching…

  20. First-day newborn weight loss predicts in-hospital weight nadir for breastfeeding infants.

    PubMed

    Flaherman, Valerie J; Bokser, Seth; Newman, Thomas B

    2010-08-01

    Exclusive breastfeeding reduces infant infectious disease. Losing > or =10% birth weight may lead to formula use. The predictive value of first-day weight loss for subsequent weight loss has not been studied. The objective of the present study was to evaluate the relationship between weight loss at <24 hours and subsequent in-hospital weight loss > or =10%. For 1,049 infants, we extracted gestational age, gender, delivery method, feeding type, and weights from medical records. Weight nadir was defined as the lowest weight recorded during birth hospitalization. We used multivariate logistic regression to assess the effect of first-day weight loss on subsequent in-hospital weight loss. Mean in-hospital weight nadir was 6.0 +/- 2.6%, and mean age at in-hospital weight nadir was 38.7 +/- 18.5 hours. While in the hospital 6.4% of infants lost > or =10% of birth weight. Infants losing > or =4.5% birth weight at <24 hours had greater risk of eventual in-hospital weight loss > or =10% (adjusted odds ratio 3.57 [1.75, 7.28]). In this cohort, 798 (76.1%) infants did not have documented weight gain while in the hospital. Early weight loss predicts higher risk of > or =10% in-hospital weight loss. Infants with high first-day weight loss could be targeted for further research into improved interventions to promote breastfeeding.

  1. The PLATO camera

    NASA Astrophysics Data System (ADS)

    Laubier, D.; Bodin, P.; Pasquier, H.; Fredon, S.; Levacher, P.; Vola, P.; Buey, T.; Bernardi, P.

    2017-11-01

    PLATO (PLAnetary Transits and Oscillation of stars) is a candidate for the M3 Medium-size mission of the ESA Cosmic Vision programme (2015-2025 period). It is aimed at Earth-size and Earth-mass planet detection in the habitable zone of bright stars and their characterisation using the transit method and the asterosismology of their host star. That means observing more than 100 000 stars brighter than magnitude 11, and more than 1 000 000 brighter than magnitude 13, with a long continuous observing time for 20 % of them (2 to 3 years). This yields a need for an unusually long term signal stability. For the brighter stars, the noise requirement is less than 34 ppm.hr-1/2, from a frequency of 40 mHz down to 20 μHz, including all sources of noise like for instance the motion of the star images on the detectors and frequency beatings. Those extremely tight requirements result in a payload consisting of 32 synchronised, high aperture, wide field of view cameras thermally regulated down to -80°C, whose data are combined to increase the signal to noise performances. They are split into 4 different subsets pointing at 4 directions to widen the total field of view; stars in the centre of that field of view are observed by all 32 cameras. 2 extra cameras are used with color filters and provide pointing measurement to the spacecraft Attitude and Orbit Control System (AOCS) loop. The satellite is orbiting the Sun at the L2 Lagrange point. This paper presents the optical, electronic and electrical, thermal and mechanical designs devised to achieve those requirements, and the results from breadboards developed for the optics, the focal plane, the power supply and video electronics.

  2. Limb-Nadir Matching for Tropospheric NO2: A New Algorithm in the SCIAMACHY Operational Level 2 Processor

    NASA Astrophysics Data System (ADS)

    Meringer, Markus; Gretschany, Sergei; Lichtenberg, Gunter; Hilboll, Andreas; Richter, Andreas; Burrows, John P.

    2015-11-01

    SCIAMACHY (SCanning Imaging Absorption spectroMeter for Atmospheric ChartographY) aboard ESA's environmental satellite ENVISAT observed the Earth's atmosphere in limb, nadir, and solar/lunar occultation geometries covering the UV-Visible to NIR spectral range. Limb and nadir geometries were the main operation modes for the retrieval of scientific data. The new version 6 of ESA's level 2 processor now provides for the first time an operational algorithm to combine measurements of these two geometries in order to generate new products. As a first instance the retrieval of tropospheric NO2 has been implemented based on IUP-Bremen's reference algorithm. We will detail the single processing steps performed by the operational limb-nadir matching algorithm and report the results of comparisons with the scientific tropospheric NO2 products of IUP and the Tropospheric Emission Monitoring Internet Service (TEMIS).

  3. Fluctuations of Lake Eyre, South Australia

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Lake Eyre is a large salt lake situated between two deserts in one of Australia's driest regions. However, this low-lying lake attracts run-off from one of the largest inland drainage systems in the world. The drainage basin is very responsive to rainfall variations, and changes dramatically with Australia's inter-annual weather fluctuations. When Lake Eyre fills,as it did in 1989, it is temporarily Australia's largest lake, and becomes dense with birds, frogs and colorful plant life. The Lake responds to extended dry periods (often associated with El Nino events) by drying completely.

    These four images from the Multi-angle Imaging SpectroRadiometer contrast the lake area at the start of the austral summers of 2000 and 2002. The top two panels portray the region as it appeared on December 9, 2000. Heavy rains in the first part of 2000 caused both the north and south sections of the lake to fill partially and the northern part of the lake still contained significant standing water by the time these data were acquired. The bottom panels were captured on November 29, 2002. Rainfall during 2002 was significantly below average ( http://www.bom.gov.au/ ), although showers occurring in the week before the image was acquired helped alleviate this condition slightly.

    The left-hand panels portray the area as it appeared to MISR's vertical-viewing (nadir) camera, and are false-color views comprised of data from the near-infrared, green and blue channels. Here, wet and/or moist surfaces appear blue-green, since water selectively absorbs longer wavelengths such as near-infrared. The right-hand panels are multi-angle composites created with red band data from MISR's 60-degree forward, nadir and 60-degree backward-viewing cameras, displayed as red, green and blue, respectively. In these multi-angle composites, color variations serve as a proxy for changes in angular reflectance, and indicate textural properties of the surface related to roughness and/or moisture

  4. Development of a camera casing suited for cryogenic and vacuum applications

    NASA Astrophysics Data System (ADS)

    Delaquis, S. C.; Gornea, R.; Janos, S.; Lüthi, M.; von Rohr, Ch Rudolf; Schenk, M.; Vuilleumier, J.-L.

    2013-12-01

    We report on the design, construction, and operation of a PID temperature controlled and vacuum tight camera casing. The camera casing contains a commercial digital camera and a lighting system. The design of the camera casing and its components are discussed in detail. Pictures taken by this cryo-camera while immersed in argon vapour and liquid nitrogen are presented. The cryo-camera can provide a live view inside cryogenic set-ups and allows to record video.

  5. America National Parks Viewed in 3D by NASA MISR Anaglyph 4

    NASA Image and Video Library

    2016-08-25

    Just in time for the U.S. National Park Service's Centennial celebration on Aug. 25, NASA's Multiangle Imaging SpectroRadiometer (MISR) instrument aboard NASA's Terra satellite is releasing four new anaglyphs that showcase 33 of our nation's national parks, monuments, historical sites and recreation areas in glorious 3D. Shown in the annotated image are Sequoia National Park, Kings Canyon National Park, Manzanar National Historic Site, Devils Postpile National Monument, Yosemite National Park, and parts of Death Valley National Park. MISR views Earth with nine cameras pointed at different angles, giving it the unique capability to produce anaglyphs, stereoscopic images that allow the viewer to experience the landscape in three dimensions. The anaglyphs were made by combining data from MISR's vertical-viewing and 46-degree forward-pointing camera. You will need red-blue glasses in order to experience the 3D effect; ensure you place the red lens over your left eye. The images have been rotated so that north is to the left in order to enable 3D viewing because the Terra satellite flies from north to south. All of the images are 235 miles (378 kilometers) from west to east. These data were acquired July 7, 2016, Orbit 88051. http://photojournal.jpl.nasa.gov/catalog/PIA20892

  6. America National Parks Viewed in 3D by NASA MISR Anaglyph 2

    NASA Image and Video Library

    2016-08-25

    Just in time for the U.S. National Park Service's Centennial celebration on Aug. 25, NASA's Multiangle Imaging SpectroRadiometer (MISR) instrument aboard NASA's Terra satellite is releasing four new anaglyphs that showcase 33 of our nation's national parks, monuments, historical sites and recreation areas in glorious 3D. Shown in the annotated image are Grand Teton National Park, John D. Rockefeller Memorial Parkway, Yellowstone National Park, and parts of Craters of the Moon National Monument. MISR views Earth with nine cameras pointed at different angles, giving it the unique capability to produce anaglyphs, stereoscopic images that allow the viewer to experience the landscape in three dimensions. The anaglyphs were made by combining data from MISR's vertical-viewing and 46-degree forward-pointing camera. You will need red-blue glasses in order to experience the 3D effect; ensure you place the red lens over your left eye. The images have been rotated so that north is to the left in order to enable 3D viewing because the Terra satellite flies from north to south. All of the images are 235 miles (378 kilometers) from west to east. These data were acquired June 25, 2016, Orbit 87876. http://photojournal.jpl.nasa.gov/catalog/PIA20890

  7. 640x480 PtSi Stirling-cooled camera system

    NASA Astrophysics Data System (ADS)

    Villani, Thomas S.; Esposito, Benjamin J.; Davis, Timothy J.; Coyle, Peter J.; Feder, Howard L.; Gilmartin, Harvey R.; Levine, Peter A.; Sauer, Donald J.; Shallcross, Frank V.; Demers, P. L.; Smalser, P. J.; Tower, John R.

    1992-09-01

    A Stirling cooled 3 - 5 micron camera system has been developed. The camera employs a monolithic 640 X 480 PtSi-MOS focal plane array. The camera system achieves an NEDT equals 0.10 K at 30 Hz frame rate with f/1.5 optics (300 K background). At a spatial frequency of 0.02 cycles/mRAD the vertical and horizontal Minimum Resolvable Temperature are in the range of MRT equals 0.03 K (f/1.5 optics, 300 K background). The MOS focal plane array achieves a resolution of 480 TV lines per picture height independent of background level and position within the frame.

  8. Cameras on the moon with Apollos 15 and 16.

    NASA Technical Reports Server (NTRS)

    Page, T.

    1972-01-01

    Description of the cameras used for photography and television by Apollo 15 and 16 missions, covering a hand-held Hasselblad camera for black and white panoramic views at locations visited by the astronauts, a special stereoscopic camera designed by astronomer Tom Gold, a 16-mm movie camera used on the Apollo 15 and 16 Rovers, and several TV cameras. Details are given on the far-UV camera/spectrograph of the Apollo 16 mission. An electronographic camera converts UV light to electrons which are ejected by a KBr layer at the focus of an f/1 Schmidt camera and darken photographic films much more efficiently than far-UV. The astronomical activity of the Apollo 16 astronauts on the moon, using this equipment, is discussed.

  9. Using SOURCES to Examine the Nadir of Race Relations (1890-1920)

    ERIC Educational Resources Information Center

    LaVallee, Carol; Waring, Scott M.

    2015-01-01

    The "nadir of race relations" is a term used by historians to describe the time period after Reconstruction, 1890-1920. During this time, African Americans were free; some argue, however, that it was a worse time than when these individuals were enslaved (Brundage 1990; Woodward 2002). There is a debate whether this time period…

  10. Moscow, Russia

    NASA Image and Video Library

    1994-09-30

    STS068-236-027 (30 September-11 October 1994) --- The STS-68 crewmembers used a 70mm camera to photograph this early morning nadir view of wheel-shaped Moscow. Star City, Russia facility, north of the city, is among the detail seen in the view, photographed from 115 nautical miles above Earth. Six NASA astronauts spent a week and a half aboard the Space Shuttle Endeavour in support of the Space Radar Laboratory 2 (SRL-2) mission.

  11. A panoramic coded aperture gamma camera for radioactive hotspots localization

    NASA Astrophysics Data System (ADS)

    Paradiso, V.; Amgarou, K.; Blanc De Lanaute, N.; Schoepff, V.; Amoyal, G.; Mahe, C.; Beltramello, O.; Liénard, E.

    2017-11-01

    A known disadvantage of the coded aperture imaging approach is its limited field-of-view (FOV), which often results insufficient when analysing complex dismantling scenes such as post-accidental scenarios, where multiple measurements are needed to fully characterize the scene. In order to overcome this limitation, a panoramic coded aperture γ-camera prototype has been developed. The system is based on a 1 mm thick CdTe detector directly bump-bonded to a Timepix readout chip, developed by the Medipix2 collaboration (256 × 256 pixels, 55 μm pitch, 14.08 × 14.08 mm2 sensitive area). A MURA pattern coded aperture is used, allowing for background subtraction without the use of heavy shielding. Such system is then combined with a USB color camera. The output of each measurement is a semi-spherical image covering a FOV of 360 degrees horizontally and 80 degrees vertically, rendered in spherical coordinates (θ,phi). The geometrical shapes of the radiation-emitting objects are preserved by first registering and stitching the optical images captured by the prototype, and applying, subsequently, the same transformations to their corresponding radiation images. Panoramic gamma images generated by using the technique proposed in this paper are described and discussed, along with the main experimental results obtained in laboratories campaigns.

  12. ATTICA family of thermal cameras in submarine applications

    NASA Astrophysics Data System (ADS)

    Kuerbitz, Gunther; Fritze, Joerg; Hoefft, Jens-Rainer; Ruf, Berthold

    2001-10-01

    Optronics Mast Systems (US: Photonics Mast Systems) are electro-optical devices which enable a submarine crew to observe the scenery above water during dive. Unlike classical submarine periscopes they are non-hull-penetrating and therefore have no direct viewing capability. Typically they have electro-optical cameras both for the visual and for an IR spectral band with panoramic view and a stabilized line of sight. They can optionally be equipped with laser range- finders, antennas, etc. The brand name ATTICA (Advanced Two- dimensional Thermal Imager with CMOS-Array) characterizes a family of thermal cameras using focal-plane-array (FPA) detectors which can be tailored to a variety of requirements. The modular design of the ATTICA components allows the use of various detectors (InSb, CMT 3...5 μm , CMT 7...11 μm ) for specific applications. By means of a microscanner ATTICA cameras achieve full standard TV resolution using detectors with only 288 X 384 (US:240 X 320) detector elements. A typical requirement for Optronics-Mast Systems is a Quick- Look-Around capability. For FPA cameras this implies the need for a 'descan' module which can be incorporated in the ATTICA cameras without complications.

  13. Continuous monitoring of Hawaiian volcanoes with thermal cameras

    USGS Publications Warehouse

    Patrick, Matthew R.; Orr, Tim R.; Antolik, Loren; Lee, Robert Lopaka; Kamibayashi, Kevan P.

    2014-01-01

    Continuously operating thermal cameras are becoming more common around the world for volcano monitoring, and offer distinct advantages over conventional visual webcams for observing volcanic activity. Thermal cameras can sometimes “see” through volcanic fume that obscures views to visual webcams and the naked eye, and often provide a much clearer view of the extent of high temperature areas and activity levels. We describe a thermal camera network recently installed by the Hawaiian Volcano Observatory to monitor Kīlauea’s summit and east rift zone eruptions (at Halema‘uma‘u and Pu‘u ‘Ō‘ō craters, respectively) and to keep watch on Mauna Loa’s summit caldera. The cameras are long-wave, temperature-calibrated models protected in custom enclosures, and often positioned on crater rims close to active vents. Images are transmitted back to the observatory in real-time, and numerous Matlab scripts manage the data and provide automated analyses and alarms. The cameras have greatly improved HVO’s observations of surface eruptive activity, which includes highly dynamic lava lake activity at Halema‘uma‘u, major disruptions to Pu‘u ‘Ō‘ō crater and several fissure eruptions.

  14. Determination of Optimum Viewing Angles for the Angular Normalization of Land Surface Temperature over Vegetated Surface

    PubMed Central

    Ren, Huazhong; Yan, Guangjian; Liu, Rongyuan; Li, Zhao-Liang; Qin, Qiming; Nerry, Françoise; Liu, Qiang

    2015-01-01

    Multi-angular observation of land surface thermal radiation is considered to be a promising method of performing the angular normalization of land surface temperature (LST) retrieved from remote sensing data. This paper focuses on an investigation of the minimum requirements of viewing angles to perform such normalizations on LST. The normally kernel-driven bi-directional reflectance distribution function (BRDF) is first extended to the thermal infrared (TIR) domain as TIR-BRDF model, and its uncertainty is shown to be less than 0.3 K when used to fit the hemispheric directional thermal radiation. A local optimum three-angle combination is found and verified using the TIR-BRDF model based on two patterns: the single-point pattern and the linear-array pattern. The TIR-BRDF is applied to an airborne multi-angular dataset to retrieve LST at nadir (Te-nadir) from different viewing directions, and the results show that this model can obtain reliable Te-nadir from 3 to 4 directional observations with large angle intervals, thus corresponding to large temperature angular variations. The Te-nadir is generally larger than temperature of the slant direction, with a difference of approximately 0.5~2.0 K for vegetated pixels and up to several Kelvins for non-vegetated pixels. The findings of this paper will facilitate the future development of multi-angular thermal infrared sensors. PMID:25825975

  15. Determination of optimum viewing angles for the angular normalization of land surface temperature over vegetated surface.

    PubMed

    Ren, Huazhong; Yan, Guangjian; Liu, Rongyuan; Li, Zhao-Liang; Qin, Qiming; Nerry, Françoise; Liu, Qiang

    2015-03-27

    Multi-angular observation of land surface thermal radiation is considered to be a promising method of performing the angular normalization of land surface temperature (LST) retrieved from remote sensing data. This paper focuses on an investigation of the minimum requirements of viewing angles to perform such normalizations on LST. The normally kernel-driven bi-directional reflectance distribution function (BRDF) is first extended to the thermal infrared (TIR) domain as TIR-BRDF model, and its uncertainty is shown to be less than 0.3 K when used to fit the hemispheric directional thermal radiation. A local optimum three-angle combination is found and verified using the TIR-BRDF model based on two patterns: the single-point pattern and the linear-array pattern. The TIR-BRDF is applied to an airborne multi-angular dataset to retrieve LST at nadir (Te-nadir) from different viewing directions, and the results show that this model can obtain reliable Te-nadir from 3 to 4 directional observations with large angle intervals, thus corresponding to large temperature angular variations. The Te-nadir is generally larger than temperature of the slant direction, with a difference of approximately 0.5~2.0 K for vegetated pixels and up to several Kelvins for non-vegetated pixels. The findings of this paper will facilitate the future development of multi-angular thermal infrared sensors.

  16. Digital Camera Control for Faster Inspection

    NASA Technical Reports Server (NTRS)

    Brown, Katharine; Siekierski, James D.; Mangieri, Mark L.; Dekome, Kent; Cobarruvias, John; Piplani, Perry J.; Busa, Joel

    2009-01-01

    Digital Camera Control Software (DCCS) is a computer program for controlling a boom and a boom-mounted camera used to inspect the external surface of a space shuttle in orbit around the Earth. Running in a laptop computer in the space-shuttle crew cabin, DCCS commands integrated displays and controls. By means of a simple one-button command, a crewmember can view low- resolution images to quickly spot problem areas and can then cause a rapid transition to high- resolution images. The crewmember can command that camera settings apply to a specific small area of interest within the field of view of the camera so as to maximize image quality within that area. DCCS also provides critical high-resolution images to a ground screening team, which analyzes the images to assess damage (if any); in so doing, DCCS enables the team to clear initially suspect areas more quickly than would otherwise be possible and further saves time by minimizing the probability of re-imaging of areas already inspected. On the basis of experience with a previous version (2.0) of the software, the present version (3.0) incorporates a number of advanced imaging features that optimize crewmember capability and efficiency.

  17. A Fast Visible Camera Divertor-Imaging Diagnostic on DIII-D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roquemore, A; Maingi, R; Lasnier, C

    2007-06-19

    In recent campaigns, the Photron Ultima SE fast framing camera has proven to be a powerful diagnostic when applied to imaging divertor phenomena on the National Spherical Torus Experiment (NSTX). Active areas of NSTX divertor research addressed with the fast camera include identification of types of EDGE Localized Modes (ELMs)[1], dust migration, impurity behavior and a number of phenomena related to turbulence. To compare such edge and divertor phenomena in low and high aspect ratio plasmas, a multi-institutional collaboration was developed for fast visible imaging on NSTX and DIII-D. More specifically, the collaboration was proposed to compare the NSTX smallmore » type V ELM regime [2] and the residual ELMs observed during Type I ELM suppression with external magnetic perturbations on DIII-D[3]. As part of the collaboration effort, the Photron camera was installed recently on DIII-D with a tangential view similar to the view implemented on NSTX, enabling a direct comparison between the two machines. The rapid implementation was facilitated by utilization of the existing optics that coupled the visible spectral output from the divertor vacuum ultraviolet UVTV system, which has a view similar to the view developed for the divertor tangential TV camera [4]. A remote controlled filter wheel was implemented, as was the radiation shield required for the DIII-D installation. The installation and initial operation of the camera are described in this paper, and the first images from the DIII-D divertor are presented.« less

  18. High-Resolution Mars Camera Test Image of Moon Infrared

    NASA Image and Video Library

    2005-09-13

    This crescent view of Earth Moon in infrared wavelengths comes from a camera test by NASA Mars Reconnaissance Orbiter spacecraft on its way to Mars. This image was taken by taken by the High Resolution Imaging Science Experiment camera Sept. 8, 2005.

  19. Morning view, contextual view showing unpaved corridor down the westernmost ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Morning view, contextual view showing unpaved corridor down the westernmost lane where the wall section (E) will be removed; camera facing north-northwest. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  20. A Fast and Robust Extrinsic Calibration for RGB-D Camera Networks †

    PubMed Central

    Shen, Ju; Xu, Wanxin; Luo, Ying

    2018-01-01

    From object tracking to 3D reconstruction, RGB-Depth (RGB-D) camera networks play an increasingly important role in many vision and graphics applications. Practical applications often use sparsely-placed cameras to maximize visibility, while using as few cameras as possible to minimize cost. In general, it is challenging to calibrate sparse camera networks due to the lack of shared scene features across different camera views. In this paper, we propose a novel algorithm that can accurately and rapidly calibrate the geometric relationships across an arbitrary number of RGB-D cameras on a network. Our work has a number of novel features. First, to cope with the wide separation between different cameras, we establish view correspondences by using a spherical calibration object. We show that this approach outperforms other techniques based on planar calibration objects. Second, instead of modeling camera extrinsic calibration using rigid transformation, which is optimal only for pinhole cameras, we systematically test different view transformation functions including rigid transformation, polynomial transformation and manifold regression to determine the most robust mapping that generalizes well to unseen data. Third, we reformulate the celebrated bundle adjustment procedure to minimize the global 3D reprojection error so as to fine-tune the initial estimates. Finally, our scalable client-server architecture is computationally efficient: the calibration of a five-camera system, including data capture, can be done in minutes using only commodity PCs. Our proposed framework is compared with other state-of-the-arts systems using both quantitative measurements and visual alignment results of the merged point clouds. PMID:29342968

  1. TES/MLS Aura L2 Carbon Monoxide (CO) Nadir (TML2CO)

    Atmospheric Science Data Center

    2018-05-06

    TES/MLS Aura L2 Carbon Monoxide (CO) Nadir (TML2CO) Atmospheric ... profile estimates and associated errors derived using TES & MLS spectral radiance measurements taken at nearest time and locations. ... a priori constraint vectors. News:  TES News Join TES News List Project Title:  TES ...

  2. TES/MLS Aura L2 Carbon Monoxide (CO) Nadir (TML2CO)

    Atmospheric Science Data Center

    2018-05-07

    TES/MLS Aura L2 Carbon Monoxide (CO) Nadir (TML2CO) ... profile estimates and associated errors derived using TES & MLS spectral radiance measurements taken at nearest time and locations. ... a priori constraint vectors. News:  TES News Join TES News List Project Title:  TES ...

  3. Sand Dunes of Nili Patera in 3-D

    NASA Technical Reports Server (NTRS)

    2001-01-01

    The most exciting new aspect of the Mars Global Surveyor (MGS) Extended Mission is the opportunity to turn the spacecraft and point the Mars Orbiter Camera (MOC) at specific features of interest. Opportunities to point the spacecraft come about ten times a week. Throughout the Primary Mission (March 1999 - January 2001), nearly all MGS operations were conducted with the spacecraft pointing 'nadir'--that is, straight down. A search for the missing Mars Polar Lander in late 1999 and early 2000 demonstrated that pointing the spacecraft could allow opportunities for MOC to see things that simply had not entered its field of view during typical nadir-looking operations, and to target areas previously seen in a nadir view so that stereo ('3-D') pictures could be derived.

    One of the very first places photographed by the MOC at the start of the Mapping Mission in March 1999 was a field of dunes located in Nili Patera, a volcanic depression in central Syrtis Major. A portion of this dune field was shown in a media release on March 11, 1999, 'Sand Dunes of Nili Patera, Syrtis Major'. Subsequently, the image was archived with the NASA Planetary Data System, as shown in the Malin Space Science Systems MOC Gallery. On April 24, 2001, an opportunity arose in which the MGS could be pointed off-nadir to take a new picture of the same dune field. By combining the nadir view from March 1999 and the off-nadir view from April 2001, a stereoscopic image was created. The anaglyph shown here must be viewed with red (left-eye) and blue (right-eye) '3-D' glasses. The dunes and the local topography of the volcanic crater's floor stand out in sharp relief. The images, taken more than one Mars year apart, show no change in the shape or location of the dunes--that is, they do not seem to have moved at all since March 1999.

  4. Optical design of portable nonmydriatic fundus camera

    NASA Astrophysics Data System (ADS)

    Chen, Weilin; Chang, Jun; Lv, Fengxian; He, Yifan; Liu, Xin; Wang, Dajiang

    2016-03-01

    Fundus camera is widely used in screening and diagnosis of retinal disease. It is a simple, and widely used medical equipment. Early fundus camera expands the pupil with mydriatic to increase the amount of the incoming light, which makes the patients feel vertigo and blurred. Nonmydriatic fundus camera is a trend of fundus camera. Desktop fundus camera is not easy to carry, and only suitable to be used in the hospital. However, portable nonmydriatic retinal camera is convenient for patient self-examination or medical stuff visiting a patient at home. This paper presents a portable nonmydriatic fundus camera with the field of view (FOV) of 40°, Two kinds of light source are used, 590nm is used in imaging, while 808nm light is used in observing the fundus in high resolving power. Ring lights and a hollow mirror are employed to restrain the stray light from the cornea center. The focus of the camera is adjusted by reposition the CCD along the optical axis. The range of the diopter is between -20m-1 and 20m-1.

  5. First-Day Newborn Weight Loss Predicts In-Hospital Weight Nadir for Breastfeeding Infants

    PubMed Central

    Bokser, Seth; Newman, Thomas B.

    2010-01-01

    Abstract Background Exclusive breastfeeding reduces infant infectious disease. Losing ≥10% birth weight may lead to formula use. The predictive value of first-day weight loss for subsequent weight loss has not been studied. The objective of the present study was to evaluate the relationship between weight loss at <24 hours and subsequent in-hospital weight loss ≥10%. Methods For 1,049 infants, we extracted gestational age, gender, delivery method, feeding type, and weights from medical records. Weight nadir was defined as the lowest weight recorded during birth hospitalization. We used multivariate logistic regression to assess the effect of first-day weight loss on subsequent in-hospital weight loss. Results Mean in-hospital weight nadir was 6.0 ± 2.6%, and mean age at in-hospital weight nadir was 38.7 ± 18.5 hours. While in the hospital 6.4% of infants lost ≥10% of birth weight. Infants losing ≥4.5% birth weight at <24 hours had greater risk of eventual in-hospital weight loss ≥10% (adjusted odds ratio 3.57 [1.75, 7.28]). In this cohort, 798 (76.1%) infants did not have documented weight gain while in the hospital. Conclusions Early weight loss predicts higher risk of ≥10% in-hospital weight loss. Infants with high first-day weight loss could be targeted for further research into improved interventions to promote breastfeeding. PMID:20113202

  6. The Ames Vertical Gun Range

    NASA Technical Reports Server (NTRS)

    Karcz, J. S.; Bowling, D.; Cornelison, C.; Parrish, A.; Perez, A.; Raiche, G.; Wiens, J.-P.

    2016-01-01

    The Ames Vertical Gun Range (AVGR) is a national facility for conducting laboratory- scale investigations of high-speed impact processes. It provides a set of light-gas, powder, and compressed gas guns capable of accelerating projectiles to speeds up to 7 km s(exp -1). The AVGR has a unique capability to vary the angle between the projectile-launch and gravity vectors between 0 and 90 deg. The target resides in a large chamber (diameter approximately 2.5 m) that can be held at vacuum or filled with an experiment-specific atmosphere. The chamber provides a number of viewing ports and feed-throughs for data, power, and fluids. Impacts are observed via high-speed digital cameras along with investigation-specific instrumentation, such as spectrometers. Use of the range is available via grant proposals through any Planetary Science Research Program element of the NASA Research Opportunities in Space and Earth Sciences (ROSES) calls. Exploratory experiments (one to two days) are additionally possible in order to develop a new proposal.

  7. An enhanced multi-view vertical line locus matching algorithm of object space ground primitives based on positioning consistency for aerial and space images

    NASA Astrophysics Data System (ADS)

    Zhang, Ka; Sheng, Yehua; Wang, Meizhen; Fu, Suxia

    2018-05-01

    The traditional multi-view vertical line locus (TMVLL) matching method is an object-space-based method that is commonly used to directly acquire spatial 3D coordinates of ground objects in photogrammetry. However, the TMVLL method can only obtain one elevation and lacks an accurate means of validating the matching results. In this paper, we propose an enhanced multi-view vertical line locus (EMVLL) matching algorithm based on positioning consistency for aerial or space images. The algorithm involves three components: confirming candidate pixels of the ground primitive in the base image, multi-view image matching based on the object space constraints for all candidate pixels, and validating the consistency of the object space coordinates with the multi-view matching result. The proposed algorithm was tested using actual aerial images and space images. Experimental results show that the EMVLL method successfully solves the problems associated with the TMVLL method, and has greater reliability, accuracy and computing efficiency.

  8. A filter spectrometer concept for facsimile cameras

    NASA Technical Reports Server (NTRS)

    Jobson, D. J.; Kelly, W. L., IV; Wall, S. D.

    1974-01-01

    A concept which utilizes interference filters and photodetector arrays to integrate spectrometry with the basic imagery function of a facsimile camera is described and analyzed. The analysis considers spectral resolution, instantaneous field of view, spectral range, and signal-to-noise ratio. Specific performance predictions for the Martian environment, the Viking facsimile camera design parameters, and a signal-to-noise ratio for each spectral band equal to or greater than 256 indicate the feasibility of obtaining a spectral resolution of 0.01 micrometers with an instantaneous field of view of about 0.1 deg in the 0.425 micrometers to 1.025 micrometers range using silicon photodetectors. A spectral resolution of 0.05 micrometers with an instantaneous field of view of about 0.6 deg in the 1.0 to 2.7 micrometers range using lead sulfide photodetectors is also feasible.

  9. Capturing method for integral three-dimensional imaging using multiviewpoint robotic cameras

    NASA Astrophysics Data System (ADS)

    Ikeya, Kensuke; Arai, Jun; Mishina, Tomoyuki; Yamaguchi, Masahiro

    2018-03-01

    Integral three-dimensional (3-D) technology for next-generation 3-D television must be able to capture dynamic moving subjects with pan, tilt, and zoom camerawork as good as in current TV program production. We propose a capturing method for integral 3-D imaging using multiviewpoint robotic cameras. The cameras are controlled through a cooperative synchronous system composed of a master camera controlled by a camera operator and other reference cameras that are utilized for 3-D reconstruction. When the operator captures a subject using the master camera, the region reproduced by the integral 3-D display is regulated in real space according to the subject's position and view angle of the master camera. Using the cooperative control function, the reference cameras can capture images at the narrowest view angle that does not lose any part of the object region, thereby maximizing the resolution of the image. 3-D models are reconstructed by estimating the depth from complementary multiviewpoint images captured by robotic cameras arranged in a two-dimensional array. The model is converted into elemental images to generate the integral 3-D images. In experiments, we reconstructed integral 3-D images of karate players and confirmed that the proposed method satisfied the above requirements.

  10. HST Solar Arrays photographed by Electronic Still Camera

    NASA Image and Video Library

    1993-12-04

    S61-E-002 (4 Dec 1993) --- This view, backdropped against the blackness of space shows one of two original Solar Arrays (SA) on the Hubble Space Telescope (HST). The scene was photographed from inside Endeavour's cabin with an Electronic Still Camera (ESC), and down linked to ground controllers soon afterward. This view features the minus V-2 panel. Endeavour's crew captured the HST on December 4, 1993 in order to service the telescope over a period of five days. Four of the crew members will work in alternating pairs outside Endeavour's shirt sleeve environment to service the giant telescope. Electronic still photography is a relatively new technology which provides the means for a handheld camera to electronically capture and digitize an image with resolution approaching film quality. The electronic still camera has flown as an experiment on several other shuttle missions.

  11. HST Solar Arrays photographed by Electronic Still Camera

    NASA Image and Video Library

    1993-12-04

    S61-E-003 (4 Dec 1993) --- This medium close-up view of one of two original Solar Arrays (SA) on the Hubble Space Telescope (HST) was photographed with an Electronic Still Camera (ESC), and down linked to ground controllers soon afterward. This view shows the cell side of the minus V-2 panel. Endeavour's crew captured the HST on December 4, 1993 in order to service the telescope over a period of five days. Four of the crew members will work in alternating pairs outside Endeavour's shirt sleeve environment to service the giant telescope. Electronic still photography is a relatively new technology which provides the means for a handheld camera to electronically capture and digitize an image with resolution approaching film quality. The electronic still camera has flown as an experiment on several other shuttle missions.

  12. THE DARK ENERGY CAMERA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flaugher, B.; Diehl, H. T.; Alvarez, O.

    2015-11-15

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuummore » Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel{sup −1}. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6–9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.« less

  13. The Dark Energy Camera

    DOE PAGES

    Flaugher, B.

    2015-04-11

    The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250-μm thick fully depleted CCDs cooled inside a vacuum Dewar.more » The 570 Mpixel focal plane comprises 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. The CCDs have 15μm x 15μm pixels with a plate scale of 0.263" per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.« less

  14. HST Solar Arrays photographed by Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This close-up view of one of two Solar Arrays (SA) on the Hubble Space Telescope (HST) was photographed with an Electronic Still Camera (ESC), and downlinked to ground controllers soon afterward. Electronic still photography is a technology which provides the means for a handheld camera to electronically capture and digitize an image with resolution approaching film quality.

  15. Omnidirectional Underwater Camera Design and Calibration

    PubMed Central

    Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David

    2015-01-01

    This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707

  16. Morning view, contextual view showing the role of the brick ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Morning view, contextual view showing the role of the brick walls along the boundary of the cemetery; interior view taken from midway down the paved west road with the camera facing west to capture the morning light on the west wall. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  17. Geometric rectification of camera-captured document images.

    PubMed

    Liang, Jian; DeMenthon, Daniel; Doermann, David

    2008-04-01

    Compared to typical scanners, handheld cameras offer convenient, flexible, portable, and non-contact image capture, which enables many new applications and breathes new life into existing ones. However, camera-captured documents may suffer from distortions caused by non-planar document shape and perspective projection, which lead to failure of current OCR technologies. We present a geometric rectification framework for restoring the frontal-flat view of a document from a single camera-captured image. Our approach estimates 3D document shape from texture flow information obtained directly from the image without requiring additional 3D/metric data or prior camera calibration. Our framework provides a unified solution for both planar and curved documents and can be applied in many, especially mobile, camera-based document analysis applications. Experiments show that our method produces results that are significantly more OCR compatible than the original images.

  18. Science, conservation, and camera traps

    USGS Publications Warehouse

    Nichols, James D.; Karanth, K. Ullas; O'Connel, Allan F.; O'Connell, Allan F.; Nichols, James D.; Karanth, K. Ullas

    2011-01-01

    Biologists commonly perceive camera traps as a new tool that enables them to enter the hitherto secret world of wild animals. Camera traps are being used in a wide range of studies dealing with animal ecology, behavior, and conservation. Our intention in this volume is not to simply present the various uses of camera traps, but to focus on their use in the conduct of science and conservation. In this chapter, we provide an overview of these two broad classes of endeavor and sketch the manner in which camera traps are likely to be able to contribute to them. Our main point here is that neither photographs of individual animals, nor detection history data, nor parameter estimates generated from detection histories are the ultimate objective of a camera trap study directed at either science or management. Instead, the ultimate objectives are best viewed as either gaining an understanding of how ecological systems work (science) or trying to make wise decisions that move systems from less desirable to more desirable states (conservation, management). Therefore, we briefly describe here basic approaches to science and management, emphasizing the role of field data and associated analyses in these processes. We provide examples of ways in which camera trap data can inform science and management.

  19. Test Image of Earth Rocks by Mars Camera Stereo

    NASA Image and Video Library

    2010-11-16

    This stereo view of terrestrial rocks combines two images taken by a testing twin of the Mars Hand Lens Imager MAHLI camera on NASA Mars Science Laboratory. 3D glasses are necessary to view this image.

  20. Fuzzy logic control for camera tracking system

    NASA Technical Reports Server (NTRS)

    Lea, Robert N.; Fritz, R. H.; Giarratano, J.; Jani, Yashvant

    1992-01-01

    A concept utilizing fuzzy theory has been developed for a camera tracking system to provide support for proximity operations and traffic management around the Space Station Freedom. Fuzzy sets and fuzzy logic based reasoning are used in a control system which utilizes images from a camera and generates required pan and tilt commands to track and maintain a moving target in the camera's field of view. This control system can be implemented on a fuzzy chip to provide an intelligent sensor for autonomous operations. Capabilities of the control system can be expanded to include approach, handover to other sensors, caution and warning messages.

  1. Morning view, contextual view of the exterior west side of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Morning view, contextual view of the exterior west side of the north wall along the unpaved road; camera facing west, positioned in road approximately 8 posts west of the gate. - Beaufort National Cemetery, Wall, 1601 Boundary Street, Beaufort, Beaufort County, SC

  2. The mosaics of Mars: As seen by the Viking Lander cameras

    NASA Technical Reports Server (NTRS)

    Levinthal, E. C.; Jones, K. L.

    1980-01-01

    The mosaics and derivative products produced from many individual high resolution images acquired by the Viking Lander Camera Systems are described: A morning and afternoon mosaic for both cameras at the Lander 1 Chryse Planitia site, and a morning, noon, and afternoon camera pair at Utopia Planitia, the Lander 11 site. The derived products include special geometric projections of the mosaic data sets, polar stereographic (donut), stereoscopic, and orthographic. Contour maps and vertical profiles of the topography were overlaid on the mosaics from which they were derived. Sets of stereo pairs were extracted and enlarged from stereoscopic projections of the mosaics.

  3. Validity of two alternative systems for measuring vertical jump height.

    PubMed

    Leard, John S; Cirillo, Melissa A; Katsnelson, Eugene; Kimiatek, Deena A; Miller, Tim W; Trebincevic, Kenan; Garbalosa, Juan C

    2007-11-01

    Vertical jump height is frequently used by coaches, health care professionals, and strength and conditioning professionals to objectively measure function. The purpose of this study is to determine the concurrent validity of the jump and reach method (Vertec) and the contact mat method (Just Jump) in assessing vertical jump height when compared with the criterion reference 3-camera motion analysis system. Thirty-nine college students, 25 females and 14 males between the ages of 18 and 25 (mean age 20.65 years), were instructed to perform the countermovement jump. Reflective markers were placed at the base of the individual's sacrum for the 3-camera motion analysis system to measure vertical jump height. The subject was then instructed to stand on the Just Jump mat beneath the Vertec and perform the jump. Measurements were recorded from each of the 3 systems simultaneously for each jump. The Pearson r statistic between the video and the jump and reach (Vertec) was 0.906. The Pearson r between the video and contact mat (Just Jump) was 0.967. Both correlations were significant at the 0.01 level. Analysis of variance showed a significant difference among the 3 means F(2,235) = 5.51, p < 0.05. The post hoc analysis showed a significant difference between the criterion reference (M = 0.4369 m) and the Vertec (M = 0.3937 m, p = 0.005) but not between the criterion reference and the Just Jump system (M = 0.4420 m, p = 0.972). The Just Jump method of measuring vertical jump height is a valid measure when compared with the 3-camera system. The Vertec was found to have a high correlation with the criterion reference, but the mean differed significantly. This study indicates that a higher degree of confidence is warranted when comparing Just Jump results with a 3-camera system study.

  4. View of Gulf coast area of Louisiana from Skylab space station

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A vertical view of the Gulf coast area of Louisiana (29.0N, 92.0W) as seen from the Skylab space station in Earth orbit. A Skylab 4 crewman used a hand-held 70mm Hasselblad camera to take this picture. This view extends from White Lake and Pecan Island (bottom border) eastward to the Mississippi River delta (top left). Atchafalaya Bay (red) is in the center. The Bayou Teche area is included in this view. A prominent feature of this photograph is two large white smoke plumes extending from Louisiana south into the Gulf of Mexico. The larger smoke plume originates on the southern shore of Vermillion Bay. The other plume extends from the southern shore of Marsh Island. The prononced narrow width and length of the plumes indicate that a strong offshore wind is present. Approximately 100 miles of the plumes are visible in this photograph; but they probably extend well into the Gulf of Mexico.

  5. The GISMO-2 Bolometer Camera

    NASA Technical Reports Server (NTRS)

    Staguhn, Johannes G.; Benford, Dominic J.; Fixsen, Dale J.; Hilton, Gene; Irwin, Kent D.; Jhabvala, Christine A.; Kovacs, Attila; Leclercq, Samuel; Maher, Stephen F.; Miller, Timothy M.; hide

    2012-01-01

    We present the concept for the GISMO-2 bolometer camera) which we build for background-limited operation at the IRAM 30 m telescope on Pico Veleta, Spain. GISM0-2 will operate Simultaneously in the 1 mm and 2 mm atmospherical windows. The 1 mm channel uses a 32 x 40 TES-based Backshort Under Grid (BUG) bolometer array, the 2 mm channel operates with a 16 x 16 BUG array. The camera utilizes almost the entire full field of view provided by the telescope. The optical design of GISM0-2 was strongly influenced by our experience with the GISMO 2 mm bolometer camera which is successfully operating at the 30m telescope. GISMO is accessible to the astronomical community through the regular IRAM call for proposals.

  6. Association of Hematological Nadirs and Survival in a Nonhuman Primate Model of Hematopoietic Syndrome of Acute Radiation Syndrome.

    PubMed

    Gluzman-Poltorak, Zoya; Vainstein, Vladimir; Basile, Lena A

    2015-08-01

    Recombinant human interleukin-12 (rHuIL-12) mitigates the hematopoietic subsyndrome of acute radiation syndrome (HSARS) after total body irradiation (TBI) in a nonhuman primate (NHP) model of HSARS. The mechanism for this effect appears to involve multiple effects of rHuIL-12 on hematopoiesis. We conducted a meta-analysis to examine hematological nadirs and survival across our three NHP completed studies. Animals were irradiated (700 cGy) and treated with a single subcutaneous injection of vehicle (n = 64) or rHuIL-12 (50-500 ng/kg; n = 108) 24-25 h after irradiation, or with daily subcutaneous injections of granulocyte-colony stimulating factor (G-CSF; 10 μg/kg/day) for 18 days starting 24-25 h after exposure (n = 26). Blood samples were obtained at various time points up to day 60 after TBI. Lymphocytes, neutrophils and platelets were significantly lower in nonsurvivors than in survivors in the overall sample and in each treatment group (P < 0.001 for each comparison, Wilcoxon rank-sum test). Lymphocyte nadir was the strongest and most consistent predictor of death by Spearman's rank correlation. Receiver operating characteristic (ROC) curve analysis of death and threshold hematologic nadir values (where nadir values less than or equal the threshold are predictive of death) showed that a threshold of 0.08 × 10(9)/L for lymphocytes had the largest positive predictive value of death (97.2% and 92.5% for the control and rHuIL-12 groups, respectively) and high sensitivity (76.1% and 62.7%, respectively), consistent with human radiation victims data. The current findings suggest that enhanced early bone marrow regeneration resulting in increases in nadir values for all major blood cell types may be the main mechanism of action by which rHuIL-12 mitigates the lethality of HSARS.

  7. The Influence of Flight Planning and Camera Orientation in UAVs Photogrammetry. a Test in the Area of Rocca San Silvestro (li), Tuscany

    NASA Astrophysics Data System (ADS)

    Chiabrando, F.; Lingua, A.; Maschio, P.; Teppati Losè, L.

    2017-02-01

    The purpose of this paper is to discuss how much the phases of flight planning and the setting of the camera orientation can affect a UAVs photogrammetric survey. The test site chosen for these evaluations was the Rocca of San Silvestro, a medieval monumental castle near Livorno, Tuscany (Italy). During the fieldwork, different sets of data have been acquired using different parameters for the camera orientation and for the set up of flight plans. Acquisition with both nadiral and oblique orientation of the camera have been performed, as well as flights with different direction of the flight lines (related with the shape of the object of the survey). The different datasets were then processed in several blocks using Pix4D software and the results of the processing were analysed and compared. Our aim was to evaluate how much the parameters described above can affect the generation of the final products of the survey, in particular the product chosen for this evaluation was the point cloud.

  8. Simultaneous overpass off nadir (SOON): a method for unified calibration/validation across IEOS and GEOSS system of systems

    NASA Astrophysics Data System (ADS)

    Ardanuy, Philip; Bergen, Bill; Huang, Allen; Kratz, Gene; Puschell, Jeff; Schueler, Carl; Walker, Joe

    2006-08-01

    The US operates a diverse, evolving constellation of research and operational environmental satellites, principally in polar and geosynchronous orbits. Our current and enhanced future domestic remote sensing capability is complemented by the significant capabilities of our current and potential future international partners. In this analysis, we define "success" through the data customers' "eyes": participating in the sufficient and continuously improving satisfaction of their mission responsibilities. To successfully fuse together observations from multiple simultaneous platforms and sensors into a common, self-consistent, operational environment requires that there exist a unified calibration and validation approach. Here, we consider develop a concept for an integrating framework for absolute accuracy; long-term stability; self-consistency among sensors, platforms, techniques, and observing systems; and validation and characterization of performance. Across all systems, this is a non-trivial problem. Simultaneous Nadir Overpasses, or SNO's, provide a proven intercomparison technique: simultaneous, collocated, co-angular measurements. Many systems have off-nadir elements, or effects, that must be calibrated. For these systems, the nadir technique constrains the process. We define the term "SOON," for simultaneous overpass off nadir. We present a target architecture and sensitivity analysis for the affordable, sustainable implementation of a global SOON calibration/validation network that can deliver the much-needed comprehensive, common, self-consistent operational picture in near-real time, at an affordable cost.

  9. Immersive viewing engine

    NASA Astrophysics Data System (ADS)

    Schonlau, William J.

    2006-05-01

    An immersive viewing engine providing basic telepresence functionality for a variety of application types is presented. Augmented reality, teleoperation and virtual reality applications all benefit from the use of head mounted display devices that present imagery appropriate to the user's head orientation at full frame rates. Our primary application is the viewing of remote environments, as with a camera equipped teleoperated vehicle. The conventional approach where imagery from a narrow field camera onboard the vehicle is presented to the user on a small rectangular screen is contrasted with an immersive viewing system where a cylindrical or spherical format image is received from a panoramic camera on the vehicle, resampled in response to sensed user head orientation and presented via wide field eyewear display, approaching 180 degrees of horizontal field. Of primary interest is the user's enhanced ability to perceive and understand image content, even when image resolution parameters are poor, due to the innate visual integration and 3-D model generation capabilities of the human visual system. A mathematical model for tracking user head position and resampling the panoramic image to attain distortion free viewing of the region appropriate to the user's current head pose is presented and consideration is given to providing the user with stereo viewing generated from depth map information derived using stereo from motion algorithms.

  10. Novel computer-based endoscopic camera

    NASA Astrophysics Data System (ADS)

    Rabinovitz, R.; Hai, N.; Abraham, Martin D.; Adler, Doron; Nissani, M.; Fridental, Ron; Vitsnudel, Ilia

    1995-05-01

    We have introduced a computer-based endoscopic camera which includes (a) unique real-time digital image processing to optimize image visualization by reducing over exposed glared areas and brightening dark areas, and by accentuating sharpness and fine structures, and (b) patient data documentation and management. The image processing is based on i Sight's iSP1000TM digital video processor chip and Adaptive SensitivityTM patented scheme for capturing and displaying images with wide dynamic range of light, taking into account local neighborhood image conditions and global image statistics. It provides the medical user with the ability to view images under difficult lighting conditions, without losing details `in the dark' or in completely saturated areas. The patient data documentation and management allows storage of images (approximately 1 MB per image for a full 24 bit color image) to any storage device installed into the camera, or to an external host media via network. The patient data which is included with every image described essential information on the patient and procedure. The operator can assign custom data descriptors, and can search for the stored image/data by typing any image descriptor. The camera optics has extended zoom range of f equals 20 - 45 mm allowing control of the diameter of the field which is displayed on the monitor such that the complete field of view of the endoscope can be displayed on all the area of the screen. All these features provide versatile endoscopic camera with excellent image quality and documentation capabilities.

  11. Nadir Ozone Profile Retrieval from SCIAMACHY: application to the Antarctic Ozone Hole

    NASA Astrophysics Data System (ADS)

    Shah, Sweta; Piet, Stammes; Tuinder, Olaf N. E.; de Laat, Jos

    2017-04-01

    We present new nadir ozone profile retrievals using SCIAMACHY UV reflectance spectra for the mission period of the Envisat satellite. We have used the most recent Level-1 data version (v8 with degradation correction included) in the UV range (265-330 nm) and have used the OPERA optimal estimation algorithm (van Peet et al., AMT, 2014) developed in KNMI. We first show the comparison of the retrieved satellite profiles to co-located ozone sonde profiles in order to evaluate the accuracy of the retrieved ozone profile dataset. Based on these results, we have further processed the SCIAMCHY nadir dataset, specifically all the southern hemisphere pixels south of 45 degrees latitude for the months of August-November for the complete years 2003-2011. We show the monthly mean profiles, time-series of daily averages and minima of the retrieved stratospheric columns, and finally the ozone profile trend over the years 2003-2011. We also show the comparison of our results with the literature and hence the consistency of this new SCIAMACHY dataset.

  12. View of 'Cape Verde' from 'Cape St. Mary' in Mid-Afternoon

    NASA Technical Reports Server (NTRS)

    2006-01-01

    As part of its investigation of 'Victoria Crater,' NASA's Mars Exploration Rover Opportunity examined a promontory called 'Cape Verde' from the vantage point of 'Cape St. Mary,' the next promontory clockwise around the crater's deeply scalloped rim. This view of Cape Verde combines several exposures taken by the rover's panoramic camera into an approximately true-color mosaic. The exposures were taken during mid-afternoon lighting conditions.

    The upper portion of the crater wall contains a jumble of material tossed outward by the impact that excavated the crater. This vertical cross-section through the blanket of ejected material surrounding the crater was exposed by erosion that expanded the crater outward from its original diameter, according to scientists' interpretation of the observations. Below the jumbled material in the upper part of the wall are layers that survive relatively intact from before the crater-causing impact.

    The images combined into this mosaic were taken during the 1,006th Martian day, or sol, of Opportunity's Mars-surface mission (Nov. 22, 2006). The panoramic camera took them through the camera's 750-nanometer, 530-nanometer and 430-nanometer filters.

  13. View of 'Cape Verde' from 'Cape St. Mary' in Late Morning

    NASA Technical Reports Server (NTRS)

    2006-01-01

    As part of its investigation of 'Victoria Crater,' NASA's Mars Exploration Rover Opportunity examined a promontory called 'Cape Verde' from the vantage point of 'Cape St. Mary,' the next promontory clockwise around the crater's deeply scalloped rim. This view of Cape Verde combines several exposures taken by the rover's panoramic camera into an approximately true-color mosaic. The exposures were taken during late-morning lighting conditions.

    The upper portion of the crater wall contains a jumble of material tossed outward by the impact that excavated the crater. This vertical cross-section through the blanket of ejected material surrounding the crater was exposed by erosion that expanded the crater outward from its original diameter, according to scientists' interpretation of the observations. Below the jumbled material in the upper part of the wall are layers that survive relatively intact from before the crater-causing impact.

    The images combined into this mosaic were taken during the 1,006th Martian day, or sol, of Opportunity's Mars-surface mission (Nov. 22, 2006). The panoramic camera took them through the camera's 750-nanometer, 530-nanometer and 430-nanometer filters.

  14. The NIKA2 Large Field-of-View Millimeter Continuum Camera for the 30-M IRAM Telescope

    NASA Astrophysics Data System (ADS)

    Monfardini, Alessandro

    2018-01-01

    We have constructed and deployed a multi-thousands pixels dual-band (150 and 260 GHz, respectively 2mm and 1.15mm wavelengths) camera to image an instantaneous field-of-view of 6.5arc-min and configurable to map the linear polarization at 260GHz. We are providing a detailed description of this instrument, named NIKA2 (New IRAM KID Arrays 2), in particular focusing on the cryogenics, the optics, the focal plane arrays based on Kinetic Inductance Detectors (KID) and the readout electronics. We are presenting the performance measured on the sky during the commissioning runs that took place between October 2015 and April 2017 at the 30-meter IRAM (Institute of Millimetric Radio Astronomy) telescope at Pico Veleta, and preliminary science-grade results.

  15. Supertyphoon Nepartak Barreling Towards Taiwan

    Atmospheric Science Data Center

    2016-12-30

    ... on the coast as the typhoon approaches, and air and train travel have been severely impacted. The typhoon is currently moving at about 10 ... view of Typhoon Nepartak on 7 July 2016 at 10:30 AM local time (2:30 AM UTC). On the left is an image from the nadir (vertical pointing) ...

  16. Single lens 3D-camera with extended depth-of-field

    NASA Astrophysics Data System (ADS)

    Perwaß, Christian; Wietzke, Lennart

    2012-03-01

    Placing a micro lens array in front of an image sensor transforms a normal camera into a single lens 3D camera, which also allows the user to change the focus and the point of view after a picture has been taken. While the concept of such plenoptic cameras is known since 1908, only recently the increased computing power of low-cost hardware and the advances in micro lens array production, have made the application of plenoptic cameras feasible. This text presents a detailed analysis of plenoptic cameras as well as introducing a new type of plenoptic camera with an extended depth of field and a maximal effective resolution of up to a quarter of the sensor resolution.

  17. Development of infrared scene projectors for testing fire-fighter cameras

    NASA Astrophysics Data System (ADS)

    Neira, Jorge E.; Rice, Joseph P.; Amon, Francine K.

    2008-04-01

    We have developed two types of infrared scene projectors for hardware-in-the-loop testing of thermal imaging cameras such as those used by fire-fighters. In one, direct projection, images are projected directly into the camera. In the other, indirect projection, images are projected onto a diffuse screen, which is then viewed by the camera. Both projectors use a digital micromirror array as the spatial light modulator, in the form of a Micromirror Array Projection System (MAPS) engine having resolution of 800 x 600 with mirrors on a 17 micrometer pitch, aluminum-coated mirrors, and a ZnSe protective window. Fire-fighter cameras are often based upon uncooled microbolometer arrays and typically have resolutions of 320 x 240 or lower. For direct projection, we use an argon-arc source, which provides spectral radiance equivalent to a 10,000 Kelvin blackbody over the 7 micrometer to 14 micrometer wavelength range, to illuminate the micromirror array. For indirect projection, an expanded 4 watt CO II laser beam at a wavelength of 10.6 micrometers illuminates the micromirror array and the scene formed by the first-order diffracted light from the array is projected onto a diffuse aluminum screen. In both projectors, a well-calibrated reference camera is used to provide non-uniformity correction and brightness calibration of the projected scenes, and the fire-fighter cameras alternately view the same scenes. In this paper, we compare the two methods for this application and report on our quantitative results. Indirect projection has an advantage of being able to more easily fill the wide field of view of the fire-fighter cameras, which typically is about 50 degrees. Direct projection more efficiently utilizes the available light, which will become important in emerging multispectral and hyperspectral applications.

  18. Camera/Photometer Results

    NASA Astrophysics Data System (ADS)

    Clifton, K. S.; Owens, J. K.

    1983-04-01

    Efforts continue regarding the analysis of particulate contamination recorded by the Camera/Photometers on STS-2. These systems were constructed by Epsilon Laboratories, Inc. and consisted of two 16-mm photographic cameras, using Kodak Double X film, Type 7222, to make stereoscopic observations of contaminant particles and background. Each was housed within a pressurized canister and operated automatically throughout the mission, making simultaneous exposures on a continuous basis every 150 sec. The cameras were equipped with 18-mm f/0.9 lenses and subtended overlapping 20° fields-of-view. An integrating photometer was used to inhibit the exposure sequences during periods of excessive illumination and to terminate the exposures at preset light levels. During the exposures, a camera shutter operated in a chopping mode in order to isolate the movement of particles for velocity determinations. Calculations based on the preflight film calibration indicate that particles as small as 25 μm can be detected from ideal observing conditions. Current emphasis is placed on the digitization of the photographic data frames and the determination of particle distances, sizes, and velocities. It has been concluded that background bright-ness measurements cannot be established with any reliability on the STS-2 mission, due to the preponderance of Earth-directed attitudes and the incidence of light reflected from nearby surfaces.

  19. A Comparative Study of Microscopic Images Captured by a Box Type Digital Camera Versus a Standard Microscopic Photography Camera Unit

    PubMed Central

    Desai, Nandini J.; Gupta, B. D.; Patel, Pratik Narendrabhai

    2014-01-01

    Introduction: Obtaining images of slides viewed by a microscope can be invaluable for both diagnosis and teaching.They can be transferred among technologically-advanced hospitals for further consultation and evaluation. But a standard microscopic photography camera unit (MPCU)(MIPS-Microscopic Image projection System) is costly and not available in resource poor settings. The aim of our endeavour was to find a comparable and cheaper alternative method for photomicrography. Materials and Methods: We used a NIKON Coolpix S6150 camera (box type digital camera) with Olympus CH20i microscope and a fluorescent microscope for the purpose of this study. Results: We got comparable results for capturing images of light microscopy, but the results were not as satisfactory for fluorescent microscopy. Conclusion: A box type digital camera is a comparable, less expensive and convenient alternative to microscopic photography camera unit. PMID:25478350

  20. Optomechanical Design of Ten Modular Cameras for the Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Ford, Virginia G.; Karlmann, Paul; Hagerott, Ed; Scherr, Larry

    2003-01-01

    This viewgraph presentation reviews the design and fabrication of the modular cameras for the Mars Exploration Rovers. In the 2003 mission there were to be 2 landers and 2 rovers, each were to have 10 cameras each. Views of the camera design, the lens design, the lens interface with the detector assembly, the detector assembly, the electronics assembly are shown.

  1. Game theoretic approach for cooperative feature extraction in camera networks

    NASA Astrophysics Data System (ADS)

    Redondi, Alessandro E. C.; Baroffio, Luca; Cesana, Matteo; Tagliasacchi, Marco

    2016-07-01

    Visual sensor networks (VSNs) consist of several camera nodes with wireless communication capabilities that can perform visual analysis tasks such as object identification, recognition, and tracking. Often, VSN deployments result in many camera nodes with overlapping fields of view. In the past, such redundancy has been exploited in two different ways: (1) to improve the accuracy/quality of the visual analysis task by exploiting multiview information or (2) to reduce the energy consumed for performing the visual task, by applying temporal scheduling techniques among the cameras. We propose a game theoretic framework based on the Nash bargaining solution to bridge the gap between the two aforementioned approaches. The key tenet of the proposed framework is for cameras to reduce the consumed energy in the analysis process by exploiting the redundancy in the reciprocal fields of view. Experimental results in both simulated and real-life scenarios confirm that the proposed scheme is able to increase the network lifetime, with a negligible loss in terms of visual analysis accuracy.

  2. HST Solar Arrays photographed by Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This view, backdropped against the blackness of space shows one of two original Solar Arrays (SA) on the Hubble Space Telescope (HST). The scene was photographed with an Electronic Still Camera (ESC), and downlinked to ground controllers soon afterward. Electronic still photography is a technology which provides the means for a handheld camera to electronically capture and digitize an image with resolution approaching film quality.

  3. Camera calibration for multidirectional flame chemiluminescence tomography

    NASA Astrophysics Data System (ADS)

    Wang, Jia; Zhang, Weiguang; Zhang, Yuhong; Yu, Xun

    2017-04-01

    Flame chemiluminescence tomography (FCT), which combines computerized tomography theory and multidirectional chemiluminescence emission measurements, can realize instantaneous three-dimensional (3-D) diagnostics for flames with high spatial and temporal resolutions. One critical step of FCT is to record the projections by multiple cameras from different view angles. For high accuracy reconstructions, it requires that extrinsic parameters (the positions and orientations) and intrinsic parameters (especially the image distances) of cameras be accurately calibrated first. Taking the focus effect of the camera into account, a modified camera calibration method was presented for FCT, and a 3-D calibration pattern was designed to solve the parameters. The precision of the method was evaluated by reprojections of feature points to cameras with the calibration results. The maximum root mean square error of the feature points' position is 1.42 pixels and 0.0064 mm for the image distance. An FCT system with 12 cameras was calibrated by the proposed method and the 3-D CH* intensity of a propane flame was measured. The results showed that the FCT system provides reasonable reconstruction accuracy using the camera's calibration results.

  4. Opportunity Surroundings on 3,000th Sol, Vertical Projection

    NASA Image and Video Library

    2012-09-07

    This 360-degree vertical projection was assembled from images taken by the navigation camera on NASA Mars Exporation Rover Opportunity shows terrain surrounding the position where the rover spent its 3,000th Martian day.

  5. Routine 18F-2-deoxy-2-fluoro-D-glucose (18F-FDG) myocardial tomography using a normal large field of view gamma-camera.

    PubMed

    Höflin, F; Ledermann, H; Noelpp, U; Weinreich, R; Rösler, H

    1989-12-01

    There is a recent need to study glucose metabolism of the heart in ischemic, as well as in "hibernating or stunned" myocardium, and compare it with that in perfusion studies. In non-positron emission tomography centers, positron imaging is possible with a standard Anger-type camera if proper collimation and adequate shielding of the camera crystal can be achieved. For the study with fast-decaying isotopes, seven-pinhole tomography (7PHT), a limited-angle method designed for transaxial tomography of the left ventricle using a nonrotating camera, is well suited, because projections are acquired simultaneously. Individual adjustment (patient supine) of the camera's view axis (CAx) with the left ventricular axis (LVAx) gives excellent results: sensitivity for CHD 82%, specificity 72% in a prospective 201TI study (48 patients, x-ray coronarography as reference). Good alignment of CAx with LVAx is also achieved with the patient prone in LAO in a hammock above the camera surface. In this setting additional lead shielding of the camera is possible using a table reinforced with 5 cm of lead with a central hole for the 7PH-collimator, which has a special lead inlay. This allows utilization of the 511 KeV emitter 18F-FDG, which with a half-life of 109 minutes, can be transported a reasonable distance from the production site. System sensitivity and resolution for 18F was found comparable to 201Tl, 99mTc, and 123I using a phantom. First clinical examinations after 201Tl stress/redistribution studies showed increased 18F-FDG uptake in ischemic heart segments, as well as in "hibernating" nonperfused or "stunned" myocardium.

  6. Low-cost mobile phone microscopy with a reversed mobile phone camera lens.

    PubMed

    Switz, Neil A; D'Ambrosio, Michael V; Fletcher, Daniel A

    2014-01-01

    The increasing capabilities and ubiquity of mobile phones and their associated digital cameras offer the possibility of extending low-cost, portable diagnostic microscopy to underserved and low-resource areas. However, mobile phone microscopes created by adding magnifying optics to the phone's camera module have been unable to make use of the full image sensor due to the specialized design of the embedded camera lens, exacerbating the tradeoff between resolution and field of view inherent to optical systems. This tradeoff is acutely felt for diagnostic applications, where the speed and cost of image-based diagnosis is related to the area of the sample that can be viewed at sufficient resolution. Here we present a simple and low-cost approach to mobile phone microscopy that uses a reversed mobile phone camera lens added to an intact mobile phone to enable high quality imaging over a significantly larger field of view than standard microscopy. We demonstrate use of the reversed lens mobile phone microscope to identify red and white blood cells in blood smears and soil-transmitted helminth eggs in stool samples.

  7. Low-Cost Mobile Phone Microscopy with a Reversed Mobile Phone Camera Lens

    PubMed Central

    Fletcher, Daniel A.

    2014-01-01

    The increasing capabilities and ubiquity of mobile phones and their associated digital cameras offer the possibility of extending low-cost, portable diagnostic microscopy to underserved and low-resource areas. However, mobile phone microscopes created by adding magnifying optics to the phone's camera module have been unable to make use of the full image sensor due to the specialized design of the embedded camera lens, exacerbating the tradeoff between resolution and field of view inherent to optical systems. This tradeoff is acutely felt for diagnostic applications, where the speed and cost of image-based diagnosis is related to the area of the sample that can be viewed at sufficient resolution. Here we present a simple and low-cost approach to mobile phone microscopy that uses a reversed mobile phone camera lens added to an intact mobile phone to enable high quality imaging over a significantly larger field of view than standard microscopy. We demonstrate use of the reversed lens mobile phone microscope to identify red and white blood cells in blood smears and soil-transmitted helminth eggs in stool samples. PMID:24854188

  8. View of 'Cape St. Mary' from 'Cape Verde'

    NASA Technical Reports Server (NTRS)

    2006-01-01

    As part of its investigation of 'Victoria Crater,' NASA's Mars Exploration Rover Opportunity examined a promontory called 'Cape St. Mary' from the from the vantage point of 'Cape Verde,' the next promontory counterclockwise around the crater's deeply scalloped rim. This view of Cape St. Mary combines several exposures taken by the rover's panoramic camera into an approximately true-color mosaic.

    The upper portion of the crater wall contains a jumble of material tossed outward by the impact that excavated the crater. This vertical cross-section through the blanket of ejected material surrounding the crater was exposed by erosion that expanded the crater outward from its original diameter, according to scientists' interpretation of the observations. Below the jumbled material in the upper part of the wall are layers that survive relatively intact from before the crater-causing impact. Near the base of the Cape St. Mary cliff are layers with a pattern called 'crossbedding,' intersecting with each other at angles, rather than parallel to each other. Large-scale crossbedding can result from material being deposited as wind-blown dunes.

    The images combined into this mosaic were taken during the 970th Martian day, or sol, of Opportunity's Mars-surface mission (Oct. 16, 2006). The panoramic camera took them through the camera's 750-nanometer, 530-nanometer and 430-nanometer filters.

  9. ACT-Vision: active collaborative tracking for multiple PTZ cameras

    NASA Astrophysics Data System (ADS)

    Broaddus, Christopher; Germano, Thomas; Vandervalk, Nicholas; Divakaran, Ajay; Wu, Shunguang; Sawhney, Harpreet

    2009-04-01

    We describe a novel scalable approach for the management of a large number of Pan-Tilt-Zoom (PTZ) cameras deployed outdoors for persistent tracking of humans and vehicles, without resorting to the large fields of view of associated static cameras. Our system, Active Collaborative Tracking - Vision (ACT-Vision), is essentially a real-time operating system that can control hundreds of PTZ cameras to ensure uninterrupted tracking of target objects while maintaining image quality and coverage of all targets using a minimal number of sensors. The system ensures the visibility of targets between PTZ cameras by using criteria such as distance from sensor and occlusion.

  10. In-flight Video Captured by External Tank Camera System

    NASA Technical Reports Server (NTRS)

    2005-01-01

    In this July 26, 2005 video, Earth slowly fades into the background as the STS-114 Space Shuttle Discovery climbs into space until the External Tank (ET) separates from the orbiter. An External Tank ET Camera System featuring a Sony XC-999 model camera provided never before seen footage of the launch and tank separation. The camera was installed in the ET LO2 Feedline Fairing. From this position, the camera had a 40% field of view with a 3.5 mm lens. The field of view showed some of the Bipod area, a portion of the LH2 tank and Intertank flange area, and some of the bottom of the shuttle orbiter. Contained in an electronic box, the battery pack and transmitter were mounted on top of the Solid Rocker Booster (SRB) crossbeam inside the ET. The battery pack included 20 Nickel-Metal Hydride batteries (similar to cordless phone battery packs) totaling 28 volts DC and could supply about 70 minutes of video. Located 95 degrees apart on the exterior of the Intertank opposite orbiter side, there were 2 blade S-Band antennas about 2 1/2 inches long that transmitted a 10 watt signal to the ground stations. The camera turned on approximately 10 minutes prior to launch and operated for 15 minutes following liftoff. The complete camera system weighs about 32 pounds. Marshall Space Flight Center (MSFC), Johnson Space Center (JSC), Goddard Space Flight Center (GSFC), and Kennedy Space Center (KSC) participated in the design, development, and testing of the ET camera system.

  11. Use and validation of mirrorless digital single light reflex camera for recording of vitreoretinal surgeries in high definition

    PubMed Central

    Khanduja, Sumeet; Sampangi, Raju; Hemlatha, B C; Singh, Satvir; Lall, Ashish

    2018-01-01

    Purpose: The purpose of this study is to describe the use of commercial digital single light reflex (DSLR) for vitreoretinal surgery recording and compare it to standard 3-chip charged coupling device (CCD) camera. Methods: Simultaneous recording was done using Sony A7s2 camera and Sony high-definition 3-chip camera attached to each side of the microscope. The videos recorded from both the camera systems were edited and sequences of similar time frames were selected. Three sequences that selected for evaluation were (a) anterior segment surgery, (b) surgery under direct viewing system, and (c) surgery under indirect wide-angle viewing system. The videos of each sequence were evaluated and rated on a scale of 0-10 for color, contrast, and overall quality Results: Most results were rated either 8/10 or 9/10 for both the cameras. A noninferiority analysis by comparing mean scores of DSLR camera versus CCD camera was performed and P values were obtained. The mean scores of the two cameras were comparable for each other on all parameters assessed in the different videos except of color and contrast in posterior pole view and color on wide-angle view, which were rated significantly higher (better) in DSLR camera. Conclusion: Commercial DSLRs are an affordable low-cost alternative for vitreoretinal surgery recording and may be used for documentation and teaching. PMID:29283133

  12. Use and validation of mirrorless digital single light reflex camera for recording of vitreoretinal surgeries in high definition.

    PubMed

    Khanduja, Sumeet; Sampangi, Raju; Hemlatha, B C; Singh, Satvir; Lall, Ashish

    2018-01-01

    The purpose of this study is to describe the use of commercial digital single light reflex (DSLR) for vitreoretinal surgery recording and compare it to standard 3-chip charged coupling device (CCD) camera. Simultaneous recording was done using Sony A7s2 camera and Sony high-definition 3-chip camera attached to each side of the microscope. The videos recorded from both the camera systems were edited and sequences of similar time frames were selected. Three sequences that selected for evaluation were (a) anterior segment surgery, (b) surgery under direct viewing system, and (c) surgery under indirect wide-angle viewing system. The videos of each sequence were evaluated and rated on a scale of 0-10 for color, contrast, and overall quality Results: Most results were rated either 8/10 or 9/10 for both the cameras. A noninferiority analysis by comparing mean scores of DSLR camera versus CCD camera was performed and P values were obtained. The mean scores of the two cameras were comparable for each other on all parameters assessed in the different videos except of color and contrast in posterior pole view and color on wide-angle view, which were rated significantly higher (better) in DSLR camera. Commercial DSLRs are an affordable low-cost alternative for vitreoretinal surgery recording and may be used for documentation and teaching.

  13. The Panoramic Camera (PanCam) Instrument for the ESA ExoMars Rover

    NASA Astrophysics Data System (ADS)

    Griffiths, A.; Coates, A.; Jaumann, R.; Michaelis, H.; Paar, G.; Barnes, D.; Josset, J.

    The recently approved ExoMars rover is the first element of the ESA Aurora programme and is slated to deliver the Pasteur exobiology payload to Mars by 2013. The 0.7 kg Panoramic Camera will provide multispectral stereo images with 65° field-of- view (1.1 mrad/pixel) and high resolution (85 µrad/pixel) monoscopic "zoom" images with 5° field-of-view. The stereo Wide Angle Cameras (WAC) are based on Beagle 2 Stereo Camera System heritage. The Panoramic Camera instrument is designed to fulfil the digital terrain mapping requirements of the mission as well as providing multispectral geological imaging, colour and stereo panoramic images, solar images for water vapour abundance and dust optical depth measurements and to observe retrieved subsurface samples before ingestion into the rest of the Pasteur payload. Additionally the High Resolution Camera (HRC) can be used for high resolution imaging of interesting targets detected in the WAC panoramas and of inaccessible locations on crater or valley walls.

  14. Camera Trajectory fromWide Baseline Images

    NASA Astrophysics Data System (ADS)

    Havlena, M.; Torii, A.; Pajdla, T.

    2008-09-01

    Camera trajectory estimation, which is closely related to the structure from motion computation, is one of the fundamental tasks in computer vision. Reliable camera trajectory estimation plays an important role in 3D reconstruction, self localization, and object recognition. There are essential issues for a reliable camera trajectory estimation, for instance, choice of the camera and its geometric projection model, camera calibration, image feature detection and description, and robust 3D structure computation. Most of approaches rely on classical perspective cameras because of the simplicity of their projection models and ease of their calibration. However, classical perspective cameras offer only a limited field of view, and thus occlusions and sharp camera turns may cause that consecutive frames look completely different when the baseline becomes longer. This makes the image feature matching very difficult (or impossible) and the camera trajectory estimation fails under such conditions. These problems can be avoided if omnidirectional cameras, e.g. a fish-eye lens convertor, are used. The hardware which we are using in practice is a combination of Nikon FC-E9 mounted via a mechanical adaptor onto a Kyocera Finecam M410R digital camera. Nikon FC-E9 is a megapixel omnidirectional addon convertor with 180° view angle which provides images of photographic quality. Kyocera Finecam M410R delivers 2272×1704 images at 3 frames per second. The resulting combination yields a circular view of diameter 1600 pixels in the image. Since consecutive frames of the omnidirectional camera often share a common region in 3D space, the image feature matching is often feasible. On the other hand, the calibration of these cameras is non-trivial and is crucial for the accuracy of the resulting 3D reconstruction. We calibrate omnidirectional cameras off-line using the state-of-the-art technique and Mičušík's two-parameter model, that links the radius of the image point r to the

  15. Extracting accurate and precise topography from LROC narrow angle camera stereo observations

    NASA Astrophysics Data System (ADS)

    Henriksen, M. R.; Manheim, M. R.; Burns, K. N.; Seymour, P.; Speyerer, E. J.; Deran, A.; Boyd, A. K.; Howington-Kraus, E.; Rosiek, M. R.; Archinal, B. A.; Robinson, M. S.

    2017-02-01

    The Lunar Reconnaissance Orbiter Camera (LROC) includes two identical Narrow Angle Cameras (NAC) that each provide 0.5 to 2.0 m scale images of the lunar surface. Although not designed as a stereo system, LROC can acquire NAC stereo observations over two or more orbits using at least one off-nadir slew. Digital terrain models (DTMs) are generated from sets of stereo images and registered to profiles from the Lunar Orbiter Laser Altimeter (LOLA) to improve absolute accuracy. With current processing methods, DTMs have absolute accuracies better than the uncertainties of the LOLA profiles and relative vertical and horizontal precisions less than the pixel scale of the DTMs (2-5 m). We computed slope statistics from 81 highland and 31 mare DTMs across a range of baselines. For a baseline of 15 m the highland mean slope parameters are: median = 9.1°, mean = 11.0°, standard deviation = 7.0°. For the mare the mean slope parameters are: median = 3.5°, mean = 4.9°, standard deviation = 4.5°. The slope values for the highland terrain are steeper than previously reported, likely due to a bias in targeting of the NAC DTMs toward higher relief features in the highland terrain. Overlapping DTMs of single stereo sets were also combined to form larger area DTM mosaics that enable detailed characterization of large geomorphic features. From one DTM mosaic we mapped a large viscous flow related to the Orientale basin ejecta and estimated its thickness and volume to exceed 300 m and 500 km3, respectively. Despite its ∼3.8 billion year age the flow still exhibits unconfined margin slopes above 30°, in some cases exceeding the angle of repose, consistent with deposition of material rich in impact melt. We show that the NAC stereo pairs and derived DTMs represent an invaluable tool for science and exploration purposes. At this date about 2% of the lunar surface is imaged in high-resolution stereo, and continued acquisition of stereo observations will serve to strengthen our

  16. [Evaluation of Iris Morphology Viewed through Stromal Edematous Corneas by Infrared Camera].

    PubMed

    Kobayashi, Masaaki; Morishige, Naoyuki; Morita, Yukiko; Yamada, Naoyuki; Kobayashi, Motomi; Sonoda, Koh-Hei

    2016-02-01

    We reported that the application of infrared camera enables us to observe iris morphology in Peters' anomaly through edematous corneas. To observe the iris morphology in bullous keratopathy or failure grafts with an infrared camera. Eleven bullous keratopathy or failure grafts subjects (6 men and 5 women, mean age ± SD; 72.7 ± 13.0 years old) were enrolled in this study. The iris morphology was observed by applying visible light mode and near infrared light mode of infrared camera (MeibomPen). The detectability of pupil shapes, iris patterns and presence of iridectomy was evaluated. Infrared mode observation enabled us to detect the pupil shapes in 11 out of 11 cases, iris patterns in 3 out of 11 cases, and presence of iridetomy in 9 out of 11 cases although visible light mode observation could not detect any iris morphological changes. Applying infrared optics was valuable for observation of the iris morphology through stromal edematous corneas.

  17. Real-time vehicle matching for multi-camera tunnel surveillance

    NASA Astrophysics Data System (ADS)

    Jelača, Vedran; Niño Castañeda, Jorge Oswaldo; Frías-Velázquez, Andrés; Pižurica, Aleksandra; Philips, Wilfried

    2011-03-01

    Tracking multiple vehicles with multiple cameras is a challenging problem of great importance in tunnel surveillance. One of the main challenges is accurate vehicle matching across the cameras with non-overlapping fields of view. Since systems dedicated to this task can contain hundreds of cameras which observe dozens of vehicles each, for a real-time performance computational efficiency is essential. In this paper, we propose a low complexity, yet highly accurate method for vehicle matching using vehicle signatures composed of Radon transform like projection profiles of the vehicle image. The proposed signatures can be calculated by a simple scan-line algorithm, by the camera software itself and transmitted to the central server or to the other cameras in a smart camera environment. The amount of data is drastically reduced compared to the whole image, which relaxes the data link capacity requirements. Experiments on real vehicle images, extracted from video sequences recorded in a tunnel by two distant security cameras, validate our approach.

  18. Auto-converging stereo cameras for 3D robotic tele-operation

    NASA Astrophysics Data System (ADS)

    Edmondson, Richard; Aycock, Todd; Chenault, David

    2012-06-01

    Polaris Sensor Technologies has developed a Stereovision Upgrade Kit for TALON robot to provide enhanced depth perception to the operator. This kit previously required the TALON Operator Control Unit to be equipped with the optional touchscreen interface to allow for operator control of the camera convergence angle adjustment. This adjustment allowed for optimal camera convergence independent of the distance from the camera to the object being viewed. Polaris has recently improved the performance of the stereo camera by implementing an Automatic Convergence algorithm in a field programmable gate array in the camera assembly. This algorithm uses scene content to automatically adjust the camera convergence angle, freeing the operator to focus on the task rather than adjustment of the vision system. The autoconvergence capability has been demonstrated on both visible zoom cameras and longwave infrared microbolometer stereo pairs.

  19. An evaluation of video cameras for collecting observational data on sanctuary-housed chimpanzees (Pan troglodytes).

    PubMed

    Hansen, Bethany K; Fultz, Amy L; Hopper, Lydia M; Ross, Stephen R

    2018-05-01

    Video cameras are increasingly being used to monitor captive animals in zoo, laboratory, and agricultural settings. This technology may also be useful in sanctuaries with large and/or complex enclosures. However, the cost of camera equipment and a lack of formal evaluations regarding the use of cameras in sanctuary settings make it challenging for facilities to decide whether and how to implement this technology. To address this, we evaluated the feasibility of using a video camera system to monitor chimpanzees at Chimp Haven. We viewed a group of resident chimpanzees in a large forested enclosure and compared observations collected in person and with remote video cameras. We found that via camera, the observer viewed fewer chimpanzees in some outdoor locations (GLMM post hoc test: est. = 1.4503, SE = 0.1457, Z = 9.951, p < 0.001) and identified a lower proportion of chimpanzees (GLMM post hoc test: est. = -2.17914, SE = 0.08490, Z = -25.666, p < 0.001) compared to in-person observations. However, the observer could view the 2 ha enclosure 15 times faster by camera compared to in person. In addition to these results, we provide recommendations to animal facilities considering the installation of a video camera system. Despite some limitations of remote monitoring, we posit that there are substantial benefits of using camera systems in sanctuaries to facilitate animal care and observational research. © 2018 Wiley Periodicals, Inc.

  20. Volunteers Help Decide Where to Point Mars Camera

    NASA Image and Video Library

    2015-07-22

    This series of images from NASA's Mars Reconnaissance Orbiter successively zooms into "spider" features -- or channels carved in the surface in radial patterns -- in the south polar region of Mars. In a new citizen-science project, volunteers will identify features like these using wide-scale images from the orbiter. Their input will then help mission planners decide where to point the orbiter's high-resolution camera for more detailed views of interesting terrain. Volunteers will start with images from the orbiter's Context Camera (CTX), which provides wide views of the Red Planet. The first two images in this series are from CTX; the top right image zooms into a portion of the image at left. The top right image highlights the geological spider features, which are carved into the terrain in the Martian spring when dry ice turns to gas. By identifying unusual features like these, volunteers will help the mission team choose targets for the orbiter's High Resolution Imaging Science Experiment (HiRISE) camera, which can reveal more detail than any other camera ever put into orbit around Mars. The final image is this series (bottom right) shows a HiRISE close-up of one of the spider features. http://photojournal.jpl.nasa.gov/catalog/PIA19823

  1. Multi-Angle Snowflake Camera Value-Added Product

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shkurko, Konstantin; Garrett, T.; Gaustad, K

    The Multi-Angle Snowflake Camera (MASC) addresses a need for high-resolution multi-angle imaging of hydrometeors in freefall with simultaneous measurement of fallspeed. As illustrated in Figure 1, the MASC consists of three cameras, separated by 36°, each pointing at an identical focal point approximately 10 cm away. Located immediately above each camera, a light aims directly at the center of depth of field for its corresponding camera. The focal point at which the cameras are aimed lies within a ring through which hydrometeors fall. The ring houses a system of near-infrared emitter-detector pairs, arranged in two arrays separated vertically by 32more » mm. When hydrometeors pass through the lower array, they simultaneously trigger all cameras and lights. Fallspeed is calculated from the time it takes to traverse the distance between the upper and lower triggering arrays. The trigger electronics filter out ambient light fluctuations associated with varying sunlight and shadows. The microprocessor onboard the MASC controls the camera system and communicates with the personal computer (PC). The image data is sent via FireWire 800 line, and fallspeed (and camera control) is sent via a Universal Serial Bus (USB) line that relies on RS232-over-USB serial conversion. See Table 1 for specific details on the MASC located at the Oliktok Point Mobile Facility on the North Slope of Alaska. The value-added product (VAP) detailed in this documentation analyzes the raw data (Section 2.0) using Python: images rely on OpenCV image processing library and derived aggregated statistics rely on some clever averaging. See Sections 4.1 and 4.2 for more details on what variables are computed.« less

  2. Viewing Vertical Objects with an Overhead Projector.

    ERIC Educational Resources Information Center

    Wild, R. L.

    1988-01-01

    Discusses the use of an overhead projector for the deflection of a vertical image to a screen. Describes three demonstrations: magnetizing of a steel ball bearing and paper clip; convection currents of a hot liquid within a cold liquid; and oscillation of concentrated salt solution into fresh water. (YP)

  3. Brief Report: Using a Point-of-View Camera to Measure Eye Gaze in Young Children with Autism Spectrum Disorder during Naturalistic Social Interactions--A Pilot Study

    ERIC Educational Resources Information Center

    Edmunds, Sarah R.; Rozga, Agata; Li, Yin; Karp, Elizabeth A.; Ibanez, Lisa V.; Rehg, James M.; Stone, Wendy L.

    2017-01-01

    Children with autism spectrum disorder (ASD) show reduced gaze to social partners. Eye contact during live interactions is often measured using stationary cameras that capture various views of the child, but determining a child's precise gaze target within another's face is nearly impossible. This study compared eye gaze coding derived from…

  4. Hematologic Nadirs During Chemoradiation for Anal Cancer: Temporal Characterization and Dosimetric Predictors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Andrew Y.; Golden, Daniel W.; Bazan, Jose G.

    Purpose: Pelvic bone marrow (BM) constraints may offer a means to reduce the toxicity commonly associated with chemoradiation for anal cancer. We conducted a bi-institutional analysis of dose-volume metrics in a time-sensitive fashion to devise practical metrics to minimize hematologic toxicity. Methods and Materials: Fifty-six anal cancer patients from 2 institutions received definitive radiation therapy (median primary dose of 54 Gy) using intensity modulated radiation therapy (IMRT, n=49) or 3-dimensional (3D) conformal therapy (n=7) with concurrent 5-fluorouracil (5-FU) and mitomycin C. Weekly blood counts were retrospectively plotted to characterize the time course of cytopenias. Dose-volume parameters were correlated with blood countsmore » at a standardized time point to identify predictors of initial blood count nadirs. Results: Leukocytes, neutrophils, and platelets reached a nadir at week 3 of treatment. Smaller volumes of the pelvic BM correlated most strongly with lower week 3 blood counts, more so than age, sex, body mass index (BMI), or dose metrics. Patients who had ≥750 cc of pelvic BM spared from doses of ≥30 Gy had 0% grade 3+ leukopenia or neutropenia at week 3. Higher V40 Gy to the lower pelvic BM (LP V40) also correlated with cytopenia. Patients with an LP V40 >23% had higher rates of grade 3+ leukopenia (29% vs 4%, P=.02), grade 3+ neutropenia (33% vs 8%, P=.04), and grade 2+ thrombocytopenia (32% vs 7%, P=.04) at week 3. On multivariate analysis, pelvic BM volume and LP V40 remained associated with leukocyte count, and all marrow subsite volumes remained associated with neutrophil counts at week 3 (P<.1). Conclusions: Larger pelvic BM volumes correlate with less severe leukocyte and neutrophil nadirs, suggesting that larger total “marrow reserve” can mitigate cytopenias. Sparing a critical marrow reserve and limiting the V40 Gy to the lower pelvis may reduce the risk of hematologic toxicity.« less

  5. Contextual view of Warner's Ranch. Third of three sequential views ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of Warner's Ranch. Third of three sequential views (from west to east) of the buildings in relation to the surrounding geography. Note approximate location of Overland Trail crossing left to right. Camera facing northeast - Warner Ranch, Ranch House, San Felipe Road (State Highway S2), Warner Springs, San Diego County, CA

  6. View of a dust storm taken from Atlantis during STS-106

    NASA Image and Video Library

    2000-09-11

    STS106-718-056 (11 September 2000) --- One of the STS-106 crew members on board the Space Shuttle Atlantis used a handheld 70mm camera to photograph this image of Afghanistan dust/front winds in the upper Amu Darya Valley. The strong winds along the northern border of Afghanistan lofted thick, light brown dust into the air (top half of the view). In this desert environment land surfaces are not protected by vegetation from the effect of blowing wind. The central Asian deserts experience the greatest number of dust storm days on the planet each year. The sharp dust front shows that the dust has not traveled far, but has been raised from the surfaces in the view. Dust is entrained in the atmosphere by horizontal winds but also by vertical movements. Here the vertical component is indicated by the fact that the higher points along the dust front are each topped by a small cumulus cloud, which appear as a line of small white puffballs. Cumulus clouds indicate upward motion and here the air which has entrained the dust is lifting the air above to the level of condensation at each point where a small cloud has formed.

  7. Design of microcontroller based system for automation of streak camera.

    PubMed

    Joshi, M J; Upadhyay, J; Deshpande, P P; Sharma, M L; Navathe, C P

    2010-08-01

    A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor. A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.

  8. Design of microcontroller based system for automation of streak camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joshi, M. J.; Upadhyay, J.; Deshpande, P. P.

    2010-08-15

    A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor.more » A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.« less

  9. Light field rendering with omni-directional camera

    NASA Astrophysics Data System (ADS)

    Todoroki, Hiroshi; Saito, Hideo

    2003-06-01

    This paper presents an approach to capture visual appearance of a real environment such as an interior of a room. We propose the method for generating arbitrary viewpoint images by building light field with the omni-directional camera, which can capture the wide circumferences. Omni-directional camera used in this technique is a special camera with the hyperbolic mirror in the upper part of a camera, so that we can capture luminosity in the environment in the range of 360 degree of circumferences in one image. We apply the light field method, which is one technique of Image-Based-Rendering(IBR), for generating the arbitrary viewpoint images. The light field is a kind of the database that records the luminosity information in the object space. We employ the omni-directional camera for constructing the light field, so that we can collect many view direction images in the light field. Thus our method allows the user to explore the wide scene, that can acheive realistic representation of virtual enviroment. For demonstating the proposed method, we capture image sequence in our lab's interior environment with an omni-directional camera, and succesfully generate arbitray viewpoint images for virual tour of the environment.

  10. External Mask Based Depth and Light Field Camera

    DTIC Science & Technology

    2013-12-08

    laid out in the previous light field cameras. A good overview of the sampling of the plenoptic function can be found in the survey work by Wetzstein et...view is shown in Figure 6. 5. Applications High spatial resolution depth and light fields are a rich source of information about the plenoptic ...http://www.pelicanimaging.com/. [4] E. Adelson and J. Wang. Single lens stereo with a plenoptic camera. Pattern Analysis and Machine Intelligence

  11. Fly-through viewpoint video system for multi-view soccer movie using viewpoint interpolation

    NASA Astrophysics Data System (ADS)

    Inamoto, Naho; Saito, Hideo

    2003-06-01

    This paper presents a novel method for virtual view generation that allows viewers to fly through in a real soccer scene. A soccer match is captured by multiple cameras at a stadium and images of arbitrary viewpoints are synthesized by view-interpolation of two real camera images near the given viewpoint. In the proposed method, cameras do not need to be strongly calibrated, but epipolar geometry between the cameras is sufficient for the view-interpolation. Therefore, it can easily be applied to a dynamic event even in a large space, because the efforts for camera calibration can be reduced. A soccer scene is classified into several regions and virtual view images are generated based on the epipolar geometry in each region. Superimposition of the images completes virtual views for the whole soccer scene. An application for fly-through observation of a soccer match is introduced as well as the algorithm of the view-synthesis and experimental results..

  12. Synchronizing A Stroboscope With A Video Camera

    NASA Technical Reports Server (NTRS)

    Rhodes, David B.; Franke, John M.; Jones, Stephen B.; Dismond, Harriet R.

    1993-01-01

    Circuit synchronizes flash of light from stroboscope with frame and field periods of video camera. Sync stripper sends vertical-synchronization signal to delay generator, which generates trigger signal. Flashlamp power supply accepts delayed trigger signal and sends pulse of power to flash lamp. Designed for use in making short-exposure images that "freeze" flow in wind tunnel. Also used for making longer-exposure images obtained by use of continuous intense illumination.

  13. Clinical outcomes and nadir prostate-specific antigen (PSA) according to initial PSA levels in primary androgen deprivation therapy for metastatic prostate cancer.

    PubMed

    Kitagawa, Yasuhide; Ueno, Satoru; Izumi, Kouji; Kadono, Yoshifumi; Mizokami, Atsushi; Hinotsu, Shiro; Akaza, Hideyuki; Namiki, Mikio

    2016-03-01

    To investigate the clinical outcomes of metastatic prostate cancer patients and the relationship between nadir prostate-specific antigen (PSA) levels and different types of primary androgen deprivation therapy (PADT). This study utilized data from the Japan Study Group of Prostate Cancer registry, which is a large, multicenter, population-based database. A total of 2982 patients treated with PADT were enrolled. Kaplan-Meier analysis was used to compare progression-free survival (PFS) and overall survival (OS) in patients treated using combined androgen blockade (CAB) and non-CAB therapies. The relationships between nadir PSA levels and PADT type according to initial serum PSA levels were also investigated. Among the 2982 enrolled patients, 2101 (70.5 %) were treated with CAB. Although CAB-treated patients had worse clinical characteristics, their probability of PFS and OS was higher compared with those treated with a non-CAB therapy. These results were due to a survival benefit with CAB in patients with an initial PSA level of 500-1000 ng/mL. Nadir PSA levels were significantly lower in CAB patients than in non-CAB patients with comparable initial serum PSA levels. A small survival benefit for CAB in metastatic prostate cancer was demonstrated in a Japanese large-scale prospective cohort study. The clinical significance of nadir PSA levels following PADT was evident, but the predictive impact of PSA nadir on OS was different between CAB and non-CAB therapy.

  14. America National Parks Viewed in 3D by NASA MISR Anaglyph 3

    NASA Image and Video Library

    2016-08-25

    Just in time for the U.S. National Park Service's Centennial celebration on Aug. 25, NASA's Multiangle Imaging SpectroRadiometer (MISR) instrument aboard NASA's Terra satellite is releasing four new anaglyphs that showcase 33 of our nation's national parks, monuments, historical sites and recreation areas in glorious 3D. Shown in the annotated image are Lewis and Clark National Historic Park, Mt. Rainier National Park, Olympic National Park, Ebey's Landing National Historical Reserve, San Juan Island National Historic Park, North Cascades National Park, Lake Chelan National Recreation Area, and Ross Lake National Recreation Area (also Mt. St. Helens National Volcanic Monument, administered by the U.S. Forest Service) MISR views Earth with nine cameras pointed at different angles, giving it the unique capability to produce anaglyphs, stereoscopic images that allow the viewer to experience the landscape in three dimensions. The anaglyphs were made by combining data from MISR's vertical-viewing and 46-degree forward-pointing camera. You will need red-blue glasses in order to experience the 3D effect; ensure you place the red lens over your left eye. The images have been rotated so that north is to the left in order to enable 3D viewing because the Terra satellite flies from north to south. All of the images are 235 miles (378 kilometers) from west to east. These data were acquired May 12, 2012, Orbit 65960. http://photojournal.jpl.nasa.gov/catalog/PIA20891

  15. America National Parks Viewed in 3D by NASA MISR Anaglyph 1

    NASA Image and Video Library

    2016-08-25

    Just in time for the U.S. National Park Service's Centennial celebration on Aug. 25, NASA's Multiangle Imaging SpectroRadiometer (MISR) instrument aboard NASA's Terra satellite is releasing four new anaglyphs that showcase 33 of our nation's national parks, monuments, historical sites and recreation areas in glorious 3D. Shown in the annotated image are Walnut Canyon National Monument, Sunset Crater Volcano National Monument, Wupatki National Monument, Grand Canyon National Park, Pipe Spring National Monument, Zion National Park, Cedar Breaks National Monument, Bryce Canyon National Park, Capitol Reef National Park, Navajo National Monument, Glen Canyon National Recreation Area, Natural Bridges National Monument, Canyonlands National Park, and Arches National Park. MISR views Earth with nine cameras pointed at different angles, giving it the unique capability to produce anaglyphs, stereoscopic images that allow the viewer to experience the landscape in three dimensions. The anaglyphs were made by combining data from MISR's vertical-viewing and 46-degree forward-pointing camera. You will need red-blue glasses in order to experience the 3D effect; ensure you place the red lens over your left eye. The images have been rotated so that north is to the left in order to enable 3D viewing because the Terra satellite flies from north to south. All of the images are 235 miles (378 kilometers) from west to east. These data were acquired June 18, 2016, Orbit 87774. http://photojournal.jpl.nasa.gov/catalog/PIA20889

  16. First results from the TOPSAT camera

    NASA Astrophysics Data System (ADS)

    Greenway, Paul; Tosh, Ian; Morris, Nigel; Burton, Gary; Cawley, Steve

    2017-11-01

    The TopSat camera is a low cost remote sensing imager capable of producing 2.5 metre resolution panchromatic imagery, funded by the British National Space Centre's Mosaic programme. The instrument was designed and assembled at the Space Science & Technology Department of the CCLRC's Rutherford Appleton Laboratory (RAL) in the UK, and was launched on the 27th October 2005 from Plesetsk Cosmodrome in Northern Russia on a Kosmos-3M. The camera utilises an off-axis three mirror system, which has the advantages of excellent image quality over a wide field of view, combined with a compactness that makes its overall dimensions smaller than its focal length. Keeping the costs to a minimum has been a major design driver in the development of this camera. The camera is part of the TopSat mission, which is a collaboration between four UK organisations; QinetiQ, Surrey Satellite Technology Ltd (SSTL), RAL and Infoterra. Its objective is to demonstrate provision of rapid response high resolution imagery to fixed and mobile ground stations using a low cost minisatellite. The paper "Development of the TopSat Camera" presented by RAL at the 5th ICSO in 2004 described the opto-mechanical design, assembly, alignment and environmental test methods implemented. Now that the spacecraft is in orbit and successfully acquiring images, this paper presents the first results from the camera and makes an initial assessment of the camera's in-orbit performance.

  17. Calibration between Color Camera and 3D LIDAR Instruments with a Polygonal Planar Board

    PubMed Central

    Park, Yoonsu; Yun, Seokmin; Won, Chee Sun; Cho, Kyungeun; Um, Kyhyun; Sim, Sungdae

    2014-01-01

    Calibration between color camera and 3D Light Detection And Ranging (LIDAR) equipment is an essential process for data fusion. The goal of this paper is to improve the calibration accuracy between a camera and a 3D LIDAR. In particular, we are interested in calibrating a low resolution 3D LIDAR with a relatively small number of vertical sensors. Our goal is achieved by employing a new methodology for the calibration board, which exploits 2D-3D correspondences. The 3D corresponding points are estimated from the scanned laser points on the polygonal planar board with adjacent sides. Since the lengths of adjacent sides are known, we can estimate the vertices of the board as a meeting point of two projected sides of the polygonal board. The estimated vertices from the range data and those detected from the color image serve as the corresponding points for the calibration. Experiments using a low-resolution LIDAR with 32 sensors show robust results. PMID:24643005

  18. Ultra-fast framing camera tube

    DOEpatents

    Kalibjian, Ralph

    1981-01-01

    An electronic framing camera tube features focal plane image dissection and synchronized restoration of the dissected electron line images to form two-dimensional framed images. Ultra-fast framing is performed by first streaking a two-dimensional electron image across a narrow slit, thereby dissecting the two-dimensional electron image into sequential electron line images. The dissected electron line images are then restored into a framed image by a restorer deflector operated synchronously with the dissector deflector. The number of framed images on the tube's viewing screen is equal to the number of dissecting slits in the tube. The distinguishing features of this ultra-fast framing camera tube are the focal plane dissecting slits, and the synchronously-operated restorer deflector which restores the dissected electron line images into a two-dimensional framed image. The framing camera tube can produce image frames having high spatial resolution of optical events in the sub-100 picosecond range.

  19. Generating Stereoscopic Television Images With One Camera

    NASA Technical Reports Server (NTRS)

    Coan, Paul P.

    1996-01-01

    Straightforward technique for generating stereoscopic television images involves use of single television camera translated laterally between left- and right-eye positions. Camera acquires one of images (left- or right-eye image), and video signal from image delayed while camera translated to position where it acquires other image. Length of delay chosen so both images displayed simultaneously or as nearly simultaneously as necessary to obtain stereoscopic effect. Technique amenable to zooming in on small areas within broad scenes. Potential applications include three-dimensional viewing of geological features and meteorological events from spacecraft and aircraft, inspection of workpieces moving along conveyor belts, and aiding ground and water search-and-rescue operations. Also used to generate and display imagery for public education and general information, and possible for medical purposes.

  20. The Clouds of Isidore

    NASA Technical Reports Server (NTRS)

    2002-01-01

    These views of Hurricane Isidore were acquired by the Multi-angle Imaging SpectroRadiometer (MISR) on September 20, 2002. After bringing large-scale flooding to western Cuba, Isidore was upgraded (on September 21) from a tropical storm to a category 3hurricane. Sweeping westward to Mexico's Yucatan Peninsula, the hurricane caused major destruction and left hundreds of thousands of people homeless. Although weakened after passing over the Yucatan landmass, Isidore regained strength as it moved northward over the Gulf of Mexico.

    At left is a colorful visualization of cloud extent that superimposes MISR's radiometric camera-by-camera cloud mask (RCCM) over natural-color radiance imagery, both derived from data acquired with the instrument's vertical-viewing (nadir) camera. Using brightness and statistical metrics, the RCCM is one of several techniques MISR uses to determine whether an area is clear or cloudy. In this rendition, the RCCM has been color-coded, and purple = cloudy with high confidence, blue = cloudy with low confidence, green = clear with low confidence, and red = clear with high confidence.

    In addition to providing information on meteorological events, MISR's data products are designed to help improve our understanding of the influences of clouds on climate. Cloud heights and albedos are among the variables that govern these influences. (Albedo is the amount of sunlight reflected back to space divided by the amount of incident sunlight.) The center panel is the cloud-top height field retrieved using automated stereoscopic processing of data from multiple MISR cameras. Areas where heights could not be retrieved are shown in dark gray. In some areas, such as the southern portion of the image, the stereo retrieval was able to detect thin, high clouds that were not picked up by the RCCM's nadir view. Retrieved local albedo values for Isidore are shown at right. Generation of the albedo product is dependent upon observed cloud radiances as a function

  1. 2D Measurements of the Balmer Series in Proto-MPEX using a Fast Visible Camera Setup

    NASA Astrophysics Data System (ADS)

    Lindquist, Elizabeth G.; Biewer, Theodore M.; Ray, Holly B.

    2017-10-01

    The Prototype Material Plasma Exposure eXperiment (Proto-MPEX) is a linear plasma device with densities up to 1020 m-3 and temperatures up to 20 eV. Broadband spectral measurements show the visible emission spectra are solely due to the Balmer lines of deuterium. Monochromatic and RGB color Sanstreak SC1 Edgertronic fast visible cameras capture high speed video of plasmas in Proto-MPEX. The color camera is equipped with a long pass 450 nm filter and an internal Bayer filter to view the Dα line at 656 nm on the red channel and the Dβ line at 486 nm on the blue channel. The monochromatic camera has a 434 nm narrow bandpass filter to view the Dγ intensity. In the setup, a 50/50 beam splitter is used so both cameras image the same region of the plasma discharge. Camera images were aligned to each other by viewing a grid ensuring 1 pixel registration between the two cameras. A uniform intensity calibrated white light source was used to perform a pixel-to-pixel relative and an absolute intensity calibration for both cameras. Python scripts that combined the dual camera data, rendering the Dα, Dβ, and Dγ intensity ratios. Observations from Proto-MPEX discharges will be presented. This work was supported by the US. D.O.E. contract DE-AC05-00OR22725.

  2. Head-coupled remote stereoscopic camera system for telepresence applications

    NASA Astrophysics Data System (ADS)

    Bolas, Mark T.; Fisher, Scott S.

    1990-09-01

    The Virtual Environment Workstation Project (VIEW) at NASA's Ames Research Center has developed a remotely controlled stereoscopic camera system that can be used for telepresence research and as a tool to develop and evaluate configurations for head-coupled visual systems associated with space station telerobots and remote manipulation robotic arms. The prototype camera system consists of two lightweight CCD video cameras mounted on a computer controlled platform that provides real-time pan, tilt, and roll control of the camera system in coordination with head position transmitted from the user. This paper provides an overall system description focused on the design and implementation of the camera and platform hardware configuration and the development of control software. Results of preliminary performance evaluations are reported with emphasis on engineering and mechanical design issues and discussion of related psychophysiological effects and objectives.

  3. Light-Directed Ranging System Implementing Single Camera System for Telerobotics Applications

    NASA Technical Reports Server (NTRS)

    Wells, Dennis L. (Inventor); Li, Larry C. (Inventor); Cox, Brian J. (Inventor)

    1997-01-01

    A laser-directed ranging system has utility for use in various fields, such as telerobotics applications and other applications involving physically handicapped individuals. The ranging system includes a single video camera and a directional light source such as a laser mounted on a camera platform, and a remotely positioned operator. In one embodiment, the position of the camera platform is controlled by three servo motors to orient the roll axis, pitch axis and yaw axis of the video cameras, based upon an operator input such as head motion. The laser is offset vertically and horizontally from the camera, and the laser/camera platform is directed by the user to point the laser and the camera toward a target device. The image produced by the video camera is processed to eliminate all background images except for the spot created by the laser. This processing is performed by creating a digital image of the target prior to illumination by the laser, and then eliminating common pixels from the subsequent digital image which includes the laser spot. A reference point is defined at a point in the video frame, which may be located outside of the image area of the camera. The disparity between the digital image of the laser spot and the reference point is calculated for use in a ranging analysis to determine range to the target.

  4. Improved pointing information for SCIAMACHY from in-flight measurements of the viewing directions towards sun and moon

    NASA Astrophysics Data System (ADS)

    Bramstedt, Klaus; Stone, Thomas C.; Gottwald, Manfred; Noël, Stefan; Bovensmann, Heinrich; Burrows, John P.

    2017-07-01

    The SCanning Imaging Absorption spectroMeter for Atmospheric CHartographY (SCIAMACHY) on Envisat (2002-2012) performed nadir, limb, solar/lunar occultation and various monitoring measurements. The pointing information of the instrument is determined by the attitude information of the Envisat platform with its star trackers together with the encoder readouts of both the azimuth and the elevation scanner of SCIAMACHY. In this work, we present additional sources of attitude information from the SCIAMACHY measurements itself. The basic principle is the same as used by the star tracker: we measure the viewing direction towards celestial objects, i.e. sun and moon, to detect possible mispointings. In sun over limb port observations, we utilise the vertical scans over the solar disk. In horizontal direction, SCIAMACHY's sun follower device (SFD) is used to adjust the viewing direction. Moon over limb port measurements use for both the vertical and the horizontal direction the adjustment by the SFD. The viewing direction is steered towards the intensity centroid of the illuminated part of the lunar disk. We use reference images from the USGS Robotic Lunar Observatory (ROLO) to take into account the inhomogeneous surface and the variations by lunar libration and phase to parameterise the location of the intensity centroid from the observation geometry. Solar observations through SCIAMACHY's so-called sub-solar port (with a viewing direction closely to zenith) also use the SFD in the vertical direction. In the horizontal direction the geometry of the port defines the viewing direction. Using these three type of measurements, we fit improved mispointing parameters by minimising the pointing offsets in elevation and azimuth. The geolocation of all retrieved products will benefit from this; the tangent heights are especially improved. The altitudes assigned to SCIAMACHY's solar occultation measurements are changed in the range of -130 to -330 m, the lunar occultation

  5. Intelligent person identification system using stereo camera-based height and stride estimation

    NASA Astrophysics Data System (ADS)

    Ko, Jung-Hwan; Jang, Jae-Hun; Kim, Eun-Soo

    2005-05-01

    In this paper, a stereo camera-based intelligent person identification system is suggested. In the proposed method, face area of the moving target person is extracted from the left image of the input steros image pair by using a threshold value of YCbCr color model and by carrying out correlation between the face area segmented from this threshold value of YCbCr color model and the right input image, the location coordinates of the target face can be acquired, and then these values are used to control the pan/tilt system through the modified PID-based recursive controller. Also, by using the geometric parameters between the target face and the stereo camera system, the vertical distance between the target and stereo camera system can be calculated through a triangulation method. Using this calculated vertical distance and the angles of the pan and tilt, the target's real position data in the world space can be acquired and from them its height and stride values can be finally extracted. Some experiments with video images for 16 moving persons show that a person could be identified with these extracted height and stride parameters.

  6. Depth estimation and camera calibration of a focused plenoptic camera for visual odometry

    NASA Astrophysics Data System (ADS)

    Zeller, Niclas; Quint, Franz; Stilla, Uwe

    2016-08-01

    This paper presents new and improved methods of depth estimation and camera calibration for visual odometry with a focused plenoptic camera. For depth estimation we adapt an algorithm previously used in structure-from-motion approaches to work with images of a focused plenoptic camera. In the raw image of a plenoptic camera, scene patches are recorded in several micro-images under slightly different angles. This leads to a multi-view stereo-problem. To reduce the complexity, we divide this into multiple binocular stereo problems. For each pixel with sufficient gradient we estimate a virtual (uncalibrated) depth based on local intensity error minimization. The estimated depth is characterized by the variance of the estimate and is subsequently updated with the estimates from other micro-images. Updating is performed in a Kalman-like fashion. The result of depth estimation in a single image of the plenoptic camera is a probabilistic depth map, where each depth pixel consists of an estimated virtual depth and a corresponding variance. Since the resulting image of the plenoptic camera contains two plains: the optical image and the depth map, camera calibration is divided into two separate sub-problems. The optical path is calibrated based on a traditional calibration method. For calibrating the depth map we introduce two novel model based methods, which define the relation of the virtual depth, which has been estimated based on the light-field image, and the metric object distance. These two methods are compared to a well known curve fitting approach. Both model based methods show significant advantages compared to the curve fitting method. For visual odometry we fuse the probabilistic depth map gained from one shot of the plenoptic camera with the depth data gained by finding stereo correspondences between subsequent synthesized intensity images of the plenoptic camera. These images can be synthesized totally focused and thus finding stereo correspondences is enhanced

  7. Intelligent viewing control for robotic and automation systems

    NASA Astrophysics Data System (ADS)

    Schenker, Paul S.; Peters, Stephen F.; Paljug, Eric D.; Kim, Won S.

    1994-10-01

    We present a new system for supervisory automated control of multiple remote cameras. Our primary purpose in developing this system has been to provide capability for knowledge- based, `hands-off' viewing during execution of teleoperation/telerobotic tasks. The reported technology has broader applicability to remote surveillance, telescience observation, automated manufacturing workcells, etc. We refer to this new capability as `Intelligent Viewing Control (IVC),' distinguishing it from a simple programmed camera motion control. In the IVC system, camera viewing assignment, sequencing, positioning, panning, and parameter adjustment (zoom, focus, aperture, etc.) are invoked and interactively executed by real-time by a knowledge-based controller, drawing on a priori known task models and constraints, including operator preferences. This multi-camera control is integrated with a real-time, high-fidelity 3D graphics simulation, which is correctly calibrated in perspective to the actual cameras and their platform kinematics (translation/pan-tilt). Such merged graphics- with-video design allows the system user to preview and modify the planned (`choreographed') viewing sequences. Further, during actual task execution, the system operator has available both the resulting optimized video sequence, as well as supplementary graphics views from arbitrary perspectives. IVC, including operator-interactive designation of robot task actions, is presented to the user as a well-integrated video-graphic single screen user interface allowing easy access to all relevant telerobot communication/command/control resources. We describe and show pictorial results of a preliminary IVC system implementation for telerobotic servicing of a satellite.

  8. Polar Views of Planet Earth.

    ERIC Educational Resources Information Center

    Brochu, Michel

    1983-01-01

    In August, 1981, National Aeronautics and Space Administration launched Dynamics Explorer 1 into polar orbit equipped with three cameras built to view the Northern Lights. The cameras can photograph aurora borealis' faint light without being blinded by the earth's bright dayside. Photographs taken by the satellite are provided. (JN)

  9. Dynamic calibration of pan-tilt-zoom cameras for traffic monitoring.

    PubMed

    Song, Kai-Tai; Tai, Jen-Chao

    2006-10-01

    Pan-tilt-zoom (PTZ) cameras have been widely used in recent years for monitoring and surveillance applications. These cameras provide flexible view selection as well as a wider observation range. This makes them suitable for vision-based traffic monitoring and enforcement systems. To employ PTZ cameras for image measurement applications, one first needs to calibrate the camera to obtain meaningful results. For instance, the accuracy of estimating vehicle speed depends on the accuracy of camera calibration and that of vehicle tracking results. This paper presents a novel calibration method for a PTZ camera overlooking a traffic scene. The proposed approach requires no manual operation to select the positions of special features. It automatically uses a set of parallel lane markings and the lane width to compute the camera parameters, namely, focal length, tilt angle, and pan angle. Image processing procedures have been developed for automatically finding parallel lane markings. Interesting experimental results are presented to validate the robustness and accuracy of the proposed method.

  10. Camera Control and Geo-Registration for Video Sensor Networks

    NASA Astrophysics Data System (ADS)

    Davis, James W.

    With the use of large video networks, there is a need to coordinate and interpret the video imagery for decision support systems with the goal of reducing the cognitive and perceptual overload of human operators. We present computer vision strategies that enable efficient control and management of cameras to effectively monitor wide-coverage areas, and examine the framework within an actual multi-camera outdoor urban video surveillance network. First, we construct a robust and precise camera control model for commercial pan-tilt-zoom (PTZ) video cameras. In addition to providing a complete functional control mapping for PTZ repositioning, the model can be used to generate wide-view spherical panoramic viewspaces for the cameras. Using the individual camera control models, we next individually map the spherical panoramic viewspace of each camera to a large aerial orthophotograph of the scene. The result provides a unified geo-referenced map representation to permit automatic (and manual) video control and exploitation of cameras in a coordinated manner. The combined framework provides new capabilities for video sensor networks that are of significance and benefit to the broad surveillance/security community.

  11. Eruption of Kliuchevskoi volcano

    NASA Image and Video Library

    1994-10-05

    STS068-155-094 (30 September-11 October 1994) --- (Kliuchevskoi Volcano) The crewmembers used a Linhof large format Earth observation camera to photograph this nadir view of the Kamchatka peninsula's week-old volcano. The eruption and the follow-up environmental activity was photographed from 115 nautical miles above Earth. Six NASA astronauts spent a week and a half aboard the Space Shuttle Endeavour in support of the Space Radar Laboratory 2 (SRL-2) mission.

  12. PSA nadir as a predictive factor for biochemical disease-free survival and overall survival following whole-gland salvage HIFU following radiotherapy failure.

    PubMed

    Shah, T T; Peters, M; Kanthabalan, A; McCartan, N; Fatola, Y; van der Voort van Zyp, J; van Vulpen, M; Freeman, A; Moore, C M; Arya, M; Emberton, M; Ahmed, H U

    2016-09-01

    Treatment options for radio-recurrent prostate cancer are either androgen-deprivation therapy or salvage prostatectomy. Whole-gland high-intensity focussed ultrasound (HIFU) might have a role in this setting. An independent HIFU registry collated consecutive cases of HIFU. Between 2005 and 2012, we identified 50 men who underwent whole-gland HIFU following histological confirmation of localised disease following prior external beam radiotherapy (2005-2012). No upper threshold was applied for risk category, PSA or Gleason grade either at presentation or at the time of failure. Progression was defined as a composite with biochemical failure (Phoenix criteria (PSA>nadir+2 ng ml(-1))), start of systemic therapies or metastases. Median age (interquartile range (IQR)), pretreatment PSA (IQR) and Gleason score (range) were 68 years (64-72), 5.9 ng ml(-1) (2.2-11.3) and 7 (6-9), respectively. Median follow-up was 64 months (49-84). In all, 24/50 (48%) avoided androgen-deprivation therapies. Also, a total of 28/50 (56%) achieved a PSA nadir <0.5 ng ml(-1), 15/50 (30%) had a nadir ⩾0.5 ng ml(-1) and 7/50 (14%) did not nadir (PSA non-responders). Actuarial 1, 3 and 5-year progression-free survival (PFS) was 72, 40 and 31%, respectively. Actuarial 1, 3 and 5-year overall survival (OS) was 100, 94 and 87%, respectively. When comparing patients with PSA nadir <0.5 ng ml(-1), nadir ⩾0.5 and non-responders, a statistically significant difference in PFS was seen (P<0.0001). Three-year PFS in each group was 57, 20 and 0%, respectively. Five-year OS was 96, 100 and 38%, respectively. Early in the learning curve, between 2005 and 2007, 3/50 (6%) developed a fistula. Intervention for bladder outlet obstruction was needed in 27/50 (54%). Patient-reported outcome measure questionnaires showed incontinence (any pad-use) as 8/26 (31%). In our series of high-risk patients, in whom 30-50% may have micro-metastases, disease control rates were promising in PSA

  13. Whose point-of-view is it anyway?

    NASA Astrophysics Data System (ADS)

    Garvey, Gregory P.

    2011-03-01

    Shared virtual worlds such as Second Life privilege a single point-of-view, namely that of the user. When logged into Second Life a user sees the virtual world from a default viewpoint, which is from slightly above and behind the user's avatar (the user's alter ego 'in-world.') This point-of-view is as if the user were viewing his or her avatar using a camera floating a few feet behind it. In fact it is possible to set the view to as if you were seeing the world through the eyes of your avatar or you can even move the camera completely independent of your avatar. A change in point-of-view, means, more than just a different camera point-of-view. The practice of using multiple avatars requires a transformation of identity and personality. When a user 'enacts' the identity of a particular avatar, their 'real' personality is masked by the assumed personality. The technology of virtual worlds permits both a change of point-of -view and also facilitates a change in identity. Does this cause any psychological distress? Or is the ability to be someone else and see a world (a game, a virtual world) through a different set of eyes somehow liberating and even beneficial?

  14. A time-resolved image sensor for tubeless streak cameras

    NASA Astrophysics Data System (ADS)

    Yasutomi, Keita; Han, SangMan; Seo, Min-Woong; Takasawa, Taishi; Kagawa, Keiichiro; Kawahito, Shoji

    2014-03-01

    This paper presents a time-resolved CMOS image sensor with draining-only modulation (DOM) pixels for tube-less streak cameras. Although the conventional streak camera has high time resolution, the device requires high voltage and bulky system due to the structure with a vacuum tube. The proposed time-resolved imager with a simple optics realize a streak camera without any vacuum tubes. The proposed image sensor has DOM pixels, a delay-based pulse generator, and a readout circuitry. The delay-based pulse generator in combination with an in-pixel logic allows us to create and to provide a short gating clock to the pixel array. A prototype time-resolved CMOS image sensor with the proposed pixel is designed and implemented using 0.11um CMOS image sensor technology. The image array has 30(Vertical) x 128(Memory length) pixels with the pixel pitch of 22.4um. .

  15. 3D Surface Reconstruction and Automatic Camera Calibration

    NASA Technical Reports Server (NTRS)

    Jalobeanu, Andre

    2004-01-01

    Illustrations in this view-graph presentation are presented on a Bayesian approach to 3D surface reconstruction and camera calibration.Existing methods, surface analysis and modeling,preliminary surface reconstruction results, and potential applications are addressed.

  16. User-assisted visual search and tracking across distributed multi-camera networks

    NASA Astrophysics Data System (ADS)

    Raja, Yogesh; Gong, Shaogang; Xiang, Tao

    2011-11-01

    Human CCTV operators face several challenges in their task which can lead to missed events, people or associations, including: (a) data overload in large distributed multi-camera environments; (b) short attention span; (c) limited knowledge of what to look for; and (d) lack of access to non-visual contextual intelligence to aid search. Developing a system to aid human operators and alleviate such burdens requires addressing the problem of automatic re-identification of people across disjoint camera views, a matching task made difficult by factors such as lighting, viewpoint and pose changes and for which absolute scoring approaches are not best suited. Accordingly, we describe a distributed multi-camera tracking (MCT) system to visually aid human operators in associating people and objects effectively over multiple disjoint camera views in a large public space. The system comprises three key novel components: (1) relative measures of ranking rather than absolute scoring to learn the best features for matching; (2) multi-camera behaviour profiling as higher-level knowledge to reduce the search space and increase the chance of finding correct matches; and (3) human-assisted data mining to interactively guide search and in the process recover missing detections and discover previously unknown associations. We provide an extensive evaluation of the greater effectiveness of the system as compared to existing approaches on industry-standard i-LIDS multi-camera data.

  17. MISR Images Northeastern Botswana

    NASA Technical Reports Server (NTRS)

    2000-01-01

    MISR images of the Ntwetwe and Sua Pans in northeastern Botswana, acquired on August 18, 2000 (Terra orbit 3553). The left image is a color view from the vertical-viewing (nadir) camera. On the right is a composite of red band imagery in which the 45-degree aft camera data are displayed in blue, 45-degree forward as green, and vertical as red. This combination causes wet areas to appear blue because of the glint-like reflection from water and damp surfaces. Clouds are visible in the upper left corner and right center of each image. The clouds look peculiar in the multi-angle view because geometric parallax resulting from their elevation above the surface causes a misregistration of the individual images making up the composite. This stereoscopic effect provides a way of distinguishing clouds from bright surfaces.

    The images are approximately 250 kilometers across. Ntwetwe and Sua pans are closed interior basins that catch rainwater and surface runoff during the wet season. Seasonal lakes form that may reach several meters in depth. During the dry season the collected waters rapidly evaporate leaving behind dissolved salts that coat the surface and turn it bright ('sua' means salt). The mining town of Sowa is located where the Sua Spit (a finger of grassland extending into the pan) attaches to the shore. Sowa represents headquarters for a JPL contingent carrying out MISR field experiments using the evaporite surface and the grasslands as targets and for Botswana scientists studying migration of groundwaters beneath the pans and surrounding areas. These efforts support the Southern Africa Regional Science Initiative (SAFARI-2000), which is now underway.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

    For more information: http://www-misr.jpl.nasa.gov

  18. Lensless imaging for wide field of view

    NASA Astrophysics Data System (ADS)

    Nagahara, Hajime; Yagi, Yasushi

    2015-02-01

    It is desirable to engineer a small camera with a wide field of view (FOV) because of current developments in the field of wearable cameras and computing products, such as action cameras and Google Glass. However, typical approaches for achieving wide FOV, such as attaching a fisheye lens and convex mirrors, require a trade-off between optics size and the FOV. We propose camera optics that achieve a wide FOV, and are at the same time small and lightweight. The proposed optics are a completely lensless and catoptric design. They contain four mirrors, two for wide viewing, and two for focusing the image on the camera sensor. The proposed optics are simple and can be simply miniaturized, since we use only mirrors for the proposed optics and the optics are not susceptible to chromatic aberration. We have implemented the prototype optics of our lensless concept. We have attached the optics to commercial charge-coupled device/complementary metal oxide semiconductor cameras and conducted experiments to evaluate the feasibility of our proposed optics.

  19. The effect of radiometer placement and view on inferred directional and hemispheric radiometric temperatures of an urban canopy

    NASA Astrophysics Data System (ADS)

    Adderley, C.; Christen, A.; Voogt, J. A.

    2015-07-01

    Any radiometer at a fixed location has a biased view when observing a convoluted, three-dimensional surface such as an urban canopy. The goal of this contribution is to determine the bias of various sensors views observing a simple urban residential neighbourhood (nadir, oblique, hemispherical) over a 24 hour cycle under clear weather conditions. The error in measuring a longwave radiation flux density (L) and/or inferring surface temperatures (T0) is quantified for different times over a diurnal cycle. Panoramic time-sequential thermography (PTST) data were recorded by a thermal camera on a hydraulic mast above a residential canyon in Vancouver, BC. The data set resolved sub-facet temperature variability of all representative urban facets in a 360° swath repetitively over a 24-hour cycle. This data set is used along with computer graphics and vision techniques to project measured fields of L for a given time and pixel onto texture sheets of a three-dimensional urban surface model at a resolution of centimetres. The resulting data set attributes L of each pixel on the texture sheets to different urban facets and associates facet location, azimuth, slope, material, and sky view factor. The texture sheets of L are used to calculate the complete surface temperature (T0,C) and to simulate the radiation in the field of view (FOV) of narrow and hemispheric radiometers observing the same urban surface (in absence of emissivity and atmospheric effects). The simulated directional (T0,d) and hemispheric (T0,h) radiometric temperatures inferred from various biased views are compared to T0,C. For a range of simulated off-nadir (φ) and azimuth (Ω) angles, T0,d(φ,Ω) and T0,C differ between -2.6 and +2.9 K over the course of the day. The effects of effective anisotropy are highest in the daytime, particularly around sunrise and sunset when different views can lead to differences in T0,d(φ,Ω) that are as high as 3.5 K. For a sensor with a narrow FOV in the nadir of the

  20. The effect of radiometer placement and view on inferred directional and hemispheric radiometric temperatures of a urban canopy

    NASA Astrophysics Data System (ADS)

    Adderley, C.; Christen, A.; Voogt, J. A.

    2015-02-01

    Any radiometer at a fixed location has a biased view when observing a convoluted, three dimensional surface such as an urban canopy. The goal of this contribution is to determine the bias of various sensors views observing a simple urban residential neighbourhood (nadir, oblique, hemispherical) over a 24 h cycle under clear weather conditions. The error in measuring longwave radiance (L) and/or inferring surface temperatures (T0) is quantified for different times over a diurnal cycle. Panoramic time-sequential thermography (PTST) data was recorded by a thermal camera on a hydraulic mast above a residential canyon in Vancouver, BC. The dataset resolved sub-facet temperature variability of all representative urban facets in a 360° swath repetitively over a 24 h cycle. This dataset is used along with computer graphics and vision techniques to project measured fields of L for a given time and pixel onto texture sheets of a three-dimensional urban surface model at a resolution of centimetres. The resulting dataset attributes L of each pixel on the texture sheets to different urban facets and associates facet location, azimuth, slope, material, and sky view factor. The texture sheets of L are used to calculate the complete surface temperature (T0,C) and to simulate the instantaneous field of view (IFOV) of narrow and hemispheric radiometers observing the same urban surface (in absence of emissivity and atmospheric effects). The simulated directional (T0,d) and hemispheric (T0,h) radiometric temperatures inferred from various biased views are compared to T0,C. For a range of simulated off-nadir (ϕ) and azimuth (Ω) angles, T0,d (ϕ, Ω) and T0,C differ between -2.7 and +2.9 K over the course of the day. The effects of effective anisotropy are highest in the daytime, particularly around sunrise and sunset when different views can lead to differences in T0,d (ϕ, Ω) that are as high as 3.5 K. For a sensor with a narrow IFOV in the nadir of the urban

  1. Occlusion handling framework for tracking in smart camera networks by per-target assistance task assignment

    NASA Astrophysics Data System (ADS)

    Bo, Nyan Bo; Deboeverie, Francis; Veelaert, Peter; Philips, Wilfried

    2017-09-01

    Occlusion is one of the most difficult challenges in the area of visual tracking. We propose an occlusion handling framework to improve the performance of local tracking in a smart camera view in a multicamera network. We formulate an extensible energy function to quantify the quality of a camera's observation of a particular target by taking into account both person-person and object-person occlusion. Using this energy function, a smart camera assesses the quality of observations over all targets being tracked. When it cannot adequately observe of a target, a smart camera estimates the quality of observation of the target from view points of other assisting cameras. If a camera with better observation of the target is found, the tracking task of the target is carried out with the assistance of that camera. In our framework, only positions of persons being tracked are exchanged between smart cameras. Thus, communication bandwidth requirement is very low. Performance evaluation of our method on challenging video sequences with frequent and severe occlusions shows that the accuracy of a baseline tracker is considerably improved. We also report the performance comparison to the state-of-the-art trackers in which our method outperforms.

  2. TES/Aura L2 Carbon Dioxide (CO2) Nadir V7 (TL2CO2N)

    Atmospheric Science Data Center

    2018-01-18

    ... TES/Aura L2 Carbon Dioxide (CO2) Nadir (TL2CO2N) News:  TES News Join TES News List Project ... TES Order Tool Parameters:  Earth Science Atmosphere Atmospheric Chemistry/Carbon and Hydrocarbon Compounds ...

  3. Contextual view of Warner's Ranch. Second of three sequential views ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of Warner's Ranch. Second of three sequential views (from west to east) of the buildings in relation to the surrounding geography. Ranch house and trading post/barn on left. Note approximate location of Overland Trail crossing left to right. Camera facing north. - Warner Ranch, Ranch House, San Felipe Road (State Highway S2), Warner Springs, San Diego County, CA

  4. Contextual view of Warner's Ranch. First of three sequential views ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Contextual view of Warner's Ranch. First of three sequential views (from west to east) of the buildings in relation to the surrounding geography. Ranch House on right. Note approximate locations of Overland Trail on right and San Diego cutoff branching off to left. Camera facing northwest. - Warner Ranch, Ranch House, San Felipe Road (State Highway S2), Warner Springs, San Diego County, CA

  5. ETR, TRA642, CAMERA IS BELOW, BUT NEAR THE CEILING OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR, TRA-642, CAMERA IS BELOW, BUT NEAR THE CEILING OF THE GROUND FLOOR, AND LOOKS DOWN TOWARD THE CONSOLE FLOOR. CAMERA FACES WESTERLY. THE REACTOR PIT IS IN THE CENTER OF THE VIEW. BEYOND IT TO THE LEFT IS THE SOUTH SIDE OF THE WORKING CANAL. IN THE FOREGROUND ON THE RIGHT IS THE SHIELDING FOR THE PROCESS WATER TUNNEL AND PIPING. SPIRAL STAIRCASE AT LEFT OF VIEW. INL NEGATIVE NO. 56-2237. Jack L. Anderson, Photographer, 7/6/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  6. Fixed Nadir Focus Concentrated Solar Power Applying Reflective Array Tracking Method

    NASA Astrophysics Data System (ADS)

    Setiawan, B.; DAMayanti, A. M.; Murdani, A.; Habibi, I. I. A.; Wakidah, R. N.

    2018-04-01

    The Sun is one of the most potential renewable energy develoPMent to be utilized, one of its utilization is for solar thermal concentrators, CSP (Concentrated Solar Power). In CSP energy conversion, the concentrator is as moving the object by tracking the sunlight to reach the focus point. This method need quite energy consumption, because the unit of the concentrators has considerable weight, and use large CSP, means the existence of the usage unit will appear to be wider and heavier. The addition of weight and width of the unit will increase the torque to drive the concentrator and hold the wind gusts. One method to reduce energy consumption is direct the sunlight by the reflective array to nadir through CSP with Reflective Fresnel Lens concentrator. The focus will be below the nadir direction, and the position of concentrator will be fixed position even the angle of the sun’s elevation changes from morning to afternoon. So, the energy concentrated maximally, because it has been protected from wind gusts. And then, the possibility of dAMage and changes in focus construction will not occur. The research study and simulation of the reflective array (mechanical method) will show the reflective angle movement. The distance between reflectors and their angle are controlled by mechatronics. From the simulation using fresnel 1m2, and efficiency of solar energy is 60.88%. In restriction, the intensity of sunlight at the tropical circles 1KW/peak, from 6 AM until 6 PM.

  7. Geometric Calibration and Validation of Ultracam Aerial Sensors

    NASA Astrophysics Data System (ADS)

    Gruber, Michael; Schachinger, Bernhard; Muick, Marc; Neuner, Christian; Tschemmernegg, Helfried

    2016-03-01

    We present details of the calibration and validation procedure of UltraCam Aerial Camera systems. Results from the laboratory calibration and from validation flights are presented for both, the large format nadir cameras and the oblique cameras as well. Thus in this contribution we show results from the UltraCam Eagle and the UltraCam Falcon, both nadir mapping cameras, and the UltraCam Osprey, our oblique camera system. This sensor offers a mapping grade nadir component together with the four oblique camera heads. The geometric processing after the flight mission is being covered by the UltraMap software product. Thus we present details about the workflow as well. The first part consists of the initial post-processing which combines image information as well as camera parameters derived from the laboratory calibration. The second part, the traditional automated aerial triangulation (AAT) is the step from single images to blocks and enables an additional optimization process. We also present some special features of our software, which are designed to better support the operator to analyze large blocks of aerial images and to judge the quality of the photogrammetric set-up.

  8. Gate simulation of Compton Ar-Xe gamma-camera for radionuclide imaging in nuclear medicine

    NASA Astrophysics Data System (ADS)

    Dubov, L. Yu; Belyaev, V. N.; Berdnikova, A. K.; Bolozdynia, A. I.; Akmalova, Yu A.; Shtotsky, Yu V.

    2017-01-01

    Computer simulations of cylindrical Compton Ar-Xe gamma camera are described in the current report. Detection efficiency of cylindrical Ar-Xe Compton camera with internal diameter of 40 cm is estimated as1-3%that is 10-100 times higher than collimated Anger’s camera. It is shown that cylindrical Compton camera can image Tc-99m radiotracer distribution with uniform spatial resolution of 20 mm through the whole field of view.

  9. Evaluation of camera-based systems to reduce transit bus side collisions : phase II.

    DOT National Transportation Integrated Search

    2012-12-01

    The sideview camera system has been shown to eliminate blind zones by providing a view to the driver in real time. In : order to provide the best integration of these systems, an integrated camera-mirror system (hybrid system) was : developed and tes...

  10. Improving Photometric Calibration of Meteor Video Camera Systems

    NASA Technical Reports Server (NTRS)

    Ehlert, Steven; Kingery, Aaron; Suggs, Robert

    2016-01-01

    We present the results of new calibration tests performed by the NASA Meteoroid Environment Oce (MEO) designed to help quantify and minimize systematic uncertainties in meteor photometry from video camera observations. These systematic uncertainties can be categorized by two main sources: an imperfect understanding of the linearity correction for the MEO's Watec 902H2 Ultimate video cameras and uncertainties in meteor magnitudes arising from transformations between the Watec camera's Sony EX-View HAD bandpass and the bandpasses used to determine reference star magnitudes. To address the rst point, we have measured the linearity response of the MEO's standard meteor video cameras using two independent laboratory tests on eight cameras. Our empirically determined linearity correction is critical for performing accurate photometry at low camera intensity levels. With regards to the second point, we have calculated synthetic magnitudes in the EX bandpass for reference stars. These synthetic magnitudes enable direct calculations of the meteor's photometric ux within the camera band-pass without requiring any assumptions of its spectral energy distribution. Systematic uncertainties in the synthetic magnitudes of individual reference stars are estimated at 0:20 mag, and are limited by the available spectral information in the reference catalogs. These two improvements allow for zero-points accurate to 0:05 ?? 0:10 mag in both ltered and un ltered camera observations with no evidence for lingering systematics.

  11. Camera for Quasars in the Early Universe (CQUEAN)

    NASA Astrophysics Data System (ADS)

    Kim, Eunbin; Park, W.; Lim, J.; Jeong, H.; Kim, J.; Oh, H.; Pak, S.; Im, M.; Kuehne, J.

    2010-05-01

    The early universe of z ɳ is where the first stars, galaxies, and quasars formed, starting the re-ionization of the universe. The discovery and the study of quasars in the early universe allow us to witness the beginning of history of astronomical objects. In order to perform a medium-deep, medium-wide, imaging survey of quasars, we are developing an optical CCD camera, CQUEAN (Camera for QUasars in EArly uNiverse) which uses a 1024*1024 pixel deep-depletion CCD. It has an enhanced QE than conventional CCD at wavelength band around 1μm, thus it will be an efficient tool for observation of quasars at z > 7. It will be attached to the 2.1m telescope at McDonald Observatory, USA. A focal reducer is designed to secure a larger field of view at the cassegrain focus of 2.1m telescope. For long stable exposures, auto-guiding system will be implemented by using another CCD camera viewing an off-axis field. All these instruments will be controlled by the software written in python on linux platform. CQUEAN is expected to see the first light during summer in 2010.

  12. Data-Acquisition Software for PSP/TSP Wind-Tunnel Cameras

    NASA Technical Reports Server (NTRS)

    Amer, Tahani R.; Goad, William K.

    2005-01-01

    Wing-Viewer is a computer program for acquisition and reduction of image data acquired by any of five different scientificgrade commercial electronic cameras used at Langley Research center to observe wind-tunnel models coated with pressure or temperature-sensitive paints (PSP/TSP). Wing-Viewer provides full automation of camera operation and acquisition of image data, and has limited data-preprocessing capability for quick viewing of the results of PSP/TSP test images. Wing- Viewer satisfies a requirement for a standard interface between all the cameras and a single personal computer: Written by use of Microsoft Visual C++ and the Microsoft Foundation Class Library as a framework, Wing-Viewer has the ability to communicate with the C/C++ software libraries that run on the controller circuit cards of all five cameras.

  13. Validation of GOME (ERS-2) NO2 vertical column data with ground-based measurements at Issyk-Kul (Kyrgyzstan)

    NASA Astrophysics Data System (ADS)

    Ionov, D.; Sinyakov, V.; Semenov, V.

    Starting from 1995 the global monitoring of atmospheric nitrogen dioxide is carried out by the measurements of nadir-viewing GOME spectrometer aboard ERS-2 satellite. Continuous validation of that data by means of comparisons with well-controlled ground-based measurements is important to ensure the quality of GOME data products and improve related retrieval algorithms. At the station of Issyk-Kul (Kyrgyzstan) the ground-based spectroscopic observations of NO2 vertical column have been started since 1983. The station is located on the northern shore of Issyk-Kul lake, 1650 meters above the sea level (42.6 N, 77.0 E). The site is equipped with grating spectrometer for the twilight measurements of zenith-scattered solar radiation in the visible range, and applies the DOAS technique to retrieve NO2 vertical column. It is included in the list of NDSC stations as a complementary one. The present study is focused on validation of GOME NO2 vertical column data, based on 8-year comparison with correlative ground-based measurements at Issyk-Kul station in 1996-2003. Within the investigation, an agreement of both individual and monthly averaged GOME measurements with corresponding twilight ground-based observations is examined. Such agreement is analyzed with respect to different conditions (season, sun elevation), temporal/spatial criteria choice (actual overpass location, correction for diurnal variation) and data processing (GDP version 2.7, 3.0). In addition, NO2 vertical columns were integrated from simultaneous stratospheric profile measurements by NASA HALOE and SAGE-II/III satellite instruments and introduced to explain the differences with ground-based observations. In particular cases, NO2 vertical profiles retrieved from the twilight ground-based measurements at Issuk-Kul were also included into comparison. Overall, summertime GOME NO2 vertical columns were found to be systematicaly lower than ground-based data. This work was supported by International Association

  14. PBF Reactor Building (PER620). Camera facing south end of high ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Reactor Building (PER-620). Camera facing south end of high bay. Vertical-lift door is being installed. Later, pneumatic seals will be installed around door. Photographer: Kirsh. Date: September 31, 1968. INEEL negative no. 68-3176 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  15. View of southeastern New York State

    NASA Image and Video Library

    1973-08-15

    SL3-87-299 (July-September 1973) --- A vertical view of southeastern New York State is seen in this Skylab 3 Earth Resources Experiments Package S190-B (five-inch Earth terrain camera) infrared photograph taken from the Skylab space station in Earth orbit. An 18-inch, 450mm lens and type 2443 infrared Ektachrome film was used. This picture covers the northern part of New Jersey, a part of northwestern Pennsylvania, and the western tip of Connecticut. The body of water is Long Island Sound. The wide Hudson River flows southward across a corner of the photograph. The New York City metropolitan area occupies part of the picture. Federal agencies participating with NASA on the EREP project are the Departments of Agriculture, Commerce, Interior, the Environmental Protection Agency and the Corps of Engineers. All EREP photography is available to the public through the Department of Interior?s Earth Resources Observations Systems Data Center, Sioux Falls, South Dakota, 57198. Photo credit: NASA

  16. Optical Design of the LSST Camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivier, S S; Seppala, L; Gilmore, K

    2008-07-16

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, modified Paul-Baker design, with an 8.4-meter primary mirror, a 3.4-m secondary, and a 5.0-m tertiary feeding a camera system that includes a set of broad-band filters and refractive corrector lenses to produce a flat focal plane with a field of view of 9.6 square degrees. Optical design of the camera lenses and filters is integrated with optical design of telescope mirrors to optimize performance, resulting in excellent image quality over the entire field from ultra-violet to near infra-red wavelengths. The LSST camera optics design consists of three refractive lenses withmore » clear aperture diameters of 1.55 m, 1.10 m and 0.69 m and six interchangeable, broad-band, filters with clear aperture diameters of 0.75 m. We describe the methodology for fabricating, coating, mounting and testing these lenses and filters, and we present the results of detailed tolerance analyses, demonstrating that the camera optics will perform to the specifications required to meet their performance goals.« less

  17. Hubble Space Telescope photographed by Electronic Still Camera

    NASA Image and Video Library

    1993-12-04

    S61-E-001 (4 Dec 1993) --- This medium close-up view of the top portion of the Hubble Space Telescope (HST) was photographed with an Electronic Still Camera (ESC), and down linked to ground controllers soon afterward. Endeavour's crew captured the HST on December 4, 1993 in order to service the telescope over a period of five days. Four of the crew members will work in alternating pairs outside Endeavour's shirt sleeve environment to service the giant telescope. Electronic still photography is a relatively new technology which provides the means for a handheld camera to electronically capture and digitize an image with resolution approaching film quality. The electronic still camera has flown as an experiment on several other shuttle missions.

  18. HST Solar Arrays photographed by Electronic Still Camera

    NASA Image and Video Library

    1993-12-07

    S61-E-020 (7 Dec 1993) --- This close-up view of one of two Solar Arrays (SA) on the Hubble Space Telescope (HST) was photographed with an Electronic Still Camera (ESC), and down linked to ground controllers soon afterward. Endeavour's crew captured the HST on December 4, 1993, in order to service the telescope over a period of five days. Four of the crew members will work in alternating pairs outside Endeavour's shirt sleeve environment to service the giant telescope. Electronic still photography is a relatively new technology which provides the means for a handheld camera to electronically capture and digitize an image with resolution approaching film quality. The electronic still camera has flown as an experiment on several other shuttle missions.

  19. Extratropical Cyclone in the Southern Ocean

    NASA Technical Reports Server (NTRS)

    2002-01-01

    These images from the Multi-angle Imaging SpectroRadiometer (MISR) portray an occluded extratropical cyclone situated in the Southern Ocean, about 650 kilometers south of the Eyre Peninsula, South Australia. The left-hand image, a true-color view from MISR's nadir (vertical-viewing) camera, shows clouds just south of the Yorke Peninsula and the Murray-Darling river basin in Australia. Retrieved cloud-tracked wind velocities are indicated by the superimposed arrows. The image on the right displays cloud-top heights. Areas where cloud heights could not be retrieved are shown in black. Both the wind vectors and the cloud heights were derived using data from multiple MISR cameras within automated computer processing algorithms. The stereoscopic algorithms used to generate these results are still being refined, and future versions of these products may show modest changes. Extratropical cyclones are the dominant weather system at midlatitudes, and the term is used generically for regional low-pressure systems in the mid- to high-latitudes. In the southern hemisphere, cyclonic rotation is clockwise. These storms obtain their energy from temperature differences between air masses on either side of warm and cold fronts, and their characteristic pattern is of warm and cold fronts radiating out from a migrating low pressure center which forms, deepens, and dissipates as the fronts fold and collapse on each other. The center of this cyclone has started to decay, with the band of cloud to the south most likely representing the main front that was originally connected with the cyclonic circulation. These views were acquired on October 11, 2001, and the large view represents an area of about 380 kilometers x 1900 kilometers. Image courtesy NASA/GSFC/LaRC/JPL, MISR Team.

  20. Validation of the iPhone app using the force platform to estimate vertical jump height.

    PubMed

    Carlos-Vivas, Jorge; Martin-Martinez, Juan P; Hernandez-Mocholi, Miguel A; Perez-Gomez, Jorge

    2018-03-01

    Vertical jump performance has been evaluated with several devices: force platforms, contact mats, Vertec, accelerometers, infrared cameras and high-velocity cameras; however, the force platform is considered the gold standard for measuring vertical jump height. The purpose of this study was to validate an iPhone app called My Jump, that measures vertical jump height by comparing it with other methods that use the force platform to estimate vertical jump height, namely, vertical velocity at take-off and time in the air. A total of 40 sport sciences students (age 21.4±1.9 years) completed five countermovement jumps (CMJs) over a force platform. Thus, 200 CMJ heights were evaluated from the vertical velocity at take-off and the time in the air using the force platform, and from the time in the air with the My Jump mobile application. The height obtained was compared using the intraclass correlation coefficient (ICC). Correlation between APP and force platform using the time in the air was perfect (ICC=1.000, P<0.001). Correlation between APP and force platform using the vertical velocity at take-off was also very high (ICC=0.996, P<0.001), with an error margin of 0.78%. Therefore, these results showed that application, My Jump, is an appropriate method to evaluate the vertical jump performance; however, vertical jump height is slightly overestimated compared with that of the force platform.