Note: This page contains sample records for the topic camera lroc images from Science.gov.
While these samples are representative of the content of Science.gov,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of Science.gov
to obtain the most current and comprehensive results.
Last update: August 15, 2014.
1

NASA's Lunar Reconnaissance Orbiter Cameras (LROC)  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter LRO mission is scheduled to launch in the fall of 2008 as part of NASA s Robotic Lunar Exploration Program and is the first spacecraft to be built as part of NASA s Vision for Space Exploration The orbiter will be equipped with seven scientific instrument packages one of which is LROC The Lunar Reconnaissance Orbiter Camera LROC has been designed to address two of LRO s primary measurement objectives landing site certification and monitoring of polar illumination In order to examine potential landing sites high-resolution images 0 5 m pixel will be used to assess meter-scale features near the pole and other regions on the lunar surface The LROC will also acquire 100 m pixel images of the polar regions of the Moon during each orbit for a year to identify areas of permanent shadow and permanent or near-permanent illumination In addition to these two main objectives the LROC team also plans to conduct meter-scale monitoring of polar regions under varying illumination angles acquire overlapping observations to enable derivation of meter-scale topography acquire global multispectral imaging to map ilmenite and other minerals derive a global morphology base map characterize regolith properties and determine current impact hazards by re-imaging areas covered by Apollo images to search for newly-formed impact craters The LROC is a modified version of the Mars Reconnaissance Orbiter s Context Camera and Mars Color Imager The LROC will be made up of four optical elements two identical narrow-angle telescopes

Robinson, M.; McEwen, A.; Eliason, E.; Joliff, B.; Hiesinger, H.; Malin, M.; Thomas, P.; Turtle, E.; Brylow, S.

2

Lunar Reconnaissance Orbiter Camera (LROC) instrument overview  

USGS Publications Warehouse

The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.

Robinson, M. S.; Brylow, S. M.; Tschimmel, M.; Humm, D.; Lawrence, S. J.; Thomas, P. C.; Denevi, B. W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M. A.; Caplinger, M. A.; Ghaemi, F. T.; Schaffner, J. A.; Malin, M. C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T. N.; Eliason, E. M.; McEwen, A. S.; Turtle, E.; Jolliff, B. L.; Hiesinger, H.

2010-01-01

3

Lunar Reconnaissance Orbiter Camera (LROC) Instrument Overview  

Microsoft Academic Search

The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar\\u000a Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m\\/pixel visible and UV, respectively), while\\u000a the two NACs are monochrome narrow-angle linescan imagers (0.5 m\\/pixel). The primary mission of LRO is to obtain measurements\\u000a of the Moon that

M. S. Robinson; S. M. Brylow; M. Tschimmel; D. Humm; S. J. Lawrence; P. C. Thomas; B. W. Denevi; E. Bowman-Cisneros; J. Zerr; M. A. Ravine; M. A. Caplinger; F. T. Ghaemi; J. A. Schaffner; M. C. Malin; P. Mahanti; A. Bartels; J. Anderson; T. N. Tran; E. M. Eliason; A. S. McEwen; E. Turtle; B. L. Jolliff; H. Hiesinger

2010-01-01

4

Polar illumination conditions analysed with LROC images  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NAC) provide synoptic and high-resolution imaging of the polar regions, respectively. Image sequences collected over time provide the means to create movie sequences, illumination maps, verify illumination models, and identify small nearly continuously illuminated peaks to aid future polar exploration. With the LROC dataset, we have identified regions that are illuminated for over 90% of the year and small illuminated peaks that are 10's of meters wide in regions predicted to be in shadow.

Speyerer, E.; Robinson, M.

2011-10-01

5

Initial Results of 3D Topographic Mapping Using Lunar Reconnaissance Orbiter Camera (LROC) Stereo Imagery  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter (LRO), launched June 18, 2009, carries the Lunar Reconnaissance Orbiter Camera (LROC) as one of seven remote sensing instruments on board. The camera system is equipped with a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NAC) for systematic lunar surface mapping and detailed site characterization for potential landing site selection and resource identification. The LROC WAC is a pushframe camera with five 14-line by 704-sample framelets for visible light bands and two 16-line by 512-sample (summed 4x to 4 by 128) UV bands. The WAC can also acquire monochrome images with a 14-line by 1024-sample format. At the nominal 50-km orbit the visible bands ground scale is 75-m/pixel and the UV 383-m/pixel. Overlapping WAC images from adjacent orbits can be used to map topography at a scale of a few hundred meters. The two panchromatic NAC cameras are pushbroom imaging sensors each with a Cassegrain telescope of a 700-mm focal length. The two NAC cameras are aligned with a small overlap in the cross-track direction so that they cover a 5-km swath with a combined field-of-view (FOV) of 5.6°. At an altitude of 50-km, the NAC can provide panchromatic images from its 5,000-pixel linear CCD at a ground scale of 0.5-m/pixel. Calibration of the cameras was performed by using precision collimator measurements to determine the camera principal points and radial lens distortion. The orientation of the two NAC cameras is estimated by a boresight calibration using double and triple overlapping NAC images of the lunar surface. The resulting calibration results are incorporated into a photogrammetric bundle adjustment (BA), which models the LROC camera imaging geometry, in order to refine the exterior orientation (EO) parameters initially retrieved from the SPICE kernels. Consequently, the improved EO parameters can significantly enhance the quality of topographic products derived from LROC NAC imagery. In addition, an analysis of the spacecraft jitter effect is performed by measuring lunar surface features in the NAC CCD overlapping strip in the image space and object space. Topographic and cartographic data processing results and products derived from LROC NAC and WAC stereo imagery using different software systems from several participating institutions of the LROC team will be presented, including results of calibration, bundle adjustment, jitter analysis, DEM, orthophoto, and cartographic maps.

Li, R.; Oberst, J.; McEwen, A. S.; Archinal, B. A.; Beyer, R. A.; Thomas, P. C.; Chen, Y.; Hwangbo, J.; Lawver, J. D.; Scholten, F.; Mattson, S. S.; Howington-Kraus, A. E.; Robinson, M. S.

2009-12-01

6

Preliminary Mapping of Permanently Shadowed and Sunlit Regions Using the Lunar Reconnaissance Orbiter Camera (LROC)  

NASA Astrophysics Data System (ADS)

The spin axis of the Moon is tilted by only 1.5° (compared with the Earth's 23.5°), leaving some areas near the poles in permanent shadow while other nearby regions remain sunlit for a majority of the year. Theory, radar data, neutron measurements, and Lunar CRater Observation and Sensing Satellite (LCROSS) observations suggest that volatiles may be present in the cold traps created inside these permanently shadowed regions. While areas of near permanent illumination are prime locations for future lunar outposts due to benign thermal conditions and near constant solar power. The Lunar Reconnaissance Orbiter (LRO) has two imaging systems that provide medium and high resolution views of the poles. During almost every orbit the LROC Wide Angle Camera (WAC) acquires images at 100 m/pixel of the polar region (80° to 90° north and south latitude). In addition, the LROC Narrow Angle Camera (NAC) targets selected regions of interest at 0.7 to 1.5 m/pixel [Robinson et al., 2010]. During the first 11 months of the nominal mission, LROC acquired almost 6,000 WAC images and over 7,300 NAC images of the polar region (i.e., within 2° of pole). By analyzing this time series of WAC and NAC images, regions of permanent shadow and permanent, or near-permanent illumination can be quantified. The LROC Team is producing several reduced data products that graphically illustrate the illumination conditions of the polar regions. Illumination movie sequences are being produced that show how the lighting conditions change over a calendar year. Each frame of the movie sequence is a polar stereographic projected WAC image showing the lighting conditions at that moment. With the WAC’s wide field of view (~100 km at an altitude of 50 km), each frame has repeat coverage between 88° and 90° at each pole. The same WAC images are also being used to develop multi-temporal illumination maps that show the percent each 100 m × 100 m area is illuminated over a period of time. These maps are derived by stacking all the WAC frames, selecting a threshold to determine if the surface is illuminated, and summing the resulting binary images. In addition, mosaics of NAC images are also being produced for regions of interest at a scale of 0.7 to 1.5 m/pixel. The mosaics produced so far have revealed small illuminated surfaces on the tens of meters scale that were previously thought to be shadowed during that time. The LROC dataset of the polar regions complements previous illumination analysis of Clementine images [Bussey et al., 1999], Kaguya topography [Bussey et al., 2010], and the current efforts underway by the Lunar Orbiter Laser Altimeter (LOLA) Team [Mazarico et al., 2010] and provide an important new dataset for science and exploration. References: Bussey et al. (1999), Illumination conditions at the lunar south pole, Geophysical Research Letters, 26(9), 1187-1190. Bussey et al. (2010), Illumination conditions of the south pole of the Moon derived from Kaguya topography, Icarus, 208, 558-564. Mazarico et al. (2010), Illumination of the lunar poles from the Lunar Orbiter Laser Altimeter (LOLA) Topography Data, paper presented at 41st LPSC, Houston, TX. Robinson et al. (2010), Lunar Reconnaissance Orbiter Camera (LROC) Instrument Overview, Space Sci Rev, 150, 81-124.

Speyerer, E.; Koeber, S.; Robinson, M. S.

2010-12-01

7

LROC NAC Stereo Anaglyphs  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter Camera Narrow Angle Camera (LROC NAC) acquires high resolution (50 to 200 cm pixel scale) images of the Moon. In operation since June 2009, LROC NAC acquires geometric stereo pairs by rolling off-nadir on subsequent orbits. A new automated processing system currently in development will produce anaglyphs from most of the NAC geometric stereo pairs. An anaglyph is an image formed by placing one image from the stereo pair in the red channel, and the other image from the stereo pair in the green and blue channels, so that together with red-blue or red-cyan glasses, the 3D information in the pair can be readily viewed. These new image products will make qualitative interpretation of the lunar surface in 3D more accessible, without the need for intensive computational resources or special equipment. The LROC NAC is composed of two separate pushbroom CCD cameras (NAC L and R) aligned to increase the full swath width to 5 km from an altitude of 50 km. Development of the anaglyph processing system incorporates stereo viewing geometry, proper alignment of the NAC L and R frames, and optimal contrast normalization of the stereo pair to minimize extreme brightness differences, which can make stereo viewing difficult in an anaglyph. The LROC NAC anaglyph pipeline is based on a similar automated system developed for the HiRISE camera, on the Mars Reconnaissance Orbiter. Improved knowledge of camera pointing and spacecraft position allows for the automatic registration of the L and R frames by map projecting them to a polar stereographic projection. One half of the stereo pair must then be registered to the other so there is no offset in the vertical (y) direction. Stereo viewing depends on parallax only in the horizontal (x) direction. High resolution LROC NAC anaglyphs will be made available to the lunar science community and to the public on the LROC web site (http://lroc.sese.asu.edu).

Mattson, S.; McEwen, A. S.; Speyerer, E.; Robinson, M. S.

2012-12-01

8

Investigating at the Moon With new Eyes: The Lunar Reconnaissance Orbiter Mission Camera (LROC)  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter Mission Camera (LROC) H. Hiesinger (1,2), M.S. Robinson (3), A.S. McEwen (4), E.P. Turtle (4), E.M. Eliason (4), B.L. Jolliff (5), M.C. Malin (6), and P.C. Thomas (7) (1) Brown Univ., Dept. of Geological Sciences, Providence RI 02912, Harald_Hiesinger@brown.edu, (2) Westfaelische Wilhelms-University, (3) Northwestern Univ., (4) LPL, Univ. of Arizona, (5) Washington Univ., (6) Malin Space Science Systems, (7) Cornell Univ. The Lunar Reconnaissance Orbiter (LRO) mission is scheduled for launch in October 2008 as a first step to return humans to the Moon by 2018. The main goals of the Lunar Reconnaissance Orbiter Camera (LROC) are to: 1) assess meter and smaller- scale features for safety analyses for potential lunar landing sites near polar resources, and elsewhere on the Moon; and 2) acquire multi-temporal images of the poles to characterize the polar illumination environment (100 m scale), identifying regions of permanent shadow and permanent or near permanent illumination over a full lunar year. In addition, LROC will return six high-value datasets such as 1) meter-scale maps of regions of permanent or near permanent illumination of polar massifs; 2) high resolution topography through stereogrammetric and photometric stereo analyses for potential landing sites; 3) a global multispectral map in 7 wavelengths (300-680 nm) to characterize lunar resources, in particular ilmenite; 4) a global 100-m/pixel basemap with incidence angles (60-80 degree) favorable for morphologic interpretations; 5) images of a variety of geologic units at sub-meter resolution to investigate physical properties and regolith variability; and 6) meter-scale coverage overlapping with Apollo Panoramic images (1-2 m/pixel) to document the number of small impacts since 1971-1972, to estimate hazards for future surface operations. LROC consists of two narrow-angle cameras (NACs) which will provide 0.5-m scale panchromatic images over a 5-km swath, a wide-angle camera (WAC) to acquire images at about 100 m/pixel in seven color bands over a 100-km swath, and a common Sequence and Compressor System (SCS). Each NAC has a 700-mm-focal-length optic that images onto a 5000-pixel CCD line-array, providing a cross-track field-of-view (FOV) of 2.86 degree. The NAC readout noise is better than 100 e- , and the data are sampled at 12 bits. Its internal buffer holds 256 MB of uncompressed data, enough for a full-swath image 25-km long or a 2x2 binned image 100-km long. The WAC has two 6-mm- focal-length lenses imaging onto the same 1000 x 1000 pixel, electronically shuttered CCD area-array, one imaging in the visible/near IR, and the other in the UV. Each has a cross-track FOV of 90 degree. From the nominal 50-km orbit, the WAC will have a resolution of 100 m/pixel in the visible, and a swath width of ˜100 km. The seven-band color capability of the WAC is achieved by color filters mounted directly 1 over the detector, providing different sections of the CCD with different filters [1]. The readout noise is less than 40 e- , and, as with the NAC, pixel values are digitized to 12-bits and may be subsequently converted to 8-bit values. The total mass of the LROC system is about 12 kg; the total LROC power consumption averages at 22 W (30 W peak). Assuming a downlink with lossless compression, LRO will produce a total of 20 TeraBytes (TB) of raw data. Production of higher-level data products will result in a total of 70 TB for Planetary Data System (PDS) archiving, 100 times larger than any previous missions. [1] Malin et al., JGR, 106, 17651-17672, 2001. 2

Hiesinger, H.; Robinson, M. S.; McEwen, A. S.; Turtle, E. P.; Eliason, E. M.; Jolliff, B. L.; Malin, M. C.; Thomas, P. C.

9

Exploring the Moon at High-Resolution: First Results From the Lunar Reconnaissance Orbiter Camera (LROC)  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter (LRO) spacecraft was launched on an Atlas V 401 rocket from the Cape Canaveral Air Force Station Launch Complex 41 on June 18, 2009. After spending four days in Earth-Moon transit, the spacecraft entered a three month commissioning phase in an elliptical 30×200 km orbit. On September 15, 2009, LRO began its planned one-year nominal mapping mission in a quasi-circular 50 km orbit. A multi-year extended mission in a fixed 30×200 km orbit is optional. The Lunar Reconnaissance Orbiter Camera (LROC) consists of a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NACs). The WAC is a 7-color push-frame camera, which images the Moon at 100 and 400 m/pixel in the visible and UV, respectively, while the two NACs are monochrome narrow-angle linescan imagers with 0.5 m/pixel spatial resolution. LROC was specifically designed to address two of the primary LRO mission requirements and six other key science objectives, including 1) assessment of meter-and smaller-scale features in order to select safe sites for potential lunar landings near polar resources and elsewhere on the Moon; 2) acquire multi-temporal synoptic 100 m/pixel images of the poles during every orbit to unambiguously identify regions of permanent shadow and permanent or near permanent illumination; 3) meter-scale mapping of regions with permanent or near-permanent illumination of polar massifs; 4) repeat observations of potential landing sites and other regions to derive high resolution topography; 5) global multispectral observations in seven wavelengths to characterize lunar resources, particularly ilmenite; 6) a global 100-m/pixel basemap with incidence angles (60° -80° ) favorable for morphological interpretations; 7) sub-meter imaging of a variety of geologic units to characterize their physical properties, the variability of the regolith, and other key science questions; 8) meter-scale coverage overlapping with Apollo-era panoramic images (1-2 m/pixel) to document the number of small impacts since 1971-1972. LROC allows us to determine the recent impact rate of bolides in the size range of 0.5 to 10 meters, which is currently not well known. Determining the impact rate at these sizes enables engineering remediation measures for future surface operations and interplanetary travel. The WAC has imaged nearly the entire Moon in seven wavelengths. A preliminary global WAC stereo-based topographic model is in preparation [1] and global color processing is underway [2]. As the mission progresses repeat global coverage will be obtained as lighting conditions change providing a robust photometric dataset. The NACs are revealing a wealth of morpho-logic features at the meter scale providing the engineering and science constraints needed to support future lunar exploration. All of the Apollo landing sites have been imaged, as well as the majority of robotic landing and impact sites. Through the use of off-nadir slews a collection of stereo pairs is being acquired that enable 5-m scale topographic mapping [3-7]. Impact mor-phologies (terraces, impact melt, rays, etc) are preserved in exquisite detail at all Copernican craters and are enabling new studies of impact mechanics and crater size-frequency distribution measurements [8-12]. Other topical studies including, for example, lunar pyroclastics, domes, and tectonics are underway [e.g., 10-17]. The first PDS data release of LROC data will be in March 2010, and will include all images from the commissioning phase and the first 3 months of the mapping phase. [1] Scholten et al. (2010) 41st LPSC, #2111; [2] Denevi et al. (2010a) 41st LPSC, #2263; [3] Beyer et al. (2010) 41st LPSC, #2678; [4] Archinal et al. (2010) 41st LPSC, #2609; [5] Mattson et al. (2010) 41st LPSC, #1871; [6] Tran et al. (2010) 41st LPSC, #2515; [7] Oberst et al. (2010) 41st LPSC, #2051; [8] Bray et al. (2010) 41st LPSC, #2371; [9] Denevi et al. (2010b) 41st LPSC, #2582; [10] Hiesinger et al. (2010a) 41st LPSC, #2278; [11] Hiesinger et al. (2010b) 41st LPSC, #2304; [12] van der Bogert et al. (2010) 41st LPSC, #2165;

Robinson, Mark; Hiesinger, Harald; McEwen, Alfred; Jolliff, Brad; Thomas, Peter C.; Turtle, Elizabeth; Eliason, Eric; Malin, Mike; Ravine, A.; Bowman-Cisneros, Ernest

10

Photometric parameter maps of the Moon derived from LROC WAC images  

NASA Astrophysics Data System (ADS)

Spatially resolved photometric parameter maps were computed from 21 months of Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) images. Due to a 60° field-of-view (FOV), the WAC achieves nearly global coverage of the Moon each month with more than 50% overlap from orbit-to-orbit. From the repeat observations at various viewing and illumination geometries, we calculated Hapke bidirectional reflectance model parameters [1] for 1°x1° "tiles" from 70°N to 70°S and 0°E to 360°E. About 66,000 WAC images acquired from February 2010 to October 2011 were converted from DN to radiance factor (I/F) though radiometric calibration, partitioned into gridded tiles, and stacked in a time series (tile-by-tile method [2]). Lighting geometries (phase, incidence, emission) were computed using the WAC digital terrain model (100 m/pixel) [3]. The Hapke parameters were obtained by model fitting against I/F within each tile. Among the 9 parameters of the Hapke model, we calculated 3 free parameters (w, b, and hs) by setting constant values for 4 parameters (Bco=0, hc=1, ?, ?=0) and interpolating 2 parameters (c, Bso). In this simplification, we ignored the Coherent Backscatter Opposition Effect (CBOE) to avoid competing CBOE and Shadow Hiding Opposition Effect (SHOE). We also assumed that surface regolith porosity is uniform across the Moon. The roughness parameter (?) was set to an averaged value from the equator (× 3°N). The Henyey-Greenstein double lobe function (H-G2) parameter (c) was given by the 'hockey stick' relation [4] (negative correlation) between b and c based on laboratory measurements. The amplitude of SHOE (Bso) was given by the correlation between w and Bso at the equator (× 3°N). Single scattering albedo (w) is strongly correlated to the photometrically normalized I/F, as expected. The c shows an inverse trend relative to b due to the 'hockey stick' relation. The parameter c is typically low for the maria (0.08×0.06) relative to the highlands (0.47×0.16). Since c controls the fraction of backward/forward scattering in H-G2, lower c for the maria indicates more forward scattering relative to the highlands. This trend is opposite to what was expected because darker particles are usually more backscattering. However, the lower albedo of the maria is due to the higher abundance of ilmenite, which is an opaque mineral that scatters all of the light by specular reflection from the its surface. If their surface facets are relatively smooth the ilmenite particles will be forward scattering. Other factors (e.g. grain shape, grain size, porosity, maturity) besides the mineralogy might also be affecting c. The angular-width of SHOE (hs) typically shows lower values (0.047×0.02) for the maria relative to the highlands (0.074×0.025). An increase in hs for the maria theoretically suggests lower porosity or a narrower grain size distribution [1], but the link between actual materials and hs is not well constrained. Further experiments using both laboratory and spacecraft observations will help to unravel the photometric properties of the surface materials of the Moon. [1] Hapke, B.: Cambridge Univ. Press, 2012. [2] Sato, H. et al.: 42nd LPSC, abstract #1974, 2011. [3] Scholten, F. et al.: JGR, 117, E00H17, 2012. [4] Hapke, B.: Icarus, 221(2), p1079-1083, 2012.

Sato, H.; Robinson, M. S.; Hapke, B. W.; Denevi, B. W.; Boyd, A. K.

2013-12-01

11

Depths, Diameters, and Profiles of Small Lunar Craters From LROC NAC Stereo Images  

NASA Astrophysics Data System (ADS)

Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) images (pixel scale ~0.5 m) provide new 3-D views of small craters (40m>D>200m). We extracted topographic profiles from 85 of these craters in mare and highland terrains between 18.1-19.1°N and 5.2-5.4°E to investigate relationships among crater shape, age, and target. Obvious secondary craters (e.g., clustered) and moderately- to heavily-degraded craters were excluded. The freshest craters included in the study have crisp rims, bright ejecta, and no superposed craters. The depth, diameter, and profiles of each crater were determined from a NAC-derived DTM (M119808916/M119815703) tied to LOLA topography with better than 1 m vertical resolution (see [1]). Depth/diameter ratios for the selected craters are generally between 0.12 and 0.2. Crater profiles were classified into one of 3 categories: V-shaped, U-shaped, or intermediate (craters on steep slopes were excluded). Craters were then morphologically classified according to [2], where crater shape is determined by changes in material strength between subsurface layers, resulting in bowl-shaped, flat-bottomed, concentric, or central-mound crater forms. In this study, craters with U-shaped profiles tend to be small (<60 m) and flat-bottomed, while V-shaped craters have steep slopes (~20°), little to no floor, and a range of diameters. Both fresh and relatively degraded craters display the full range of profile shapes (from U to V and all stages in between). We found it difficult to differentiate U-shaped craters from V-shaped craters without the DTM, and we saw no clear correlation between morphologic and profile classification. Further study is still needed to increase our crater statistics and expand on the relatively small population of craters included here. For the craters in this study, we found that block abundances correlate with relative crater degradation state as defined by [3], where abundant blocks signal fresher craters; however, block abundances do not correlate with U- or V-shaped profiles. The craters examined here show that profile shape cannot be used to determine the relative age or degradation state as might be inferred from [4, for example]. The observed variability in crater profiles may be explained by local variations in regolith thickness [e.g., 2, 5], impactor velocity, and/or possibly bolide density. Ongoing efforts will quantify the possible effects of solitary secondary craters and investigate whether or not depth/diameter ratios and crater profiles vary between different regions of the Moon (thick vs thin regolith, highlands vs mare, and old vs young mare). References: [1] Tran T. et al. (2010) LPSC XXXXI, Abstract 2515. [2] Quaide W. L. and V. R. Oberbeck (1968) JGR, 73: 5247-5270. [3] Basilevsky A. T. (1976) Proc LPSC 7th, p. 1005-1020. [4] Soderblom L. A. and L. A. Lebofsky (1972) JGR, 77: 279-296. [5] Wilcox B. B. et al. (2005) Met. Planet. Sci., 40: 695-710.

Stopar, J. D.; Robinson, M.; Barnouin, O. S.; Tran, T.

2010-12-01

12

LROC assessment of non-linear filtering methods in Ga-67 SPECT imaging  

NASA Astrophysics Data System (ADS)

In emission tomography, iterative reconstruction is usually followed by a linear smoothing filter to make such images more appropriate for visual inspection and diagnosis by a physician. This will result in a global blurring of the images, smoothing across edges and possibly discarding valuable image information for detection tasks. The purpose of this study is to investigate which possible advantages a non-linear, edge-preserving postfilter could have on lesion detection in Ga-67 SPECT imaging. Image quality can be defined based on the task that has to be performed on the image. This study used LROC observer studies based on a dataset created by CPU-intensive Gate Monte Carlo simulations of a voxelized digital phantom. The filters considered in this study were a linear Gaussian filter, a bilateral filter, the Perona-Malik anisotropic diffusion filter and the Catte filtering scheme. The 3D MCAT software phantom was used to simulate the distribution of Ga-67 citrate in the abdomen. Tumor-present cases had a 1-cm diameter tumor randomly placed near the edges of the anatomical boundaries of the kidneys, bone, liver and spleen. Our data set was generated out of a single noisy background simulation using the bootstrap method, to significantly reduce the simulation time and to allow for a larger observer data set. Lesions were simulated separately and added to the background afterwards. These were then reconstructed with an iterative approach, using a sufficiently large number of MLEM iterations to establish convergence. The output of a numerical observer was used in a simplex optimization method to estimate an optimal set of parameters for each postfilter. No significant improvement was found for using edge-preserving filtering techniques over standard linear Gaussian filtering.

De Clercq, Stijn; Staelens, Steven; De Beenhouwer, Jan; D'Asseler, Yves; Lemahieu, Ignace

2006-03-01

13

Nonparametric ROC and LROC analysis  

SciTech Connect

In this paper we review several results of the nonparametric receiver operating characteristic (ROC) analysis and present an extension to the nonparametric localization ROC (LROC) analysis. Equations for the estimation of the area under the characteristic curve and for the variance calculations are derived. Expressions for the choice of the optimal ratio between the number of signal-absent and signal-present image samples are also presented. The results can be applied both with continuous or discrete scoring scales. The simulation studies carried out validate the theoretical derivations and show that the LROC analysis is considerably more sensitive than the ROC analysis.

Popescu, Lucretiu M. [University of Pennsylvania, Department of Radiology, 423 Guardian Drive, 4th floor Blockley Hall, Philadelphia, PA 19104-6021 (United States)

2007-05-15

14

Mapping the Apollo 17 landing site area based on Lunar Reconnaissance Orbiter Camera images and Apollo surface photography  

NASA Astrophysics Data System (ADS)

Newly acquired high resolution Lunar Reconnaissance Orbiter Camera (LROC) images allow accurate determination of the coordinates of Apollo hardware, sampling stations, and photographic viewpoints. In particular, the positions from where the Apollo 17 astronauts recorded panoramic image series, at the so-called “traverse stations”, were precisely determined for traverse path reconstruction. We analyzed observations made in Apollo surface photography as well as orthorectified orbital images (0.5 m/pixel) and Digital Terrain Models (DTMs) (1.5 m/pixel and 100 m/pixel) derived from LROC Narrow Angle Camera (NAC) and Wide Angle Camera (WAC) images. Key features captured in the Apollo panoramic sequences were identified in LROC NAC orthoimages. Angular directions of these features were measured in the panoramic images and fitted to the NAC orthoimage by applying least squares techniques. As a result, we obtained the surface panoramic camera positions to within 50 cm. At the same time, the camera orientations, North azimuth angles and distances to nearby features of interest were also determined. Here, initial results are shown for traverse station 1 (northwest of Steno Crater) as well as the Apollo Lunar Surface Experiment Package (ALSEP) area.

Haase, I.; Oberst, J.; Scholten, F.; Wählisch, M.; Gläser, P.; Karachevtseva, I.; Robinson, M. S.

2012-05-01

15

Spacecraft camera image registration  

NASA Technical Reports Server (NTRS)

A system for achieving spacecraft camera (1, 2) image registration comprises a portion external to the spacecraft and an image motion compensation system (IMCS) portion onboard the spacecraft. Within the IMCS, a computer (38) calculates an image registration compensation signal (60) which is sent to the scan control loops (84, 88, 94, 98) of the onboard cameras (1, 2). At the location external to the spacecraft, the long-term orbital and attitude perturbations on the spacecraft are modeled. Coefficients (K, A) from this model are periodically sent to the onboard computer (38) by means of a command unit (39). The coefficients (K, A) take into account observations of stars and landmarks made by the spacecraft cameras (1, 2) themselves. The computer (38) takes as inputs the updated coefficients (K, A) plus synchronization information indicating the mirror position (AZ, EL) of each of the spacecraft cameras (1, 2), operating mode, and starting and stopping status of the scan lines generated by these cameras (1, 2), and generates in response thereto the image registration compensation signal (60). The sources of periodic thermal errors on the spacecraft are discussed. The system is checked by calculating measurement residuals, the difference between the landmark and star locations predicted at the external location and the landmark and star locations as measured by the spacecraft cameras (1, 2).

Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Chan, Fred N. T. (Inventor); Gamble, Donald W. (Inventor)

1987-01-01

16

Combined collimator/reconstruction optimization for myocardial perfusion SPECT imaging using polar map-based LROC numerical observer  

NASA Astrophysics Data System (ADS)

Polar maps have been used to assist clinicians diagnose coronary artery diseases (CAD) in single photon emission computed tomography (SPECT) myocardial perfusion imaging. Herein, we investigate the optimization of collimator design for perfusion defect detection in SPECT imaging when reconstruction includes modeling of the collimator. The optimization employs an LROC clinical model observer (CMO), which emulates the clinical task of polar map detection of CAD. By utilizing a CMO, which better mimics the clinical perfusion-defect detection task than previous SKE based observers, our objective is to optimize collimator design for SPECT myocardial perfusion imaging when reconstruction includes compensation for collimator spatial resolution. Comparison of lesion detection accuracy will then be employed to determine if a lower spatial resolution hence higher sensitivity collimator design than currently recommended could be utilized to reduce the radiation dose to the patient, imaging time, or a combination of both. As the first step in this investigation, we report herein on the optimization of the three-dimensional (3D) post-reconstruction Gaussian filtering of and the number of iterations used to reconstruct the SPECT slices of projections acquired by a low-energy generalpurpose (LEGP) collimator. The optimization was in terms of detection accuracy as determined by our CMO and four human observers. Both the human and all four CMO variants agreed that the optimal post-filtering was with sigma of the Gaussian in the range of 0.75 to 1.0 pixels. In terms of number of iterations, the human observers showed a preference for 5 iterations; however, only one of the variants of the CMO agreed with this selection. The others showed a preference for 15 iterations. We shall thus proceed to optimize the reconstruction parameters for even higher sensitivity collimators using this CMO, and then do the final comparison between collimators using their individually optimized parameters with human observers and three times the test images to reduce the statistical variation seen in our present results.

Konate, Souleymane; Pretorius, P. Hendrik; Gifford, Howard C.; O'Connor, J. Michael; Konik, Arda; Shazeeb, Mohammed Salman; King, Michael A.

2012-02-01

17

Mapping the Apollo 17 Astronauts' Positions Based on LROC Data and Apollo Surface Photography  

NASA Astrophysics Data System (ADS)

The positions from where the Apollo 17 astronauts recorded panoramic image series, e.g. at the so-called "traverse stations", were precisely determined using ortho-images (0.5 m/pxl) as well as Digital Terrain Models (DTM) (1.5 m/pxl and 100 m/pxl) derived from Lunar Reconnaissance Orbiter Camera (LROC) data. Features imaged in the Apollo panoramas were identified in LROC ortho-images. Least-squares techniques were applied to angles measured in the panoramas to determine the astronaut's position to within the ortho-image pixel. The result of our investigation of Traverse Station 1 in the north-west of Steno Crater is presented.

Haase, I.; Oberst, J.; Scholten, F.; Gläser, P.; Wählisch, M.; Robinson, M. S.

2011-10-01

18

Multiframing image converter camera  

NASA Astrophysics Data System (ADS)

In this paper we describe a multiframe camera based on PV-OOl image tube. This tube is very convenient for obtaining short frame exposures due to availability of high-frequency shutter system with the image shift compensation facility, as well as with one pair of high-frequency deflection system. The developed camera provides 164 million frames/s recording speed with minimum time exposure of about 3ns. The frame number is controled between 6 and 16. Dynamic spatial resolution is not less than 5 line pair/ mm.

Frontov, H. N.; Serdyuchenko, Yury N.

1991-04-01

19

Stardust Imaging Camera  

NASA Astrophysics Data System (ADS)

The Stardust spacecraft is carrying an Imaging Camera, primarily for 81P/Wild 2 approach navigation, but the camera also will provide valuable data for P/Wild 2 nucleus characterization and about its activity. During an engineering readiness test, the camera already has provided useful scientific data about an asteroid, 5535 Annefrank. The camera utilizes spare parts from Cassini and Voyager as well as some inherited designs to provide an excellent, low-cost imaging system with a 1024 × 1024 pixel CCD detector and an f/3.5 lens with a 202 mm focal length. These yield a 3.5° × 3.5° field of view and, potentially, 12 arc second resolution. Spacecraft roll combined with a movable mirror permits safe autonomous body tracking. This capability was proven on Annefrank and will be used to acquire the desired data on P/Wild 2, with the spacecraft protected by a dust shield (a Whipple bumper) during that flyby. Details of the instrumental design and problems associated with its use make up the bulk of the paper.

Newburn, Ray L.; Bhaskaran, Shyam; Duxbury, Thomas C.; Fraschetti, George; Radey, Tom; Schwochert, Mark

2003-10-01

20

Neutron Imaging Camera  

NASA Technical Reports Server (NTRS)

The Neutron Imaging Camera (NIC) is based on the Three-dimensional Track Imager (3DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution, 3-D tracking of charged particles. The incident direction of fast neutrons, En > 0.5 MeV, are reconstructed from the momenta and energies of the proton and triton fragments resulting from (sup 3)He(n,p) (sup 3)H interactions in the 3-DTI volume. The performance of the NIC from laboratory and accelerator tests is presented.

Hunter, Stanley; deNolfo, G. A.; Barbier, L. M.; Link, J. T.; Son, S.; Floyd, S. R.; Guardala, N.; Skopec, M.; Stark, B.

2008-01-01

21

Neutron imaging camera  

NASA Astrophysics Data System (ADS)

The Neutron Imaging Camera (NIC) is based on the Three-dimensional Track Imager (3_DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, ~0.4 mm resolution, 3-D tracking of charged particles. The incident direction of fast neutrons, En > 0.5 MeV, are reconstructed from the momenta and energies of the proton and triton fragments resulting from 3He(n,p)3H interactions in the 3-DTI volume. The performance of the NIC from laboratory is presented.

Hunter, S. D.; de Nolfo, G. A.; Barbier, L. M.; Link, J. T.; Son, S.; Floyd, S. R.; Guardala, N.; Skopec, M.; Stark, B.

2008-05-01

22

Image Sensors Enhance Camera Technologies  

NASA Technical Reports Server (NTRS)

In the 1990s, a Jet Propulsion Laboratory team led by Eric Fossum researched ways of improving complementary metal-oxide semiconductor (CMOS) image sensors in order to miniaturize cameras on spacecraft while maintaining scientific image quality. Fossum s team founded a company to commercialize the resulting CMOS active pixel sensor. Now called the Aptina Imaging Corporation, based in San Jose, California, the company has shipped over 1 billion sensors for use in applications such as digital cameras, camera phones, Web cameras, and automotive cameras. Today, one of every three cell phone cameras on the planet feature Aptina s sensor technology.

2010-01-01

23

Computer-Assisted Detection of Collapse Pits in LROC NAC Images  

NASA Astrophysics Data System (ADS)

Pits in mare basalts and impact melt deposits provide unique environments for human shelters and preservation of geologic information. Due to their steep walls, pits are most distinguishable when the Sun is high (pit walls are casting shadows and impact crater walls are not). Because of the large number of NAC images acquired every day (>350), each typically with 5000 samples and 52,224 lines, it is not feasible to carefully search each image manually, so we developed a shadow detection algorithm (Pitscan) which analyzes an image in thirty seconds. It locates blocks of pixels that are below a digital number (DN) cutoff value, indicating that the block of pixels is "in shadow", and then runs a DN profile in the direction of solar lighting, comparing average DN values of the up-Sun and down-Sun sides. If the up-Sun average DN is higher than the down-Sun average, the shadow is assumed to be from a positive relief feature, and ignored. Otherwise, Pitscan saves a 200 x 200 pixel sub-image for later manual review. The algorithm currently generates ~150 false positives for each successful pit identification. This number would be unacceptable for an algorithm designed to catalog a common feature, but since the logic is merely intended to assist humans in locating an unusual type of feature, the false alarm rate is acceptable, and the current version allows a human to effectively check 10,000 NAC images for pits (over 2500 gigapixels) per hour. The false negative rate is not yet known, however Pitscan detected every pit in a test on a small subset of the images known to contain pits. Pitscan is only effective when the Sun is within 50° of the zenith. When the Sun is closer to the horizon crater walls often cast shadows, resulting in unacceptable numbers of false positives. Due to the Sun angle limit, only regions within 50° latitude of the equator are searchable. To date, 25.42% of the Moon has been imaged within this constraint. Early versions of Pitscan found more than 150 small (average diameter 15m) pits in impact melt deposits of Copernican craters [1]. More recently, improvements to the algorithm revealed two new large mare pits, similar to the three pits discovered in Kaguya images [2]. One is in Schlüter crater, a mare-filled crater near Orientale basin, with a 20 x 40m opening, approximately 60 m deep. The second new pit is in Lacus Mortis (44.96°N, 25.61°E) in a tectonically complex region west of Burg crater, This pit is the largest mare pit found to date, with an opening approximately 100 x 150 m, and a floor more than 90 m below the surrounding terrain. Most interesting from an exploration point of view is the fact that the east wall appears to have collapsed, leaving a relatively smooth ~22° slope from the surrounding mare down to the pit floor. Computer-assisted feature detection is an effective method of locating rare features in the extremely large high-resolution NAC dataset. Pitscan enabled the discovery of unknown collapse pits both in the mare and highlands. These pits are an important resource for future surface exploration, both by providing access to pristine cross-sections of the near-surface and by providing radiation and micrometorite shielding for human outposts. [1] Wagner, R.V. et al. (2012), LPSC XLIII, #2266 [2] Haruyama, J. et al. (2010), LPSC XLI, #1285

Wagner, R. V.; Robinson, M. S.

2012-12-01

24

Neutron Imaging Camera  

NASA Technical Reports Server (NTRS)

We describe the Neutron Imaging Camera (NIC) being developed for DTRA applications by NASA/GSFC and NSWC/Carderock. The NIC is based on the Three-dimensional Track Imager (3-DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution. 3-D tracking of charged particles. The incident direction of fast neutrons, E(sub N) > 0.5 MeV. arc reconstructed from the momenta and energies of the proton and triton fragments resulting from 3He(n,p)3H interactions in the 3-DTI volume. We present angular and energy resolution performance of the NIC derived from accelerator tests.

Hunter, Stanley D.; DeNolfo, Georgia; Floyd, Sam; Krizmanic, John; Link, Jason; Son, Seunghee; Guardala, Noel; Skopec, Marlene; Stark, Robert

2008-01-01

25

Camera motion and mobile imaging  

NASA Astrophysics Data System (ADS)

Due to the demanding size and cost constraints of camera phones, the mobile imaging industry needs to address several key challenges in order to achieve the quality of a digital still camera. Minimizing camera-motion introduced image blur is one of them. Film photographers have long used a rule-of-thumb that a hand held 35mm format film camera should have an exposure in seconds that is not longer than the inverse of the focal length in millimeters. Due to the lack of scientific studies on camera-motion, it is still an open question how to generalize this rule-of-thumb to digital still cameras as well as camera phones. In this paper, we first propose a generalized rule-of-thumb with the original rule-of-thumb as a special case when camera-motion can be approximated by a linear motion at 1.667 °/sec. We then use a gyroscope-based system to measure camera-motion patterns for two camera phones (one held with one hand and the other held in two hands) and one digital still camera. The results show that effective camera-motion function can be approximated very well by a linear function for exposure durations less than 100ms. While the effective camera-motion speed for camera phones (5.95 °/sec and 4.39 °/sec respectively) is significantly higher than that of digital still cameras (2.18 °/sec), it was found that holding a camera phone with two hands while taking pictures does reduce the amount of camera motion. It was also found that camera-motion not only varies significantly across subjects but also across captures for the same subject. Since camera phones have significantly higher motion and longer exposure durations than 35mm format film cameras and most digital still cameras, it is expected that many of the pictures taken by camera phones today will not meet the sharpness criteria used in 35mm film print. The mobile imaging industry is aggressively pursuing a smaller and smaller pixel size in order to meet the digital still camera's performance in terms of total pixels while retaining the small size needed for the mobile industry. This makes it increasingly more important to address the camera-motion challenge associated with smaller pixel size.

Xiao, Feng; Pincenti, John; John, George; Johnson, Kevin

2007-03-01

26

Sweeping Image Mirror Camera.  

National Technical Information Service (NTIS)

The camera is for photographing high speed events, and is especially useful for the scientific study of nuclear phenomena. The camera housing encloses a system of two curved mirrors for reflecting light. This light emanates or is reflected from an object ...

T. Anderson

1965-01-01

27

Full Stokes polarization imaging camera  

NASA Astrophysics Data System (ADS)

Objective and background: We present a new version of Bossa Nova Technologies' passive polarization imaging camera. The previous version was performing live measurement of the Linear Stokes parameters (S0, S1, S2), and its derivatives. This new version presented in this paper performs live measurement of Full Stokes parameters, i.e. including the fourth parameter S3 related to the amount of circular polarization. Dedicated software was developed to provide live images of any Stokes related parameters such as the Degree Of Linear Polarization (DOLP), the Degree Of Circular Polarization (DOCP), the Angle Of Polarization (AOP). Results: We first we give a brief description of the camera and its technology. It is a Division Of Time Polarimeter using a custom ferroelectric liquid crystal cell. A description of the method used to calculate Data Reduction Matrix (DRM)5,9 linking intensity measurements and the Stokes parameters is given. The calibration was developed in order to maximize the condition number of the DRM. It also allows very efficient post processing of the images acquired. Complete evaluation of the precision of standard polarization parameters is described. We further present the standard features of the dedicated software that was developed to operate the camera. It provides live images of the Stokes vector components and the usual associated parameters. Finally some tests already conducted are presented. It includes indoor laboratory and outdoor measurements. This new camera will be a useful tool for many applications such as biomedical, remote sensing, metrology, material studies, and others.

Vedel, M.; Breugnot, S.; Lechocinski, N.

2011-09-01

28

Digital Elevation Models and Derived Products from Lroc Nac Stereo Observations  

NASA Astrophysics Data System (ADS)

One of the primary objectives of the Lunar Reconnaissance Orbiter Camera (LROC) is to acquire stereo observations with the Narrow Angle Camera (NAC) to enable production of high resolution digital elevation models (DEMs). This work describes the processes and techniques used in reducing the NAC stereo observations to DEMs through a combination of USGS integrated Software for Imagers and Spectrometers (ISIS) and SOCET SET® from BAE Systems by a team at Arizona State University (ASU). LROC Science Operations Center personnel have thus far reduced 130 stereo observations to DEMs of more than 130 stereo pairs for 11 Constellation Program (CxP) sites and 53 other regions of scientific interest. The NAC DEM spatial sampling is typically 2 meters, and the vertical precision is 1-2 meters. Such high resolution provides the three-dimensional view of the lunar surface required for site selection, hazard avoidance and planning traverses that minimize resource consumption. In addition to exploration analysis, geologists can measure parameters such as elevation, slope, and volume to place constraints on composition and geologic history. The NAC DEMs are released and archived through NASA's Planetary Data System.

Burns, K. N.; Speyerer, E. J.; Robinson, M. S.; Tran, T.; Rosiek, M. R.; Archinal, B. A.; Howington-Kraus, E.; the LROC Science Team

2012-08-01

29

Multispectral image dissector camera system  

NASA Technical Reports Server (NTRS)

A sensor system which provides registered high-solution multispectral images from a single sensor with no mechanical moving parts is reported, and the operation of an image dissector camera (IDC) is described. An earth scene 100 nautical miles wide is imaged through a single lens onto a photocathode surface containing three spectral filters, thereby producing three separate spectral signatures on the photocathode surface. An electron image is formed, accelerated, focused, and electromagnetically, deflected across an image plane which contains three sampling apertures, behind which are located three electron multipliers. The IDC system uses electromagnetic deflection for cross-track scanning and spacecraft orbit motion for along-track scanning, thus eliminating the need for a mechanical scanning mirror.

Johnson, B. L.

1972-01-01

30

Size Frequency Distributions of Blocks on Lunar Volcanic Landforms: Results from LROC  

NASA Astrophysics Data System (ADS)

We present the results from a comprehensive effort to obtain detailed size-frequency distributions for blocks associated with volcanic landforms by digitizing blocks from LROC NAC images and discuss implications for understanding the geology of the Marius Hills region.

Lawrence, S. J.; Denevi, B. W.; Hawke, B. R.; Giguere, T. A.; Robinson, M. S.; Stopar, J. D.; Tran, T.; Jolliff, B. L.; Bussey, D. B. J.; Stelling, R.

2011-03-01

31

Size Frequency Distributions of Blocks on Lunar Volcanic Landforms: Results from LROC  

Microsoft Academic Search

We present the results from a comprehensive effort to obtain detailed size-frequency distributions for blocks associated with volcanic landforms by digitizing blocks from LROC NAC images and discuss implications for understanding the geology of the Marius Hills region.

S. J. Lawrence; B. W. Denevi; B. R. Hawke; T. A. Giguere; M. S. Robinson; J. D. Stopar; T. Tran; B. L. Jolliff; D. B. J. Bussey; R. Stelling

2011-01-01

32

Intelligent thermal imaging camera with network interface  

NASA Astrophysics Data System (ADS)

In recent years, a significant increase in usage of thermal imagining cameras can be observed in both public and commercial sector, due to the lower cost and expanding availability of uncooled microbolometer infrared radiation detectors. Devices present on the market vary in their parameters and output interfaces. However, all these thermographic cameras are only a source of an image, which is then analyzed in external image processing unit. There is no possibility to run users dedicated image processing algorithms by thermal imaging camera itself. This paper presents a concept of realization, architecture and hardware implementation of "Intelligent thermal imaging camera with network interface" utilizing modern technologies, standards and approach in one single device.

Sielewicz, Krzysztof M.; Kasprowicz, Grzegorz; Po?niak, Krzysztof T.; Romaniuk, R. S.

2011-06-01

33

Image dissector camera system study  

NASA Technical Reports Server (NTRS)

Various aspects of a rendezvous and docking system using an image dissector detector as compared to a GaAs detector were discussed. Investigation into a gimbled scanning system is also covered and the measured video response curves from the image dissector camera are presented. Rendezvous will occur at ranges greater than 100 meters. The maximum range considered was 1000 meters. During docking, the range, range-rate, angle, and angle-rate to each reflector on the satellite must be measured. Docking range will be from 3 to 100 meters. The system consists of a CW laser diode transmitter and an image dissector receiver. The transmitter beam is amplitude modulated with three sine wave tones for ranging. The beam is coaxially combined with the receiver beam. Mechanical deflection of the transmitter beam, + or - 10 degrees in both X and Y, can be accomplished before or after it is combined with the receiver beam. The receiver will have a field-of-view (FOV) of 20 degrees and an instantaneous field-of-view (IFOV) of two milliradians (mrad) and will be electronically scanned in the image dissector. The increase in performance obtained from the GaAs photocathode is not needed to meet the present performance requirements.

Howell, L.

1984-01-01

34

Image formation in fundus cameras.  

PubMed

Imaging in a fundus camera depends more on design of the system than on correction of the first fundus image as formed by the ophthalmoscopic lens. We show here that the designer may use the free parameters of the ophthalmoscopic lens (contact or noncontact) to correct the latter for observation and illumination of the fundus. In both contact and noncontact systems the fundus is illuminated by forming a ring of light on the patient's cornea around a central area (the corneal window) reserved for observation. On the first surface of the crystalline lens, the light also forms a ring which must accomodate the total entrance pupil (TEP) of the observation system in its middle and which is limited on the outside by the patient's iris. The restrictions that result from this situation define the entrance pupil of the bundle of rays that image the marginal point of the retina. The limits of this bundle are imposed by the choice of the angular field of view and by the size of the patient's pupil. PMID:447462

Pomerantzeff, O; Webb, R H; Delori, F C

1979-06-01

35

Exploring the Moon with the Lunar Reconnaissance Orbiter Camera  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter Camera (LROC) consists of three imaging systems: a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NACs). Since entering lunar orbit in June of 2009, LROC has collected over 700,000 images. A subset of WAC images were reduced into a global morphologic basemap, a near-global digital elevation model, and multitemporal movie sequences that characterize illumination conditions of the polar regions. In addition, NAC observations were reduced to meter scale maps and digital elevation models of select regions of interest. These Reduced Data Record (RDR) products were publicly released through NASA's Planetary Data System to aid scientists and engineers in planning future lunar missions and addressing key science questions.

Robinson, M. S.; Speyerer, E. J.; Boyd, A.; Waller, D.; Wagner, R. V.; Burns, K. N.

2012-08-01

36

Morphology and Composition of Localized Lunar Dark Mantle Deposits With LROC Data  

NASA Astrophysics Data System (ADS)

Clementine color (ultraviolet, visible or UVVIS) and Lunar Reconnaissance Orbiter (LRO) Wide Angle (WAC) and Narrow Angle (NAC) camera data provide the means to investigate localized lunar dark-mantle deposits (DMDs) of potential pyroclastic origin. Our goals are to (1) examine the morphology and physical characteristics of these deposits with LROC WAC and NAC data; (2) extend methods used in earlier studies of lunar DMDs with Clementine spectral reflectance (CSR) data; (3) use LRO WAC multispectral data to complement and extend the CSR data for compositional analyses; and (4) apply these results to identify the likely mode of emplacement and study the diversity of compositions among these deposits. Pyroclastic deposits have been recognized all across the Moon, identified by their low albedo, smooth texture, and mantling relationship to underlying features. Gaddis et al. (2003) presented a compositional analysis of 75 potential lunar pyroclastic deposits (LPDs) based on CSR measurements. New LRO camera (LROC) data permit more extensive analyses of such deposits than previously possible. Our study began with six sites on the southeastern limb of the Moon that contain nine of the cataloged 75 potential pyroclastic deposits: Humboldt (4 deposits), Petavius, Barnard, Abel B, Abel C, and Titius. Our analysis found that some of the DMDs exhibit qualities characteristic of fluid emplacement, such as flat surfaces, sharp margins, embaying relationships, and flow textures. We conclude that the localized DMDs are a complex class of features, many of which may have formed by a combination of effusive and pyroclastic emplacement mechanisms. We have extended this analysis to include additional localized DMDs from the catalog of 75 potential pyroclastic deposits. We have examined high resolution (up to 0.5 m/p) NAC images as they become available to assess the mode of emplacement of the deposits, locate potential volcanic vents, and assess physical characteristics of the DMDs such as thickness, roughness, and rock abundance. Within and around each DMD, the Clementine UVVIS multispectral mosaic (100 m/p, 5 bands at 415, 750, 900, 950, and 1000 nm) and LROC WAC multispectral image cubes (75 to 400 m/p, 7 bands at 320, 360, 415, 565, 605, 645, and 690 nm) have been used to extract spectral reflectance data. Spectral ratio plots were prepared to compare deposits and draw conclusions regarding compositional differences, such as mafic mineral or titanium content and distribution, both within and between DMDs. The result of the study will be an improved classification of these deposits in terms of emplacement mechanisms and composition, including identifying compositional affinities among DMDs and between DMDs and other volcanic deposits.

Gustafson, O.; Bell, J. F.; Gaddis, L. R.; Hawke, B. R.; Robinson, M. S.; LROC Science Team

2010-12-01

37

On an assessment of surface roughness estimates from lunar laser altimetry pulse-widths for the Moon from LOLA using LROC narrow-angle stereo DTMs.  

NASA Astrophysics Data System (ADS)

Neumann et al. [1] proposed that laser altimetry pulse-widths could be employed to derive "within-footprint" surface roughness as opposed to surface roughness estimated from between laser altimetry pierce-points such as the example for Mars [2] and more recently from the 4-pointed star-shaped LOLA (Lunar reconnaissance Orbiter Laser Altimeter) onboard the NASA-LRO [3]. Since 2009, the LOLA has been collecting extensive global laser altimetry data with a 5m footprint and ?25m between the 5 points in a star-shape. In order to assess how accurately surface roughness (defined as simple RMS after slope correction) derived from LROC matches with surface roughness derived from LOLA footprints, publicly released LROC-NA (LRO Camera Narrow Angle) 1m Digital Terrain Models (DTMs) were employed to measure the surface roughness directly within each 5m footprint. A set of 20 LROC-NA DTMs were examined. Initially the match-up between the LOLA and LROC-NA orthorectified images (ORIs) is assessed visually to ensure that the co-registration is better than the LOLA footprint resolution. For each LOLA footprint, the pulse-width geolocation is then retrieved and this is used to "cookie-cut" the surface roughness and slopes derived from the LROC-NA DTMs. The investigation which includes data from a variety of different landforms shows little, if any correlation between surface roughness estimated from DTMs with LOLA pulse-widths at sub-footprint scale. In fact there is only any perceptible correlation between LOLA and LROC-DTMs at baselines of 40-60m for surface roughness and 20m for slopes. [1] Neumann et al. Mars Orbiter Laser Altimeter pulse width measurements and footprint-scale roughness. Geophysical Research Letters (2003) vol. 30 (11), paper 1561. DOI: 10.1029/2003GL017048 [2] Kreslavsky and Head. Kilometer-scale roughness of Mars: results from MOLA data analysis. J Geophys Res (2000) vol. 105 (E11) pp. 26695-26711. [3] Rosenburg et al. Global surface slopes and roughness of the Moon from the Lunar Orbiter Laser Altimeter. Journal of Geophysical Research (2011) vol. 116, paper E02001. DOI: 10.1029/2010JE003716 [4] Chin et al. Lunar Reconnaissance Orbiter Overview: The Instrument Suite and Mission. Space Science Reviews (2007) vol. 129 (4) pp. 391-419

Muller, Jan-Peter; Poole, William

2013-04-01

38

Matching image color from different cameras  

NASA Astrophysics Data System (ADS)

Can images from professional digital SLR cameras be made equivalent in color using simple colorimetric characterization? Two cameras were characterized, these characterizations were implemented on a variety of images, and the results were evaluated both colorimetrically and psychophysically. A Nikon D2x and a Canon 5D were used. The colorimetric analyses indicated that accurate reproductions were obtained. The median CIELAB color differences between the measured ColorChecker SG and the reproduced image were 4.0 and 6.1 for the Canon (chart and spectral respectively) and 5.9 and 6.9 for the Nikon. The median differences between cameras were 2.8 and 3.4 for the chart and spectral characterizations, near the expected threshold for reliable image difference perception. Eight scenes were evaluated psychophysically in three forced-choice experiments in which a reference image from one of the cameras was shown to observers in comparison with a pair of images, one from each camera. The three experiments were (1) a comparison of the two cameras with the chart-based characterizations, (2) a comparison with the spectral characterizations, and (3) a comparison of chart vs. spectral characterization within and across cameras. The results for the three experiments are 64%, 64%, and 55% correct respectively. Careful and simple colorimetric characterization of digital SLR cameras can result in visually equivalent color reproduction.

Fairchild, Mark D.; Wyble, David R.; Johnson, Garrett M.

2008-01-01

39

Coherent infrared imaging camera (CIRIC)  

SciTech Connect

New developments in 2-D, wide-bandwidth HgCdTe (MCT) and GaAs quantum-well infrared photodetectors (QWIP) coupled with Monolithic Microwave Integrated Circuit (MMIC) technology are now making focal plane array coherent infrared (IR) cameras viable. Unlike conventional IR cameras which provide only thermal data about a scene or target, a coherent camera based on optical heterodyne interferometry will also provide spectral and range information. Each pixel of the camera, consisting of a single photo-sensitive heterodyne mixer followed by an intermediate frequency amplifier and illuminated by a separate local oscillator beam, constitutes a complete optical heterodyne receiver. Applications of coherent IR cameras are numerous and include target surveillance, range detection, chemical plume evolution, monitoring stack plume emissions, and wind shear detection.

Hutchinson, D.P.; Simpson, M.L.; Bennett, C.A.; Richards, R.K.; Emery, M.S.; Crutcher, R.I.; Sitter, D.N. Jr.; Wachter, E.A.; Huston, M.A.

1995-07-01

40

15 CFR 743.3 - Thermal imaging camera reporting.  

Code of Federal Regulations, 2010 CFR

...2010-01-01 false Thermal imaging camera reporting. 743.3 Section... BUREAU OF INDUSTRY AND SECURITY, DEPARTMENT OF COMMERCE ...743.3 Thermal imaging camera reporting. (a) General... Exports of thermal imaging cameras must be reported to BIS...

2010-01-01

41

15 CFR 743.3 - Thermal imaging camera reporting.  

Code of Federal Regulations, 2013 CFR

...2013-01-01 false Thermal imaging camera reporting. 743.3 Section... BUREAU OF INDUSTRY AND SECURITY, DEPARTMENT OF COMMERCE ...743.3 Thermal imaging camera reporting. (a) General... Exports of thermal imaging cameras must be reported to BIS...

2013-01-01

42

The European Photon Imaging Camera on XMM-Newton: The MOS cameras : The MOS cameras  

Microsoft Academic Search

The EPIC focal plane imaging spectrometers on XMM-Newton use CCDs to record the images and spectra of celestial X-ray sources focused by the three X-ray mirrors. There is one camera at the focus of each mirror; two of the cameras contain seven MOS CCDs, while the third uses twelve PN CCDs, defining a circular field of view of 30' diameter

M. J. L. Turner; A. Abbey; M. Arnaud; M. Balasini; M. Barbera; E. Belsole; P. J. Bennie; J. P. Bernard; G. F. Bignami; M. Boer; U. Briel; I. Butler; C. Cara; C. Chabaud; R. Cole; A. Collura; M. Conte; A. Cros; M. Denby; P. Dhez; G. Di Coco; J. Dowson; P. Ferrando; S. Ghizzardi; F. Gianotti; C. V. Goodall; L. Gretton; R. G. Griffiths; O. Hainaut; J. F. Hochedez; A. D. Holland; E. Jourdain; E. Kendziorra; A. Lagostina; R. Laine; N. La Palombara; M. Lortholary; D. Lumb; P. Marty; S. Molendi; C. Pigot; E. Poindron; K. A. Pounds; J. N. Reeves; C. Reppin; R. Rothenflug; P. Salvetat; J. L. Sauvageot; D. Schmitt; S. Sembay; A. D. T. Short; J. Spragg; J. Stephen; L. Strüder; A. Tiengo; M. Trifoglio; J. Trümper; S. Vercellone; L. Vigroux; G. Villa; M. J. Ward; S. Whitehead; E. Zonca

2001-01-01

43

Generating Stereoscopic Television Images With One Camera  

NASA Technical Reports Server (NTRS)

Straightforward technique for generating stereoscopic television images involves use of single television camera translated laterally between left- and right-eye positions. Camera acquires one of images (left- or right-eye image), and video signal from image delayed while camera translated to position where it acquires other image. Length of delay chosen so both images displayed simultaneously or as nearly simultaneously as necessary to obtain stereoscopic effect. Technique amenable to zooming in on small areas within broad scenes. Potential applications include three-dimensional viewing of geological features and meteorological events from spacecraft and aircraft, inspection of workpieces moving along conveyor belts, and aiding ground and water search-and-rescue operations. Also used to generate and display imagery for public education and general information, and possible for medical purposes.

Coan, Paul P.

1996-01-01

44

Camera lens adapter magnifies image  

NASA Technical Reports Server (NTRS)

Polaroid Land camera with an illuminated 7-power magnifier adapted to the lens, photographs weld flaws. The flaws are located by inspection with a 10-power magnifying glass and then photographed with this device, thus providing immediate pictorial data for use in remedial procedures.

Moffitt, F. L.

1967-01-01

45

Development of gamma ray imaging cameras  

SciTech Connect

In January 1990, the Department of Energy initiated this project with the objective to develop the technology for general purpose, portable gamma ray imaging cameras useful to the nuclear industry. The ultimate goal of this R D initiative is to develop the analog to the color television camera where the camera would respond to gamma rays instead of visible photons. The two-dimensional real-time image would be displayed would indicate the geometric location of the radiation relative to the camera's orientation, while the brightness and color'' would indicate the intensity and energy of the radiation (and hence identify the emitting isotope). There is a strong motivation for developing such a device for applications within the nuclear industry, for both high- and low-level waste repositories, for environmental restoration problems, and for space and fusion applications. At present, there are no general purpose radiation cameras capable of producing spectral images for such practical applications. At the time of this writing, work on this project has been underway for almost 18 months. Substantial progress has been made in the project's two primary areas: mechanically-collimated (MCC) and electronically-collimated camera (ECC) designs. We present developments covering the mechanically-collimated design, and then discuss the efforts on the electronically-collimated camera. The renewal proposal addresses the continuing R D efforts for the third year effort. 8 refs.

Wehe, D.K.; Knoll, G.F.

1992-05-28

46

Occluded object imaging via optimal camera selection  

NASA Astrophysics Data System (ADS)

High performance occluded object imaging in cluttered scenes is a significant challenging task for many computer vision applications. Recently the camera array synthetic aperture imaging is proved to be an effective way to seeing object through occlusion. However, the imaging quality of occluded object is often significantly decreased by the shadows of the foreground occluder. Although some works have been presented to label the foreground occluder via object segmentation or 3D reconstruction, these methods will fail in the case of complicated occluder and severe occlusion. In this paper, we present a novel optimal camera selection algorithm to solve the above problem. The main characteristics of this algorithm include: (1) Instead of synthetic aperture imaging, we formulate the occluded object imaging problem as an optimal camera selection and mosaicking problem. To the best of our knowledge, our proposed method is the first one for occluded object mosaicing. (2) A greedy optimization framework is presented to propagate the visibility information among various depth focus planes. (3) A multiple label energy minimization formulation is designed in each plane to select the optimal camera. The energy is estimated in the synthetic aperture image volume and integrates the multi-view intensity consistency, previous visibility property and camera view smoothness, which is minimized via Graph cuts. We compare our method with the state-of-the-art synthetic aperture imaging algorithms, and extensive experimental results with qualitative and quantitative analysis demonstrate the effectiveness and superiority of our approach.

Yang, Tao; Zhang, Yanning; Tong, Xiaomin; Ma, Wenguang; Yu, Rui

2013-12-01

47

Images obtained with a compact gamma camera  

NASA Astrophysics Data System (ADS)

A design for a compact gamma camera based on the use of a position-sensitive photomultiplier is presented. Tests have been carried out on a prototype detector system, having a sensitive area of 25 cm 2, using both a simple pinhole aperture and a parallel collimator. Images of a thyroid phantom are presented, and after processing to reduce the artefacts introduced by the use of a pinhole aperture, the quality is compared with that obtained using a standard Anger camera.

Bird, A. J.; Ramsden, D.

1990-12-01

48

Night solid state imaging camera (RPV)  

Microsoft Academic Search

This report describes the results of a program to provide the first phase of development of a day\\/night imager suitable for application in a Remotely Piloted Vehicle (RPV) being developed by the U.S. Army. Major objectives defined for the first phase development have been achieved, specifically; (1) A television camera which demonstrates feasibility of the Night Solid State Imager approach

K. A. Hoagland

1977-01-01

49

Lroc Observations of Permanently Shadowed Regions: Seeing into the Dark  

NASA Astrophysics Data System (ADS)

Permanently shadowed regions (PSRs) near the lunar poles that receive secondary illumination from nearby Sun facing slopes were imaged by the Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Cameras (NAC). Typically secondary lighting is optimal in polar areas around respective solstices and when the LRO orbit is nearly coincident with the sub-solar point (low spacecraft beta angles). NAC PSR images provide the means to search for evidence of surface frosts and unusual morphologies from ice rich regolith, and aid in planning potential landing sites for future in-situ exploration. Secondary illumination imaging in PSRs requires NAC integration times typically more than ten times greater than nominal imaging. The increased exposure time results in downtrack smear that decreases the spatial resolution of the NAC PSR images. Most long exposure NAC images of PSRs were acquired with exposure times of 24.2-ms (1-m by 40-m pixels, sampled to 20-m) and 12-ms (1-m by 20-m, sampled to 10-m). The initial campaign to acquire long exposure NAC images of PSRs in the north pole region ran from February 2013 to April 2013. Relative to the south polar region, PSRs near the north pole are generally smaller (D<24-km) and located in simple craters. Long exposure NAC images of PSRs in simple craters are often well illuminated by secondary light reflected from Sun-facing crater slopes during the northern summer solstice, allowing many PSRs to be imaged with the shorter exposure time of 12-ms (resampled to 10-m). With the exception of some craters in Peary crater, most northern PSRs with diameters >6-km were successfully imaged (ex. Whipple, Hermite A, and Rozhestvenskiy U). The third PSR south polar campaign began in April 2013 and will continue until October 2013. The third campaign will expand previous NAC coverage of PSRs and follow up on discoveries with new images of higher signal to noise ratio (SNR), higher resolution, and varying secondary illumination conditions. Utilizing previous campaign images and Sun's position, an individualized approach for targeting each crater drives this campaign. Secondary lighting within the PSRs, though somewhat diffuse, is at low incidence angles and coupled with nadir NAC imaging results in large phase angles. Such conditions tend to reduce albedo contrasts, complicating identification of patchy frost or ice deposits. Within the long exposure PSR images, a few small craters (D<200-m) with highly reflective ejecta blankets have been identified and interpreted as small fresh impact craters. Sylvester N and Main L are Copernican-age craters with PSRs; NAC images reveal debris flows, boulders, and morphologically fresh interior walls indicative of their young age. The identifications of albedo anomalies associated with these fresh craters and debris flows indicate that strong albedo contrasts (~2x) associated with small fresh impact craters can be distinguished in PSRs. Lunar highland material has an albedo of ~0.2, while pure water frost has an albedo of ~0.9. If features in PSRs have an albedo similar to lunar highlands, significant surface frost deposits could result in detectable reflective anomalies in the NAC images. However, no reflective anomalies have thus far been identified in PSRs attributable to frost.

Koeber, S. D.; Robinson, M. S.

2013-12-01

50

Classroom multispectral imaging using inexpensive digital cameras.  

NASA Astrophysics Data System (ADS)

The proliferation of increasingly cheap digital cameras in recent years means that it has become easier to exploit the broad wavelength sensitivity of their CCDs (360 - 1100 nm) for classroom-based teaching. With the right tools, it is possible to open children's eyes to the invisible world of UVA and near-IR radiation either side of our narrow visual band. The camera-filter combinations I describe can be used to explore the world of animal vision, looking for invisible markings on flowers, or in bird plumage, for example. In combination with a basic spectroscope (such as the Project-STAR handheld plastic spectrometer, 25), it is possible to investigate the range of human vision and camera sensitivity, and to explore the atomic and molecular absorption lines from the solar and terrestrial atmospheres. My principal use of the cameras has been to teach multispectral imaging of the kind used to determine remotely the composition of planetary surfaces. A range of camera options, from 50 circuit-board mounted CCDs up to $900 semi-pro infrared camera kits (including mobile phones along the way), and various UV-vis-IR filter options will be presented. Examples of multispectral images taken with these systems are used to illustrate the range of classroom topics that can be covered. Particular attention is given to learning about spectral reflectance curves and comparing images from Earth and Mars taken using the same filter combination that it used on the Mars Rovers.

Fortes, A. D.

2007-12-01

51

Prediction of Viking lander camera image quality  

NASA Technical Reports Server (NTRS)

Formulations are presented that permit prediction of image quality as a function of camera performance, surface radiance properties, and lighting and viewing geometry. Predictions made for a wide range of surface radiance properties reveal that image quality depends strongly on proper camera dynamic range command and on favorable lighting and viewing geometry. Proper camera dynamic range commands depend mostly on the surface albedo that will be encountered. Favorable lighting and viewing geometries depend mostly on lander orientation with respect to the diurnal sun path over the landing site, and tend to be independent of surface albedo and illumination scattering function. Side lighting with low sun elevation angles (10 to 30 deg) is generally favorable for imaging spatial details and slopes, whereas high sun elevation angles are favorable for measuring spectral reflectances.

Huck, F. O.; Burcher, E. E.; Jobson, D. J.; Wall, S. D.

1976-01-01

52

Improvement of passive THz camera images  

NASA Astrophysics Data System (ADS)

Terahertz technology is one of emerging technologies that has a potential to change our life. There are a lot of attractive applications in fields like security, astronomy, biology and medicine. Until recent years, terahertz (THz) waves were an undiscovered, or most importantly, an unexploited area of electromagnetic spectrum. The reasons of this fact were difficulties in generation and detection of THz waves. Recent advances in hardware technology have started to open up the field to new applications such as THz imaging. The THz waves can penetrate through various materials. However, automated processing of THz images can be challenging. The THz frequency band is specially suited for clothes penetration because this radiation does not point any harmful ionizing effects thus it is safe for human beings. Strong technology development in this band have sparked with few interesting devices. Even if the development of THz cameras is an emerging topic, commercially available passive cameras still offer images of poor quality mainly because of its low resolution and low detectors sensitivity. Therefore, THz image processing is very challenging and urgent topic. Digital THz image processing is a really promising and cost-effective way for demanding security and defense applications. In the article we demonstrate the results of image quality enhancement and image fusion of images captured by a commercially available passive THz camera by means of various combined methods. Our research is focused on dangerous objects detection - guns, knives and bombs hidden under some popular types of clothing.

Kowalski, Marcin; Piszczek, Marek; Palka, Norbert; Szustakowski, Mieczyslaw

2012-10-01

53

Imaging characteristics of photogrammetric camera systems  

USGS Publications Warehouse

In view of the current interest in high-altitude and space photographic systems for photogrammetric mapping, the United States Geological Survey (U.S.G.S.) undertook a comprehensive research project designed to explore the practical aspects of applying the latest image quality evaluation techniques to the analysis of such systems. The project had two direct objectives: (1) to evaluate the imaging characteristics of current U.S.G.S. photogrammetric camera systems; and (2) to develop methodologies for predicting the imaging capabilities of photogrammetric camera systems, comparing conventional systems with new or different types of systems, and analyzing the image quality of photographs. Image quality was judged in terms of a number of evaluation factors including response functions, resolving power, and the detectability and measurability of small detail. The limiting capabilities of the U.S.G.S. 6-inch and 12-inch focal length camera systems were established by analyzing laboratory and aerial photographs in terms of these evaluation factors. In the process, the contributing effects of relevant parameters such as lens aberrations, lens aperture, shutter function, image motion, film type, and target contrast procedures for analyzing image quality and predicting and comparing performance capabilities. ?? 1973.

Welch, R.; Halliday, J.

1973-01-01

54

A pyroelectric thermal imaging camera tube  

Microsoft Academic Search

This paper proposes the use of a thin slice of a pyroelectric single crystal as the target of an electron beam scanned camera tube. This target would be sensitive to thermal images generated by black body radiation. It is shown that 104resolvable picture points per cm2of a TGS target at a frame rate not lower than 10 Hz and a

MICHAEL F. TOMPSETT

1971-01-01

55

Investigation of Layered Lunar Mare Lava flows through LROC Imagery and Terrestrial Analogs  

NASA Astrophysics Data System (ADS)

High resolution images of the lunar surface have revealed layered deposits in the walls of impact craters and pit craters in the lunar maria, which are interpreted to be sequences of stacked lava flows. The goal of our research is to establish quantitative constraints and uncertainties on the thicknesses of individual flow units comprising the layered outcrops, in order to model the cooling history of lunar lava flows. The underlying motivation for this project is to identify locations hosting intercalated units of lava flows and paleoregoliths, which may preserve snapshots of the ancient solar wind and other extra-lunar particles, thereby providing potential sampling localities for future missions to the lunar surface. Our approach involves mapping layered outcrops using high-resolution imagery acquired by the Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC), with constraints on flow unit dimensions provided by Lunar Orbiter Laser Altimeter (LOLA) data. We have measured thicknesses of ~ 2 to > 20 m. However, there is considerable uncertainty in the definition of contacts between adjacent units, primarily because talus commonly obscures contacts and/or prevents lateral tracing of the flow units. In addition, flows may have thicknesses or geomorphological complexity at scales approaching the limit of resolution of the data, which hampers distinguishing one unit from another. To address these issues, we have undertaken a terrestrial analog study using World View 2 satellite imagery of layered lava sequences on Oahu, Hawaii. These data have a resolution comparable to LROC NAC images of 0.5 m. The layered lava sequences are first analyzed in ArcGIS to obtain an initial estimate of the number and thicknesses of flow units identified in the images. We next visit the outcrops in the field to perform detailed measurements of the individual units. We have discovered that the number of flow units identified in the remote sensing data is fewer compared to the field analysis, because the resolution of the data precludes identification of subtle flow contacts and the identified 'units' are in fact multiple compounded units. Other factors such as vegetation and shadows may alter the view in the satellite imagery. This means that clarity in the lunar study may also be affected by factors such as lighting angle and amount of debris overlaying the lava sequence. The compilation of field and remote sensing measurements allows us to determine the uncertainty on unit thicknesses, which can be modeled to establish the uncertainty on the calculated depths of penetration of the resulting heat pulse into the underlying regolith. This in turn provides insight into the survivability of extra-lunar particles in paleoregolith layers sandwiched between lava flows.

Needham, H.; Rumpf, M.; Sarah, F.

2013-12-01

56

Imaging spectrometer/camera having convex grating  

NASA Technical Reports Server (NTRS)

An imaging spectrometer has fore-optics coupled to a spectral resolving system with an entrance slit extending in a first direction at an imaging location of the fore-optics for receiving the image, a convex diffraction grating for separating the image into a plurality of spectra of predetermined wavelength ranges; a spectrometer array for detecting the spectra; and at least one concave sperical mirror concentric with the diffraction grating for relaying the image from the entrance slit to the diffraction grating and from the diffraction grating to the spectrometer array. In one embodiment, the spectrometer is configured in a lateral mode in which the entrance slit and the spectrometer array are displaced laterally on opposite sides of the diffraction grating in a second direction substantially perpendicular to the first direction. In another embodiment, the spectrometer is combined with a polychromatic imaging camera array disposed adjacent said entrance slit for recording said image.

Reininger, Francis M. (Inventor)

2000-01-01

57

New insight into lunar impact melt mobility from the LRO camera  

Microsoft Academic Search

The Lunar Reconnaissance Orbiter Camera (LROC) is systematically imaging impact melt deposits in and around lunar craters at meter and sub-meter scales. These images reveal that lunar impact melts, although morphologically similar to terrestrial lava flows of similar size, exhibit distinctive features (e.g., erosional channels). Although generated in a single rapid event, the post-impact mobility and morphology of lunar impact

V. J. Bray; L. L. Tornabene; L. P. Keszthelyi; A. S. McEwen; B. R. Hawke; T. A. Giguere; S. A. Kattenhorn; W. B. Garry; B. Rizk; C. M. Caudill; L. R. Gaddis; C. H. van der Bogert

2010-01-01

58

X-ray imaging using digital cameras  

NASA Astrophysics Data System (ADS)

The possibility of using the combination of a computed radiography (storage phosphor) cassette and a semiprofessional grade digital camera for medical or dental radiography is investigated. We compare the performance of (i) a Canon 5D Mk II single lens reflex camera with f1.4 lens and full-frame CMOS array sensor and (ii) a cooled CCD-based camera with a 1/3 frame sensor and the same lens system. Both systems are tested with 240 x 180 mm cassettes which are based on either powdered europium-doped barium fluoride bromide or needle structure europium-doped cesium bromide. The modulation transfer function for both systems has been determined and falls to a value of 0.2 at around 2 lp/mm, and is limited by light scattering of the emitted light from the storage phosphor rather than the optics or sensor pixelation. The modulation transfer function for the CsBr:Eu2+ plate is bimodal, with a high frequency wing which is attributed to the light-guiding behaviour of the needle structure. The detective quantum efficiency has been determined using a radioisotope source and is comparatively low at 0.017 for the CMOS camera and 0.006 for the CCD camera, attributed to the poor light harvesting by the lens. The primary advantages of the method are portability, robustness, digital imaging and low cost; the limitations are the low detective quantum efficiency and hence signal-to-noise ratio for medical doses, and restricted range of plate sizes. Representative images taken with medical doses are shown and illustrate the potential use for portable basic radiography.

Winch, Nicola M.; Edgar, Andrew

2012-02-01

59

Cervical SPECT Camera for Parathyroid Imaging  

SciTech Connect

Primary hyperparathyroidism characterized by one or more enlarged parathyroid glands has become one of the most common endocrine diseases in the world affecting about 1 per 1000 in the United States. Standard treatment is highly invasive exploratory neck surgery called ���¢��������Parathyroidectomy���¢�������. The surgery has a notable mortality rate because of the close proximity to vital structures. The move to minimally invasive parathyroidectomy is hampered by the lack of high resolution pre-surgical imaging techniques that can accurately localize the parathyroid with respect to surrounding structures. We propose to develop a dedicated ultra-high resolution (~ 1 mm) and high sensitivity (10x conventional camera) cervical scintigraphic imaging device. It will be based on a multiple pinhole-camera SPECT system comprising a novel solid state CZT detector that offers the required performance. The overall system will be configured to fit around the neck and comfortably image a patient.

None

2012-08-31

60

The Large Binocular Camera Image Simulator  

NASA Astrophysics Data System (ADS)

The Large Binocular Camera Image Simulator (LBCImSim) is a package for generating artificial images in the typical FITS format. It operates on real or artificial images, simulating the expected performances of real instruments by including several observing conditions (filters, air mass, flat field, and exposure time) and creating images with the LBC instrumental artifacts (optical deformations, noise, and CCD architectures). This simulator can also be used to produce artificial images for other existing and future telescopes, since its structure is very flexible. The primary aim of LBCImSim is to support the development of a pipeline and data analysis procedure that is able to handle wide-field imaging and the fast reduction of huge amounts of photometric data. The software consists of three stand-alone programs written in C that use IRAF and run on Linux OS. In this paper, we first describe the software, its performance in several tests carried out prior to public release, and some examples for users. In particular, we compare the Hubble Deep Field-South (HDF-S) as seen by FORS1 with a simulated image and found that the agreement is good. We then use this software to predict the expected performances of the LBC instrument by means of realistic simulations of deep-field observations with the Large Binocular Telescope.

Grazian, A.; Fontana, A.; De Santis, C.; Gallozzi, S.; Giallongo, E.; Di Pangrazio, F.

2004-08-01

61

Performance of the Neutron Imaging Camera (NIC)  

NASA Astrophysics Data System (ADS)

The Neutron Imaging Camera (NIC) is based on the Three-Dimensional Track Imager (3-DTI) technology developed at NASA/GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time projection chamber, provides accurate, ˜0.4mm resolution, 3-D tracking of charged particles. The incident direction of fast neutrons, En> 0.5 MeV, are reconstructed from the momenta and energies of the proton and triton fragments resulting from ^3He(n,p)^3H interactions in the 3-DTI volume. We will discuss the NIC performance, including the angular and energy response, derived from a 2˜3 MeV neutron accelerator beam.

Son, Seunghee; Hunter, Stanley; Link, Jason; de Nolfo, Georiga; Guardala, Noel

2009-05-01

62

LCD display screen performance testing for handheld thermal imaging cameras  

Microsoft Academic Search

Handheld thermal imaging cameras are an important tool for the first responder community. As their use becomes more prevalent, it will become important for a set of standard test metrics to be available to characterize the performance of these cameras. A major factor in the performance of the imagers is the quality of the image on a display screen. An

Joshua B. Dinaburg; Francine Amon; Anthony Hamins; Paul Boynton

2006-01-01

63

A lens and aperture camera model for synthetic image generation  

Microsoft Academic Search

This paper extends the traditional pin-hole camera projection geometry, used in computer graphics, to a more realistic camera model which approximates the effects of a lens and an aperture function of an actual camera. This model allows the generation of synthetic images which have a depth of field, can be focused on an arbitrary plane, and also permits selective modeling

Michael Potmesil; Indranil Chakravarty

1981-01-01

64

Camera system for multispectral imaging of documents  

NASA Astrophysics Data System (ADS)

A spectral imaging system comprising a 39-Mpixel monochrome camera, LED-based narrowband illumination, and acquisition/control software has been designed for investigations of cultural heritage objects. Notable attributes of this system, referred to as EurekaVision, include: streamlined workflow, flexibility, provision of well-structured data and metadata for downstream processing, and illumination that is safer for the artifacts. The system design builds upon experience gained while imaging the Archimedes Palimpsest and has been used in studies of a number of important objects in the LOC collection. This paper describes practical issues that were considered by EurekaVision to address key research questions for the study of fragile and unique cultural objects over a range of spectral bands. The system is intended to capture important digital records for access by researchers, professionals, and the public. The system was first used for spectral imaging of the 1507 world map by Martin Waldseemueller, the first printed map to reference "America." It was also used to image sections of the Carta Marina 1516 map by the same cartographer for comparative purposes. An updated version of the system is now being utilized by the Preservation Research and Testing Division of the Library of Congress.

Christens-Barry, William A.; Boydston, Kenneth; France, Fenella G.; Knox, Keith T.; Easton, Roger L., Jr.; Toth, Michael B.

2009-02-01

65

New insight into lunar impact melt mobility from the LRO camera  

USGS Publications Warehouse

The Lunar Reconnaissance Orbiter Camera (LROC) is systematically imaging impact melt deposits in and around lunar craters at meter and sub-meter scales. These images reveal that lunar impact melts, although morphologically similar to terrestrial lava flows of similar size, exhibit distinctive features (e.g., erosional channels). Although generated in a single rapid event, the post-impact mobility and morphology of lunar impact melts is surprisingly complex. We present evidence for multi-stage influx of impact melt into flow lobes and crater floor ponds. Our volume and cooling time estimates for the post-emplacement melt movements noted in LROC images suggest that new flows can emerge from melt ponds an extended time period after the impact event.

Bray, Veronica J.; Tornabene, Livio L.; Keszthelyi, Laszlo P.; McEwen, Alfred S.; Hawke, B. Ray; Giguere, Thomas A.; Kattenhorn, Simon A.; Garry, William B.; Rizk, Bashar; Caudill, C.M.; Gaddis, Lisa R.; van der Bogert, Carolyn H.

2010-01-01

66

Factors affecting the accuracy of thermal imaging cameras in volcanology  

Microsoft Academic Search

Volcano observatories and researchers are recognizing the potential usefulness of thermal imaging cameras both before and during volcanic eruptions. Obvious applications include measurements of the surface temperatures of active lava domes and lava flows to determine the location of the most active parts of these potentially hazardous features. If appropriate precautions are taken, the new generation of thermal imaging cameras

M. Ball; H. Pinkerton

2006-01-01

67

Removal of adherent waterdrops from images acquired with stereo camera  

Microsoft Academic Search

In this paper, we propose a new method that can remove view-disturbing noises from stereo images. One of the thorny problems in outdoor surveillance by a camera is that adherent noises such as waterdrops on the protecting glass surface lens disturb the view from the camera. Therefore, we propose a method for removing adherent noises from stereo images taken with

Atsushi Yamashita; Yuu Tanaka; Toru Kaneko

2005-01-01

68

The European Photon Imaging Camera on XMM-Newton: The pn-CCD camera  

Microsoft Academic Search

The European Photon Imaging Camera (EPIC) consortium has provided the focal plane instruments for the three X-ray mirror systems on XMM-Newton. Two cameras with a reflecting grating spectrometer in the optical path are equipped with MOS type CCDs as focal plane detectors (Turner \\\\cite{mturner}), the telescope with the full photon flux operates the novel pn-CCD as an imaging X-ray spectrometer.

L. Strüder; U. Briel; K. Dennerl; R. Hartmann; E. Kendziorra; N. Meidinger; E. Pfeffermann; C. Reppin; B. Aschenbach; W. Bornemann; H. Bräuninger; W. Burkert; M. Elender; M. Freyberg; F. Haberl; G. Hartner; F. Heuschmann; H. Hippmann; E. Kastelic; S. Kemmer; G. Kettenring; W. Kink; N. Krause; S. Müller; A. Oppitz; W. Pietsch; M. Popp; P. Predehl; A. Read; K. H. Stephan; D. Stötter; J. Trümper; P. Holl; J. Kemmer; H. Soltau; R. Stötter; U. Weber; U. Weichert; C. von Zanthier; D. Carathanassis; G. Lutz; R. H. Richter; P. Solc; H. Böttcher; M. Kuster; R. Staubert; A. Abbey; A. Holland; M. Turner; M. Balasini; G. F. Bignami; N. La Palombara; G. Villa; W. Buttler; F. Gianini; R. Lainé; D. Lumb; P. Dhez

2001-01-01

69

Measurement of the nonuniformity of first responder thermal imaging cameras  

NASA Astrophysics Data System (ADS)

Police, firefighters, and emergency medical personnel are examples of first responders that are utilizing thermal imaging cameras in a very practical way every day. However, few performance metrics have been developed to assist first responders in evaluating the performance of thermal imaging technology. This paper describes one possible metric for evaluating the nonuniformity of thermal imaging cameras. Several commercially available uncooled focal plane array cameras were examined. Because of proprietary property issues, each camera was considered a 'black box'. In these experiments, an extended area black body (18 cm square) was placed very close to the objective lens of the thermal imaging camera. The resultant video output from the camera was digitized at a resolution of 640x480 pixels and a grayscale depth of 10 bits. The nonuniformity was calculated using the standard deviation of the digitized image pixel intensities divided by the mean of those pixel intensities. This procedure was repeated for each camera at several blackbody temperatures in the range from 30° C to 260° C. It has observed that the nonuniformity initially increases with temperature, then asymptotically approaches a maximum value. Nonuniformity is also applied to the calculation of Spatial Frequency Response as well providing a noise floor. The testing procedures described herein are being developed as part of a suite of tests to be incorporated into a performance standard covering thermal imaging cameras for first responders.

Lock, Andrew; Amon, Francine

2008-05-01

70

Z/I Imaging digital aerial camera system  

NASA Astrophysics Data System (ADS)

With the availability of digital cameras, it is possible to completely close the digital chain from image recording to plotting. The key decision regarding the camera design in this case is whether a CCD sensor should be used on a line or a matrix basis. In view of the high geometric accuracy requirements in photogrammetry, Z/I-Imaging focused its development on a digital camera based on a matrix sensor. An essential aspect of this decision was not only the aerial camera system, but the entire photgrammetric process to the finished photo or mapping product was also taken into account. The approach chosen will also maintain the usual central perspective for the new digital images. In addition to this Z/I Imaging will make use of the knowledge gathered during the development of the Digital Camera System, to continue its success in the field of reconnaissance.

Spiller, Rudolf H.; Hinz, Alexander

2000-11-01

71

Camera calibration for natural image studies and vision research  

PubMed Central

A dominant theme in vision research is that important characteristics of the visual pathway evolved to be effective in processing natural scenes. Given this perspective, one can learn about the nature of visual processing from a quantitative analysis of natural images. Such analysis can benefit from the camera as a measuring device. As such, the camera should not introduce arbitrary artifacts into the image formation process. This paper describes how to correct a number of unnecessary artifacts associated with obtaining natural scene statistics with a digital camera. For artifacts that are inherently part of image formation, and where elimination is not possible or appropriate, we describe methods for transformation and quantification.

Brady, Mark; Legge, Gordon E.

2014-01-01

72

Controlling Camera and Lights for Intelligent Image Acquisition and Merging  

Microsoft Academic Search

Docking craft in space and guiding mining machines are areas that often use remote video cameras equipped with one or more controllable light sources. In these applica- tions, the problem of parameter selection arises: how to choose the best parameters for the camera and lights? Another problem is that a single image often cannot cap- ture the whole scene properly

Olena Borzenko; Yves Lespérance; Michael R. M. Jenkin

2005-01-01

73

Motion deblurring of infrared images from a microbolometer camera  

Microsoft Academic Search

The infrared image of a microbolometer camera can show significant blurring effects if the object is moving. The blurring mechanism of a microbolometer is different to that encountered with classical CCD and CMOS cameras. The electrical signal in the pixel of a microbolometer detector decays exponentially with a time constant of 10–15ms; therefore, the moving object is mapped to more

Beata Oswald-Tranta; Mario Sorger; Paul O’Leary

2010-01-01

74

Thermal analysis of the ultraviolet imager camera and electronics  

NASA Technical Reports Server (NTRS)

The Ultraviolet Imaging experiment has undergone design changes that necessiate updating the reduced thermal models (RTM's) for both the Camera and Electronics. In addition, there are several mission scenarios that need to be evaluated in terms of thermal response of the instruments. The impact of these design changes and mission scenarios on the thermal performance of the Camera and Electronics assemblies is discussed.

Dirks, Gregory J.

1991-01-01

75

Imaging Emission Spectra with Handheld and Cellphone Cameras  

ERIC Educational Resources Information Center

As point-and-shoot digital camera technology advances it is becoming easier to image spectra in a laboratory setting on a shoestring budget and get immediate results. With this in mind, I wanted to test three cameras to see how their results would differ. Two undergraduate physics students and I used one handheld 7.1 megapixel (MP) digital Cannon…

Sitar, David

2012-01-01

76

Extraction of cloud statistics from whole sky imaging cameras  

SciTech Connect

Computer codes have been developed to extract basic cloud statistics from whole sky imaging (WSI) cameras. This report documents, on an algorithmic level, the steps and processes underlying these codes. Appendices comment on code details and on how to adapt to future changes in either the source camera or the host computer.

Kegelmeyer, W.P. Jr.

1994-03-01

77

Recording Images Using a Simple Pinhole Camera  

NSDL National Science Digital Library

In this lesson, students develop and expand their observational skills and technological understanding by building and operating a pinhole camera. The interdisciplinary connections are in the realm of application in this motivating activity. The lesson pr

Eichinger, John

2009-05-30

78

Imaging Emission Spectra with Handheld and Cellphone Cameras  

NASA Astrophysics Data System (ADS)

As point-and-shoot digital camera technology advances it is becoming easier to image spectra in a laboralory setting on a shoestring budget and get immediale results. With this in mind, I wanted to test three cameras to see how their results would differ. Two undergraduate physics students and I used one handheld 7.1 megapixel (MP) digital Cannon point-and-shoot auto focusing camera and two different cellphone cameras: one at 6.1 MP and the other at 5.1 MP.

Sitar, David

2012-12-01

79

Compact Gamma Camera System for Breast Cancer Imaging.  

National Technical Information Service (NTIS)

The goal of this project is the development of a compact solid-state gamma camera specifically designed to image metabolically active tumors in the breast and axillary nodes with the highest possible detection efficiency and spatial resolution. The compac...

S. E. Derenzo

1999-01-01

80

Compact Gamma Camera System for Breast Cancer Imaging.  

National Technical Information Service (NTIS)

In this project we have developed a compact solid-state gamma camera specifically designed to image metabolically active tumors in the breast and axillary nodes with the highest possible detection efficiency and spatial resolution. We have assembled final...

S. E. Derenzo

2001-01-01

81

A time-resolved image sensor for tubeless streak cameras  

NASA Astrophysics Data System (ADS)

This paper presents a time-resolved CMOS image sensor with draining-only modulation (DOM) pixels for tube-less streak cameras. Although the conventional streak camera has high time resolution, the device requires high voltage and bulky system due to the structure with a vacuum tube. The proposed time-resolved imager with a simple optics realize a streak camera without any vacuum tubes. The proposed image sensor has DOM pixels, a delay-based pulse generator, and a readout circuitry. The delay-based pulse generator in combination with an in-pixel logic allows us to create and to provide a short gating clock to the pixel array. A prototype time-resolved CMOS image sensor with the proposed pixel is designed and implemented using 0.11um CMOS image sensor technology. The image array has 30(Vertical) x 128(Memory length) pixels with the pixel pitch of 22.4um. .

Yasutomi, Keita; Han, SangMan; Seo, Min-Woong; Takasawa, Taishi; Kagawa, Keiichiro; Kawahito, Shoji

2014-03-01

82

Imaging Colorimetry Using a Digital Camera  

Microsoft Academic Search

In this work, we investigate the use of a digital camera for colorimetry. Our system consists of a measurement device and a corresponding calibration mapping. The goal is to design a system that will accurately assess the color of a sample. We develop two colorimetry systems by applying model-based and regression-based techniques. For both systems, the measurement device is formed

Wencheng Wu; Jan P. Allebach; Mostafa Analoui

1999-01-01

83

[New medical imaging based on electron tracking Compton camera (ETCC)].  

PubMed

We have developed an Electron-Tracking Compton Camera (ETCC) for medical imaging due to its wide energy dynamic range (200-1,500keV) and wide field of view (FOV, 3 str). This camera has a potential of developing the new reagents. We have carried out several imaging reagent studies as examples; (1) 18F-FDG and 131I-MIBG simultaneous imaging for double clinical tracer imaging, (2) imaging of some minerals (Mn-54, Zn-65, Fe-59) in mouse and plants. In addition, ETCC has a potential of real-time monitoring of the Bragg peak location by imaging prompt gamma rays for the beam therapy. We carried out the water phantom experiment using 140MeV proton beam, and obtained the images of both 511 keV and high energy gamma rays (800-2,000keV). Here better correlation of the latter image to the Bragg peak has been observed. Another potential of ETCC is to reconstruct the 3D image using only one-head camera without rotations of both the target and camera. Good 3D images of the thyroid grant phantom and the mouse with tumor were observed. In order to advance those features to the practical use, we are improving the all components and then construct the multi-head ETCC system. PMID:24592680

Tanimori, Toru; Kubo, Hidetoshi; Kabuki, Shigeto; Kimura, Hiroyuki

2012-01-01

84

The WEBERSAT camera - An inexpensive earth imaging system  

NASA Astrophysics Data System (ADS)

WEBERSAT is a 27 pound LEO satellite launched in 1990 into a 500 mile polar orbit. One of its payloads is a low cost CCD color camera system developed by engineering students at Weber State University. The camera is a modified Canon CI-10 with a 25 mm lens, automatic iris, and 780 x 490 pixel resolution. The iris range control potentiometer was made programmable; a 10.7 MHz digitization clock, fixed focus support, and solid tantalum capacitors were added. Camera output signals, composite video, red, green, blue, and the digitization clock are fed to a flash digitizer, where they are processed for storage in RAM. Camera control commands are stored and executed via the onboard computer. The CCD camera has successfully imaged meteorological features of the earth, land masses, and a number of astronomical objects.

Jackson, Stephen; Raetzke, Jeffrey

85

Fluorescence lifetime imaging microscopy using a streak camera  

NASA Astrophysics Data System (ADS)

We present the development of a fluorescence lifetime imaging microscopy system using a streak camera (SC-FLIM), which uses ultrafast infrared laser for multiphoton excitation and a streak camera for lifetime measurement. A pair of galvo mirrors are employed to accomplish quick time-resolved scanning on a line and 2D fluorescence lifetime imaging. The SC-FLIM system was calibrated using an F-P etalon and several standard fluorescent dyes, and was also used to perform fluorescence lifetime imaging of fluorescent microspheres and a prepared plant stem slide.

Liu, Lixin; Li, Yahui; Sun, Luogeng; Li, Heng; Peng, Xiao; Qu, Junle

2014-02-01

86

Acquisition and evaluation of radiography images by digital camera.  

PubMed

To determine applicability of low-cost digital imaging for different radiographic modalities used in consultations from remote areas of the Ecuadorian rainforest with limited resources, both medical and financial. Low-cost digital imaging, consisting of hand-held digital cameras, was used for image capture at a remote location. Diagnostic radiographic images were captured in Ecuador by digital camera and transmitted to a password-protected File Transfer Protocol (FTP) server at VCU Medical Center in Richmond, Virginia, using standard Internet connectivity with standard security. After capture and subsequent transfer of images via low-bandwidth Internet connections, attending radiologists in the United States compared diagnoses to those from Ecuador to evaluate quality of image transfer. Corroborative diagnoses were obtained with the digital camera images for greater than 90% of the plain film and computed tomography studies. Ultrasound (U/S) studies demonstrated only 56% corroboration. Images of radiographs captured utilizing commercially available digital cameras can provide quality sufficient for expert consultation for many plain film studies for remote, underserved areas without access to advanced modalities. PMID:15857253

Cone, Stephen W; Carucci, Laura R; Yu, Jinxing; Rafiq, Azhar; Doarn, Charles R; Merrell, Ronald C

2005-04-01

87

ProxiScan?: A Novel Camera for Imaging Prostate Cancer  

ScienceCinema

ProxiScan is a compact gamma camera suited for high-resolution imaging of prostate cancer. Developed by Brookhaven National Laboratory and Hybridyne Imaging Technologies, Inc., ProxiScan won a 2009 R&D 100 Award, sponsored by R&D Magazine to recognize t

Ralph James

2010-01-08

88

ProxiScan™: A Novel Camera for Imaging Prostate Cancer  

SciTech Connect

ProxiScan is a compact gamma camera suited for high-resolution imaging of prostate cancer. Developed by Brookhaven National Laboratory and Hybridyne Imaging Technologies, Inc., ProxiScan won a 2009 R&D 100 Award, sponsored by R&D Magazine to recognize t

Ralph James

2009-10-27

89

Hierarchical Color Correction for Camera Cell Phone Images  

Microsoft Academic Search

In this paper, we propose a hierarchical color correction algorithm for enhancing the color of digital images obtained from low quality digital image capture devices such as cell phone cameras. The proposed method is based on a multi layer hierarchical stochastic framework whose parameters are learned in an ofine training procedure using the well-known expectation maximization (EM) algorithm. This hierarchical

Hasib Siddiqui; Charles A. Bouman

2008-01-01

90

Restoration of TDI camera images with motion distortion and blur  

NASA Astrophysics Data System (ADS)

Platform movement during exposure of imaging system severely degrades image quality. In the case of Time delay and integration (TDI) camera, abnormal movements cause not only image blur but also distortion, for image Point Spread Function (PSF) is space-variant. In this paper, we present a motion degradation model of TDI image, and provide a method to restore such degraded image. While a TDI camera is imaging, it outputs images row by row (or line by line) along the scanning axis, and our method processes in the same track. We firstly calculate the space-invariant PSF of each row using the movement information of the TDI camera. Then, we substitute pixels of the row and the ones of their neighbor rows together with the PSF into standard Richardson-Lucy algorithm. By deconvoluting we get the restored pixels of the row. The same operations are executed for all rows of the degraded TDI image. Finally, a restored image can be reconstructed from those restored rows. Both simulated and experimental results prove the effectiveness of our method.

Wu, Jiagu; Zheng, Zhenzhen; Feng, Huajun; Xu, Zhihai; Li, qi; Chen, Yueting

2010-11-01

91

Laser speckle imaging using a consumer-grade color camera  

PubMed Central

Laser speckle imaging (LSI) is a noninvasive optical imaging technique able to provide wide-field two-dimensional maps of moving particles. Raw laser speckle images are typically taken with a scientific-grade monochrome camera. We demonstrate that a digital single-lens reflex (dSLR) camera with a Bayer filter is able to provide similar sensitivity despite taking information only from a specific pixel color. Here we demonstrate the effect of changing three primary dSLR exposure settings (i.e., aperture, exposure time/shutter speed, and gain/sensitivity (ISO)) on speckle contrast. In addition, we present data from an in vivo reactive hyperemia experiment that demonstrates the qualitative similarity in blood-flow dynamics visualized with a color dSLR and a scientific-grade monochrome camera.

Yang, Owen; Choi, Bernard

2013-01-01

92

Hadamard camera for 3D imaging  

NASA Astrophysics Data System (ADS)

This paper at hand describes in details the work that has been carried out for fusing a commercial micro mirror sampling element with TOF acquisition methods and known Hadamard multiplexing techniques for implementation of fast and SNR optimized 3D image capture. The theoretical basics of TOF and Hadamard technique are presented and will be complemented by theoretical explanation of utilizing them for 3D volumetric image generation. Finally measurement results of scene image acquisition are going to be demonstrated and discussed as well as expanded by considerations about possible applications in THz-imaging and the following research steps.

Romasew, Eugen; Barenz, Joachim; Tholl, Hans Dieter

2007-11-01

93

Wide-Angle, Reflective Strip-Imaging Camera  

NASA Technical Reports Server (NTRS)

Proposed camera images thin, striplike portion of field of view of 180 degrees wide. Hemispherical concave reflector forms image onto optical fibers, which transfers it to strip of photodetectors or spectrograph. Advantages include little geometric distortion, achromatism, and ease of athermalization. Uses include surveillance of clouds, coarse mapping of terrain, measurements of bidirectional reflectance distribution functions of aerosols, imaging spectrometry, oceanography, and exploration of planets.

Vaughan, Arthur H.

1992-01-01

94

Radiometric cloud imaging with an uncooled microbolometer thermal infrared camera  

Microsoft Academic Search

An uncooled microbolometer-array thermal infrared camera has been incorporated into a remote sensing system for radiometric sky imaging. The radiometric calibration is validated and improved through direct comparison with spectrally integrated data from the Atmospheric Emitted Radiance Interferometer (AERI). With the improved calibration, the Infrared Cloud Imager (ICI) system routinely obtains sky images with radiometric uncertainty less than 0.5 W\\/(m2

Joseph A. Shaw; Paul W. Nugent; Nathan J. Pust; Brentha Thurairajah; Kohei Mizutani

2005-01-01

95

High-Resolution Mars Camera Test Image of Moon (Infrared)  

NASA Technical Reports Server (NTRS)

This crescent view of Earth's Moon in infrared wavelengths comes from a camera test by NASA's Mars Reconnaissance Orbiter spacecraft on its way to Mars. The mission's High Resolution Imaging Science Experiment camera took the image on Sept. 8, 2005, while at a distance of about 10 million kilometers (6 million miles) from the Moon. The dark feature on the right is Mare Crisium. From that distance, the Moon would appear as a star-like point of light to the unaided eye. The test verified the camera's focusing capability and provided an opportunity for calibration. The spacecraft's Context Camera and Optical Navigation Camera also performed as expected during the test.

The Mars Reconnaissance Orbiter, launched on Aug. 12, 2005, is on course to reach Mars on March 10, 2006. After gradually adjusting the shape of its orbit for half a year, it will begin its primary science phase in November 2006. From the mission's planned science orbit about 300 kilometers (186 miles) above the surface of Mars, the high resolution camera will be able to discern features as small as one meter or yard across.

2005-01-01

96

Image-shearing camera for direct measurement of surface strains.  

PubMed

This paper describes a new image-shearing camera which focuses two laterally sheared images at the film plane. With coherent illumination, this camera becomes a shearing interferometer, which directly measures the derivatives of the surface displacements. This strain measuring tool enjoys several advantages over the conventional, holographic, and speckle interferometry, namely, (1) better fringe quality (than speckle interferometry); (2) does not require special vibration isolation; (3) very simple optical setup; (4) direct determination of strains; and (5) extended controllable range of sensitivity. PMID:20208872

Hung, Y Y; Liang, C Y

1979-04-01

97

Digital camera with apparatus for authentication of images produced from an image file  

NASA Technical Reports Server (NTRS)

A digital camera equipped with a processor for authentication of images produced from an image file taken by the digital camera is provided. The digital camera processor has embedded therein a private key unique to it, and the camera housing has a public key that is so uniquely based upon the private key that digital data encrypted with the private key by the processor may be decrypted using the public key. The digital camera processor comprises means for calculating a hash of the image file using a predetermined algorithm, and second means for encrypting the image hash with the private key, thereby producing a digital signature. The image file and the digital signature are stored in suitable recording means so they will be available together. Apparatus for authenticating at any time the image file as being free of any alteration uses the public key for decrypting the digital signature, thereby deriving a secure image hash identical to the image hash produced by the digital camera and used to produce the digital signature. The apparatus calculates from the image file an image hash using the same algorithm as before. By comparing this last image hash with the secure image hash, authenticity of the image file is determined if they match, since even one bit change in the image hash will cause the image hash to be totally different from the secure hash.

Friedman, Gary L. (inventor)

1993-01-01

98

Dual camera system for acquisition of high resolution images  

NASA Astrophysics Data System (ADS)

Video surveillance is ubiquitous in modern society, but surveillance cameras are severely limited in utility by their low resolution. With this in mind, we have developed a system that can autonomously take high resolution still frame images of moving objects. In order to do this, we combine a low resolution video camera and a high resolution still frame camera mounted on a pan/tilt mount. In order to determine what should be photographed (objects of interest), we employ a hierarchical method which first separates foreground from background using a temporal-based median filtering technique. We then use a feed-forward neural network classifier on the foreground regions to determine whether the regions contain the objects of interest. This is done over several frames, and a motion vector is deduced for the object. The pan/tilt mount then focuses the high resolution camera on the next predicted location of the object, and an image is acquired. All components are controlled through a single MATLAB graphical user interface (GUI). The final system we present will be able to detect multiple moving objects simultaneously, track them, and acquire high resolution images of them. Results will demonstrate performance tracking and imaging varying numbers of objects moving at different speeds.

Papon, Jeremie A.; Broussard, Randy P.; Ives, Robert W.

2007-02-01

99

Mining Images in Publicly-Available Cameras for Homeland Security  

Microsoft Academic Search

A dramatic increase or decrease in the number of people appearing at a location can be an indicator that something has happened that may be of interest to law-enforcement, public health, or security. This work demonstrates how low quality camera images can be used to automatically alert when an unusual number of people are absent or present at a location.

Latanya Sweeney; Ralph Gross

100

Compact Gamma Camera System for Breast Cancer Imaging.  

National Technical Information Service (NTIS)

The goal of this project is the development of a compact solid-state gamma camera specifically designed to image metabolically active tumors in the breast and axillary nodes with the highest possible detection efficiency and spatial resolution. We have de...

S. Derenzo

2000-01-01

101

Meaningful performance evaluation conditions for fire service thermal imaging cameras  

Microsoft Academic Search

Thermal imaging cameras (TIC) are rapidly becoming integral equipment for the fire service for use in structure fires and other emergencies. The Building and Fire Research Laboratory (BFRL) at the National Institute of Standards and Technology has conducted research to establish test conditions that best represent the environment in which TIC are used. Firefighters may use TIC for field operations

Francine Amon; Anthony Hamins; Nelson Bryner; Justin Rowe

2008-01-01

102

Uncooled detector, optics, and camera development for THz imaging  

Microsoft Academic Search

A prototype THz imaging system based on modified uncooled microbolometer detector arrays, INO MIMICII camera electronics, and a custom f\\/1 THz optics has been assembled. A variety of new detector layouts and architectures have been designed; the detector THz absorption was optimized via several methods including integration of thin film metallic absorbers, thick film gold black absorbers, and antenna structures.

Timothy Pope; Michel Doucet; Fabien Dupont; Linda Marchese; Bruno Tremblay; Georges Baldenberger; Sonia Verrault; Frédéric Lamontagne

2009-01-01

103

Cooperative Detection applied to THz Imaging Cameras  

Microsoft Academic Search

THz technology for developing imaging systems has recently aroused great interest, mainly due to the large number of applications in which these frequencies can be used: security, vision in hard environments, etc. The technological difficulties involved in producing a large number of detectors at these frequencies, proves to be a serious constrain that limits the possibilities of the whole system.

J. Lizarraga; D. Betancourt; C. Del Río

104

Copernican craters: Early results from the Lunar Reconnaissance Orbiter Camera  

NASA Astrophysics Data System (ADS)

The youngest (Copernican) craters on the Moon provide the best examples of original crater morphology and a record of the impact flux over the last ~1 Ga in the Earth-Moon system. The LRO Narrow Angle Cameras (NAC) provide 50 cm pixels from an altitude of 50 km. With changing incidence angle, global access, and very high data rates, these cameras provide unprecedented data on lunar craters. Stereo image pairs are being acquired for detailed topographic mapping. These data allow comparisons of relative ages of the larger young craters, some of which are tied to absolute radiometric ages from Apollo-returned samples. These relative ages, the crater populations at small diameters, and details of crater morphology including ejecta and melt morphologies, allow better delineation of recent lunar history and the formation and modification of impact craters. Crater counts may also reveal differences in the formation and preservation of small diameter craters as a function of target material (e.g., unconsolidated regolith versus solid impact melt). One key question: Is the current cratering rate constant or does it fluctuate. We will constrain the very recent cratering rate (at 10-100 m diameter) by comparing LROC images with those taken by Apollo nearly 40 years ago to determine the number of new impact craters. The current cratering rate and an assumption of constant cratering rate over time may or may not correctly predict the number of craters superimposed over radiometrically-dated surfaces such as South Ray, Cone, and North Ray craters, which range from 2-50 Ma and are not saturated by 10-100 m craters. If the prediction fails with realistic consideration of errors, then the present-day cratering rate must be atypical. Secondary craters complicate this analysis, but the resolution and coverage of LROC enables improved recognition of secondary craters. Of particular interest for the youngest Copernican craters is the possibility of self-cratering. LROC is providing the the image quality needed to classify small craters by state of degradation (i.e., relative age); concentrations of craters with uniform size and age indicate secondary formation. Portion of LROC image M103703826LE showing a sparsely-cratered pond of impact melt on the floor of farside Copernican crater Necho (4.95 S, 123.6 E).

McEwen, A. S.; Hiesinger, H.; Thomas, P. C.; Robinson, M. S.; van der Bogert, C.; Ostrach, L.; Plescia, J. B.; Bray, V. J.; Tornabene, L. L.

2009-12-01

105

Image-based camera motion estimation using prior probabilities  

NASA Astrophysics Data System (ADS)

Image-based camera motion estimation from video or still images is a difficult problem in the field of computer vision. Many algorithms have been proposed for estimating intrinsic camera parameters, detecting and matching features between images, calculating extrinsic camera parameters based on those features, and optimizing the recovered parameters with nonlinear methods. These steps in the camera motion inference process all face challenges in practical applications: locating distinctive features can be difficult in many types of scenes given the limited capabilities of current feature detectors, camera motion inference can easily fail in the presence of noise and outliers in the matched features, and the error surfaces in optimization typically contain many suboptimal local minima. The problems faced by these techniques are compounded when they are applied to medical video captured by an endoscope, which presents further challenges such as non-rigid scenery and severe barrel distortion of the images. In this paper, we study these problems and propose the use of prior probabilities to stabilize camera motion estimation for the application of computing endoscope motion sequences in colonoscopy. Colonoscopy presents a special case for camera motion estimation in which it is possible to characterize typical motion sequences of the endoscope. As the endoscope is restricted to move within a roughly tube-shaped structure, forward/backward motion is expected, with only small amounts of rotation and horizontal movement. We formulate a probabilistic model of endoscope motion by maneuvering an endoscope and attached magnetic tracker through a synthetic colon model and fitting a distribution to the observed motion of the magnetic tracker. This model enables us to estimate the probability of the current endoscope motion given previously observed motion in the sequence. We add these prior probabilities into the camera motion calculation as an additional penalty term in RANSAC to help reject improbable motion parameters caused by outliers and other problems with medical data. This paper presents the theoretical basis of our method along with preliminary results on indoor scenes and synthetic colon images.

Sargent, Dusty; Park, Sun Young; Spofford, Inbar; Vosburgh, Kirby

2011-03-01

106

Digital Camera with Apparatus for Authentication of Images Produced from an Image File  

NASA Technical Reports Server (NTRS)

A digital camera equipped with a processor for authentication of images produced from an image file taken by the digital camera is provided. The digital camera processor has embedded therein a private key unique to it, and the camera housing has a public key that is so uniquely related to the private key that digital data encrypted with the private key may be decrypted using the public key. The digital camera processor comprises means for calculating a hash of the image file using a predetermined algorithm, and second means for encrypting the image hash with the private key, thereby producing a digital signature. The image file and the digital signature are stored in suitable recording means so they will be available together. Apparatus for authenticating the image file as being free of any alteration uses the public key for decrypting the digital signature, thereby deriving a secure image hash identical to the image hash produced by the digital camera and used to produce the digital signature. The authenticating apparatus calculates from the image file an image hash using the same algorithm as before. By comparing this last image hash with the secure image hash, authenticity of the image file is determined if they match. Other techniques to address time-honored methods of deception, such as attaching false captions or inducing forced perspectives, are included.

Friedman, Gary L. (Inventor)

1996-01-01

107

Radiometric cloud imaging with an uncooled microbolometer thermal infrared camera.  

PubMed

An uncooled microbolometer-array thermal infrared camera has been incorporated into a remote sensing system for radiometric sky imaging. The radiometric calibration is validated and improved through direct comparison with spectrally integrated data from the Atmospheric Emitted Radiance Interferometer (AERI). With the improved calibration, the Infrared Cloud Imager (ICI) system routinely obtains sky images with radiometric uncertainty less than 0.5 W/(m(2 )sr) for extended deployments in challenging field environments. We demonstrate the infrared cloud imaging technique with still and time-lapse imagery of clear and cloudy skies, including stratus, cirrus, and wave clouds. PMID:19498585

Shaw, Joseph; Nugent, Paul; Pust, Nathan; Thurairajah, Brentha; Mizutani, Kohei

2005-07-25

108

A virtual wiper - restoration of deteriorated images by using multiple cameras  

Microsoft Academic Search

In this paper, we propose a new method for the restoration of deteriorated images by using multiple cameras. In outdoor environment, it is often the case that scenes taken by the cameras are hard to see because of adherent noises on the surface of the lens-protecting glass of the cameras. Our proposed method analyses multiple camera images describing the same

Atushi Yamashitat; Masayuki KuramotoT; Toru Kaneko; Kenjiro T. Miurat

2003-01-01

109

A hybrid version of the Whipple observatory's air Cherenkov imaging camera for use in moonlight  

Microsoft Academic Search

A hybrid version of the Whipple Observatory's atmospheric Cherenkov imaging camera that permits observation during periods of bright moonlight is described. The hybrid camera combines a blue-light blocking filter with the standard Whipple imaging camera to reduce sensitivity to wavelengths greater than 360 nm. Data taken with this camera are found to be free from the effects of the moonlit

M. C. Chantell; C. W. Akerlof; H. M. Badran; J. Buckley; D. A. Carter-Lewis; M. F. Cawley; V. Connaughton; D. J. Fegan; P. Fleury; J. Gaidos; A. M. Hillas; R. C. Lamb; E. Pare; H. J. Rose; A. C. Rovero; X. Sarazin; G. Sembroski; M. S. Schubnell; M. Urban; T. C. Weekes; C. Wilson

1997-01-01

110

Filtered backprojection reconstruction and redundancy in Compton camera imaging.  

PubMed

During the acquisition process with the Compton gamma-camera, integrals of the intensity distribution of the source on conical surfaces are measured. They represent the Compton projections of the intensity. The inversion of the Compton transform reposes on a particular Fourier-Slice theorem. This paper proposes a filtered backprojection algorithm for image reconstruction from planar Compton camera data. We show how different projections are related together and how they may be combined in the tomographical reconstruction step. Considering a simulated Compton imaging system, we conclude that the proposed method yields accurate reconstructed images for simple sources. An elongation of the source in the direction orthogonal to the camera may be observed and is to be related to the truncation of the projections induced by the finite extent of the device. This phenomenon was previously observed with other reconstruction methods, e.g., iterative maximum likelihood expectation maximization. The redundancy of the Compton transform is thus an important feature for the reduction of noise in Compton images, since the ideal assumptions of infinite width and observation time are never met in practice. We show that a selection operated on the set of data allows to partially get around projection truncation, at the expense of an enhancement of the noise in the images. PMID:24196864

Maxim, Voichi?a

2014-01-01

111

Measuring SO2 ship emissions with an ultraviolet imaging camera  

NASA Astrophysics Data System (ADS)

Over the last few years fast-sampling ultraviolet (UV) imaging cameras have been developed for use in measuring SO2 emissions from industrial sources (e.g. power plants; typical emission rates ~ 1-10 kg s-1) and natural sources (e.g. volcanoes; typical emission rates ~ 10-100 kg s-1). Generally, measurements have been made from sources rich in SO2 with high concentrations and emission rates. In this work, for the first time, a UV camera has been used to measure the much lower concentrations and emission rates of SO2 (typical emission rates ~ 0.01-0.1 kg s-1) in the plumes from moving and stationary ships. Some innovations and trade-offs have been made so that estimates of the emission rates and path concentrations can be retrieved in real time. Field experiments were conducted at Kongsfjord in Ny Ålesund, Svalbard, where SO2 emissions from cruise ships were made, and at the port of Rotterdam, Netherlands, measuring emissions from more than 10 different container and cargo ships. In all cases SO2 path concentrations could be estimated and emission rates determined by measuring ship plume speeds simultaneously using the camera, or by using surface wind speed data from an independent source. Accuracies were compromised in some cases because of the presence of particulates in some ship emissions and the restriction of single-filter UV imagery, a requirement for fast-sampling (> 10 Hz) from a single camera. Despite the ease of use and ability to determine SO2 emission rates from the UV camera system, the limitation in accuracy and precision suggest that the system may only be used under rather ideal circumstances and that currently the technology needs further development to serve as a method to monitor ship emissions for regulatory purposes. A dual-camera system or a single, dual-filter camera is required in order to properly correct for the effects of particulates in ship plumes.

Prata, A. J.

2014-05-01

112

Distributed imaging using an array of compressive cameras  

NASA Astrophysics Data System (ADS)

We describe a distributed computational imaging system that employs an array of feature specific sensors, also known as compressive imagers, to directly measure the linear projections of an object. Two different schemes for implementing these non-imaging sensors are discussed. We consider the task of object reconstruction and quantify the fidelity of reconstruction using the root mean squared error (RMSE) metric. We also study the lifetime of such a distributed sensor network. The sources of energy consumption in a distributed feature specific imaging (DFSI) system are discussed and compared with those in a distributed conventional imaging (DCI) system. A DFSI system consisting of 20 imagers collecting DCT, Hadamard, or PCA features has a lifetime of 4.8× that of the DCI system when the noise level is 20% and the reconstruction RMSE requirement is 6%. To validate the simulation results we emulate a distributed computational imaging system using an experimental setup consisting of an array of conventional cameras.

Ke, Jun; Shankar, Premchandra; Neifeld, Mark A.

2009-01-01

113

Sub-100g uncooled thermal imaging camera design  

NASA Astrophysics Data System (ADS)

There are many applications for thermal imaging systems where low weight, high performance and high durability are at a premium. These include UAV systems, future warrior programs and thermal weapon sights. Thermal imaging camera design is restricted by a number external constraints including, detector packaging, detector performance and optical design. This paper describes how, by combining the latest 25µm pitch detector technology, novel optical design and shutter-less image processing a high resolution imager a system weight of 100g can be achieved. Recently developed detectors have low mass vacuum packages, in this example a 384x288 25um un-cooled microbolometer has a weight of less than 25g. By comparison, earlier 35µm and 50 µm devices were In the region of 40g. Where cameras are used in harsh environments mechanical shutters present both a reliability issue and additional weight. The low-weight camera utilises Xti Shutter-less technology to generate high quality images without the need for any form of mechanical shutter. The resulting camera has no moving parts. Lenses for Long Wave Infrared (LWIR) Thermal imaging are typically manufactured using Germanium (Ge) elements. These lenses tend to be designed with f/1.0 apertures and as a result add significant weight to the design. Thanks to the smaller detector pitch and system sensitivity a lens has been designed with a focal length of 14.95mm at f/1.3 where the mass of the optical components is 9g. The final optical assembly, including passive athermalisation has a mass of no more than 15g.

Brown, Alistair

2008-10-01

114

Innovative Camera and Image Processing System to Characterize Cryospheric Changes  

NASA Astrophysics Data System (ADS)

The polar regions play an important role in Earth’s climatic and geodynamic systems. Digital photogrammetric mapping provides a means for monitoring the dramatic changes observed in the polar regions during the past decades. High-resolution, photogrammetrically processed digital aerial imagery provides complementary information to surface measurements obtained by laser altimetry systems. While laser points accurately sample the ice surface, stereo images allow for the mapping of features, such as crevasses, flow bands, shear margins, moraines, leads, and different types of sea ice. Tracking features in repeat images produces a dense velocity vector field that can either serve as validation for interferometrically derived surface velocities or it constitutes a stand-alone product. A multi-modal, photogrammetric platform consists of one or more high-resolution, commercial color cameras, GPS and inertial navigation system as well as optional laser scanner. Such a system, using a Canon EOS-1DS Mark II camera, was first flown on the Icebridge missions Fall 2009 and Spring 2010, capturing hundreds of thousands of images at a frame rate of about one second. While digital images and videos have been used for quite some time for visual inspection, precise 3D measurements with low cost, commercial cameras require special photogrammetric treatment that only became available recently. Calibrating the multi-camera imaging system and geo-referencing the images are absolute prerequisites for all subsequent applications. Commercial cameras are inherently non-metric, that is, their sensor model is only approximately known. Since these cameras are not as rugged as photogrammetric cameras, the interior orientation also changes, due to temperature and pressure changes and aircraft vibration, resulting in large errors in 3D measurements. It is therefore necessary to calibrate the cameras frequently, at least whenever the system is newly installed. Geo-referencing the images is performed by the Applanix navigation system. Our new method enables a 3D reconstruction of ice sheet surface with high accuracy and unprecedented details, as it is demonstrated by examples from the Antarctic Peninsula, acquired by the IceBridge mission. Repeat digital imaging also provides data for determining surface elevation changes and velocities that are critical parameters for ice sheet models. Although these methods work well, there are known problems with satellite images and the traditional area-based matching, especially over rapidly changing outlet glaciers. To take full advantage of the high resolution, repeat stereo imaging we have developed a new method. The processing starts with the generation of a DEM from geo-referenced stereo images of the first time epoch. The next step is concerned with extracting and matching interest points in object space. Since an interest point moves its spatial position between two time epochs, such points are only radiometrically conjugate but not geometrically. In fact, the geometric displacement of two identical points, together with the time difference, renders velocities. We computed the evolution of the velocity field and surface topography on the floating tongue of the Jakobshavn glacier from historical stereo aerial photographs to illustrate the approach.

Schenk, A.; Csatho, B. M.; Nagarajan, S.

2010-12-01

115

Reflectance Calibration Scheme for Airborne Frame Camera Images  

NASA Astrophysics Data System (ADS)

The image quality of photogrammetric images is influenced by various effects from outside the camera. One effect is the scattered light from the atmosphere that lowers contrast in the images and creates a colour shift towards the blue. Another is the changing illumination during the day which results in changing image brightness within an image block. In addition, there is the so-called bidirectional reflectance of the ground (BRDF effects) that is giving rise to a view and sun angle dependent brightness gradient in the image itself. To correct for the first two effects an atmospheric correction with reflectance calibration is chosen. The effects have been corrected successfully for ADS linescan sensor data by using a parametrization of the atmospheric quantities. Following Kaufman et al. the actual atmospheric condition is estimated by the brightness of a dark pixel taken from the image. The BRDF effects are corrected using a semi-empirical modelling of the brightness gradient. Both methods are now extended to frame cameras. Linescan sensors have a viewing geometry that is only dependent from the cross track view zenith angle. The difference for frame cameras now is to include the extra dimension of the view azimuth into the modelling. Since both the atmospheric correction and the BRDF correction require a model inversion with the help of image data, a different image sampling strategy is necessary which includes the azimuth angle dependence. For the atmospheric correction a sixth variable is added to the existing five variables visibility, view zenith angle, sun zenith angle, ground altitude, and flight altitude - thus multiplying the number of modelling input combinations for the offline-inversion. The parametrization has to reflect the view azimuth angle dependence. The BRDF model already contains the view azimuth dependence and is combined with a new sampling strategy.

Beisl, U.

2012-07-01

116

ARNICA, the NICMOS 3 imaging camera of TIRGO.  

NASA Astrophysics Data System (ADS)

ARNICA (ARcetri Near Infrared CAmera) is the imaging camera for the near infrared bands between 1.0 and 2.5 ?m that Arcetri Observatory has designed and built as a general facility for the TIRGO telescope (1.5 m diameter, f/20) located at Gornergrat (Switzerland). The scale is 1?per pixel, with sky coverage of more than 4 min×4 min on the NICMOS 3 (256×256 pixels, 40 ?m side) detector array. The camera is remotely controlled by a PC 486, connected to the array control electronics via a fiber-optics link. A C-language package, running under MS-DOS on the PC 486, acquires and stores the frames, and controls the timing of the array. The camera is intended for imaging of large extra-galactic and Galactic fields; a large effort has been dedicated to explore the possibility of achieving precise photometric measurements in the J, H, K astronomical bands, with very promising results.

Lisi, F.; Baffa, C.; Hunt, L.; Stanga, R.

117

New high spatial resolution portable camera in medical imaging  

NASA Astrophysics Data System (ADS)

In the last years, many studies have been carried out on portable gamma cameras in order to optimize a device for medical imaging. In this paper, we present a new type of gamma camera, for low energies detection, based on a position sensitive photomultiplier tube Hamamatsu Flat Panel H8500 and an innovative technique based on CsI(Tl) scintillation crystals inserted into the square holes of a tungsten collimator. The geometrical features of this collimator-scintillator structure, which affect the camera spatial resolution and sensitivity, were chosen to offer optimal performances in clinical functional examinations. Detector sensitivity, energy resolution and spatial resolution were measured and the acquired image quality was evaluated with particular attention to the pixel identification capability. This low weight (about 2 kg) portable gamma camera was developed thanks to a miniaturized resistive chain electronic readout, combined with a dedicated compact 4 channel ADC board. This data acquisition board, designed by our research group, showed excellent performances, with respect to a commercial PCI 6110E card (National Intruments), in term of sampling period and additional on board operation for data pre-processing.

Trotta, C.; Massari, R.; Palermo, N.; Scopinaro, F.; Soluri, A.

2007-07-01

118

Missed Diagnostic Opportunities of Streak Camera Imaging in Tokamaks  

NASA Astrophysics Data System (ADS)

The streak camera imaging of tokamak plasmas is shown to provide diagnostic opportunities based on the existence of a long-lived (i.e., identifiable during time period comparable with slow MHD times, e.g., with the period of plasma toroidal rotation) fine structure of filaments (of luminosity) in plasmas. The skeletal structure of such filaments have been found [1] in visible light images taken with a streak camera in former experiments in tokamaks TM-2, T-4, T-6, with effective time resolution one microsecond. The above phenomenon enables us to suggest a new method of streak camera imaging which allows the determination of time evolution of direction and magnitude of rotation velocity of these structures. This exploits the «blurring» of bright spots (within these filaments) in toroidal and poloidal directions, for a wide enough slit. The above determination in a single discharge requires a special optical scheme and preliminary optimization, trying only few discharges, of the time-dependent velocity of pulling the exposed film (or its electronic analog). The examples are presented of evaluating the velocities from the data from tokamak T-6 experiments in the ordinary scheme which requires trying many discharges to optimize the imaging. [1] Proc. 27th EPS PPCF, Budapest, 2000 (http://sgi30.rmki.kfki.hu/EPS2000/P2_029.pdf).

Rantsev-Kartinov, Valentin A.

2002-11-01

119

Camera assembly design proposal for SRF cavity image collection  

SciTech Connect

This project seeks to collect images from the inside of a superconducting radio frequency (SRF) large grain niobium cavity during vertical testing. These images will provide information on multipacting and other phenomena occurring in the SRF cavity during these tests. Multipacting, a process that involves an electron buildup in the cavity and concurrent loss of RF power, is thought to be occurring near the cathode in the SRF structure. Images of electron emission in the structure will help diagnose the source of multipacting in the cavity. Multipacting sources may be eliminated with an alteration of geometric or resonant conditions in the SRF structure. Other phenomena, including unexplained light emissions previously discovered at SLAC, may be present in the cavity. In order to effectively capture images of these events during testing, a camera assembly needs to be installed to the bottom of the RF structure. The SRF assembly operates under extreme environmental conditions: it is kept in a dewar in a bath of 2K liquid helium during these tests, is pumped down to ultra-high vacuum, and is subjected to RF voltages. Because of this, the camera needs to exist as a separate assembly attached to the bottom of the cavity. The design of the camera is constrained by a number of factors that are discussed.

Tuozzolo, S.

2011-10-10

120

Engineering design criteria for an image intensifier/image converter camera  

NASA Technical Reports Server (NTRS)

The design, display, and evaluation of an image intensifier/image converter camera which can be utilized in various requirements of spaceshuttle experiments are described. An image intensifier tube was utilized in combination with two brassboards as power supply and used for evaluation of night photography in the field. Pictures were obtained showing field details which would have been undistinguishable to the naked eye or to an ordinary camera.

Sharpsteen, J. T.; Lund, D. L.; Stoap, L. J.; Solheim, C. D.

1976-01-01

121

A Virtual Wiper-restoration of Deteriorated Images by using a Pan-tilt Camera  

Microsoft Academic Search

In this paper, we propose a new method that can remove view-disturbing waterdrops from images by processing images taken with a pan-tilt camera system. In rainy days, it is often the case that images taken by the camera are hard to see because of adherent waterdrops on the surface of the protecting glass of the camera. In our method, an

Atsushi Yamashita; Toru Kaneko; Kenjiro T. Miura

2004-01-01

122

Alternative Images for Perpendicular Parking: A Usability Study of a Multi-Camera Parking Assistance System  

Microsoft Academic Search

16. Abstract The parking assistance system evaluated consisted of four outward facing cameras whose images could be presented on a monitor on the center console. The images presented varied in the location of the virtual eye point of the camera (the height above the vehicle and forward depression angle) and the number of camera images (one or multiple) appearing together.

Sean Michael Walls; John Amann; Brian Cullinane; Paul Green; Sujata Gadgil; Rachel Rubin

123

Camera head of the DSSC X-ray imager  

Microsoft Academic Search

A description of the camera-head electronics of the DSSC mega-pixel X-ray imager is given. The integration of 32 sensor chips, 256 ASICs, 112 PCBs, and 16 flex cables in the space of 241mm × 251mm × 224mm underlines its compactness. Prototypes of subassemblies behind the readout-ASIC layer are designed and fabricated. Measurement results of the main functionalities clock distribution, power

Karsten Hansen; Helmut Klar; Dirk Muntefering

2011-01-01

124

Applications for high-performance thermal imaging cameras  

Microsoft Academic Search

High-performance thermal imaging cameras based on indium antimonide (InSb) focal plane arrays (FPAs) offer excellent sensitivity in the midwave infrared band, notably in the 3-5 micron waveband. Noise levels below 20 mK enable detection of surface temperature differences of 0.1 C, and the high-speed response of the InSb photodetectors enablesv the capture of events on time scale as short as

Austin A. Richards

2002-01-01

125

CCD Fingerprint Method-Identification of a Video Camera from Videotaped Images  

Microsoft Academic Search

We propose a new “CCD Fingerprint” method to identify a camcorder from videotaped images. This method enables individual camera identification, which does not mean camera-model identification. The clue to the individual camera identification is obtained from nonuniformity of dark currents on CCD chips, which causes a fixed pattern noise. The pattern is quite unique and intrinsic to an individual camera.

Kenji Kurosawa; Kenro Kuroki; Naoki Saitoh

1999-01-01

126

Gamma-ray Compton camera imaging with a segmented HPGe  

NASA Astrophysics Data System (ADS)

In this paper, we investigate our concept to develop a ?-ray Compton camera out of a single coaxial High Purity Germanium (HPGe) detector. The imaging properties of the HPGe can be realized by way of a segmented outer contact and digital pulse-shape analysis. Limiting factors in performance will be related to the intrinsic electron momentum in Ge and the noise in the preamplifier JFETs. In addition to discussing these issues, we will present experimental and theoretical imaging studies that we have done using an existing segmented HPGe: the GRETA prototype detector at LBNL.

Schmid, G. J.; Beckedahl, D. A.; Kammeraad, J. E.; Blair, J. J.; Vetter, K.; Kuhn, A.

2001-03-01

127

Parallel phase-sensitive three-dimensional imaging camera  

DOEpatents

An apparatus is disclosed for generating a three-dimensional (3-D) image of a scene illuminated by a pulsed light source (e.g. a laser or light-emitting diode). The apparatus, referred to as a phase-sensitive 3-D imaging camera utilizes a two-dimensional (2-D) array of photodetectors to receive light that is reflected or scattered from the scene and processes an electrical output signal from each photodetector in the 2-D array in parallel using multiple modulators, each having inputs of the photodetector output signal and a reference signal, with the reference signal provided to each modulator having a different phase delay. The output from each modulator is provided to a computational unit which can be used to generate intensity and range information for use in generating a 3-D image of the scene. The 3-D camera is capable of generating a 3-D image using a single pulse of light, or alternately can be used to generate subsequent 3-D images with each additional pulse of light.

Smithpeter, Colin L. (Albuquerque, NM); Hoover, Eddie R. (Sandia Park, NM); Pain, Bedabrata (Los Angeles, CA); Hancock, Bruce R. (Altadena, CA); Nellums, Robert O. (Albuquerque, NM)

2007-09-25

128

The Atlases of Vesta derived from Dawn Framing Camera images  

NASA Astrophysics Data System (ADS)

The Dawn Framing Camera acquired during its two HAMO (High Altitude Mapping Orbit) phases in 2011 and 2012 about 6,000 clear filter images with a resolution of about 60 m/pixel. We combined these images in a global ortho-rectified mosaic of Vesta (60 m/pixel resolution). Only very small areas near the northern pole were still in darkness and are missing in the mosaic. The Dawn Framing Camera also acquired about 10,000 high-resolution clear filter images (about 20 m/pixel) of Vesta during its Low Altitude Mapping Orbit (LAMO). Unfortunately, the northern part of Vesta was still in darkness during this phase, good illumination (incidence angle < 70°) was only available for 66.8 % of the surface [1]. We used the LAMO images to calculate another global mosaic of Vesta, this time with 20 m/pixel resolution. Both global mosaics were used to produce atlases of Vesta: a HAMO atlas with 15 tiles at a scale of 1:500,000 and a LAMO atlas with 30 tiles at a scale between 1:200,000 and 1:225,180. The nomenclature used in these atlases is based on names and places historically associated with the Roman goddess Vesta, and is compliant with the rules of the IAU. 65 names for geological features were already approved by the IAU, 39 additional names are currently under review. Selected examples of both atlases will be shown in this presentation. Reference: [1]Roatsch, Th., etal., High-resolution Vesta Low Altitude Mapping Orbit Atlas derived from Dawn Framing Camera images. Planetary and Space Science (2013), http://dx.doi.org/10.1016/j.pss.2013.06.024i

Roatsch, T.; Kersten, E.; Matz, K.; Preusker, F.; Scholten, F.; Jaumann, R.; Raymond, C. A.; Russell, C. T.

2013-12-01

129

Imaging volcanic SO2 plumes with UV cameras  

NASA Astrophysics Data System (ADS)

Ultraviolet (UV) cameras allow the two-dimensional imaging of SO2 distributions at temporal resolutions on the order of 1Hz. Optical bandpass filters that selectively transmit only UV wavelengths at which SO2 absorption occurs (or, for reference, wavelengths at which absorption is negligible) are positioned in the camera's optical system, thus providing selective sensitivity to SO2. As SO2 is one of the main volatile species associated with high-temperature volcanic degassing, UV camera systems are increasingly being applied to volcanic environments for monitoring and research purposes. The relatively high frame rate of these cameras allows the retrieval of SO2 emission fluxes on time scales comparable to those on which other geophysical parameters (e.g. seismicity, deformation) are recorded, thus making an integrated evaluation possible. While impressive imagery of a volcanic plume's extent is readily obtained, lingering challenges include obtaining an accurate SO2 column density calibration, correcting for complex radiative transfer in and around volcanic plumes (which are often visually opaque), and deriving accurate flux measurements truly representative of volcanic activity and not overly biased by atmospheric turbulence and other secondary effects. Here, these issues are addressed, innovative solutions are presented, and example measurements from Kilauea Volcano (Hawaii) are shown. Although time-averaged SO2 emission rates were typically below 10 kg/s (~900 t/d) during measurements in September and October 2011, the high-resolution camera measurements revealed short-period (order of seconds) peaks of more than twice that value. Measurements were taken from different locations between 2 and 7 km distance from the summit vent, thus giving different perspectives of the gas plume. Where possible, links between SO2 emissions and other monitored parameters are identified and interpreted in regard to their connection to physical processes occurring in the volcanic system.

Kern, C.; Werner, C. A.; Doukas, M. P.; Elias, T.; Kelly, P. J.; Sutton, A. J.

2012-04-01

130

Imaging of Venus from Galileo: Early results and camera performance  

USGS Publications Warehouse

Three images of Venus have been returned so far by the Galileo spacecraft following an encounter with the planet on UT February 10, 1990. The images, taken at effective wavelengths of 4200 and 9900 A??, characterize the global motions and distribution of haze near the Venus cloud tops and, at the latter wavelength, deep within the main cloud. Previously undetected markings are clearly seen in the near-infrared image. The global distribution of these features, which have maximum contrasts of 3%, is different from that recorded at short wavelengths. In particular, the "polar collar," which is omnipresent in short wavelength images, is absent at 9900 A??. The maximum contrast in the features at 4200 A?? is about 20%. The optical performance of the camera is described and is judged to be nominal. ?? 1992.

Belton, M. J. S.; Gierasch, P.; Klaasen, K. P.; Anger, C. D.; Carr, M. H.; Chapman, C. R.; Davies, M. E.; Greeley, R.; Greenberg, R.; Head, J. W.; Neukum, G.; Pilcher, C. B.; Veverka, J.; Fanale, F. P.; Ingersoll, A. P.; Pollock, J. B.; Morrison, D.; Clary, M. C.; Cunningham, W.; Breneman, H.

1992-01-01

131

The optical imaging camera for the NAYUTA telescope I. Design  

NASA Astrophysics Data System (ADS)

An optical imaging camera is very important for Nishi-Harima Astronomical Observatory (NHAO) as a public observatory because optical images are easily understandable and attractive for the people. NHAO aims the integrated activity of research and education. So this instrument must have the capability of scientific observations. For the purpose of obtaining impressive images with wide field of view, this has the reducer converting the F-number from F/12 to F/4.8, and has the large format CCD (2k×2k). Field of view is about 10 arcminutes. Because the site placing NHAO has good seeing in Japan, we are careful not to degrade image quality in designing the optical devices.

Ozaki, Shinobu

132

CMOS image sensor noise reduction method for image signal processor in digital cameras and camera phones  

NASA Astrophysics Data System (ADS)

Digital images captured from CMOS image sensors suffer Gaussian noise and impulsive noise. To efficiently reduce the noise in Image Signal Processor (ISP), we analyze noise feature for imaging pipeline of ISP where noise reduction algorithm is performed. The Gaussian noise reduction and impulsive noise reduction method are proposed for proper ISP implementation in Bayer domain. The proposed method takes advantage of the analyzed noise feature to calculate noise reduction filter coefficients. Thus, noise is adaptively reduced according to the scene environment. Since noise is amplified and characteristic of noise varies while the image sensor signal undergoes several image processing steps, it is better to remove noise in earlier stage on imaging pipeline of ISP. Thus, noise reduction is carried out in Bayer domain on imaging pipeline of ISP. The method is tested on imaging pipeline of ISP and images captured from Samsung 2M CMOS image sensor test module. The experimental results show that the proposed method removes noise while effectively preserves edges.

Yoo, Youngjin; Lee, SeongDeok; Choe, Wonhee; Kim, Chang-Yong

2007-03-01

133

Focused image recovery from two defocused images recorded with different camera settings.  

PubMed

Two new methods are presented for recovering the focused image of an object from only two blurred images recorded with different camera parameter settings. The camera parameters include lens position, focal length, and aperture diameter. First a blur parameter sigma is estimated using one of our proposed depth-from-defocus methods. Then one of the two blurred images is deconvolved to recover the focused image. The first method is based on a spatial domain convolution/deconvolution transform. This method requires only the knowledge of sigma of the camera's point spread function (PSF). It does not require information about the actual form of the camera's PSF. The second method, in contrast to the first, requires full knowledge of the form of the PSF. As part of the second method, we present a calibration procedure for estimating the camera's PSF for different values of the blur parameter sigma. In the second method, the focused image is obtained through deconvolution in the Fourier domain using a Wiener filter. For both methods, the results of experiments on actual defocused images recorded by a CCD camera are given. The first method requires much less computation than the second method. The first method gives satisfactory results for up to medium levels of blur and the second method gives good results for up to relatively high levels of blur. PMID:18291993

Subbarao, M; Wei, T C; Surya, G

1995-01-01

134

Image-intensifier camera studies of shocked metal surfaces  

SciTech Connect

A high-space-resolution image-intensifier camera with luminance gain of up to 5000 and exposure times as short as 30 ns has been applied to the study of the interaction of posts and welds with strongly shocked metal surfaces, which included super strong steels. The time evolution of a single experiment can be recorded by multiple pulsing of the camera. Phenomena that remain coherent for relatively long durations have been observed. An important feature of the hydrodynamic flow resulting from post-plate interactions is the creation of a wave that propagates outward on the plate; the flow blocks the explosive product gases from escaping through the plate for greater than 10 ..mu..s. Electron beam welds were ineffective in blocking product gases from escaping for even short periods of time.

Engelke, R.P.; Thurston, R.S.

1986-01-01

135

Uncooled detector, optics, and camera development for THz imaging  

NASA Astrophysics Data System (ADS)

A prototype THz imaging system based on modified uncooled microbolometer detector arrays, INO MIMICII camera electronics, and a custom f/1 THz optics has been assembled. A variety of new detector layouts and architectures have been designed; the detector THz absorption was optimized via several methods including integration of thin film metallic absorbers, thick film gold black absorbers, and antenna structures. The custom f/1 THz optics is based on high resistivity floatzone silicon with parylene anti-reflection coating matched to the wavelength region of interest. The integrated detector, camera electronics, and optics are combined with a 3 THz quantum cascade laser for initial testing and evaluation. Future work will include the integration of fully optimized detectors and packaging and the evaluation of the achievable NEP with an eye to future applications such as industrial inspection and stand-off detection.

Pope, Timothy; Doucet, Michel; Dupont, Fabien; Marchese, Linda; Tremblay, Bruno; Baldenberger, Georges; Verrault, Sonia; Lamontagne, Frédéric

2009-05-01

136

Image enhancement for fluid lens camera based on color correlation.  

PubMed

The novel field of fluid lens cameras introduces unique image processing challenges. Intended for surgical applications, these fluid optics systems have a number of advantages over traditional glass lens systems. These advantages include improved miniaturization and no moving parts while zooming. However, the liquid medium creates two forms of image degradation: image distortion, which warps the image such that straight lines appear curved, and nonuniform color blur, which degrades the image such that certain color planes appear sharper than others. We propose the use of image processing techniques to reduce these degradations. To deal with image warping, we employ a conventional method that models the warping process as a degree-six polynomial in order to invert the effect. For image blur, we propose an adapted perfect reconstruction filter bank that uses high frequency sub-bands of sharp color planes to improve blurred color planes. The algorithm adjusts the number of levels in the decomposition and alters a prefilter based on crude knowledge of the blurring channel characteristics. While this paper primarily considers the use of a sharp green color plane to improve a blurred blue color plane, these methods can be applied to improve the red color plane as well, or more generally adapted to any system with high edge correlation between two images. PMID:19278917

Tzeng, Jack; Nguyen, Truong Q

2009-04-01

137

Single-quantum dot imaging with a photon counting camera  

PubMed Central

The expanding spectrum of applications of single-molecule fluorescence imaging ranges from fundamental in vitro studies of biomolecular activity to tracking of receptors in live cells. The success of these assays has relied on progresses in organic and non-organic fluorescent probe developments as well as improvements in the sensitivity of light detectors. We describe a new type of detector developed with the specific goal of ultra-sensitive single-molecule imaging. It is a wide-field, photon-counting detector providing high temporal and high spatial resolution information for each incoming photon. It can be used as a standard low-light level camera, but also allows access to a lot more information, such as fluorescence lifetime and spatio-temporal correlations. We illustrate the single-molecule imaging performance of our current prototype using quantum dots and discuss on-going and future developments of this detector.

Michalet, X.; Colyer, R. A.; Antelman, J.; Siegmund, O.H.W.; Tremsin, A.; Vallerga, J.V.; Weiss, S.

2010-01-01

138

High-contrast imaging with a self-coherent camera  

NASA Astrophysics Data System (ADS)

Direct imaging of exoplanets is very challenging because the planet is 104 to 1010 fainter than the star at a separation of a fraction of arcsec. Several coronagraphs have been proposed to reduce the contrast ratio but their performance strongly depends on the level of phase and amplitude aberrations that induce speckles in the science image. An active control of the aberrations and a posteriori calibration are thus required to reach very high contrasts. Classical adaptive optics are not sufficient for this purpose because of non-common path aberrations. Our team proposed a self-coherent camera that spatially modulates the speckles in the science image. It is then possible to both actively control a deformable mirror and calibrate the residuals a posteriori. The current paper is an overview of the developments we have been working on for 7 years. We present the principle of the self-coherent camera, laboratory performance obtained in monochromatic light, and upgrades of the technique to make it achromatic.

Galicher, Raphaël.; Mazoyer, Johan; Baudoz, Pierre; Rousset, Gérard

2013-09-01

139

On image sensor dynamic range utilized by security cameras  

NASA Astrophysics Data System (ADS)

The dynamic range is an important quantity used to describe an image sensor. Wide/High/Extended dynamic range is often brought forward as an important feature to compare one device to another. The dynamic range of an image sensor is normally given as a single number, which is often insufficient since a single number will not fully describe the dynamic capabilities of the sensor. A camera is ideally based on a sensor that can cope with the dynamic range of the scene. Otherwise it has to sacrifice some part of the available data. For a security camera the latter may be critical since important objects might be hidden in the sacrificed part of the scene. In this paper we compare the dynamic capabilities of some image sensors utilizing a visual tool. The comparison is based on the use case, common in surveillance, where low contrast objects may appear in any part of a scene that through its uneven illumination, span a high dynamic range. The investigation is based on real sensor data that has been measured in our lab and a synthetic test scene is used to mimic the low contrast objects. With this technique it is possible to compare sensors with different intrinsic dynamic properties as well as some capture techniques used to create an effect of increased dynamic range.

Johannesson, Anders

2012-02-01

140

ARNICA: the Arcetri Observatory NICMOS3 imaging camera  

NASA Astrophysics Data System (ADS)

ARNICA (ARcetri Near Infrared CAmera) is the imaging camera for the near infrared bands between 1.0 and 2.5 micrometers that Arcetri Observatory has designed and built as a general facility for the TIRGO telescope (1.5 m diameter, f/20) located at Gornergrat (Switzerland). The scale is 1' per pixel, with sky coverage of more than 4' X 4' on the NICMOS 3 (256 X 256 pixels, 40 micrometers side) detector array. The optical path is compact enough to be enclosed in a 25.4 cm diameter dewar; the working temperature is 76 K. The camera is remotely controlled by a 486 PC, connected to the array control electronics via a fiber-optics link. A C-language package, running under MS-DOS on the 486 PC, acquires and stores the frames, and controls the timing of the array. We give an estimate of performance, in terms of sensitivity with an assigned observing time, along with some details on the main parameters of the NICMOS 3 detector.

Lisi, Franco; Baffa, Carlo; Hunt, Leslie K.

1993-10-01

141

The Herschel/PACS 2560 bolometers imaging camera  

NASA Astrophysics Data System (ADS)

The development program of the flight model imaging camera for the PACS instrument on-board the Herschel spacecraft is nearing completion. This camera has two channels covering the 60 to 210 microns wavelength range. The focal plane of the short wavelength channel is made of a mosaic of 2×4 3-sides buttable bolometer arrays (16×16 pixels each) for a total of 2048 pixels, while the long wavelength channel has a mosaic of 2 of the same bolometer arrays for a total of 512 pixels. The 10 arrays have been fabricated, individually tested and integrated in the photometer. They represent the first filled arrays of fully collectively built bolometers with a cold multiplexed readout, allowing for a properly sampled coverage of the full instrument field of view. The camera has been fully characterized and the ground calibration campaign will take place after its delivery to the PACS consortium in mid 2006. The bolometers, working at a temperature of 300 mK, have a NEP close to the BLIP limit and an optical bandwidth of 4 to 5 Hz that will permit the mapping of large sky areas. This paper briefly presents the concept and technology of the detectors as well as the cryocooler and the warm electronics. Then we focus on the performances of the integrated focal planes (responsivity, NEP, low frequency noise, bandwidth).

Billot, Nicolas; Agnèse, Patrick; Auguères, Jean-Louis; Béguin, Alain; Bouère, André; Boulade, Olivier; Cara, Christophe; Cloué, Christelle; Doumayrou, Eric; Duband, Lionel; Horeau, Benoît; le Mer, Isabelle; Lepennec, Jean; Martignac, Jérome; Okumura, Koryo; Revéret, Vincent; Sauvage, Marc; Simoens, François; Vigroux, Laurent

2006-07-01

142

LROC WAC 100 Meter Scale Photometrically Normalized Map of the Moon  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) monthly global observations allowed derivation of a robust empirical photometric solution over a broad range of incidence, emission and phase (i, e, g) angles. Combining the WAC stereo-based GLD100 [1] digital terrain model (DTM) and LOLA polar DTMs [2] enabled precise topographic corrections to photometric angles. Over 100,000 WAC observations at 643 nm were calibrated to reflectance (I/F). Photometric angles (i, e, g), latitude, and longitude were calculated and stored for each WAC pixel. The 6-dimensional data set was then reduced to 3 dimensions by photometrically normalizing I/F with a global solution similar to [3]. The global solution was calculated from three 2°x2° tiles centered on (1°N, 147°E), (45°N, 147°E), and (89°N, 147°E), and included over 40 million WAC pixels. A least squares fit to a multivariate polynomial of degree 4 (f(i,e,g)) was performed, and the result was the starting point for a minimum search solving the non-linear function min[{1-[ I/F / f(i,e,g)] }2]. The input pixels were filtered to incidence angles (calculated from topography) < 89° and I/F greater than a minimum threshold to avoid shadowed pixels, and the output normalized I/F values were gridded into an equal-area map projection at 100 meters/pixel. At each grid location the median, standard deviation, and count of valid pixels were recorded. The normalized reflectance map is the result of the median of all normalized WAC pixels overlapping that specific 100-m grid cell. There are an average of 86 WAC normalized I/F estimates at each cell [3]. The resulting photometrically normalized mosaic provides the means to accurately compare I/F values for different regions on the Moon (see Nuno et al. [4]). The subtle differences in normalized I/F can now be traced across the local topography at regions that are illuminated at any point during the LRO mission (while the WAC was imaging), including at polar latitudes. This continuous map of reflectance at 643 nm, normalized to a standard geometry of i=30, e=0, g=30, ranges from 0.036 to 0.36 (0.01%-99.99% of the histogram) with a global mean reflectance of 0.115. Immature rays of Copernican craters are typically >0.14 and maria are typically <0.07 with averages for individual maria ranging from 0.046 to 0.060. The materials with the lowest normalized reflectance on the Moon are pyroclastic deposits at Sinus Aestuum (<0.036) and those with the highest normalized reflectance are found on steep crater walls (>0.36)[4]. 1. Scholten et al. (2012) J. Geophys. Res., 117, doi: 10.1029/2011JE003926. 2. Smith et al. (2010), Geophys. Res. Lett., 37, L18204, doi:10.1029/2010GL043751. 3. Boyd et al. (2012) LPSC XLIII, #2795 4. Nuno et al. AGU, (this conference)

Boyd, A. K.; Nuno, R. G.; Robinson, M. S.; Denevi, B. W.; Hapke, B. W.

2013-12-01

143

Image reconstruction methods for the PBX-M pinhole camera  

SciTech Connect

This paper describes two methods which have been used to reconstruct the soft x-ray emission profile of the PBX-M tokamak from the projected images recorded by the PBX-M pinhole camera. Both methods must accurately represent the shape of the reconstructed profile while also providing a degree of immunity to noise in the data. The first method is a simple least squares fit to the data. This has the advantage of being fast and small, and thus easily implemented on the PDP-11 computer used to control the video digitizer for the pinhole camera. The second method involves the application of a maximum entropy algorithm to an overdetermined system. This has the advantage of allowing the use of a default profile. This profile contains additional knowledge about the plasma shape which can be obtained from equilibrium fits to the external magnetic measurements. Additionally the reconstruction is guaranteed positive, and the fit to the data can be relaxed by specifying both the amount and distribution of noise in the image. The algorithm described has the advantage of being considerably faster, for an overdetermined system, than the usual Lagrange multiplier approach to finding the maximum entropy solution. 13 refs., 24 figs.

Holland, A.; Powell, E.T.; Fonck, R.J.

1990-03-01

144

A comparison of select image-compression algorithms for an electronic still camera  

NASA Technical Reports Server (NTRS)

This effort is a study of image-compression algorithms for an electronic still camera. An electronic still camera can record and transmit high-quality images without the use of film, because images are stored digitally in computer memory. However, high-resolution images contain an enormous amount of information, and will strain the camera's data-storage system. Image compression will allow more images to be stored in the camera's memory. For the electronic still camera, a compression algorithm that produces a reconstructed image of high fidelity is most important. Efficiency of the algorithm is the second priority. High fidelity and efficiency are more important than a high compression ratio. Several algorithms were chosen for this study and judged on fidelity, efficiency and compression ratio. The transform method appears to be the best choice. At present, the method is compressing images to a ratio of 5.3:1 and producing high-fidelity reconstructed images.

Nerheim, Rosalee

1989-01-01

145

Camera Canvas: Image Editing Software for People with Disabilities  

Microsoft Academic Search

\\u000a We developed Camera Canvas, photo editing and picture drawing software for individuals who cannot use their hands to operate\\u000a a computer mouse. Camera Canvas is designed for use with camera-based mouse-replacement interfaces that allow a user with\\u000a severe motion impairments to control the mouse pointer by moving his or her head in front of a web camera. To make Camera

Christopher Kwan; Margrit Betke

146

Embedded image enhancement for high-throughput cameras  

NASA Astrophysics Data System (ADS)

This paper presents image enhancement for a novel Ultra-High-Definition (UHD) video camera offering 4K images and higher. Conventional image enhancement techniques need to be reconsidered for the high-resolution images and the low-light sensitivity of the new sensor. We study two image enhancement functions and evaluate and optimize the algorithms for embedded implementation in programmable logic (FPGA). The enhancement study involves high-quality Auto White Balancing (AWB) and Local Contrast Enhancement (LCE). We have compared multiple algorithms from literature, both with objective and subjective metrics. In order to objectively compare Local Contrast (LC), an existing LC metric is modified for LC measurement in UHD images. For AWB, we have found that color histogram stretching offers a subjective high image quality and it is among the algorithms with the lowest complexity, while giving only a small balancing error. We impose a color-to-color gain constraint, which improves robustness of low-light images. For local contrast enhancement, a combination of contrast preserving gamma and single-scale Retinex is selected. A modified bilateral filter is designed to prevent halo artifacts, while significantly reducing the complexity and simultaneously preserving quality. We show that by cascading contrast preserving gamma and single-scale Retinex, the visibility of details is improved towards the level appropriate for high-quality surveillance applications. The user is offered control over the amount of enhancement. Also, we discuss the mapping of those functions on a heterogeneous platform to come to an effective implementation while preserving quality and robustness.

Geerts, Stan J. C.; Cornelissen, Dion; de With, Peter H. N.

2014-03-01

147

Measurement of Space Variant PSF for Restoring Degraded Images by Security Cameras  

Microsoft Academic Search

Images recorded by a security camera have often severely degraded due to dirty lens or secular distortion of the recording system. To restore these images, the fully determination of space variant point spread function (PSF) is required. To measure PSF, we used a liquid crystal display. We made some experiment to restore the images by a CCD camera with intentionally

Tadashi Ito; Y. Fujii; N. Ohta; S. Saitoh; T. Matsuura; T. Yamamoto

2006-01-01

148

Experimental and modeling studies of imaging with curvilinear electronic eye cameras.  

PubMed

Model calculations and the experimental measurements of the imaging properties of planar, hemispherical, and elliptic parabolic electronic eye cameras are compared. Numerical methods for comprehensive full field calculations of image formation are enabled by use computationally efficient modes. Quantitative agreement between these calculations and experimentally measured images of test patterns reveals advantages of curvilinear camera systems, and provides guidelines for future designs. PMID:21197013

Malyarchuk, Viktor; Jung, Inhwa; Rogers, John A; Shin, Gunchul; Ha, Jeong Sook

2010-12-20

149

Real-Time Vehicle License Plate Recognition on Road Images from Various Cameras  

Microsoft Academic Search

The information of vehicle is very important for maintaining traffic order under the present complex traffic environments. The image of vehicle is captured in various ways; fixed camera, movable camera, and vehicle-loaded camera. This paper proposes a new vehicle plate recognition mechanism that is essential to know the information of vehicle. The proposed method is a real-time processing system using

Jae-Khun Chang

2010-01-01

150

An industrial camera for color image processing with mixed lighting conditions  

Microsoft Academic Search

Industrial camera is an important electric device in the machine vision, which serves as an interface between the computers and the real world. There is an increasing demand on the usage of high-speed, miniaturization, portable, low-power, and smart industrial camera. In this paper, we novelly design and develop such a kind of camera for capturing and processing color images from

Wei Liao; Xianqing Tai; Gongyan Li; Xun Li

2010-01-01

151

Image Frequency Analysis for Testing of Fire Service Thermal Imaging Cameras  

Microsoft Academic Search

A growing number of first responders are purchasing infrared cameras (thermal imagers). Thermal imagers provide critical information\\u000a for many firefighting operations: size up, tracking of fire growth, location of victims and egress routes, overhaul, etc….\\u000a Although thermal imagers have proven to be useful tools for the fire service over the past 10–15 years, there is no existing\\u000a comprehensive performance standard for

Francine Amon; Alfred Ducharme

2009-01-01

152

Image deblurring using the direction dependence of camera resolution  

NASA Astrophysics Data System (ADS)

The blurring that occurs in the lens of a camera has a tendency to further degrade in areas away from the on-axis of the image. In addition, the degradation of the blurred image in an off-axis area exhibits directional dependence. Conventional methods have been known to use the Wiener filter or the Richardson-Lucy algorithm to mitigate the problem. These methods use the pre-defined point spread function (PSF) in the restoration process, thereby preventing an increase in the noise elements. However, the nonuniform degradation that depends on the direction is not improved even though the edges are emphasized by these conventional methods. In this paper, we analyze the directional dependence of resolution based on the modeling of an optical system using a blurred image. We propose a novel image deblurring method that employs a reverse filter based on optimizing the directional dependence coefficients of the regularization term in the maximum a posterior probability (MAP) algorithm. We have improved the directional dependence of resolution by optimizing the weight coefficients of the direction in which the resolution is degraded.

Hirai, Yukio; Yoshikawa, Hiroyasu; Shimizu, Masayoshi

2013-03-01

153

X-ray fluorescence camera for biomedical imaging  

NASA Astrophysics Data System (ADS)

X-Ray Fluorescence (XRF) analysis is useful for measuring density distributions of contrast media in vivo. An XRF camera was developed to carry out mapping for iodine-based contrast media used in medical angiography. In this camera, objects are exposed by an x-ray beam formed using a 3.0-mm-diameter lead hole. Next, cerium K-series characteristic x-rays are absorbed effectively by iodine media in objects, and iodine fluorescences are produced from the objects. Iodine K? fluorescences are selected out using a 58-?m-thick stannum filter and are detected by a cadmium telluride (CdTe) detector. K? rays are discriminated out by a multichannel analyzer (MCA), and photon number is counted by a counter board (CB). The objects are moved and scanned using an x-y stage driven by a two-stage controller, and x-ray images obtained by iodine mapping are shown in a personal computer (PC) monitor. In particular, iodine fluorescences were produced from remanent iodine elements in a cancer region of a rabbit ear.

Matsukiyo, Hiroshi; Sato, Eiichi; Abderyim, Purkhet; Osawa, Akihiro; Enomoto, Toshiyuki; Watanabe, Manabu; Nagao, Jiro; Nomiya, Seiichiro; Sato, Koetsu; Ogawa, Akira; Sato, Shigehiro; Ichimaru, Toshio

2008-08-01

154

Distant Supernova Remnant Imaged by Chandra's High Resolution Camera  

NASA Astrophysics Data System (ADS)

The High Resolution Camera (HRC), one of the two X-ray cameras on NASA's Chandra X-ray Observatory, was placed into the focus for the first time on Monday, August 30. The first target was LMC X-1, a point-like source of X rays in the Large Magellanic Cloud. The Large Magellanic Cloud, a companion galaxy to the Milky Way, is 160,000 light years from Earth. After checking the focus with LMC X-1, Chandra observed N132D, a remnant of an exploded star in the Large Magellanic Cloud. "These were preliminary test observations," emphasized Dr. Stephen Murray, of the Harvard-Smithsonian Center for Astrophysics, principal investigator for the High Resolution Camera. "But we are very pleased with the results. All indications are that the HRC will produce X-ray images of unprecedented clarity." The N132D image shows a highly structured remnant, or shell, of 10-million-degree gas that is 80 light years across. Such a shell in the vicinity of the Sun would encompass more than fifty nearby stars. The amount of material in the N132D hot gas remnant is equal to that of 600 suns. The N132D supernova remnant appears to be colliding with a giant molecular cloud, which produces the brightening on the southern rim of the remnant. The molecular cloud, visible with a radio telescope, has the mass of 300,000 suns. The relatively weak x-radiation on the upper left shows that the shock wave is expanding into a less dense region on the edge of the molecular cloud. A number of small circular structures are visible in the central regions and a hint of a large circular loop can be seen in the upper part of the remnant. Whether the peculiar shape of the supernova remnant can be fully explained in terms of these effects, or whether they point to a peculiar cylindrically shaped explosion remains to be seen. -more- "The image is so rich in structure that it will take a while to sort out what is really going on," Murray said. "It could be multiple supernovas, or absorbing clouds in the vicinity of the supernova." The unique capabilities of the HRC stem from the close match of its imaging capability to the focusing power of the mirrors. When used with the Chandra mirrors, the HRC will make images that reveal detail as small as one-half an arc second. This is equivalent to the ability to read a stop sign at a distance of twelve miles. The checkout period for the HRC will continue for the next few weeks, during which time the team expects to acquire images of other supernova remnants, star clusters, and starburst galaxies. To follow Chandra's progress, visit the Chandra News Web site at: http://chandra.harvard.edu AND http://chandra.nasa.gov NASA's Marshall Space Flight Center in Huntsville, Alabama, manages the Chandra X-ray Observatory for NASA's Office of Space Science, NASA Headquarters, Washington, D.C. The Smithsonian Astrophysical Observatory's Chandra X-ray Center in Cambridge, Mass., manages the Chandra science program and controls the observatory for NASA. TRW Space and Electronics Group of Redondo Beach, Calif., leads the contractor team that built Chandra. High resolution digital versions of the X-ray image (300 dpi JPG, TIFF) and other information associated with this release are available on the Internet at: http://chandra.harvard.edu/photo/0050/ or via links in: http://chandra.harvard.edu

1999-09-01

155

Color calibration of a CMOS digital camera for mobile imaging  

NASA Astrophysics Data System (ADS)

As white balance algorithms employed in mobile phone cameras become increasingly sophisticated by using, e.g., elaborate white-point estimation methods, a proper color calibration is necessary. Without such a calibration, the estimation of the light source for a given situation may go wrong, giving rise to large color errors. At the same time, the demands for efficiency in the production environment require the calibration to be as simple as possible. Thus it is important to find the correct balance between image quality and production efficiency requirements. The purpose of this work is to investigate camera color variations using a simple model where the sensor and IR filter are specified in detail. As input to the model, spectral data of the 24-color Macbeth Colorchecker was used. This data was combined with the spectral irradiance of mainly three different light sources: CIE A, D65 and F11. The sensor variations were determined from a very large population from which 6 corner samples were picked out for further analysis. Furthermore, a set of 100 IR filters were picked out and measured. The resulting images generated by the model were then analyzed in the CIELAB space and color errors were calculated using the ?E94 metric. The results of the analysis show that the maximum deviations from the typical values are small enough to suggest that a white balance calibration is sufficient. Furthermore, it is also demonstrated that the color temperature dependence is small enough to justify the use of only one light source in a production environment.

Eliasson, Henrik

2010-01-01

156

A Hybrid Camera for simultaneous imaging of gamma and optical photons  

NASA Astrophysics Data System (ADS)

We present a new concept for a medical imaging system, the Hybrid Mini Gamma Camera (HMGC). This combines an optical and a gamma-ray camera in a co-aligned configuration that offers high spatial resolution multi-modality imaging for superimposition of a scintigraphic image on an optical image. This configuration provides visual identification of the sites of localisation of radioactivity that would be especially suited to medical imaging. An extension of the new concept using two hybrid cameras (The StereoScope) offers the potential for stereoscopic imaging with depth estimation for a gamma emitting source.

Lees, J. E.; Bassford, D. J.; Blake, O. E.; Blackshaw, P. E.; Perkins, A. C.

2012-06-01

157

Camera calibration with lens distortion using weakly localized image features  

Microsoft Academic Search

Camera calibration is a crucial problem for many industrial applications that incorporate visual sensing. In this paper, we present an approach to computing the intrinsic and extrinsic camera parameters taking into account radial lens distortion. The approach consists of directly searching for the camera parameters that best project 3D points of a calibration pattern onto intensity edges in a 2D

Moumen T. Ahmed; Elsayed E. Hemayed; Aly A. Farag

1999-01-01

158

Coded-aperture Compton camera for gamma-ray imaging  

NASA Astrophysics Data System (ADS)

This dissertation describes the development of a novel gamma-ray imaging system concept and presents results from Monte Carlo simulations of the new design. Current designs for large field-of-view gamma cameras suitable for homeland security applications implement either a coded aperture or a Compton scattering geometry to image a gamma-ray source. Both of these systems require large, expensive position-sensitive detectors in order to work effectively. By combining characteristics of both of these systems, a new design can be implemented that does not require such expensive detectors and that can be scaled down to a portable size. This new system has significant promise in homeland security, astronomy, botany and other fields, while future iterations may prove useful in medical imaging, other biological sciences and other areas, such as non-destructive testing. A proof-of-principle study of the new gamma-ray imaging system has been performed by Monte Carlo simulation. Various reconstruction methods have been explored and compared. General-Purpose Graphics-Processor-Unit (GPGPU) computation has also been incorporated. The resulting code is a primary design tool for exploring variables such as detector spacing, material selection and thickness and pixel geometry. The advancement of the system from a simple 1-dimensional simulation to a full 3-dimensional model is described. Methods of image reconstruction are discussed and results of simulations consisting of both a 4 x 4 and a 16 x 16 object space mesh have been presented. A discussion of the limitations and potential areas of further study is also presented.

Farber, Aaron M.

159

Applying and extending ISO/TC42 digital camera resolution standards to mobile imaging products  

NASA Astrophysics Data System (ADS)

There are no fundamental differences between today's mobile telephone cameras and consumer digital still cameras that suggest many existing ISO imaging performance standards do not apply. To the extent that they have lenses, color filter arrays, detectors, apertures, image processing, and are hand held, there really are no operational or architectural differences. Despite this, there are currently differences in the levels of imaging performance. These are driven by physical and economic constraints, and image-capture conditions. Several ISO standards for resolution, well established for digital consumer digital cameras, require care when applied to the current generation of cell phone cameras. In particular, accommodation of optical flare, shading non-uniformity and distortion are recommended. We offer proposals for the application of existing ISO imaging resolution performance standards to mobile imaging products, and suggestions for extending performance standards to the characteristic behavior of camera phones.

Williams, Don; Burns, Peter D.

2007-01-01

160

SCC500: next-generation infrared imaging camera core products with highly flexible architecture for unique camera designs  

Microsoft Academic Search

A new 4th generation MicroIR architecture is introduced as the latest in the highly successful Standard Camera Core (SCC) series by BAE SYSTEMS to offer an infrared imaging engine with greatly reduced size, weight, power, and cost. The advanced SCC500 architecture provides great flexibility in configuration to include multiple resolutions, an industry standard Real Time Operating System (RTOS) for customer

Roy N. Rumbaugh; Kevin Grealish; Tom Kacir; Barry Arsenault; Robert H. Murphy; Scott Miller

2003-01-01

161

Importance of Developing Image Restoration Techniques for Security Cameras under Severe Conditions  

Microsoft Academic Search

A concept, which was proposed and has been pursued by the authors for restoring the images of a suspected person taken by a security camera, is reviewed. In the concept, the peculiar facts concerning the security camera system that all the things in the image except the suspected person itself are usually preserved and that can be used for investigations

Y. Fujii; T. Ito; N. Ohta; S. Saitoh; T. Matsuura; T. Yamamoto

2006-01-01

162

Image Restoration for Security Cameras with Dirty Lens under Oblique Illumination  

Microsoft Academic Search

An approach for restoring the images of a suspected person taken by a security camera with dirty lens is proposed. In the approach, the peculiar facts concerning the security camera system that all the things in the image except the suspected person itself are usually preserved and that can be used for investigations are to be used as fully as

Yusaku Fujii; Naoya Ohta; Tadashi Ito; Saburou Saitoh; Tsutomu Matsuura; Takao Yamamoto

2006-01-01

163

ATR/OTR-SY Tank Camera Purge System and in Tank Color Video Imaging System  

SciTech Connect

This procedure will document the satisfactory operation of the 101-SY tank Camera Purge System (CPS) and 101-SY in tank Color Camera Video Imaging System (CCVIS). Included in the CPRS is the nitrogen purging system safety interlock which shuts down all the color video imaging system electronics within the 101-SY tank vapor space during loss of nitrogen purge pressure.

Werry, S.M.

1995-06-06

164

Investigation of unsteady regular and Mach reflection in water using an image converter camera  

Microsoft Academic Search

The oblique collisions of two shock waves in water are investigated by an image converter camera. These two shock waves are produced by the detonation of two plane efficient explosives at angles to each other. The propagating shock wave is confirmed to be pseudo-stationary by framing photographs taken with by the image converter camera. Configurations of the shock wave are

S. Itoh; S. Nagano; O. Suzuki; I. Morita; M. Fujita

1995-01-01

165

Removal of Adherent Waterdrops from Images Acquired with a Stereo Camera System  

Microsoft Academic Search

SUMMARY In this paper, we propose a new method that can remove view-disturbing noises from stereo images. One of the thorny problems in outdoor surveillance by a camera is that ad- herent noises such as waterdrops on the protecting glass surface lens disturb the view from the camera. Therefore, we propose a method for removing adherent noises from stereo images

Kenjiro T. MIURAy

166

Color camera computed tomography imaging spectrometer for improved spatial-spectral image accuracy  

NASA Technical Reports Server (NTRS)

Computed tomography imaging spectrometers ("CTIS"s) having color focal plane array detectors are provided. The color FPA detector may comprise a digital color camera including a digital image sensor, such as a Foveon X3.RTM. digital image sensor or a Bayer color filter mosaic. In another embodiment, the CTIS includes a pattern imposed either directly on the object scene being imaged or at the field stop aperture. The use of a color FPA detector and the pattern improves the accuracy of the captured spatial and spectral information.

Wilson, Daniel W. (Inventor); Bearman, Gregory H. (Inventor); Johnson, William R. (Inventor)

2011-01-01

167

Covariance imaging experiments using a pixel-imaging mass-spectrometry camera  

NASA Astrophysics Data System (ADS)

The "pixel imaging mass spectrometry" camera is used to perform femtosecond laser-induced Coulomb explosion imaging of 3,5-dibromo-3',5'-difluoro-4'-cyanobiphenyl molecules prealigned in space. The experiment allows the concurrent detection of the correlated two-dimensional momentum images of all the ionic fragments resulting from fragmentation of multiple molecules in each acquisition cycle. The Coulomb explosion studies provide rich information about the parent molecular structure and fragmentation dynamics, and open new opportunities for real-time imaging of intramolecular processes.

Slater, Craig S.; Blake, Sophie; Brouard, Mark; Lauer, Alexandra; Vallance, Claire; John, Jaya John; Turchetta, Renato; Nomerotski, Andrei; Christensen, Lauge; Nielsen, Jens H.; Johansson, Mikael P.; Stapelfeldt, Henrik

2014-01-01

168

Pre-Clinical and Clinical Evaluation of High Resolution, Mobile Gamma Camera and Positron Imaging Devices.  

National Technical Information Service (NTIS)

The objective of this project is to design, build and evaluate a compact and mobile gamma and positron imaging camera. This imaging device has several advantages over conventional systems: (1) greater flexibility in positioning with respect to the target ...

D. R. Gilland

2006-01-01

169

Pre-Clinical and Clinical Evaluation of High Resolution, Mobile Gamma Camera and Positron Imaging Devices.  

National Technical Information Service (NTIS)

The objective of this project is to design, build and evaluate a compact and mobile gamma and positron imaging camera. This imaging device has several advantages over conventional systems: (1) greater flexibility in positioning with respect to the target ...

D. R. Gilland

2007-01-01

170

Megapixel mythology and photospace: estimating photospace for camera phones from large image sets  

NASA Astrophysics Data System (ADS)

It is a myth that more pixels alone result in better images. The marketing of camera phones in particular has focused on their pixel numbers. However, their performance varies considerably according to the conditions of image capture. Camera phones are often used in low-light situations where the lack of a flash and limited exposure time will produce underexposed, noisy and blurred images. Camera utilization can be quantitatively described by photospace distributions, a statistical description of the frequency of pictures taken at varying light levels and camera-subject distances. If the photospace distribution is known, the user-experienced distribution of quality can be determined either directly by direct measurement of subjective quality, or by photospace-weighting of objective attributes. The population of a photospace distribution requires examining large numbers of images taken under typical camera phone usage conditions. ImagePhi was developed as a user-friendly software tool to interactively estimate the primary photospace variables, subject illumination and subject distance, from individual images. Additionally, subjective evaluations of image quality and failure modes for low quality images can be entered into ImagePhi. ImagePhi has been applied to sets of images taken by typical users with a selection of popular camera phones varying in resolution. The estimated photospace distribution of camera phone usage has been correlated with the distributions of failure modes. The subjective and objective data show that photospace conditions have a much bigger impact on image quality of a camera phone than the pixel count of its imager. The 'megapixel myth' is thus seen to be less a myth than an ill framed conditional assertion, whose conditions are to a large extent specified by the camera's operational state in photospace.

Hultgren, Bror O.; Hertel, Dirk W.

2008-01-01

171

Gamma camera-mounted anatomical X-ray tomography: technology, system characteristics and first images  

Microsoft Academic Search

.   Scintigraphic diagnosis, based on functional image interpretation, becomes more accurate and meaningful when supported by\\u000a corresponding anatomical data. In order to produce anatomical images that are inherently registered with images of emission\\u000a computerised tomography acquired with a gamma camera, an X-ray transmission system was mounted on the slip-ring gantry of\\u000a a GEMS Millennium VG gamma camera. The X-ray imaging

Moshe Bocher; Adi Balan; Yodphat Krausz; Yigal Shrem; Albert Lonn; Michael Wilk; Roland Chisin

2000-01-01

172

Analysis of a multiple reception model for processing images from the solid-state imaging camera  

NASA Technical Reports Server (NTRS)

A detection model to identify the presence of Galileo optical communications from an Earth-based Transmitter (GOPEX) signal by processing multiple signal receptions extracted from the camera images is described. The model decomposes a multi-signal reception camera image into a set of images so that the location of the pixel being illuminated is known a priori and the laser can illuminate only one pixel at each reception instance. Numerical results show that if effects on the pointing error due to atmospheric refraction can be controlled to between 20 to 30 microrad, the beam divergence of the GOPEX laser should be adjusted to be between 30 to 40 microrad when the spacecraft is 30 million km away from Earth. Furthermore, increasing beyond 5 the number of receptions for processing will not produce a significant detection probability advantage.

Yan, T.-Y.

1991-01-01

173

SCC500: next-generation infrared imaging camera core products with highly flexible architecture for unique camera designs  

NASA Astrophysics Data System (ADS)

A new 4th generation MicroIR architecture is introduced as the latest in the highly successful Standard Camera Core (SCC) series by BAE SYSTEMS to offer an infrared imaging engine with greatly reduced size, weight, power, and cost. The advanced SCC500 architecture provides great flexibility in configuration to include multiple resolutions, an industry standard Real Time Operating System (RTOS) for customer specific software application plug-ins, and a highly modular construction for unique physical and interface options. These microbolometer based camera cores offer outstanding and reliable performance over an extended operating temperature range to meet the demanding requirements of real-world environments. A highly integrated lens and shutter is included in the new SCC500 product enabling easy, drop-in camera designs for quick time-to-market product introductions.

Rumbaugh, Roy N.; Grealish, Kevin; Kacir, Tom; Arsenault, Barry; Murphy, Robert H.; Miller, Scott

2003-09-01

174

Toward an image compression algorithm for the high-resolution electronic still camera  

NASA Technical Reports Server (NTRS)

Taking pictures with a camera that uses a digital recording medium instead of film has the advantage of recording and transmitting images without the use of a darkroom or a courier. However, high-resolution images contain an enormous amount of information and strain data-storage systems. Image compression will allow multiple images to be stored in the High-Resolution Electronic Still Camera. The camera is under development at Johnson Space Center. Fidelity of the reproduced image and compression speed are of tantamount importance. Lossless compression algorithms are fast and faithfully reproduce the image, but their compression ratios will be unacceptably low due to noise in the front end of the camera. Future efforts will include exploring methods that will reduce the noise in the image and increase the compression ratio.

Nerheim, Rosalee

1989-01-01

175

A novel ultra-high speed camera for digital image processing applications  

NASA Astrophysics Data System (ADS)

Multi-channel gated-intensified cameras are commonly used for capturing images at ultra-high frame rates. The use of image intensifiers reduces the image resolution and increases the error in applications requiring high-quality images, such as digital image correlation. We report the development of a new type of non-intensified multi-channel camera system that permits recording of image sequences at ultra-high frame rates at the native resolution afforded by the imaging optics and the cameras used. This camera system is based upon the concept of using a sequence of short-duration light pulses of different wavelengths for illumination and using wavelength selective elements in the imaging system to route each particular wavelength of light to a particular camera. As such, the duration of the light pulses controls the exposure time and the timing of the light pulses controls the interframe time. A prototype camera system built according to this concept comprises four dual-frame cameras synchronized with four dual-cavity pulsed lasers producing 5 ns pulses in four different wavelengths. The prototype is capable of recording four-frame full-resolution image sequences at frame rates up to 200 MHz and eight-frame image sequences at frame rates up to 8 MHz. This system is built around a stereo microscope to capture stereoscopic image sequences usable for 3D digital image correlation. The camera system is used for imaging the chip-workpiece interface area during high speed machining, and the images are used to map the strain rate in the primary shear zone.

Hijazi, A.; Madhavan, V.

2008-08-01

176

A new nuclear medicine scintillation camera based on image-intensifier tubes.  

PubMed

A large-field scintilation camera for nuclear medicine application has recently been developed by Old Delft. The system is based on a large-field image-intensifier tube preceded by a scintillator mosaic. A comparison is made with present state-of-the-art scintillation cameras in terms of modulation transfer function (MTF) and sensitivity. These parameters, which determine the performance of scintillation cameras, are not independent of each other. Therefore, a comparative evaluation should be made under well-defined and identical conditions. The new scintillation camera achieves considerable improvement in image quality. In fact, the intrinsic MTF of the new camera is rather close to unity in the spatial frequency range up to 1 line pair per centimeter (1p/cm). Further improvement would require a fundamentally new approach to gamma imaging, free of the limitations of conventional collimators (e.g., coded-aperture imaging techniques). PMID:978249

Mulder, H; Pauwels, E K

1976-11-01

177

Development of filter exchangeable 3CCD camera for multispectral imaging acquisition  

NASA Astrophysics Data System (ADS)

There are a lot of methods to acquire multispectral images. Dynamic band selective and area-scan multispectral camera has not developed yet. This research focused on development of a filter exchangeable 3CCD camera which is modified from the conventional 3CCD camera. The camera consists of F-mounted lens, image splitter without dichroic coating, three bandpass filters, three image sensors, filer exchangeable frame and electric circuit for parallel image signal processing. In addition firmware and application software have developed. Remarkable improvements compared to a conventional 3CCD camera are its redesigned image splitter and filter exchangeable frame. Computer simulation is required to visualize a pathway of ray inside of prism when redesigning image splitter. Then the dimensions of splitter are determined by computer simulation which has options of BK7 glass and non-dichroic coating. These properties have been considered to obtain full wavelength rays on all film planes. The image splitter is verified by two line lasers with narrow waveband. The filter exchangeable frame is designed to make swap bandpass filters without displacement change of image sensors on film plane. The developed 3CCD camera is evaluated to application of detection to scab and bruise on Fuji apple. As a result, filter exchangeable 3CCD camera could give meaningful functionality for various multispectral applications which need to exchange bandpass filter.

Lee, Hoyoung; Park, Soo Hyun; Kim, Moon S.; Noh, Sang Ha

2012-05-01

178

Hubble Space Telescope Planetary Camera images of R136  

NASA Technical Reports Server (NTRS)

Images obtained with the Planetary Camera on the HST is used here to study the stellar population of R136, the core of the 30 Doradus cluster. It is found that R136a, the brightest knot at the center of R136, is indeed a tight cluster of stars containing at least 12 components in a 1 arcsec region. Three of the stars are of the Wolf-Rayet (W-R) type. The brightest stars have luminosities consistent with their being massive O supergiants or W-R stars. The stellar mass density in R136a is at least a million tiems that of the solar neighborhood. In the larger region known as R136, the magnitudes of 214 stars are detected and measured. A color-magnitude diagram shows a range of stars from luminous O supergiants to ZAMS B3 stars. The diagram is very similar to that of stars outside of R136. A surface brightness profile constructed from stellar photometry is best fit by a pure power law.

Campbell, Bel; Hunter, Deidre A.; Holtzman, Jon A.; Lauer, Tod R.; Shaya, Edward J.; Code, Arthur; Faber, S. M.; Groth, Edward J.; Light, Robert M.; Lynds, Roger

1992-01-01

179

Cloud Detection with the Earth Polychromatic Imaging Camera (EPIC)  

NASA Technical Reports Server (NTRS)

The Earth Polychromatic Imaging Camera (EPIC) on board the Deep Space Climate Observatory (DSCOVR) would provide a unique opportunity for Earth and atmospheric research due not only to its Lagrange point sun-synchronous orbit, but also to the potential for synergistic use of spectral channels in both the UV and visible spectrum. As a prerequisite for most applications, the ability to detect the presence of clouds in a given field of view, known as cloud masking, is of utmost importance. It serves to determine both the potential for cloud contamination in clear-sky applications (e.g., land surface products and aerosol retrievals) and clear-sky contamination in cloud applications (e.g., cloud height and property retrievals). To this end, a preliminary cloud mask algorithm has been developed for EPIC that applies thresholds to reflected UV and visible radiances, as well as to reflected radiance ratios. This algorithm has been tested with simulated EPIC radiances over both land and ocean scenes, with satisfactory results. These test results, as well as algorithm sensitivity to potential instrument uncertainties, will be presented.

Meyer, Kerry; Marshak, Alexander; Lyapustin, Alexei; Torres, Omar; Wang, Yugie

2011-01-01

180

MEM-FLIM: all-solid-state camera for fluorescence lifetime imaging  

NASA Astrophysics Data System (ADS)

We have built an all-solid-state camera which is directly modulated at the pixel level for frequency domain fluorescence lifetime imaging microscopy (FLIM) measurement. This novel camera eliminates the need for an image intensifier through the use of an application-specific CCD design, which is being used in a frequency domain FLIM system. The first stage of evaluation for the camera has been carried out. Camera characteristics such as noise distribution, dark current influence, camera gain, sampling density, sensitivity, linearity of photometric response, and contrast modulation transfer function have been studied through experiments. We are able to do lifetime measurement using MEM-FLIM cameras for various objects, e.g. fluorescence plastic test slides, fluorescein solution, fixed GFP cells, and GFP - Actin stained live cells.

Zhao, Qiaole; Young, Ian Ted; Schelen, Ben; Schouten, Raymond; van den Oever, Rein; Leenen, René; van Kuijk, Harry; Peters, Inge; Polderdijk, Frank; Bosiers, Jan; Jalink, Kees; de Jong, Sander; van Geest, Bert; Stoop, Karel

2012-02-01

181

The Potential of Dual Camera Systems for Multimodal Imaging of Cardiac Electrophysiology and Metabolism  

PubMed Central

Fluorescence imaging has become a common modality in cardiac electrodynamics. A single fluorescent parameter is typically measured. Given the growing emphasis on simultaneous imaging of more than one cardiac variable, we present an analysis of the potential of dual camera imaging, using as an example our straightforward dual camera system that allows simultaneous measurement of two dynamic quantities from the same region of the heart. The advantages of our system over others include an optional software camera calibration routine that eliminates the need for precise camera alignment. The system allows for rapid setup, dichroic image separation, dual-rate imaging, and high spatial resolution, and it is generally applicable to any two-camera measurement. This type of imaging system offers the potential for recording simultaneously not only transmembrane potential and intracellular calcium, two frequently measured quantities, but also other signals more directly related to myocardial metabolism, such as [K+]e, NADH, and reactive oxygen species, leading to the possibility of correlative multimodal cardiac imaging. We provide a compilation of dye and camera information critical to the design of dual camera systems and experiments.

Holcomb, Mark R.; Woods, Marcella C.; Uzelac, Ilija; Wikswo, John P.; Gilligan, Jonathan M.; Sidorov, Veniamin Y.

2013-01-01

182

Advanced time-of-flight range camera with novel real-time 3D image processing  

Microsoft Academic Search

We present a solid state range camera covering measuring distances from 2 m to 25 m and novel real-time 3D image processing algorithms for object detection, tracking and classification based on the three-dimensional features of the camera's output data. The technology is based on a 64x8 pixel array CMOS image sensor which is capable of capturing three-dimensional images by executing

Bernhard Koenig; Bedrich Hosticka; Peter Mengel; Ludwig Listl

2007-01-01

183

Single-chip color imaging for UHDTV camera with a 33M-pixel CMOS image sensor  

NASA Astrophysics Data System (ADS)

To develop an ultrahigh-definition television (UHDTV) camera-with a resolution 16 times higher than that of HDTV resolution and a frame rate of 60 Hz (progressive)-a compact and high-mobility UHDTV camera using a 33M-pixel CMOS image sensor to provide single-chip color imaging was developed. The sensor has a Bayer color-filter array (CFA), and its output signal format is compatible with the conventional UHDTV camera that uses four 8M-pixel image sensors. The theoretical MTF characteristics of the single-chip camera and a conventional four-8M-pixel CMOS camera were first calculated. A new technique for Bayer CFA demosaicing used for the single-chip UHDTV camera was then evaluated. Finally, a pick-up system for single-chip imaging with a 33M-pixel color CMOS image sensor was measured. The measurement results show that the resolution of this is equivalent to or surpasses that of the conventional four-8M-pixel CMOS camera. The possibility of a practical compact UHDTV camera that makes use of single-chip color imaging was thereby confirmed.

Funatsu, Ryohei; Yamashita, Takayuki; Mitani, Kohji; Nojiri, Yuji

2011-02-01

184

First results from the Faint Object Camera - Images of the gravitational lens system G2237 + 0305  

NASA Technical Reports Server (NTRS)

Images of the gravitational lens system G2237 + 0305 have been obtained with the Faint Object Camera on board the Hubble Space Telescope. A preliminary analysis of these images is reported here and includes measurements of the relative positions and magnitudes of the lensed images of the QSO, and of the lensing galaxy. No evidence is found for a fifth lensed image.

Crane, P.; Albrecht, R.; Barbieri, C.; Blades, J. C.; Boksenberg, A.

1991-01-01

185

Cloud level winds from the Venus Express Monitoring Camera imaging  

NASA Astrophysics Data System (ADS)

Six years of continuous monitoring of Venus by European Space Agency's Venus Express orbiter provides an opportunity to study dynamics of the atmosphere our neighbor planet. Venus Monitoring Camera (VMC) on-board the orbiter has acquired the longest and the most complete so far set of ultra violet images of Venus. These images enable a study the cloud level circulation by tracking motion of the cloud features. The highly elliptical polar orbit of Venus Express provides optimal conditions for observations of the Southern hemisphere at varying spatial resolution. Out of the 2300 orbits of Venus Express over which the images used in the study cover about 10 Venus years. Out of these, we tracked cloud features in images obtained in 127 orbits by a manual cloud tracking technique and by a digital correlation method in 576 orbits. Total number of wind vectors derived in this work is 45,600 for the manual tracking and 391,600 for the digital method. This allowed us to determine the mean circulation, its long-term and diurnal trends, orbit-to-orbit variations and periodicities. We also present the first results of tracking features in the VMC near-IR images. In low latitudes the mean zonal wind at cloud tops (67 ± 2 km following: Rossow, W.B., Del Genio, A.T., Eichler, T. [1990]. J. Atmos. Sci. 47, 2053-2084) is about 90 m/s with a maximum of about 100 m/s at 40-50°S. Poleward of 50°S the average zonal wind speed decreases with latitude. The corresponding atmospheric rotation period at cloud tops has a maximum of about 5 days at equator, decreases to approximately 3 days in middle latitudes and stays almost constant poleward from 50°S. The mean poleward meridional wind slowly increases from zero value at the equator to about 10 m/s at 50°S and then decreases to zero at the pole. The error of an individual measurement is 7.5-30 m/s. Wind speeds of 70-80 m/s were derived from near-IR images at low latitudes. The VMC observations indicate a long term trend for the zonal wind speed at low latitudes to increase from 85 m/s in the beginning of the mission to 110 m/s by the middle of 2012. VMC UV observations also showed significant short term variations of the mean flow. The velocity difference between consecutive orbits in the region of mid-latitude jet could reach 30 m/s that likely indicates vacillation of the mean flow between jet-like regime and quasi-solid body rotation at mid-latitudes. Fourier analysis revealed periodicities in the zonal circulation at low latitudes. Within the equatorial region, up to 35°S, the zonal wind show an oscillation with a period of 4.1-5 days (4.83 days on average) that is close to the super-rotation period at the equator. The wave amplitude is 4-17 m/s and decreases with latitude, a feature of the Kelvin wave. The VMC observations showed a clear diurnal signature. A minimum in the zonal speed was found close to the noon (11-14 h) and maxima in the morning (8-9 h) and in the evening (16-17 h). The meridional component peaks in the early afternoon (13-15 h) at around 50°S latitude. The minimum of the meridional component is located at low latitudes in the morning (8-11 h). The horizontal divergence of the mean cloud motions associated with the diurnal pattern suggests upwelling motions in the morning at low latitudes and downwelling flow in the afternoon in the cold collar region.

Khatuntsev, I. V.; Patsaeva, M. V.; Titov, D. V.; Ignatiev, N. I.; Turin, A. V.; Limaye, S. S.; Markiewicz, W. J.; Almeida, M.; Roatsch, Th.; Moissl, R.

2013-09-01

186

Electronic Still Camera image of Astronaut Claude Nicollier working with RMS  

NASA Technical Reports Server (NTRS)

The robot arm controlling work of Swiss scientist Astronaut Claude Nicollier was photographed with an Electronic Still Camera (ESC), and downlinked to ground controllers soon afterward. Electronic still photography is a technology which provides the means for a handheld camera to electronically capture and digitze an image with resolution approaching film quality.

1993-01-01

187

High-Resolution Position-Sensitive Proportional Counter Camera for Radiochromatographic Imaging.  

National Technical Information Service (NTIS)

A high-resolution proportional counter camera for imaging two- dimensional (2-D) distributions of radionuclides is described. The camera can accommodate wet or dry samples that are separated from the counter gas volume by a 6- mu m Mylar membrane. Using 9...

D. D. Schuresko M. K. Kopp J. A. Harter W. D. Bostick

1988-01-01

188

Lens distortion correction for thermal cameras to improve aerial imaging with small-scale UAVs  

Microsoft Academic Search

Lens distortion as a result of the shape and con- struction of a photographic lens is a common problem in image acquisition. Thermal cameras are no exception to this artifact. So far many methods have been developed to formulate the distortion model and almost all of them exploit the patterns in visible range to calibrate the lenses in RGB cameras.

Saeed Yahyanejad; Jakub Misiorny; Bernhard Rinner

2011-01-01

189

Initial Results and Field Applications of a Polarization Imaging Camera.  

National Technical Information Service (NTIS)

The SALSA linear Stokes polarization camera from Bossa Nova Technologies (520-550 nm) uses an electronically rotated polarization filter to measure four states of polarization nearly simultaneously. Some initial imagery results are presented. Preliminary ...

A. M. Puetz C. Esterline M. Eyler R. C. Olsen

2010-01-01

190

Comparison of three thermal cameras with canine hip area thermographic images.  

PubMed

The objective of this study was to compare the method of thermography by using three different resolution thermal cameras and basic software for thermographic images, separating the two persons taking the thermographic images (thermographers) from the three persons interpreting the thermographic images (interpreters). This was accomplished by studying the repeatability between thermographers and interpreters. Forty-nine client-owned dogs of 26 breeds were enrolled in the study. The thermal cameras used were of different resolutions-80 × 80, 180 × 180 and 320 × 240 pixels. Two trained thermographers took thermographic images of the hip area in all dogs using all three cameras. A total of six thermographic images per dog were taken. The thermographic images were analyzed using appropriate computer software, FLIR QuickReport 2.1. Three trained interpreters independently evaluated the mean temperatures of hip joint areas of the six thermographic images for each dog. The repeatability between thermographers was >0.975 with the two higher-resolution cameras and 0.927 with the lowest resolution camera. The repeatability between interpreters was >0.97 with each camera. Thus, the between-interpreter variation was small. The repeatability between thermographers and interpreters was considered high enough to encourage further studies with thermographic imaging in dogs. PMID:22785576

Vainionpää, Mari; Raekallio, Marja; Tuhkalainen, Elina; Hänninen, Hannele; Alhopuro, Noora; Savolainen, Maija; Junnila, Jouni; Hielm-Björkman, Anna; Snellman, Marjatta; Vainio, Outi

2012-12-01

191

Development of a mini gamma camera for prostate imaging  

Microsoft Academic Search

We have tested a concept of a mini gamma camera based on monolithic arrays of MPPCs from Hamamatsu. CsI(Tl), and Cs(Na) arrays and a thin scintillation GSO plate were tested with 122 keV gammas from 57Co sources. The planned application requires placement of this mini-camera in an endorectal probe and thus needs to be very compact and possess high spatial

S. Majewski; J. Proffitt; A. Stolin

2011-01-01

192

Fast visible 2-D camera imaging on NSTX  

NASA Astrophysics Data System (ADS)

Initial applications of 2-D cameras on tokamak devices were mainly used to gain qualitative insight into global dynamics such as plasma formation and motion. Quantitative data were limited, due in large part to the time scales of most plasma events that were far faster than the frame rate of the camera and the inability to resolve structures at the characteristic plasma scale lengths. Recent advances in the development of fast cameras increased the frame rate up to megahertz rates while exposure times as short as 3nsec have been achieved with intensifiers. NSTX supports 8 fast visible cameras with framing rates ranging from 10^3 to 10^6 frames/sec with resolution from 64 X 64 to 1024 X 1080 pixels at the shortest exposure times. These cameras have been used to study plasma phenomena including edge and divertor turbulence, ELMs, and impurity transport. The most recent applications include studies of supersonic gas jets, pellet ablation, incandescent dust behavior, and measurement of fast ion loss of neutral beam injected particles. The capabilities of each camera will be presented and movies of plasma phenomena recorded with them will be presented.

Roquemore, A. L.; Nishino, N.; Maqueda, R.; Bush, C.; Dorf, L.; Shinohara, K.

2005-10-01

193

Effects of environment factors on imaging performance of long focal length space camera  

NASA Astrophysics Data System (ADS)

In course of developing, testing, launching and working in orbit, Space camera has to undergo the shock of external loads and changing environment. The optical performance of a long focal length space camera is largely determined by external mechanical loads and ambient temperature. The performance of the camera is a result of the interaction between environment factors. The performance of the optical system should be making an accurate forecast when a modern optical instrument is designed. In this paper, the research methods are reviewed firstly. Then the related technologies are described. The analysis methods of environment temperature and structural characteristics effecting space camera imaging performance are also discussed.

Guo, Quanfeng; Jin, Guang; Dong, Jihong; Li, Wei; Li, Yanchun; Wang, Haiping; Wang, Kejun; Zhao, Weiguo

2012-10-01

194

Imaging Asteroid 4 Vesta Using the Framing Camera  

NASA Technical Reports Server (NTRS)

The Framing Camera (FC) onboard the Dawn spacecraft serves a dual purpose. Next to its central role as a prime science instrument it is also used for the complex navigation of the ion drive spacecraft. The CCD detector with 1024 by 1024 pixels provides the stability for a multiyear mission and its high requirements of photometric accuracy over the wavelength band from 400 to 1000 nm covered by 7 band-pass filters. Vesta will be observed from 3 orbit stages with image scales of 227, 63, and 17 m/px, respectively. The mapping of Vesta s surface with medium resolution will be only completed during the exit phase when the north pole will be illuminated. A detailed pointing strategy will cover the surface at least twice at similar phase angles to provide stereo views for reconstruction of the topography. During approach the phase function of Vesta was determined over a range of angles not accessible from earth. This is the first step in deriving the photometric function of the surface. Combining the topography based on stereo tie points with the photometry in an iterative procedure will disclose details of the surface morphology at considerably smaller scales than the pixel scale. The 7 color filters are well positioned to provide information on the spectral slope in the visible, the depth of the strong pyroxene absorption band, and their variability over the surface. Cross calibration with the VIR spectrometer that extends into the near IR will provide detailed maps of Vesta s surface mineralogy and physical properties. Georeferencing all these observation will result in a coherent and unique data set. During Dawn s approach and capture FC has already demonstrated its performance. The strong variation observed by the Hubble Space Telescope can now be correlated with surface units and features. We will report on results obtained from images taken during survey mode covering the whole illuminated surface. Vesta is a planet-like differentiated body, but its surface gravity and escape velocity are comparable to those of other asteroids and hence much smaller than those of the inner planets or

Keller, H. Uwe; Nathues, Andreas; Coradini, Angioletta; Jaumann, Ralf; Jorda, Laurent; Li, Jian-Yang; Mittlefehldt, David W.; Mottola, Stefano; Raymond, C. A.; Schroeder, Stefan E.

2011-01-01

195

A New Lunar Atlas: Mapping the Moon with the Wide Angle Camera  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter (LRO) spacecraft launched in June 2009 and began systematically mapping the lunar surface and providing a priceless dataset for the planetary science community and future mission planners. From 20 September 2009 to 11 December 2011, the spacecraft was in a nominal 50 km polar orbit, except for two one-month long periods when a series of spacecraft maneuvers enabled low attitude flyovers (as low as 22 km) of key exploration and scientifically interesting targets. One of the instruments, the Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) [1], captured nearly continuous synoptic views of the illuminated lunar surface. The WAC is a 7-band (321, 360, 415, 566, 604, 643, 689 nm) push frame imager with field of view of 60° in color mode and 90° in monochrome mode. This broad field of view enables the WAC to reimage nearly 50% (at the equator where the orbit tracks our spaced the furthest) of the terrain it imaged in the previous orbit. The visible bands of map projected WAC images have a pixel scale of 100 m, while UV bands have a pixel scale of 400 m due to 4x4 pixel on-chip binning that increases signal-to-noise. The nearly circular polar orbit and short (two hour) orbital periods enable seamless mosaics of broad areas of the surface with uniform lighting and resolution. In March of 2011, the LROC team released the first version of the global monochrome (643nm) morphologic map [2], which was comprised of 15,000 WAC images collected over three periods. With the over 130,000 WAC images collected while the spacecraft was in the 50 km orbit, a new set of mosaics are being produced by the LROC Team and will be released to the Planetary Data Systems. These new maps include an updated morphologic map with an improved set of images (limiting illumination variations and gores due to off-nadir observation of other instruments) and a new photometric correction derived from the LROC WAC dataset. In addition, a higher sun (lower incidence angle) mosaic will also be released. This map has minimal shadows and highlights albedo differences. In addition, seamless regional WAC mosaics acquired under multiple lighting geometries (Sunlight coming from the East, overhead, and West) will also be produced for key areas of interest. These new maps use the latest terrain model (LROC WAC GLD100) [3], updated spacecraft ephemeris provided by the LOLA team [4], and improved WAC distortion model [5] to provide accurate placement of each WAC pixel on the lunar surface. References: [1] Robinson et al. (2010) Space Sci. Rev. [2] Speyerer et al. (2011) LPSC, #2387. [3] Scholten et al. (2012) JGR. [4] Mazarico et al. (2012) J. of Geodesy [5] Speyerer et al. (2012) ISPRS Congress.

Speyerer, E.; Robinson, M. S.; Boyd, A.; Sato, H.

2012-12-01

196

Phenological research using digital image archives: how important is camera system choice?  

NASA Astrophysics Data System (ADS)

Phenological research has been improved by continuous automated monitoring of vegetation canopies using digital cameras and webcams. Most cameras used for this purpose have a native capture system in the red (R) - green (G) - blue (B) color space, which can be used for simple visual inspection but also for separate extraction of color information as RGB digital numbers that allow for quantitative analysis of vegetation status. One overlooked aspect is the choice of appropriate camera system. Ultimately, camera system choice together with atmospheric and illumination conditions dictates image quality (e.g., sharpness, noise and dynamic range, color accuracy and balance), and thus the usefulness of color information for phenological research. In addition, no standardized protocol exists regarding the extraction of representative RGB time series from digital image archives. In this research we compare image archives (fall 2010) obtained at a temperate broadleaf forest (Harvard Forest) with different types of digital cameras and webcams with different image sensors (i.e., CMOS vs. CCD) to assess the impact of image quality on color information for phenological research. Furthermore, we developed a protocol based on moving window quantiles to extract daily RGB time series to maximize the phenological information content of image archives. Preliminary results suggest that image quality and thus camera system choice is of secondary importance compared to the technique used to extract robust daily RGB time series for phenological research in temperate broadleaf forests.

Sonnentag, O.; Hufkens, K.; Teshera-Sterne, C.; Young, A. M.; Richardson, A. D.

2010-12-01

197

The RAW Image File Format Problem - Applications of Digital SLR Cameras in Astronomy and Science  

NASA Astrophysics Data System (ADS)

This work describes the application of digital single lens reflex (SLR) cameras in astronomy science. A definition of the term raw file format is given. As with formats like FITS or TIFF, the CR2 file format is explained beeing a dis- closed, and well documented raw file format. Results from raw image file decom- position with different software tools, and a comparison of imaging with a digital SLR camera and a professional grade CCD detector are presented. The analysis of noise and image histograms will help to recover problems with raw file format de- composition tools and their adjustment. Without doubt, digital SLR cameras are suited for special scientific applications. Further implications from this work are a detailed analysis of noise characteristics to achieve high quality imaging at very low light levels from raw files taken with digital SLR cameras.

Bauer, Thilo

2008-02-01

198

Development of gamma ray imaging cameras. Progress report for second year  

SciTech Connect

In January 1990, the Department of Energy initiated this project with the objective to develop the technology for general purpose, portable gamma ray imaging cameras useful to the nuclear industry. The ultimate goal of this R&D initiative is to develop the analog to the color television camera where the camera would respond to gamma rays instead of visible photons. The two-dimensional real-time image would be displayed would indicate the geometric location of the radiation relative to the camera`s orientation, while the brightness and ``color`` would indicate the intensity and energy of the radiation (and hence identify the emitting isotope). There is a strong motivation for developing such a device for applications within the nuclear industry, for both high- and low-level waste repositories, for environmental restoration problems, and for space and fusion applications. At present, there are no general purpose radiation cameras capable of producing spectral images for such practical applications. At the time of this writing, work on this project has been underway for almost 18 months. Substantial progress has been made in the project`s two primary areas: mechanically-collimated (MCC) and electronically-collimated camera (ECC) designs. We present developments covering the mechanically-collimated design, and then discuss the efforts on the electronically-collimated camera. The renewal proposal addresses the continuing R&D efforts for the third year effort. 8 refs.

Wehe, D.K.; Knoll, G.F.

1992-05-28

199

Low-cost printing of computerised tomography (CT) images where there is no dedicated CT camera.  

PubMed

Many developing countries still rely on conventional hard copy images to transfer information among physicians. We have developed a low-cost alternative method of printing computerised tomography (CT) scan images where there is no dedicated camera. A digital camera is used to photograph images from the CT scan screen monitor. The images are then transferred to a PC via a USB port, before being printed on glossy paper using an inkjet printer. The method can be applied to other imaging modalities like ultrasound and MRI and appears worthy of emulation elsewhere in the developing world where resources and technical expertise are scarce. PMID:17785022

Tabari, Abdulkadir M

2007-01-01

200

A 58 x 62 pixel Si:Ga array camera for 5 - 14 micron astronomical imaging  

NASA Technical Reports Server (NTRS)

A new infrared array camera system has been successfully applied to high background 5 - 14 micron astronomical imaging photometry observations, using a hybrid 58 x 62 pixel Si:Ga array detector. The off-axis reflective optical design incorporating a parabolic camera mirror, circular variable filter wheel, and cold aperture stop produces diffraction-limited images with negligible spatial distortion and minimum thermal background loading. The camera electronic system architecture is divided into three subsystems: (1) high-speed analog front end, including 2-channel preamp module, array address timing generator, bias power suppies, (2) two 16 bit, 3 microsec per conversion A/D converters interfaced to an arithmetic array processor, and (3) an LSI 11/73 camera control and data analysis computer. The background-limited observational noise performance of the camera at the NASA/IRTF telescope is NEFD (1 sigma) = 0.05 Jy/pixel min exp 1/2.

Gezari, D. Y.; Folz, W. C.; Woods, L. A.; Wooldridge, J. B.

1989-01-01

201

Modulated electron-multiplied fluorescence lifetime imaging microscope: all-solid-state camera for fluorescence lifetime imaging  

NASA Astrophysics Data System (ADS)

We have built an all-solid-state camera that is directly modulated at the pixel level for frequency-domain fluorescence lifetime imaging microscopy (FLIM) measurements. This novel camera eliminates the need for an image intensifier through the use of an application-specific charge coupled device design in a frequency-domain FLIM system. The first stage of evaluation for the camera has been carried out. Camera characteristics such as noise distribution, dark current influence, camera gain, sampling density, sensitivity, linearity of photometric response, and optical transfer function have been studied through experiments. We are able to do lifetime measurement using our modulated, electron-multiplied fluorescence lifetime imaging microscope (MEM-FLIM) camera for various objects, e.g., fluorescein solution, fixed green fluorescent protein (GFP) cells, and GFP-actin stained live cells. A detailed comparison of a conventional microchannel plate (MCP)-based FLIM system and the MEM-FLIM system is presented. The MEM-FLIM camera shows higher resolution and a better image quality. The MEM-FLIM camera provides a new opportunity for performing frequency-domain FLIM.

Zhao, Qiaole; Schelen, Ben; Schouten, Raymond; van den Oever, Rein; Leenen, René; van Kuijk, Harry; Peters, Inge; Polderdijk, Frank; Bosiers, Jan; Raspe, Marcel; Jalink, Kees; de Jong, Jan Geert Sander; van Geest, Bert; Stoop, Karel; Young, Ian Ted

2012-12-01

202

NaI(Tl) camera crystals: imaging capabilities of hydrated regions on the crystal surface  

Microsoft Academic Search

Sodium iodide (NaI(Tl)) camera crystals may become hydrated. Hydration results in small spots on the crystal surface that at first are white but later turn yellowish. The appearance of these spots--as any artifact--in a diagnostic image is to be avoided, and the crystal should be replaced. The image of the hydrated spots of an aged NaI(Tl) camera crystal has been

Keszthelyi-Landori

1986-01-01

203

Low-Light AutoFocus Enhancement for Digital and CellPhone Camera Image Pipelines  

Microsoft Academic Search

Images captured by a digital or cell-phone camera in low-light environments usually suffer from a lack of sharpness due to the failure of the camera's passive auto-focus (AF) system to locate the peak in-focus position of a sharpness function that is extracted from the image. In low-light, the sharpness function becomes flat, making it quite difficult to locate the peak.In

Mark Gamadia; Nasser Kehtarnavaz; Katie Roberts-Hoffman

2007-01-01

204

Development of an ultra-violet digital camera for volcanic SO2 imaging  

Microsoft Academic Search

In an effort to improve monitoring of passive volcano degassing, we have constructed and tested a digital camera for quantifying the sulfur dioxide (SO2) content of volcanic plumes. The camera utilizes a bandpass filter to collect photons in the ultra-violet (UV) region where SO2 selectively absorbs UV light. SO2 is quantified by imaging calibration cells of known SO2 concentrations. Images

G. J. S. Bluth; J. M. Shannon; I. M. Watson; A. J. Prata; V. J. Realmuto

2007-01-01

205

Digital image correlation for whole field out-of-plane displacement measurement using a single camera  

Microsoft Academic Search

Digital image correlation (DIC) using a single camera has been widely used for in-plane displacement and strain measurement. However, in order to obtain out-of-plane displacement, one should employ stereo vision systems which enable multiple directions detection. In this paper, we develop a simple method for whole field out-of-plane displacement measurement using only one camera. The proposed method employs digital image

C. J. Tay; C. Quan; Y. H. Huang; Y. Fu

2005-01-01

206

Applications of the BAE SYSTEMS MicroIR uncooled infrared thermal imaging cameras  

Microsoft Academic Search

MicroIR uncooled infrared imaging modules (based on VOx microbolometers), developed and manufactured at BAE SYSTEMS, are integrated into ruggedized, weatherproof camera systems and are currently supporting numerous security and surveillance applications. The introduction of uncooled thermal imaging has permitted the expansion of traditional surveillance and security perimeters. MicroIR cameras go beyond the imagery limits of visible and low-light short wavelength

Heather A. Wickman; John J. Henebury Jr.; Dennis R. Long

2003-01-01

207

Application of spatial frequency response as a criterion for evaluating thermal imaging camera performance  

NASA Astrophysics Data System (ADS)

Police, firefighters, and emergency medical personnel are examples of first responders that are utilizing thermal imaging cameras in a very practical way every day. However, few performance metrics have been developed to assist first responders in evaluating the performance of thermal imaging technology. This paper describes one possible metric for evaluating spatial resolution using an application of Spatial Frequency Response (SFR) calculations for thermal imaging. According to ISO 12233, the SFR is defined as the integrated area below the Modulation Transfer Function (MTF) curve derived from the discrete Fourier transform of a camera image representing a knife-edge target. This concept is modified slightly for use as a quantitative analysis of the camera's performance by integrating the area between the MTF curve and the camera's characteristic nonuniformity, or noise floor, determined at room temperature. The resulting value, which is termed the Effective SFR, can then be compared with a spatial resolution value obtained from human perception testing of task specific situations to determine the acceptability of the performance of thermal imaging cameras. The testing procedures described herein are being developed as part of a suite of tests for possible inclusion into a performance standard on thermal imaging cameras for first responders.

Lock, Andrew; Amon, Francine

2008-05-01

208

Reconstruction of face image from security camera based on a measurement of space variant PSF  

Microsoft Academic Search

Images recorded by security camera are often severely degraded due to dirty lens or secular distortion of the recording system. To restore these images, the fully determination of space variant point spread function (PSF) is required. To measure PSF, we have proposed a method using a liquid crystal display, and shown some experimental results to restore the images by a

Tadashi Ito; Hitoshi Hoshino; Yusaku Fujii; Naoya Ohta

2009-01-01

209

Video-rate THz imaging using a microbolometer-based camera  

Microsoft Academic Search

A THz 160×120 pixel array camera has been developed at INO. Real-time transmission and reflectance imaging at video rates of 30 frames\\/s were performed with a low-power 3 THz quantum cascade laser. Various hidden objects were imaged, proving feasibility of real-time THz imaging for security screening applications.

M. Bolduc; L. Marchese; B. Tremblay; M. Doucet; M. Terroux; H. Oulachgar; L. Le Noc; C. Alain; H. Jerominek; A. Bergeron

2010-01-01

210

Image fusion for a digital camera application based on wavelet domain hidden Markov models  

Microsoft Academic Search

The traditional image fusion for a digital camera application may not be satisfactory to classify pixels in the source image by the statistical techniques. In this paper, we present a technique, based on wavelet domain hidden Markov models (HMMs) and max-likelihood estimation. The method presented here consists of deciding the quality of pixels in source images directly from the statistical

S. X. Hu; Z. W. Liao; Y. Y. Tang

2004-01-01

211

A high speed camera system based on an image sensor in standard CMOS technology  

Microsoft Academic Search

In this contribution a novel camera system developed for high speed imaging will be presented. The core of the system consists of a CMOS image sensor manufactured in a 1 ?m standard CMOS process. The special merit of the image sensor is the capability to acquire more than 1000 frames\\/s using a global electronic shutter in each sensor cell. The

Nenad Stevanovic; Matthias Hillebrand; Bedrich J. Hosticka; Uri Iurgel; Andreas Teuner

1999-01-01

212

An image compression algorithm for a high-resolution digital still camera  

NASA Technical Reports Server (NTRS)

The Electronic Still Camera (ESC) project will provide for the capture and transmission of high-quality images without the use of film. The image quality will be superior to video and will approach the quality of 35mm film. The camera, which will have the same general shape and handling as a 35mm camera, will be able to send images to earth in near real-time. Images will be stored in computer memory (RAM) in removable cartridges readable by a computer. To save storage space, the image will be compressed and reconstructed at the time of viewing. Both lossless and loss-y image compression algorithms are studied, described, and compared.

Nerheim, Rosalee

1989-01-01

213

Source camera identification for heavily JPEG compressed low resolution still images.  

PubMed

In this research, we examined whether fixed pattern noise or more specifically Photo Response Non-Uniformity (PRNU) can be used to identify the source camera of heavily JPEG compressed digital photographs of resolution 640 x 480 pixels. We extracted PRNU patterns from both reference and questioned images using a two-dimensional Gaussian filter and compared these patterns by calculating the correlation coefficient between them. Both the closed and open-set problems were addressed, leading the problems in the closed set to high accuracies for 83% for single images and 100% for around 20 simultaneously identified questioned images. The correct source camera was chosen from a set of 38 cameras of four different types. For the open-set problem, decision levels were obtained for several numbers of simultaneously identified questioned images. The corresponding false rejection rates were unsatisfactory for single images but improved for simultaneous identification of multiple images. PMID:19432739

Alles, Erwin J; Geradts, Zeno J M H; Veenman, Cor J

2009-05-01

214

Automated camera-phone experience with the frequency of imaging necessary to capture diet.  

PubMed

Camera-enabled cell phones provide an opportunity to strengthen dietary recall through automated imaging of foods eaten during a specified period. To explore the frequency of imaging needed to capture all foods eaten, we examined the number of images of individual foods consumed in a pilot study of automated imaging using camera phones set to an image-capture frequency of one snapshot every 10 seconds. Food images were tallied from 10 young adult subjects who wore the phone continuously during the work day and consented to share their images. Based on the number of images received for each eating experience, the pilot data suggest that automated capturing of images at a frequency of once every 10 seconds is adequate for recording foods consumed during regular meals, whereas a greater frequency of imaging is necessary to capture snacks and beverages eaten quickly. PMID:20656101

Arab, Lenore; Winter, Ashley

2010-08-01

215

Joint estimation of high resolution images and depth maps from light field cameras  

NASA Astrophysics Data System (ADS)

Light field cameras are attracting much attention as tools for acquiring 3D information of a scene through a single camera. The main drawback of typical lenselet-based light field cameras is the limited resolution. This limitation comes from the structure where a microlens array is inserted between the sensor and the main lens. The microlens array projects 4D light field on a single 2D image sensor at the sacrifice of the resolution; the angular resolution and the position resolution trade-off under the fixed resolution of the image sensor. This fundamental trade-off remains after the raw light field image is converted to a set of sub-aperture images. The purpose of our study is to estimate a higher resolution image from low resolution sub-aperture images using a framework of super-resolution reconstruction. In this reconstruction, these sub-aperture images should be registered as accurately as possible. This registration is equivalent to depth estimation. Therefore, we propose a method where super-resolution and depth refinement are performed alternatively. Most of the process of our method is implemented by image processing operations. We present several experimental results using a Lytro camera, where we increased the resolution of a sub-aperture image by three times horizontally and vertically. Our method can produce clearer images compared to the original sub-aperture images and the case without depth refinement.

Ohashi, Kazuki; Takahashi, Keita; Fujii, Toshiaki

2014-03-01

216

Using camera calibration to apply digital image correlation outside the laboratory  

NASA Astrophysics Data System (ADS)

An innovative single-camera two-dimensional digital image correlation (DIC) technique based on camera self-calibration is developed for use in the field where specialized fixed setups are not practical. The technique only requires attaching a planar calibration cover to the specimen surface and capturing images of the specimen from different orientations before and after deformation. A camera calibration procedure allows the camera to be freely repositioned without fixed mounts or known configurations, after which displacements are calculated with DIC. Computer simulated, random speckle images are used to test the proposed technique and good results are reported. Compared with the classical techniques which require precise fixed setups, the proposed technique is easier to use and more flexible, advancing the DIC beyond the laboratory into the real world.

Liang, Zhenning; Yin, Bo; Dai, Xin; Mo, Jinqiu; Wang, Shigang

2013-12-01

217

UCXp camera imaging principle and key technologies of data post-processing  

NASA Astrophysics Data System (ADS)

The large format digital aerial camera product UCXp was introduced into the Chinese market in 2008, the image consists of 17310 columns and 11310 rows with a pixel size of 6 mm. The UCXp camera has many advantages compared with the same generation camera, with multiple lenses exposed almost at the same time and no oblique lens. The camera has a complex imaging process whose principle will be detailed in this paper. On the other hand, the UCXp image post-processing method, including data pre-processing and orthophoto production, will be emphasized in this article. Based on the data of new Beichuan County, this paper will describe the data processing and effects.

Yuan, Fangyan; Li, Guoqing; Zuo, Zhengli; Liu, Jianmin; Wu, Liang; Yu, Xiaoping; Zhao, Haitao

2014-03-01

218

Matching Images to Models: Camera Calibration for 3-D Surface Reconstruction  

NASA Technical Reports Server (NTRS)

In a previous paper we described a system which recursively recovers a super-resolved three dimensional surface model from a set of images of the surface. In that paper we assumed that the camera calibration for each image was known. In this paper we solve two problems. Firstly, if an estimate of the surface is already known, the problem is to calibrate a new image relative to the existing surface model. Secondly, if no surface estimate is available, the relative camera calibration between the images in the set must be estimated. This will allow an initial surface model to be estimated. Results of both types of estimation are given.

Morris, Robin D.; Smelyanskiy, Vadim N.; Cheeseman. Peter C.; Norvig, Peter (Technical Monitor)

2001-01-01

219

In-plane displacement and strain measurements using a camera phone and digital image correlation  

NASA Astrophysics Data System (ADS)

In-plane displacement and strain measurements of planar objects by processing the digital images captured by a camera phone using digital image correlation (DIC) are performed in this paper. As a convenient communication tool for everyday use, the principal advantages of a camera phone are its low cost, easy accessibility, and compactness. However, when used as a two-dimensional DIC system for mechanical metrology, the assumed imaging model of a camera phone may be slightly altered during the measurement process due to camera misalignment, imperfect loading, sample deformation, and temperature variations of the camera phone, which can produce appreciable errors in the measured displacements. In order to obtain accurate DIC measurements using a camera phone, the virtual displacements caused by these issues are first identified using an unstrained compensating specimen and then corrected by means of a parametric model. The proposed technique is first verified using in-plane translation and out-of-plane translation tests. Then, it is validated through a determination of the tensile strains and elastic properties of an aluminum specimen. Results of the present study show that accurate DIC measurements can be conducted using a common camera phone provided that an adequate correction is employed.

Yu, Liping; Pan, Bing

2014-05-01

220

X-ray imaging using a consumer-grade digital camera  

NASA Astrophysics Data System (ADS)

The recent advancements in consumer-grade digital camera technology and the introduction of high-resolution, high sensitivity CsBr:Eu 2+ storage phosphor imaging plates make possible a new cost-effective technique for X-ray imaging. The imaging plate is bathed with red stimulating light by high-intensity light-emitting diodes, and the photostimulated image is captured with a digital single-lens reflex (SLR) camera. A blue band-pass optical filter blocks the stimulating red light but transmits the blue photostimulated luminescence. Using a Canon D5 Mk II camera and an f1.4 wide-angle lens, the optical image of a 240×180 mm 2 Konica CsBr:Eu 2+ imaging plate from a position 230 mm in front of the camera lens can be focussed so as to laterally fill the 35×23.3 mm 2 camera sensor, and recorded in 2808×1872 pixel elements, corresponding to an equivalent pixel size on the plate of 88 ?m. The analogue-to-digital conversion from the camera electronics is 13 bits, but the dynamic range of the imaging system as a whole is limited in practice by noise to about 2.5 orders of magnitude. The modulation transfer function falls to 0.2 at a spatial frequency of 2.2 line pairs/mm. The limiting factor of the spatial resolution is light scattering in the plate rather than the camera optics. The limiting factors for signal-to-noise ratio are shot noise in the light, and dark noise in the CMOS sensor. Good quality images of high-contrast objects can be recorded with doses of approximately 1 mGy. The CsBr:Eu 2+ plate has approximately three times the readout sensitivity of a similar BaFBr:Eu 2+ plate.

Winch, N. M.; Edgar, A.

2011-10-01

221

Mid-Wave Infrared Camera for High Speed Hyperspectral Imaging.  

National Technical Information Service (NTIS)

The MWIR imaging system acquired under the DURIP grant was incorporated into a rotating compensator imaging polarimeter. In such systems, several images of the target scene are captured through a retarder and analyzer, with the retarder rotated to a diffe...

D. Sabatke E. Dereniak J. Garcia

2001-01-01

222

Camera Animation  

NSDL National Science Digital Library

A general discussion of the use of cameras in computer animation. This section includes principles of traditional film techniques and suggestions for the use of a camera during an architectural walkthrough. This section includes html pages, images and one video.

2011-01-30

223

Imaging high-dimensional spatial entanglement with a camera  

PubMed Central

The light produced by parametric down-conversion shows strong spatial entanglement that leads to violations of EPR criteria for separability. Historically, such studies have been performed by scanning a single-element, single-photon detector across a detection plane. Here we show that modern electron-multiplying charge-coupled device cameras can measure correlations in both position and momentum across a multi-pixel field of view. This capability allows us to observe entanglement of around 2,500 spatial states and demonstrate Einstein–Podolsky–Rosen type correlations by more than two orders of magnitude. More generally, our work shows that cameras can lead to important new capabilities in quantum optics and quantum information science.

Edgar, M.P.; Tasca, D.S.; Izdebski, F.; Warburton, R.E.; Leach, J.; Agnew, M.; Buller, G.S.; Boyd, R.W.; Padgett, M.J.

2012-01-01

224

The Lunar Student Imaging Project (LSIP): Bringing the Excitement of Lunar Exploration to Students Using LRO Mission Data  

NASA Astrophysics Data System (ADS)

In June 2009, NASA launched the Lunar Reconnaissance Orbiter (LRO) - the first mission in NASA's Vision for Space Exploration, a plan to return to the Moon and then to travel to Mars and beyond. LRO is equipped with seven instruments including the Lunar Reconnaissance Orbiter Camera (LROC), a system of two narrow-angle cameras and one wide-angle camera, controlled by scientists in the School of Earth and Space Exploration at Arizona State University. The orbiter will have a one-year primary mission in a 50 km polar orbit. The measurements from LROC will uncover much-needed information about potential landing sites and will help generate a meter scale map of the lunar surface. With support from NASA Goddard Space Flight Center, the LROC Science Operations Center and the ASU Mars Education Program, have partnered to develop an inquiry-based student program, the Lunar Student Imaging Project (LSIP). Based on the nationally recognized, Mars Student Imaging Project (MSIP), LSIP uses cutting-edge NASA content and remote sensing data to involve students in authentic lunar exploration. This program offers students (grades 5-14) immersive experiences where they can: 1) target images of the lunar surface, 2) interact with NASA planetary scientists, mission engineers and educators, and 3) gain access to NASA curricula and materials developed to enhance STEM learning. Using a project based learning model, students drive their own research and learn first hand what it’s like to do real planetary science. The LSIP curriculum contains a resource manual and program guide (including lunar feature identification charts, classroom posters, and lunar exploration time line) and a series of activities covering image analysis, relative age dating and planetary comparisons. LSIP will be based upon the well-tested MSIP model, and will encompass onsite as well as distance learning components.

Taylor, W. L.; Roberts, D.; Burnham, R.; Robinson, M. S.

2009-12-01

225

Design considerations of color image processing pipeline for digital cameras  

Microsoft Academic Search

Although many individual image processing steps have been well addressed in the field, very few good image pipeline designs were proposed to integrate these processing stages. In this paper, a new color image processing pipeline (IPP), which processes the image raw data captured from CCD\\/CMOS sensors and converts to the final color with exposure corrected, is presented, it bridges the

Wen-Chung Kao; Sheng-Hong Wang; Lien-Yang Chen; Sheng-Yuan Lin

2006-01-01

226

Design and evaluation of gamma imaging systems of Compton and hybrid cameras  

NASA Astrophysics Data System (ADS)

Systems for imaging and spectroscopy of gamma-ray emission have been widely applied in environment and medicine applications. The superior performance of LaBr3:Ce detectors established them as excellent candidates for imaging and spectroscopy of gamma-rays. In this work, Compton cameras and hybrid cameras with a two-plane array of LaBr3:Ce detectors, one for the scattering and one for the absorbing detector arrays were designed and investigated. The feasibility of using LaBr3 in Compton cameras was evaluated with a bench top experiment in which two LaBr3:Ce detectors were arranged to mimic a Compton camera with one scattering and eight absorbing detectors. In the hybrid system the combination of the imaging methods of Compton and coded aperture cameras enables the system to cover the energy range of approximately 100 keV to a few MeV with good efficiency and angular resolution. The imaging performance of the hybrid imaging system was evaluated via Monte Carlo simulations. The image reconstruction algorithms of direct back-projections were applied for instant or real time imaging applications; this imaging system is capable of achieving an angular resolution of approximately 0.3 radians (17°). With image reconstruction algorithms of Expectation Maximized Likelihood, the image quality was improved to approximately 0.1 radians (or 6°). For medical applications in proton therapy, a Compton camera system to image the gamma-ray emission during treatment was designed and investigated. Gamma rays and X-rays emitted during treatment illustrate the energy deposition along the path of the proton beams and provide an opportunity for online dose verification. This Compton camera is designed to be capable of imaging gamma rays in 3D and is one of the candidates for imaging gamma emission during the treatment of proton therapy beside of the approach of positron emission tomography. In order to meet the requirement for spatial resolution of approximately 5 mm or less to meaningfully verify the dose via imaging gamma rays of 511 keV to 2 MeV, position sensing techniques with pixilated LaBr3 (Ce) crystal were applied in each detector. The pixilated LaBr3 (Ce) crystal was used in both the scattering and absorbing detectors. Image reconstruction algorithms of OS-EML were applied to obtain 3D images.

Feng, Yuxin

227

Robust real time extraction of plane segments from time-of-flight camera images  

NASA Astrophysics Data System (ADS)

We present a method that extracts plane segments from images of a time-of-flight camera. Future driver assistance systems rely on an accurate description of the vehicle's environment. Time-of-flight cameras can be used for environment perception and for the reconstruction of the environment. Since most structures in urban environments are planar, extracted plane segments from single camera images can be used for the creation of a global map. We present a method for real time detection of planar surface structures from time-of-flight camera data. The concept is based on a planar surface segmentation that serves as the fundament for a subsequent global planar surface extraction. The evaluation demonstrates the ability of the described algorithm to detect planar surfaces form depth data of complex scenarios in real time. We compare our methods to state of the art planar surface extraction algorithms.

Dalbah, Yosef; Koltermann, Dirk; Wahl, Friedrich M.

2014-04-01

228

Optical characterization of UV multispectral imaging cameras for SO2 plume measurements  

NASA Astrophysics Data System (ADS)

Only a few years ago spectral imaging cameras for SO2 plume monitoring were developed for remote sensing of volcanic plumes. We describe the development from a first camera using a single filter in the absorption band of SO2 to more advanced systems using several filters and an integrated spectrometer. The first system was based on the Hamamatsu C8484 UV camera (1344 x 1024 pixels) with high quantum efficiency in the UV region from 280 nm onward. At the heart of the second UV camera system, EnviCam, is a cooled Alta U47 camera, equipped with two on-band (310 and 315 nm) and two off-band (325 and 330 nm) filters. The third system utilizes again the uncooled Hamamatsu camera for faster sampling (~10 Hz) and a four-position filter-wheel equipped with two 10 nm filters centered at 310 and 330 nm, a UV broadband view and a blackened plate for dark-current measurement. Both cameras have been tested with lenses with different focal lengths. A co-aligned spectrometer provides a ~0.3nm resolution spectrum within the field-of-view of the camera. We describe the ground-based imaging cameras systems developed and utilized at our Institute. Custom made cylindrical quartz calibration cells with 50 mm diameter, to cover the entire field of view of the camera optics, are filled with various amounts of gaseous SO2 (typically between 100 and 1500 ppm•m). They are used for calibration and characterization of the cameras in the laboratory. We report about the procedures for monitoring and analyzing SO2 path-concentration and fluxes. This includes a comparison of the calibration in the atmosphere using the SO2 cells versus the SO2 retrieval from the integrated spectrometer. The first UV cameras have been used to monitor ship emissions (Ny-Ålesund, Svalbard and Genova, Italy). The second generation of cameras were first tested for industrial stack monitoring during a field campaign close to the Rovinari (Romania) power plant in September 2010, revealing very high SO2 emissions (> 1000 ppm•m). The second generation cameras are now used by students from several universities in Romania. The newest system has been tested for volcanic plume monitoring at Turrialba, Costa Rica in January, 2011, at Merapi volcani, Indonesia in February 2011, at Lascar volcano in Chile in July 2011 and at Etna/Stromboli (Italy) in November 2011. Retrievals from some of these campaigns will be presented.

Stebel, K.; Prata, F.; Dauge, F.; Durant, A.; Amigo, A.,

2012-04-01

229

The Mast Cameras and Mars Descent Imager (MARDI) for the 2009 Mars Science Laboratory  

NASA Technical Reports Server (NTRS)

Based on operational experience gained during the Mars Exploration Rover (MER) mission, we proposed and were selected to conduct two related imaging experiments: (1) an investigation of the geology and short-term atmospheric vertical wind profile local to the Mars Science Laboratory (MSL) landing site using descent imaging, and (2) a broadly-based scientific investigation of the MSL locale employing visible and very near infra-red imaging techniques from a pair of mast-mounted, high resolution cameras. Both instruments share a common electronics design, a design also employed for the MSL Mars Hand Lens Imager (MAHLI) [1]. The primary differences between the cameras are in the nature and number of mechanisms and specific optics tailored to each camera s requirements.

Malin, M. C.; Bell, J. F.; Cameron, J.; Dietrich, W. E.; Edgett, K. S.; Hallet, B.; Herkenhoff, K. E.; Lemmon, M. T.; Parker, T. J.; Sullivan, R. J.

2005-01-01

230

Microchannel plate pinhole camera for 20 to 100 keV x-ray imaging  

SciTech Connect

We present the design and construction of a sensitive pinhole camera for imaging suprathermal x-rays. Our device is a pinhole camera consisting of four filtered pinholes and microchannel plate electron multiplier for x-ray detection and signal amplification. We report successful imaging of 20, 45, 70, and 100 keV x-ray emissions from the fusion targets at our Novette laser facility. Such imaging reveals features of the transport of hot electrons and provides views deep inside the target.

Wang, C.L.; Leipelt, G.R.; Nilson, D.G.

1984-10-03

231

Reduced isotope dose with rapid SPECT MPI imaging: Initial experience with a CZT SPECT camera  

Microsoft Academic Search

Background  Cardiac SPECT imaging needs to become shorter and use lower radiation doses to compete with other available noninvasive imaging\\u000a modalities. Recently introduced cadmium zinc telluride (CZT) SPECT cameras have the potential to achieve both goals, but clinical\\u000a experience is so far limited.\\u000a \\u000a \\u000a \\u000a \\u000a Methods  Images of all patients undergoing a stress MPI over a 4-month period using a CZT camera (Discovery NM

Lori B. Croft; Tapan Godiwala; Eric Ginsberg; Titus George; Milena J. Henzlova

2010-01-01

232

Reduced isotope dose and imaging time with a high-efficiency CZT SPECT camera  

Microsoft Academic Search

Background  In light of recent focus on diagnostic imaging, cardiac SPECT imaging needs to become a shorter test with lower radiation\\u000a exposure to patients. Recently introduced Cadmium Zinc Telluride (CZT) cameras have the potential to achieve both goals.\\u000a \\u000a \\u000a \\u000a \\u000a Methods  During a 2-month period patients presenting for a Tc-99m sestamibi SPECT MPI study were imaged using a CZT camera using a\\u000a low-dose rest-stress

Lori B. Croft; Eric S. Ginsberg; Andrew J. Einstein; Krista A. Guma; Titus George; Milena J. Henzlova

233

Hybrid Compton camera/coded aperture imaging system  

DOEpatents

A system in one embodiment includes an array of radiation detectors; and an array of imagers positioned behind the array of detectors relative to an expected trajectory of incoming radiation. A method in another embodiment includes detecting incoming radiation with an array of radiation detectors; detecting the incoming radiation with an array of imagers positioned behind the array of detectors relative to a trajectory of the incoming radiation; and performing at least one of Compton imaging using at least the imagers and coded aperture imaging using at least the imagers. A method in yet another embodiment includes detecting incoming radiation with an array of imagers positioned behind an array of detectors relative to a trajectory of the incoming radiation; and performing Compton imaging using at least the imagers.

Mihailescu, Lucian (Livermore, CA); Vetter, Kai M. (Alameda, CA)

2012-04-10

234

A Digital Camera and Real-time Image correction for use in Edge Location  

Microsoft Academic Search

The accuracy of a depth map acquired using tri- angulation or stereo techniques is limited by the res- olution of the sensor, and by the accuracy with which distortions of the image can be calibrated. This pa- per lists the sources of error in an imaging system, and concludes that most of them can be removed by careful camera design

D. Hutber; S. Wright

235

Novel color processing architecture for digital cameras with CMOS image sensors  

Microsoft Academic Search

This paper presents a color processing architecture for digital color cameras utilizing complementary metal oxide semiconductor (CMOS) image sensors. The proposed architecture gives due consideration to the peculiar aspects of CMOS image sensors and the human visual perception related to the particular application of digital color photography. A main difference between the proposed method arid the conventional systems is the

Chaminda Weerasinghe; Wanqing Li; Igor Kharitonenko; Magnus Nilsson; Sue Twelves

2005-01-01

236

Acoustic Mine Imaging (AMI) project: An underwater acoustic camera for use in mine warfare  

Microsoft Academic Search

This paper is submitted to detail the advances in sonar and imaging techniques and synthetic apertures being made in Australia by Thales Underwater Systems within a Australian Defence Acquisition Project termed Acoustic Mine Imaging (AMI). This paper will detail the development of the AMI underwater acoustic camera for the detection, classification and characterization of mines and other underwater objects in

Colin Ellis; Ed Murphy

2001-01-01

237

Fault Detection on Transmission Lines Using a Microphone Array and an Infrared Thermal Imaging Camera  

Microsoft Academic Search

This paper proposes a hierarchical fault detection method for transmission lines using a microphone array to detect the location of a fault and thermal imaging and charge coupled device (CCD) cameras to verify the fault and store the image, respectively. There are partial arc discharges on faulty insulators which generate specific patterns of sound. By detecting these patterns using the

Hyunuk Ha; Sunsin Han; Jangmyung Lee

2012-01-01

238

CRISP: Coarse-Grained Reconfigurable Image Stream Processor for Digital Still Cameras and Camcorders  

Microsoft Academic Search

To design the hardware for image signal processing pipelines in digital still cameras (DSCs) and video camcoders, it is a dilemma for conventional solutions, such as application-specific integrated circuits (ASICs) and digital signal processors (DSPs), to achieve high processing capability at low cost while maintaining high flexibility for various algorithms. With the observation of the characteristics of image signal-processing pipelines,

Jason C. Chen; Shao-yi Chien

2008-01-01

239

Constructing a Database from Multiple 2D Images for Camera Pose Estimation and Robot Localization  

NASA Technical Reports Server (NTRS)

The LMDB (Landmark Database) Builder software identifies persistent image features (landmarks) in a scene viewed multiple times and precisely estimates the landmarks 3D world positions. The software receives as input multiple 2D images of approximately the same scene, along with an initial guess of the camera poses for each image, and a table of features matched pair-wise in each frame. LMDB Builder aggregates landmarks across an arbitrarily large collection of frames with matched features. Range data from stereo vision processing can also be passed to improve the initial guess of the 3D point estimates. The LMDB Builder aggregates feature lists across all frames, manages the process to promote selected features to landmarks, and iteratively calculates the 3D landmark positions using the current camera pose estimations (via an optimal ray projection method), and then improves the camera pose estimates using the 3D landmark positions. Finally, it extracts image patches for each landmark from auto-selected key frames and constructs the landmark database. The landmark database can then be used to estimate future camera poses (and therefore localize a robotic vehicle that may be carrying the cameras) by matching current imagery to landmark database image patches and using the known 3D landmark positions to estimate the current pose.

Wolf, Michael; Ansar, Adnan I.; Brennan, Shane; Clouse, Daniel S.; Padgett, Curtis W.

2012-01-01

240

A high-speed, pressurised multi-wire gamma camera for dynamic imaging in nuclear medicine  

NASA Astrophysics Data System (ADS)

High count rate detectors are of particular interest in nuclear medicine as they permit lower radiation doses to be received by the patient and allow dynamic images of high statistical quality to be obtained. We have developed a high-speed gamma camera based on a multi-wire proportional chamber. The chamber is filled with a xenon gas mixture and has been operated at pressures ranging from 5 to 10 bar. With an active imaging area of 25 cm×25 cm, the chamber has been equipped with an advanced, high rate, digital, electronic read-out system which carries out pulse shaping, energy discrimination, XY coincidence and cluster selection at speeds of up to a few megahertz. In order to ensure stable, long-term operation of the camera without degradation in performance, a gas purification system was designed and integrated into the camera. Measurements have been carried out to determine the properties and applicability of the camera using photon sources in the 20-120 keV energy range. We present some design features of the camera and selected results obtained from preliminary measurements carried out to measure its performance characteristics. Initial images obtained from the camera will also be presented.

Barr, A.; Bonaldi, L.; Carugno, G.; Charpak, G.; Iannuzzi, D.; Nicoletto, M.; Pepato, A.; Ventura, S.

2002-01-01

241

Review of the characteristics of 384x288 pixel THz camera for see-through imaging  

NASA Astrophysics Data System (ADS)

Terahertz is a field in constant expansion. Multiple applications are foreseen including see-through imaging. To develop deployable systems, real-time two-dimensional cameras are needed rather than monopixel detectors or linear arrays that require mechanical scanning systems. INO has recently developed a real-time (video rate) 384x288 THz camera exhibiting excellent sensitivity and low noise levels. The core of the THz imager is the 35 ?m pitch detector array that is based on INO's uncooled VOx microbolometer technology and fabricated in INO's clean room. A standard ceramic package is used for final packaging. The detector FPA is finally sealed with a high resistivity float zone silicon (HRFZ-Si) window having an anti-reflective coating consisting of thick Parylene, which the thickness of which depends on the required optimization wavelength. The FPA is mounted on an INO IRXCAM core giving a passive THz camera assembly. The additional THz objective consists of a refractive 44 mm focal length F/1 THz lens. In this paper, a review of the characteristics of the THz camera at is performed. The sensitivity of the camera at various THz wavelengths is presented along with examples of the resolution obtained with the IRXCAM-384-THz camera core. See-through imaging results are also presented.

Marchese, Linda; Terroux, Marc; Genereux, Francis; Tremblay, Bruno; Bolduc, Martin; Bergeron, Alain

2013-10-01

242

Using commercial photo camera's RAW-based images in optical-digital correlator for pattern recognition  

NASA Astrophysics Data System (ADS)

In optical-digital correlators for pattern recognition, linear registration of correlation signals is significant for both of recognition reliability and possible input image restoration. This usually achieves with scientific graduated technical cameras, but most of commercial digital cameras now have an option of RAW data output. With appropriate software and parameters of processing, it is possible to get linearized image data from photo camera's RAW file. Application of such photo cameras makes optical-digital systems cheaper, more flexible and brings along their wider propagation. For linear registration of correlation signals, open-source Dave Coffins's RAW converter DCRAW was used in this work. Data from photo camera were linearized by DCRAW converter in "totally RAW documental mode" with 16-bit output. Experimental results of comparison between linearized and non-linearized correlation signals and digitally restored input scene images are presented. It is shown, that applied linearization allows to increase linear dynamic range for used Canon EOS 400D camera more that 3 times.

Starikov, Sergey N.; Konnik, Mikhail V.

2008-03-01

243

Toward the design of a positron volume imaging camera.  

National Technical Information Service (NTIS)

Three different computing algorithms for performing positron emission image reconstruction have been compared using Monte Carlo phantom simulations. The work was motivated by the recent announcement of the commercial availability of a positron volume imag...

J. G. Rogers M. Stazyk R. Harrop C. J. Dykstra J. S. Barney

1990-01-01

244

Precise color images a high-speed color video camera system with three intensified sensors  

NASA Astrophysics Data System (ADS)

High speed imaging systems have been used in a large field of science and engineering. Although the high speed camera systems have been improved to high performance, most of their applications are only to get high speed motion pictures. However, in some fields of science and technology, it is useful to get some other information, such as temperature of combustion flame, thermal plasma and molten materials. Recent digital high speed video imaging technology should be able to get such information from those objects. For this purpose, we have already developed a high speed video camera system with three-intensified-sensors and cubic prism image splitter. The maximum frame rate is 40,500 pps (picture per second) at 64 X 64 pixels and 4,500 pps at 256 X 256 pixels with 256 (8 bit) intensity resolution for each pixel. The camera system can store more than 1,000 pictures continuously in solid state memory. In order to get the precise color images from this camera system, we need to develop a digital technique, which consists of a computer program and ancillary instruments, to adjust displacement of images taken from two or three image sensors and to calibrate relationship between incident light intensity and corresponding digital output signals. In this paper, the digital technique for pixel-based displacement adjustment are proposed. Although the displacement of the corresponding circle was more than 8 pixels in original image, the displacement was adjusted within 0.2 pixels at most by this method.

Oki, Sachio; Yamakawa, Masafumi; Gohda, Susumu; Etoh, Takeharu

1999-06-01

245

Intial synchroscan streak camera imaging at the A0 photoinjector  

SciTech Connect

At the Fermilab A0 photoinjector facility, bunch-length measurements of the laser micropulse and the e-beam micropulse have been done in the past with a single-sweep module of the Hamamatsu C5680 streak camera with an intrinsic shot-to-shot trigger jitter of 10 to 20 ps. We have upgraded the camera system with the synchroscan module tuned to 81.25 MHz to provide synchronous summing capability with less than 1.5-ps FWHM trigger jitter and a phase-locked delay box to provide phase stability of {approx}1 ps over 10s of minutes. This allowed us to measure both the UV laser pulse train at 244 nm and the e-beam via optical transition radiation (OTR). Due to the low electron beam energies and OTR signals, we typically summed over 50 micropulses with 1 nC per micropulse. We also did electron beam bunch length vs. micropulse charge measurements to identify a significant e-beam micropulse elongation from 10 to 30 ps (FWHM) for charges from 1 to 4.6 nC. This effect is attributed to space-charge effects in the PC gun as reproduced by ASTRA calculations. Chromatic temporal dispersion effects in the optics were also characterized and will be reported.

Lumpkin, A.H.; Ruan, J.; /Fermilab

2008-04-01

246

3D image processing architecture for camera phones  

NASA Astrophysics Data System (ADS)

Putting high quality and easy-to-use 3D technology into the hands of regular consumers has become a recent challenge as interest in 3D technology has grown. Making 3D technology appealing to the average user requires that it be made fully automatic and foolproof. Designing a fully automatic 3D capture and display system requires: 1) identifying critical 3D technology issues like camera positioning, disparity control rationale, and screen geometry dependency, 2) designing methodology to automatically control them. Implementing 3D capture functionality on phone cameras necessitates designing algorithms to fit within the processing capabilities of the device. Various constraints like sensor position tolerances, sensor 3A tolerances, post-processing, 3D video resolution and frame rate should be carefully considered for their influence on 3D experience. Issues with migrating functions such as zoom and pan from the 2D usage model (both during capture and display) to 3D needs to be resolved to insure the highest level of user experience. It is also very important that the 3D usage scenario (including interactions between the user and the capture/display device) is carefully considered. Finally, both the processing power of the device and the practicality of the scheme needs to be taken into account while designing the calibration and processing methodology.

Atanassov, Kalin; Ramachandra, Vikas; Goma, Sergio R.; Aleksic, Milivoje

2011-01-01

247

Validation of 3D surface imaging in breath-hold radiotherapy for breast cancer: one central camera unit versus three camera units  

NASA Astrophysics Data System (ADS)

In this work we investigated the benefit of the use of two lateral camera units additional to a central camera unit for 3D surface imaging for image guidance in deep-inspiration breath-hold (DIBH) radiotherapy by comparison with cone-beam computed tomography (CBCT). Ten patients who received DIBH radiotherapy after breast-conserving surgery were included. The performance of surface imaging using one and three camera units was compared to using CBCT for setup verification. Breast-surface registrations were performed for CBCT as well as for 3D surfaces, captured concurrently with CBCT, to planning CT. The resulting setup errors were compared with linear regression analysis. For the differences between setup errors an assessment of the group mean, systematic error, random error, and 95% limits of agreement was made. Correlations between derived surface-imaging [one camera unit;three camera units] and CBCT setup errors were: R2=[0.67;0.75], [0.76;0.87], [0.88;0.91] in left-right, cranio-caudal, and anterior-posterior direction, respectively. Group mean, systematic and random errors were slightly smaller (sub-millimeter differences) and the limits of agreement were 0.10 to 0.25cm tighter when using three camera units compared with one. For the majority of the data, the use of three camera units compared with one resulted in setup errors more similar to the CBCT derived setup errors for the craniocaudal and anterior-posterior directions (p<0.01, Wilcoxon-signed-ranks test). This study shows a better correlation and agreement between 3D surface imaging and CBCT when three camera units are used instead of one and further outlines the conditions under which the benefit of using three camera units is significant.

Alderliesten, Tanja; Betgen, Anja; van Vliet-Vroegindeweij, Corine; Remeijer, Peter

2013-03-01

248

Real-Time Adjusting of Yaw Angle of Space Camera Based on General Image-Quality Equation  

Microsoft Academic Search

A Strategy of Real-Time Adjusting of Yaw Angle (SRTAYA) of space camera was proposed base on General Image-Quality Equation (GIQE). During space camera imaging, yaw angle which is caused by rotation of the earth and the gesture changing of satellite impacts integral aspect of TDI (Time Delay and Integration) CCD (Charge Couple Device), so that the image quality degrades. For

Jinling Wang; Xiaojun He; Shuyan Xu

2009-01-01

249

A 5-18 micron array camera for high-background astronomical imaging  

NASA Technical Reports Server (NTRS)

A new infrared array camera system using a Hughes/SBRC 58 x 62 pixel hybrid Si:Ga array detector has been successfully applied to high-background 5-18-micron astronomical imaging observations. The off-axis reflective optical system minimizes thermal background loading and produces diffraction-limited images with negligible spatial distortion. The noise equivalent flux density (NEFD) of the camera at 10 microns on the 3.0-m NASA/Infrared Telescope Facility with broadband interference filters and 0.26 arcsec pixel is NEFD = 0.01 Jy/sq rt min per pixel (1sigma), and it operates at a frame rate of 30 Hz with no compromise in observational efficiency. The electronic and optical design of the camera, its photometric characteristics, examples of observational results, and techniques for successful array imaging in a high- background astronomical application are discussed.

Gezari, Daniel Y.; Folz, Walter C.; Woods, Lawrence A.; Varosi, Frank

1992-01-01

250

Evolution of INO Uncooled Infrared Cameras Towards Very High Resolution Imaging  

NASA Astrophysics Data System (ADS)

Along the years INO has been involved in development of various uncooled infrared devices. Todays, the infrared imagers exhibit good resolutions and find their niche in numerous applications. Nevertheless, there is still a trend toward high resolution imaging for demanding applications. At the same time, low-resolution for mass market applications are sought for low-cost imaging solutions. These two opposite requirements reflect the evolution of infrared cameras from the origin, when only few pixel-count FPAs were available, to megapixel-count FPA of the recent years. This paper reviews the evolution of infrared camera technologies at INO from the uncooled bolometer detector capability up to the recent achievement of 1280×960 pixels infrared camera core using INO's patented microscan technology.

Bergeron, Alain; Jerominek, Hubert; Chevalier, Claude; Le Noc, Loïc; Tremblay, Bruno; Alain, Christine; Martel, Anne; Blanchard, Nathalie; Morissette, Martin; Mercier, Luc; Gagnon, Lucie; Couture, Patrick; Desnoyers, Nichola; Demers, Mathieu; Lamontagne, Frédéric; Lévesque, Frédéric; Verreault, Sonia; Duchesne, François; Lambert, Julie; Girard, Marc; Savard, Maxime; Châteauneuf, François

2011-02-01

251

Autofluorescence Images with Carl Zeiss versus Topcon Eye Fundus Camera: A Comparative Study  

PubMed Central

Purpose. To compare the autofluorescence images of the Zeiss versus Topcon eye fundus cameras and design an objective way to quantify it. Procedures. The IMAGEJ software was used to determine the gray level corresponding to the darkest veins and the peripapillary ring (thresholds), the level of white of the brightest perifoveal area, their difference (contrast level), and the suprathreshold area for each photograph. Results. Carl Zeiss has higher contrast values than Topcon. The Topcon contrast presented a crest with further decline as the suprathreshold area continued to increase. On the contrary, the Zeiss profile did not decline in contrast. Conclusions and Message. The Carl Zeiss camera showed superior contrast ability over the Topcon when performing autofluorescence imaging. We set objective parameters to compare fundus cameras FAF images. These parameters could be the base to objectively measure and determine changes and realize followup to areas of hyper- or hypofluorescence.

Munoz, Juan M.; Coco, Rosa M.; Sanabria, M. Rosa; Cuadrado, Ruben; Blanco, Eduardo

2013-01-01

252

Modeling of Jovian Lightning Imaged by Galileo SSI Camera  

Microsoft Academic Search

We present a study of lightning on Jupiter and the clouds illuminated by the lightning. The Galileo images have a resolution of 25 km\\/pixel and are able to resolve the lightning flashes, which have full widths at half maximum in the range 90-160 km (Little et al. 1999). We compare the lightning images with the images produced by our 3D

U. A. Dyudina; A. P. Ingersoll

2000-01-01

253

A Customized Human Fall Detection System Using Omni-Camera Images and Personal Information  

Microsoft Academic Search

This paper proposes a new approach to detect the fall of the elderly. The detection system uses a MapCam (omni-camera) to capture images and performs image processing over the images. The personal information of each individual is considered in the processing task. The MapCam is used to capture 360deg scenes simultaneously and eliminate any blind viewing zone. The personal information

Shaou-Gang Miaou; Pei-Hsu Sung; Chia-Yuan Huang

2006-01-01

254

Image processing for three-dimensional scans generated by time-of-flight range cameras  

NASA Astrophysics Data System (ADS)

Time-of-flight (TOF) full-field range cameras use a correlative imaging technique to generate three-dimensional measurements of the environment. Though reliable and cheap they have the disadvantage of high measurement noise and errors that limit the practical use of these cameras in industrial applications. We show how some of these limitations can be overcome with standard image processing techniques specially adapted to TOF camera data. Additional information in the multimodal images recorded in this setting, and not available in standard image processing settings, can be used to improve reduction of measurement noise. Three extensions of standard techniques, wavelet thresholding, adaptive smoothing on a clustering based image segmentation, and an extended anisotropic diffusion filtering, make use of this information and are compared on synthetic data and on data acquired from two different off-the-shelf TOF cameras. Of these methods, the adapted anisotropic diffusion technique gives best results, and is implementable to perform in real time using current graphics processing unit (GPU) hardware. Like traditional anisotropic diffusion, it requires some parameter adaptation to the scene characteristics, but allows for low visualization delay and improved visualization of moving objects by avoiding long averaging periods when compared to traditional TOF image denoising.

Schöner, Holger; Bauer, Frank; Dorrington, Adrian; Heise, Bettina; Wieser, Volkmar; Payne, Andrew; Cree, Michael J.; Moser, Bernhard

2012-04-01

255

Imaging performance of a multiwire proportional-chamber positron camera  

SciTech Connect

A new design - fully three-dimensional - Positron Camera is presented, made of six MultiWire Proportional Chamber modules arranged to form the lateral surface of a hexagonal prism. A true coincidence rate of 56000 c/s is expected with an equal accidental rate for a 400 ..mu..Ci activity uniformly distributed in a approx. 3 l water phantom. A detailed Monte Carlo program has been used to investigate the dependence of the spatial resolution on the geometrical and physical parameters. A spatial resolution of 4.8 mm FWHM has been obtained for a /sup 18/F point-like source in a 10 cm radius water phantom. The main properties of the limited angle reconstruction algorithms are described in relation to the proposed detector geometry.

Perez-Mandez, V.; Del Guerra, A.; Nelson, W.R.; Tam, K.C.

1982-08-01

256

Imaging from Compton cameras using detectors with no spatial resolution capability  

NASA Astrophysics Data System (ADS)

To mitigate image degradation due to the finite spatial resolution of detectors, a new theory is developed that allows Compton cameras to be comprised of detectors that have no spatial resolution at all. The results of a computer simulation indicate that the new theory can be used to produce reasonable images at least when noiseless simulated data is used. Camera designs that can exploit the new theory for use in areas such as medicine and homeland security are presented. Although the results presented here are promising, further effort is needed to establish their usefulness with real data.

Smith, Bruce

2012-10-01

257

A mobile phone-based retinal camera for portable wide field imaging.  

PubMed

Digital fundus imaging is used extensively in the diagnosis, monitoring and management of many retinal diseases. Access to fundus photography is often limited by patient morbidity, high equipment cost and shortage of trained personnel. Advancements in telemedicine methods and the development of portable fundus cameras have increased the accessibility of retinal imaging, but most of these approaches rely on separate computers for viewing and transmission of fundus images. We describe a novel portable handheld smartphone-based retinal camera capable of capturing high-quality, wide field fundus images. The use of the mobile phone platform creates a fully embedded system capable of acquisition, storage and analysis of fundus images that can be directly transmitted from the phone via the wireless telecommunication system for remote evaluation. PMID:24344230

Maamari, Robi N; Keenan, Jeremy D; Fletcher, Daniel A; Margolis, Todd P

2014-04-01

258

Camera sensor arrangement for crop/weed detection accuracy in agronomic images.  

PubMed

In Precision Agriculture, images coming from camera-based sensors are commonly used for weed identification and crop line detection, either to apply specific treatments or for vehicle guidance purposes. Accuracy of identification and detection is an important issue to be addressed in image processing. There are two main types of parameters affecting the accuracy of the images, namely: (a) extrinsic, related to the sensor's positioning in the tractor; (b) intrinsic, related to the sensor specifications, such as CCD resolution, focal length or iris aperture, among others. Moreover, in agricultural applications, the uncontrolled illumination, existing in outdoor environments, is also an important factor affecting the image accuracy. This paper is exclusively focused on two main issues, always with the goal to achieve the highest image accuracy in Precision Agriculture applications, making the following two main contributions: (a) camera sensor arrangement, to adjust extrinsic parameters and (b) design of strategies for controlling the adverse illumination effects. PMID:23549361

Romeo, Juan; Guerrero, José Miguel; Montalvo, Martín; Emmi, Luis; Guijarro, María; Gonzalez-de-Santos, Pablo; Pajares, Gonzalo

2013-01-01

259

The image pretreatment based on the FPGA inside digital CCD camera  

NASA Astrophysics Data System (ADS)

In a space project, a digital CCD camera which can image more clearly in the 1 Lux light environment has been asked to design . The CCD sensor ICX285AL produced by SONY Co.Ltd has been used in the CCD camera. The FPGA (Field Programmable Gate Array) chip XQR2V1000 has been used as a timing generator and a signal processor inside the CCD camera. But in the low-light environment, two kinds of random noise become apparent because of the improving of CCD camera's variable gain, one is dark current noise in the image background, the other is vertical transfer noise. The real time method for eliminating noise based on FPGA inside the CCD camera would be introduced. The causes and characteristics of the random noise have been analyzed. First, several ideas for eliminating dark current noise had been motioned; then they were emulated by VC++ in order to compare their speed and effect; Gauss filter has been chosen because of the filtering effect. The vertical transfer vertical noise has the character that the vertical noise points have regular ordinate in the image two-dimensional coordinates; and the performance of the noise is fixed, the gray value of the noise points is 16-20 less than the surrounding pixels. According to these characters, local median filter has been used to clear up the vertical noise. Finally, these algorithms had been transplanted into the FPGA chip inside the CCD camera. A large number of experiments had proved that the pretreatment has better real-time features. The pretreatment makes the digital CCD camera improve the signal-to-noise ratio of 3-5dB in the low-light environment.

Tian, Rui; Liu, Yan-Ying

2009-07-01

260

640x512 InGaAs focal plane array camera for visible and SWIR imaging  

NASA Astrophysics Data System (ADS)

We report on our 640x512 pixel InGaAs/InP focal plane array camera for visible and short-wavelength infrared imaging. For this camera, we have fabricated a 640x512 element substrate-removed backside-illuminated InGaAs/InP photodiode array (PDA) with a 25 mm pixel pitch. The PDA is indium bump bonded to a silicon read out integrated circuit. Removing the InP substrate from the focal plane array allows visible wavelengths, which would otherwise be absorbed by the InP substrate due to its 920 nm wavelength cut-off, to reach the pixels' active region. The quantum efficiency is approximately 15% at 500 nm, 70% at 850 nm, 85% at 1310 nm, and 80% at 1550 nm. Features incorporated into this video-rate, 14-bit output camera include external triggering, windowing, individual pixel correction, 8 operational settings of gain and exposure time, and gamma correction. The readout circuit uses a gate-modulated pixel for high sensitivity imaging over a wide illumination range. This camera is useable for visible imaging as well as imaging eye-safe lasers and is of particular interest seeing laser designators and night vision as well as hyperspectral imaging.

Martin, Tara; Brubaker, Robert; Dixon, Peter; Gagliardi, Mari-Anne; Sudol, Tom

2005-05-01

261

A correction method of the spatial distortion in planar images from ?-Camera systems  

NASA Astrophysics Data System (ADS)

A methodology for correcting spatial distortions in planar images for small Field Of View (FOV) ?-Camera systems based on Position Sensitive Photomultiplier Tubes (PSPMT) and pixelated scintillation crystals is described. The process utilizes a correction matrix whose elements are derived from a prototyped planar image obtained through irradiation of the scintillation crystal by a 60Co point source and without a collimator. The method was applied to several planar images of a SPECT experiment with a simple phantom construction at different detection angles. The tomographic images are obtained using the Maximum-Likelihood Expectation-Maximization (MLEM) reconstruction technique. Corrected and uncorrected images are compared and the applied correction methodology is discussed.

Thanasas, D.; Georgiou, E.; Giokaris, N.; Karabarbounis, A.; Maintas, D.; Papanicolas, C. N.; Polychronopoulou, A.; Stiliaris, E.

2009-06-01

262

First Light with RATIR: An Automated 6-band Optical/NIR Imaging Camera  

NASA Astrophysics Data System (ADS)

The Reionization and Transients InfraRed camera (RATIR) is a simultaneous optical/NIR multi-band imaging camera which is 100% time-dedicated to the followup of Gamma-ray Bursts. The camera is mounted on the 1.5-meter Johnson telescope of the Mexican Observatorio Astronomico Nacional on Sierra San Pedro Martir in Baja California. With rapid slew capability and autonomous interrupt capabilities, the system will image GRBs in 6 bands (i, r, Z, Y, J, and H) within minutes of receiving a satellite position, detecting optically faint afterglows in the NIR and quickly alerting the community to potential GRBs at high redshift (z>6-10). We report here on this Spring's first light observing campaign with RATIR. We summarize the instrumental characteristics, capabilities, and observing modes.

Butler, Nat; Klein, Chris; Fox, Ori; Lotkin, Gennadiy; Bloom, Josh; Prochaska, J. Xavier; Ramirez-Ruiz, Enrico; de Diego, José A.; Georgiev, Leonid; González, Jesús; Lee, William H.; Richer, Michael G.; Román, Carlos; Watson, Alan M.; Gehrels, Neil; Kutyrev, Alexander; Bernstein, Rebecca; Alvarez, Luis Carlos; Ceseña, Urania; Clark, David; Colorado, Enrique; Córdova, Antolín.; Farah, Alejandro; García, Benjamín.; Guisa, Gerardo; Herrera, Joel; Lazo, Francisco; López, Eduardo; Luna, Esteban; Martínez, Benjamín.; Murillo, Francisco; Murillo, José Manuel; Núñez, Juan Manuel; Pedrayes, M. Herlinda; Quirós, Fernando; Ochoa, José Luis; Sierra, Gerardo; Moseley, Harvey; Rapchun, David; Robinson, Frederick D.; Samuel, Mathew V.; Sparr, Leroy M.

2012-09-01

263

High-speed two-camera imaging pyrometer for mapping fireball temperatures.  

PubMed

A high-speed imaging pyrometer was developed to investigate the behavior of flames and explosive events. The instrument consists of two monochrome high-speed Phantom v7.3 m cameras made by Vision Research Inc. arranged so that one lens assembly collects light for both cameras. The cameras are filtered at 700 or 900 nm with a 10 nm bandpass. The high irradiance produced by blackbody emission combined with variable shutter time and f-stop produces properly exposed images. The wavelengths were chosen with the expected temperatures in mind, and also to avoid any molecular or atomic gas phase emission. Temperatures measured using this pyrometer of exploded TNT charges are presented. PMID:22108886

Densmore, John M; Homan, Barrie E; Biss, Matthew M; McNesby, Kevin L

2011-11-20

264

The application of camera calibration in range-gated 3D imaging technology  

NASA Astrophysics Data System (ADS)

Range-gated laser imaging technology was proposed in 1966 by LF Gillespiethe in U.S. Army Night Vision Laboratory(NVL). Using pulse laser and intensified charge-coupled device(ICCD) as light source and detector respectively, range-gated laser imaging technology can realize space-slice imaging while restraining the atmospheric backs-catter, and in turn detect the target effectively, by controlling the delay between the laser pulse and strobe. Owing to the constraints of the development of key components such as narrow pulse laser and gated imaging devices, the research has been progressed slowly in the next few decades. Until the beginning of this century, as the hardware technology continues to mature, this technology has developed rapidly in fields such as night vision, underwater imaging, biomedical imaging, three-dimensional imaging, especially range-gated three-dimensional(3-D) laser imaging field purposing of access to target spatial information. 3-D reconstruction is the processing of restoration of 3-D objects visible surface geometric structure from three-dimensional(2-D) image. Range-gated laser imaging technology can achieve gated imaging of slice space to form a slice image, and in turn provide the distance information corresponding to the slice image. But to inverse the information of 3-D space, we need to obtain the imaging visual field of system, that is, the focal length of the system. Then based on the distance information of the space slice, the spatial information of each unit space corresponding to each pixel can be inversed. Camera calibration is an indispensable step in 3-D reconstruction, including analysis of the internal structure of camera parameters and the external parameters . In order to meet the technical requirements of the range-gated 3-D imaging, this paper intends to study the calibration of the zoom lens system. After summarizing the camera calibration technique comprehensively, a classic calibration method based on line is selected. One-to-one correspondence between visual filed and focal length of system is obtained and offers effective visual field information for the matching of imaging filed and illumination filed in range-gated 3-D imaging technology. On the basis of the experimental results, combined with the depth of field theory, the application of camera calibration in range-gated 3-D imaging technology is futher studied.

Liu, Xiao-quan; Wang, Xian-wei; Zhou, Yan

2013-09-01

265

SAMI: the SCAO module for the E-ELT adaptive optics imaging camera MICADO  

NASA Astrophysics Data System (ADS)

SAMI, the SCAO module for the E-ELT adaptive optics imaging camera MICADO, could be used in the first years of operation of MICADO on the telescope, until MAORY is operational and coupled to MICADO. We present the results of the study made in the framework of the MICADO phase A to design and estimate the performance of this SCAO module.

Clénet, Y.; Bernardi, P.; Chapron, F.; Gendron, E.; Rousset, G.; Hubert, Z.; Davies, R.; Thiel, M.; Tromp, N.; Genzel, R.

2010-07-01

266

Analytical image reconstruction of cone-beam projections from limited-angle Compton camera data  

Microsoft Academic Search

Since the integration of gamma-rays on a cone is measured with Compton cameras, some sort of image reconstruction method is necessary. Parra (2000) developed an analytical reconstruction algorithm based on a spherical harmonics expansion of projection data that covers the entire scattering-angle range. The measurable scattering angle range is limited due to the electrical noise of the detector and to

Takehiro Tomitani; Masahiko Hirasawa

2003-01-01

267

Saturn's hydrogen aurora: Wide field and planetary camera 2 imaging from the Hubble Space Telescope  

Microsoft Academic Search

Wide field and planetary camera 2\\/Hubble Space Telescope (WFPC2\\/HST) images of Saturn's far ultraviolet aurora reveal emissions confined to a narrow band of latitudes near Saturn's north and south poles. The aurorae are most prominent in the morning sector with patterns that appear fixed in local time. The geographic distribution and vertical extent of the auroral emissions seen in these

John T. Trauger; John T. Clarke; Gilda E. Ballester; Robin W. Evans; Christopher J. Burrows; David Crisp; John S. Gallagher; Richard E. Griffiths; J. Jeff Hester; John G. Hoessel; Jon A. Holtzman; John E. Krist; Jeremy R. Mould; Raghvendra Sahai; Paul A. Scowen; Karl R. Stapelfeldt; Alan M. Watson

1998-01-01

268

Periscope-Camera System for Visible and Infrared Imaging Diagnostics on TFTR.  

National Technical Information Service (NTIS)

An optical diagnostic consisting of a periscope which relays images of the torus interior to an array of cameras is used on the Tokamak Fusion Test Reactor (TFTR) to view plasma discharge phenomena and inspect vacuum vessel internal structures in both vis...

S. S. Medley D. L. Dimock S. Hayes D. Long J. L. Lowrence

1985-01-01

269

Reconstruction of Linearly Parameterized Models from Single Images with a Camera of Unknown Focal Length  

Microsoft Academic Search

Abstract?his paper deals with the problem of recovering the dimensions of an object and its pose from a single image acquired with a camera of unknown focal length. It is assumed that the object in question can be modeled as a polyhedron where the coordinates of the vertices can be expressed as a linear function of a dimension vector, $\\\\lambda$.

David Jelinek; Camillo J. Taylor

2001-01-01

270

Investigation of unsteady regular and Mach reflection in water using an image converter camera  

NASA Astrophysics Data System (ADS)

The oblique collisions of two shock waves in water are investigated by an image converter camera. These two shock waves are produced by the detonation of two plane efficient explosives at angles to each other. The propagating shock wave is confirmed to be pseudo-stationary by framing photographs taken with by the image converter camera. Configurations of the shock wave are also obtained by the usual shadow graph technique using a laser beam as a source of light. The velocity of the head wave is calculated by use of the shadow graph. Using the Rankine-Hugoniot relation between the front and rear of the head wave, we can calculate the pressure across the underwater shock wave. The streak photographs of the intersection of two plane shock waves are also obtained by the image converter camera. When two shock waves collide, Mach reflection occurs at an early stage, and transits to regular reflection. The transition from Mach to regular reflection is fully nonstationary and therefore there is great discrepancy between these results and Neumann's two shock theory. We confirmed that this image converter camera is capable of investigating the nonstationary transition process between Mach and regular reflection in water.

Itoh, S.; Nagano, S.; Suzuki, O.; Morita, I.; Fujita, M.

1995-05-01

271

Experimental investigation of a flat plate heat pipe performance using IR thermal imaging camera  

Microsoft Academic Search

This paper presents results and analysis of an experimental investigation into determining the thermal performance of a flat plate heat pipe using infra red (IR) thermal imaging camera. Steady state and transient temperature distribution of the evaporator surface of the flat plate heat pipe were measured using a single heat source with varied heat flux inputs. For performance comparison, the

R. Boukhanouf; A. Haddad; M. T. North; C. Buffone

2006-01-01

272

Fire service and first responder thermal imaging camera (TIC) advances and standards  

Microsoft Academic Search

Fire Service and First Responder Thermal Imaging Camera (TIC) applications are growing, saving lives and preventing injury and property damage. Firefighters face a wide range of serious hazards. TICs help mitigate the risks by protecting Firefighters and preventing injury, while reducing time spent fighting the fire and resources needed to do so. Most fire safety equipment is covered by performance

Lawrence S. Konsin; Stuart Nixdorff

2007-01-01

273

Wall temperature measurements using a thermal imaging camera with temperature-dependent emissivity corrections  

Microsoft Academic Search

A methodology is presented whereby the relationship between temperature and emissivity for fused quartz has been used to correct the temperature values of a quartz impingement plate detected by an SC3000 thermal imaging camera. The methodology uses an iterative method using the initial temperature (obtained by assuming a constant emissivity) to find the emissivity values which are then put into

Chloe McDaid; Yang Zhang

2011-01-01

274

THz imaging and radiometric measurements using a microbolometer-based camera  

Microsoft Academic Search

A THz VOx-based 160×120 microbolometer FPA camera with 52 ?m pixel pitch has been developed at INO. Radiometric NEP values have been measured at different wavelengths. Real-time video-rate reflectance imaging using large beam area from a high-power FIR optically pumped THz laser has been also performed, demonstrating high sensitivity for stand alone applications.

M. Bolduc; M. Terroux; L. Marchese; B. Tremblay; E. Savard; M. Doucet; H. Oulachgar; C. Alain; H. Jeronimek; A. Bergeron

2011-01-01

275

Noise-equivalent power characterization of an uncooled microbolometer-based THz imaging camera  

Microsoft Academic Search

A THz camera based on an uncooled microbolometer 160X120 pixel array with nominal pitch of 52 mum has been developed at INO and initial transmission and reflection images showed promise. In the present paper, the characterization of both standard infrared and THz-optimized uncooled microbolometer pixel arrays are presented at both infrared and THz wavelengths. Measurements in the THz region has

Martin Bolduc; Marc Terroux; Bruno Tremblay; Linda Marchese; Eric Savard; Michel Doucet; Hassane Oulachgar; Christine Alain; Hubert Jerominek; Alain Bergeron

2011-01-01

276

Innovative Camera and Image Processing System to Characterize Cryospheric Changes  

Microsoft Academic Search

The polar regions play an important role in Earth's climatic and geodynamic systems. Digital photogrammetric mapping provides a means for monitoring the dramatic changes observed in the polar regions during the past decades. High-resolution, photogrammetrically processed digital aerial imagery provides complementary information to surface measurements obtained by laser altimetry systems. While laser points accurately sample the ice surface, stereo images

A. Schenk; B. M. Csatho; S. Nagarajan

2010-01-01

277

Single-Quantum Dot Imaging with a Photon Counting Camera  

Microsoft Academic Search

The expanding spectrum of applications of single-molecule fluorescence imaging ranges from fundamental in vitro studies of biomolecular activity to tracking of receptors in live cells. The success of these assays has relied on pro- gress in organic and non-organic fluorescent probe developments as well as improvements in the sensitivity of light detec- tors. We describe a new type of detector

X. Michalet; R. A. Colyer; J. Antelman; O. H. W. Siegmund; A. Tremsin; J. V. Vallerga; S. Weiss

2009-01-01

278

Accurate quantification of 131 I distribution by gamma camera imaging  

Microsoft Academic Search

The development of targeted therapy requires that the concentration of the therapeutic agent can be estimated in target and normal tissues. Single photon emission tomography (SPET), with and without scatter correction, and planar imaging using131I have been compared to develop a method for investigation of targeted therapy. Compton scatter was investigated using line spread functions in air and water, these

Alan J. Green; Sarah E. Dewhurst; Richard H. J. Begent; Kenneth D. Bagshawe; Simon J. Riggs

1990-01-01

279

Microchannel Plate Intensified, Subnanosecond, X-ray Imaging Camera.  

National Technical Information Service (NTIS)

We have developed a microchannel plate intensified, subnanosecond x-ray detector for x-ray imaging experiments. It consists of an x-ray photocathode, a microchannel plate, electrostatic focusing optics and a subnanosecond phosphor. The detector is used fo...

M. R. Carter B. J. McKinley K. G. Tirsell

1989-01-01

280

Film cameras or digital sensors? The challenge ahead for aerial imaging  

USGS Publications Warehouse

Cartographic aerial cameras continue to play the key role in producing quality products for the aerial photography business, and specifically for the National Aerial Photography Program (NAPP). One NAPP photograph taken with cameras capable of 39 lp/mm system resolution can contain the equivalent of 432 million pixels at 11 ??m spot size, and the cost is less than $75 per photograph to scan and output the pixels on a magnetic storage medium. On the digital side, solid state charge coupled device linear and area arrays can yield quality resolution (7 to 12 ??m detector size) and a broader dynamic range. If linear arrays are to compete with film cameras, they will require precise attitude and positioning of the aircraft so that the lines of pixels can be unscrambled and put into a suitable homogeneous scene that is acceptable to an interpreter. Area arrays need to be much larger than currently available to image scenes competitive in size with film cameras. Analysis of the relative advantages and disadvantages of the two systems show that the analog approach is more economical at present. However, as arrays become larger, attitude sensors become more refined, global positioning system coordinate readouts become commonplace, and storage capacity becomes more affordable, the digital camera may emerge as the imaging system for the future. Several technical challenges must be overcome if digital sensors are to advance to where they can support mapping, charting, and geographic information system applications.

Light, D. L.

1996-01-01

281

Hologram synthesis of three-dimensional real objects using portable integral imaging camera.  

PubMed

We propose a portable hologram capture system based on integral imaging. An integral imaging camera with an integrated micro lens array captures spatio-angular light ray distribution of the three-dimensional scene under incoherent illumination. The captured light ray distribution is then processed to synthesize corresponding hologram. Experimental results show that the synthesized hologram is optically reconstructed successfully, demonstrating accommodation and motion parallax of the reconstructed three-dimensional scene. PMID:24104278

Lee, Sung-Keun; Hong, Sung-In; Kim, Yong-Soo; Lim, Hong-Gi; Jo, Na-Young; Park, Jae-Hyeung

2013-10-01

282

Uncool Science: Photometry and Astrometry with Modified Web Cameras and Uncooled Imagers  

NASA Astrophysics Data System (ADS)

This paper discusses techniques that allow useful scientific observations to be performed with digital still cameras, modified webcams and low cost uncooled astronomical imagers. The paper emphasizes technique changes rel- ative to the use of traditional cooled CCD sensors that are required to get good results. Areas discussed include dark subtraction, statistical consid- erations in automatic image stacking, QE, sampling theory of color and inter- lined sensors.

Hoot, J. E.

2004-05-01

283

Quantitative Evaluation of Surface Color of Tomato Fruits Cultivated in Remote Farm Using Digital Camera Images  

NASA Astrophysics Data System (ADS)

To measure the quantitative surface color information of agricultural products with the ambient information during cultivation, a color calibration method for digital camera images and a remote monitoring system of color imaging using the Web were developed. Single-lens reflex and web digital cameras were used for the image acquisitions. The tomato images through the post-ripening process were taken by the digital camera in both the standard image acquisition system and in the field conditions from the morning to evening. Several kinds of images were acquired with the standard RGB color chart set up just behind the tomato fruit on a black matte, and a color calibration was carried out. The influence of the sunlight could be experimentally eliminated, and the calibrated color information consistently agreed with the standard ones acquired in the system through the post-ripening process. Furthermore, the surface color change of the tomato on the tree in a greenhouse was remotely monitored during maturation using the digital cameras equipped with the Field Server. The acquired digital color images were sent from the Farm Station to the BIFE Laboratory of Mie University via VPN. The time behavior of the tomato surface color change during the maturing process could be measured using the color parameter calculated based on the obtained and calibrated color images along with the ambient atmospheric record. This study is a very important step in developing the surface color analysis for both the simple and rapid evaluation of the crop vigor in the field and to construct an ambient and networked remote monitoring system for food security, precision agriculture, and agricultural research.

Hashimoto, Atsushi; Suehara, Ken-Ichiro; Kameoka, Takaharu

284

PN-CCD detector for the European photon imaging camera on XMM  

Microsoft Academic Search

The pn-charge coupled device (pn-CCD) detector system was developed as the focal plane instrument of an x-ray telescope for the European photon imaging camera (EPIC) on the x-ray multi mirror (XMM) mission. The second cornerstone mission of the European Space Agency's Horizon 2000 plan performs high throughput imaging and spectroscopy of the x- ray sky in the domain of 0.1

Norbert Meidinger; Heinrich W. Braeuninger; R. Hartmann; G. Hartner; N. Krause; G. Metzner; Elmar Pfeffermann; M. Popp; C. Reppin; J. Riedel; D. Stoetter; Lothar Strueder; Joachim Truemper; U. Weber; D. Hauff; Peter Holl; J. Kemmer; S. Krisch; Heike Soltau; Christoph van Zanthier; E. Bihler; H. Boettcher; E. Kendziorra; J. Kraemer; B. Pflueger; R. Staubert

1996-01-01

285

CMOS image sensor with APS structure for high-speed video cameras  

Microsoft Academic Search

In this paper, we discuss the required specifications, APS (Active Pixel Sensor) technology, and possible problems to be overcome with next-generation CMOS image sensors for digital high-speed video photography. A custom-designed image sensor with parallel-readout, high-speed-operation and other functions, is needed to develop an advanced high-speed video camera, realizing higher frame rates and producing pictures of proper resolution. The CMOS

Kenji Tajima; Takashi Komuro; Masatoshi Ishikawa

2001-01-01

286

High image quality sub 100 picosecond gated framing camera development  

SciTech Connect

A major challenge for laser fusion is the study of the symmetry and hydrodynamic stability of imploding fuel capsules. Framed x-radiographs of 10-100 ps duration, excellent image quality, minimum geometrical distortion (< 1%), dynamic range greater than 1000, and more than 200 x 200 pixels are required for this application. Recent progress on a gated proximity focused intensifier which meets these requirements is presented.

Price, R.H.; Wiedwald, J.D.

1983-11-17

287

The trustworthy digital camera: Restoring credibility to the photographic image  

NASA Technical Reports Server (NTRS)

The increasing sophistication of computers has made digital manipulation of photographic images, as well as other digitally-recorded artifacts such as audio and video, incredibly easy to perform and increasingly difficult to detect. Today, every picture appearing in newspapers and magazines has been digitally altered to some degree, with the severity varying from the trivial (cleaning up 'noise' and removing distracting backgrounds) to the point of deception (articles of clothing removed, heads attached to other people's bodies, and the complete rearrangement of city skylines). As the power, flexibility, and ubiquity of image-altering computers continues to increase, the well-known adage that 'the photography doesn't lie' will continue to become an anachronism. A solution to this problem comes from a concept called digital signatures, which incorporates modern cryptographic techniques to authenticate electronic mail messages. 'Authenticate' in this case means one can be sure that the message has not been altered, and that the sender's identity has not been forged. The technique can serve not only to authenticate images, but also to help the photographer retain and enforce copyright protection when the concept of 'electronic original' is no longer meaningful.

Friedman, Gary L.

1994-01-01

288

Preliminary Experience with Small Animal SPECT Imaging on Clinical Gamma Cameras  

PubMed Central

The traditional lack of techniques suitable for in vivo imaging has induced a great interest in molecular imaging for preclinical research. Nevertheless, its use spreads slowly due to the difficulties in justifying the high cost of the current dedicated preclinical scanners. An alternative for lowering the costs is to repurpose old clinical gamma cameras to be used for preclinical imaging. In this paper we assess the performance of a portable device, that is, working coupled to a single-head clinical gamma camera, and we present our preliminary experience in several small animal applications. Our findings, based on phantom experiments and animal studies, provided an image quality, in terms of contrast-noise trade-off, comparable to dedicated preclinical pinhole-based scanners. We feel that our portable device offers an opportunity for recycling the widespread availability of clinical gamma cameras in nuclear medicine departments to be used in small animal SPECT imaging and we hope that it can contribute to spreading the use of preclinical imaging within institutions on tight budgets.

Aguiar, P.; Silva-Rodriguez, J.; Herranz, M.; Ruibal, A.

2014-01-01

289

Preliminary Experience with Small Animal SPECT Imaging on Clinical Gamma Cameras.  

PubMed

The traditional lack of techniques suitable for in vivo imaging has induced a great interest in molecular imaging for preclinical research. Nevertheless, its use spreads slowly due to the difficulties in justifying the high cost of the current dedicated preclinical scanners. An alternative for lowering the costs is to repurpose old clinical gamma cameras to be used for preclinical imaging. In this paper we assess the performance of a portable device, that is, working coupled to a single-head clinical gamma camera, and we present our preliminary experience in several small animal applications. Our findings, based on phantom experiments and animal studies, provided an image quality, in terms of contrast-noise trade-off, comparable to dedicated preclinical pinhole-based scanners. We feel that our portable device offers an opportunity for recycling the widespread availability of clinical gamma cameras in nuclear medicine departments to be used in small animal SPECT imaging and we hope that it can contribute to spreading the use of preclinical imaging within institutions on tight budgets. PMID:24963478

Aguiar, P; Silva-Rodríguez, J; Herranz, M; Ruibal, A

2014-01-01

290

Development of a gas leak detection method based on infrared spectrum imaging utilizing microbolometer camera  

NASA Astrophysics Data System (ADS)

Development of an early gas leak detection system is essential for safety of energy storage tank fields or chemical plants. Contact-type conventional gas sensors are not suitable for remote surveillance of gas leakage in wide area. Infrared camera has been utilized for gas leak detection, however it is limited only for detecting particular gas. In this study a gas leak identification system, which enables us to detect gas leakage and to identify gas type and density, is developed based on infrared spectrum imaging system utilizing low cost and compact microbolometer infrared camera. Feasibility of the proposed system was demonstrated by experimental results on identification of hydrofluorocarbon gas.

Sakagami, Takahide; Anzai, Hiroaki; Kubo, Shiro

2011-05-01

291

MONICA: a compact, portable dual gamma camera system for mouse whole-body imaging  

SciTech Connect

Introduction We describe a compact, portable dual-gamma camera system (named "MONICA" for MObile Nuclear Imaging CAmeras) for visualizing and analyzing the whole-body biodistribution of putative diagnostic and therapeutic single photon emitting radiotracers in animals the size of mice. Methods Two identical, miniature pixelated NaI(Tl) gamma cameras were fabricated and installed ?looking up? through the tabletop of a compact portable cart. Mice are placed directly on the tabletop for imaging. Camera imaging performance was evaluated with phantoms and field performance was evaluated in a weeklong In-111 imaging study performed in a mouse tumor xenograft model. Results Tc-99m performance measurements, using a photopeak energy window of 140 keV?10%, yielded the following results: spatial resolution (FWHM at 1 cm), 2.2 mm; sensitivity, 149 cps (counts per seconds)/MBq (5.5 cps/μCi); energy resolution (FWHM, full width at half maximum), 10.8%; count rate linearity (count rate vs. activity), r2=0.99 for 0?185 MBq (0?5 mCi) in the field of view (FOV); spatial uniformity, <3% count rate variation across the FOV. Tumor and whole-body distributions of the In-111 agent were well visualized in all animals in 5-min images acquired throughout the 168-h study period. Conclusion Performance measurements indicate that MONICA is well suited to whole-body single photon mouse imaging. The field study suggests that inter-device communications and user-oriented interfaces included in the MONICA design facilitate use of the system in practice. We believe that MONICA may be particularly useful early in the (cancer) drug development cycle where basic whole-body biodistribution data can direct future development of the agent under study and where logistical factors, e.g., limited imaging space, portability and, potentially, cost are important.

Choyke, Peter L.; Xia, Wenze; Seidel, Jurgen; Kakareka, John W.; Pohida, Thomas J.; Milenic, Diane E.; Proffitt, James; Majewski, Stan; Weisenberger, Andrew G.; Green, Michael V.

2010-04-01

292

MONICA: A Compact, Portable Dual Gamma Camera System for Mouse Whole-Body Imaging  

PubMed Central

Introduction We describe a compact, portable dual-gamma camera system (named “MONICA” for MObile Nuclear Imaging CAmeras) for visualizing and analyzing the whole-body biodistribution of putative diagnostic and therapeutic single photon emitting radiotracers in animals the size of mice. Methods Two identical, miniature pixelated NaI(Tl) gamma cameras were fabricated and installed “looking up” through the tabletop of a compact portable cart. Mice are placed directly on the tabletop for imaging. Camera imaging performance was evaluated with phantoms and field performance was evaluated in a weeklong In-111 imaging study performed in a mouse tumor xenograft model. Results Tc-99m performance measurements, using a photopeak energy window of 140 keV ± 10%, yielded the following results: spatial resolution (FWHM at 1-cm), 2.2-mm; sensitivity, 149 cps/MBq (5.5 cps/?Ci); energy resolution (FWHM), 10.8%; count rate linearity (count rate vs. activity), r2 = 0.99 for 0–185 MBq (0–5 mCi) in the field-of-view (FOV); spatial uniformity, < 3% count rate variation across the FOV. Tumor and whole-body distributions of the In-111 agent were well visualized in all animals in 5-minute images acquired throughout the 168-hour study period. Conclusion Performance measurements indicate that MONICA is well suited to whole-body single photon mouse imaging. The field study suggests that inter-device communications and user-oriented interfaces included in the MONICA design facilitate use of the system in practice. We believe that MONICA may be particularly useful early in the (cancer) drug development cycle where basic whole-body biodistribution data can direct future development of the agent under study and where logistical factors, e.g. limited imaging space, portability, and, potentially, cost are important.

Xi, Wenze; Seidel, Jurgen; Karkareka, John W.; Pohida, Thomas J.; Milenic, Diane E.; Proffitt, James; Majewski, Stan; Weisenberger, Andrew G.; Green, Michael V.; Choyke, Peter L.

2009-01-01

293

Skyline matching based camera orientation from images and mobile mapping point clouds  

NASA Astrophysics Data System (ADS)

Mobile Mapping is widely used for collecting large amounts of geo-referenced data. An important role plays sensor fusion, in order to evaluate multiple sensors such as laser scanner and cameras jointly. This requires to determine the relative orientation between sensors. Based on data of a RIEGL VMX-250 mobile mapping system equipped with two laser scanners, four optional cameras, and a highly precise GNSS/IMU system, we propose an approach to improve camera orientations. A manually determined orientation is used as an initial approximation for matching a large number of points in optical images and the corresponding projected scan images. The search space of the point correspondences is reduced to skylines found in both the optical as well as the scan image. The skyline determination is based on alpha shapes, the actual matching is done via an adapted ICP algorithm. The approximate values of the relative orientation are used as starting values for an iterative resection process. Outliers are removed at several stages of the process. Our approach is fully automatic and improves the camera orientation significantly.

Hofmann, S.; Eggert, D.; Brenner, C.

2014-05-01

294

Gamma camera-mounted anatomical X-ray tomography: technology, system characteristics and first images.  

PubMed

Scintigraphic diagnosis, based on functional image interpretation, becomes more accurate and meaningful when supported by corresponding anatomical data. In order to produce anatomical images that are inherently registered with images of emission computerised tomography acquired with a gamma camera, an X-ray transmission system was mounted on the slip-ring gantry of a GEMS Millennium VG gamma camera. The X-ray imaging system is composed of an X-ray tube and a set of detectors located on opposite sides of the gantry rotor that moves around the patient along with the nuclear detectors. A cross-sectional anatomical transmission map is acquired as the system rotates around the patient in a manner similar to a third-generation computerised tomography (CT) system. Following transmission, single-photon emission tomography (SPET) or positron emission tomography (PET) coincidence detection images are acquired and the resultant emission images are thus inherently registered to the anatomical maps. Attenuation correction of the emission images is performed with the same anatomical maps to generate transmission maps. Phantom experiments of system performance and examples of first SPET and coincidence detection patient images are presented. Despite limitations of the system when compared with a state of the art CT scanner, the transmission anatomical maps allow for precise anatomical localisation and for attenuation correction of the emission images. PMID:10901447

Bocher, M; Balan, A; Krausz, Y; Shrem, Y; Lonn, A; Wilk, M; Chisin, R

2000-06-01

295

A test-based comparison between technologies implemented in commercial cameras for high contrast imaging applications  

NASA Astrophysics Data System (ADS)

LAM is developing a high-contrast imaging testbeds for in-lab demonstration of new instrumental concepts requiring high contrast imaging: in particular, for solar and stellar coronagraphy applications. In such applications, a faint target has to be detected close to a very bright source. For these test-benches, a high-dynamic range detector is required to characterize and/or to determine the performance of a new concept. Beyond the capability to detect the target, an imaging detector has to be accurate, reliable and provide reproducible performances. In order to identify a commercial camera for the development of laboratory demonstrators working with high contrast scenes, we carried out a test campaign at the Laboratoire d’Astrophysique de Marseille (LAM) evaluating several cameras implementing different detector technologies. This paper presents the results of the test campaign, carried out at LAM, providing a quantitative comparison between the investigated technologies

Pancrazzi, M.; Landini, F.; Vives, S.; Escolle, C.; N'Diaye, M.; Focardi, M.; Guillon, C.

2012-07-01

296

Stereo Imaging Velocimetry Technique Using Standard Off-the-Shelf CCD Cameras  

NASA Technical Reports Server (NTRS)

Stereo imaging velocimetry is a fluid physics technique for measuring three-dimensional (3D) velocities at a plurality of points. This technique provides full-field 3D analysis of any optically clear fluid or gas experiment seeded with tracer particles. Unlike current 3D particle imaging velocimetry systems that rely primarily on laser-based systems, stereo imaging velocimetry uses standard off-the-shelf charge-coupled device (CCD) cameras to provide accurate and reproducible 3D velocity profiles for experiments that require 3D analysis. Using two cameras aligned orthogonally, we present a closed mathematical solution resulting in an accurate 3D approximation of the observation volume. The stereo imaging velocimetry technique is divided into four phases: 3D camera calibration, particle overlap decomposition, particle tracking, and stereo matching. Each phase is explained in detail. In addition to being utilized for space shuttle experiments, stereo imaging velocimetry has been applied to the fields of fluid physics, bioscience, and colloidal microscopy.

McDowell, Mark; Gray, Elizabeth

2004-01-01

297

Camera-Based Lock-in and Heterodyne Carrierographic Photoluminescence Imaging of Crystalline Silicon Wafers  

NASA Astrophysics Data System (ADS)

Carrierographic (spectrally gated photoluminescence) imaging of a crystalline silicon wafer using an InGaAs camera and two spread super-bandgap illumination laser beams is introduced in both low-frequency lock-in and high-frequency heterodyne modes. Lock-in carrierographic images of the wafer up to 400 Hz modulation frequency are presented. To overcome the frame rate and exposure time limitations of the camera, a heterodyne method is employed for high-frequency carrierographic imaging which results in high-resolution near-subsurface information. The feasibility of the method is guaranteed by the typical superlinearity behavior of photoluminescence, which allows one to construct a slow enough beat frequency component from nonlinear mixing of two high frequencies. Intensity-scan measurements were carried out with a conventional single-element InGaAs detector photocarrier radiometry system, and the nonlinearity exponent of the wafer was found to be around 1.7. Heterodyne images of the wafer up to 4 kHz have been obtained and qualitatively analyzed. With the help of the complementary lock-in and heterodyne modes, camera-based carrierographic imaging in a wide frequency range has been realized for fundamental research and industrial applications toward in-line nondestructive testing of semiconductor materials and devices.

Sun, Q. M.; Melnikov, A.; Mandelis, A.

2014-05-01

298

Image restoration of hybrid time delay and integration camera system with residual motion  

NASA Astrophysics Data System (ADS)

In this paper, we present a hybrid system for time delay and integration (TDI) image restoration. Images are degraded by residual motion, which distorts and blurs the TDI images during exposures and excludes the along-track scanning motion. The motion trajectory is estimated from the image sequence captured by an auxiliary high-speed camera. In order to make the estimated results less sensitive to the imaging conditions and noise, a new method based on cross-correlation is introduced for motion estimation. Then geometric distortion of the TDI image is removed by choosing the correct blurred block according to the central of the corresponding motion trajectory and the final image is restored row by row with the Richardson-Lucy algorithm. Simulated and experimental results are given to prove the effectiveness of our system.

Zheng, Zhenzhen; Wu, Jiagu; Feng, Huajun; Xu, Zhihai; Li, Qi; Chen, Yueting

2011-06-01

299

Environmental applications of camera images calibrated by means of the Levenberg-Marquardt method  

NASA Astrophysics Data System (ADS)

Even though different authors have presented procedures for camera calibration in environmental video monitoring, improvements in the robustness and accuracy of the calibration procedure are always desired and in this work the Levenberg-Marquardt method is included in the camera calibration process for environmental video monitoring images as a way to improve the robustness of the camera calibration when a low number of control points is available without using laboratory measurements. The Pinhole model and the Levenberg-Marquardt method are briefly described and a four step camera calibration procedure using them is presented. This procedure allows users to use ground control points to estimate all the Pinhole model parameters, including the lens distortion parameters and its implementation results with laboratory data are compared with the results presented by other authors. The procedure is also tested with field data obtained with cameras directed toward the beaches of the city of Cartagena, Colombia. The results show that the procedure is robust enough to be used when just a low number of control points are available, even though a large number of GCP is recommended to obtain high accuracy.

Pérez Muñoz, J. C.; Ortiz Alarcón, C. A.; Osorio, A. F.; Mejía, C. E.; Medina, R.

2013-02-01

300

Automatic Generation of Passer-by Record Images using Internet Camera  

NASA Astrophysics Data System (ADS)

Recently, many brutal crimes have shocked us. On the other hand, we have seen a decline in the number of solved crimes. Therefore, the importance of security and self-defense has increased more and more. As an example of self-defense, many surveillance cameras are set up in the buildings, homes and offices. But even if we want to detect a suspicious person, we cannot check the surveillance videos immediately so that huge number of image sequences is stored in each video system. In this paper, we propose an automatic method of generating passer-by record images by using internet camera. In first step, the process of recognizing passer-by is carried out using an image sequence obtained from the internet camera. Our method classifies the subject region into each person by using the space-time image. In addition, we obtain the information of the time, direction and number of passey-by persons from this space-time image. Next, the present method detects five characteristics: the gravity of center, the position of person's head, the brightness, the size, and the shape of person. Finaly, an image of each person is selected among the image sequence by integrating five characteristics, and is added into the passer-by record image. Some experimental results using a simple experimental system are also reported, which indicate effectiveness of the proposed method. In most scenes, the every persons was able to be detected by the proposed method and the passer-by record image was generated.

Terada, Kenji; Atsuta, Koji

301

Optimal Design of Anger Camera for Bremsstrahlung Imaging: Monte Carlo Evaluation  

PubMed Central

A conventional Anger camera is not adapted to bremsstrahlung imaging and, as a result, even using a reduced energy acquisition window, geometric x-rays represent <15% of the recorded events. This increases noise, limits the contrast, and reduces the quantification accuracy. Monte Carlo (MC) simulations of energy spectra showed that a camera based on a 30-mm-thick BGO crystal and equipped with a high energy pinhole collimator is well-adapted to bremsstrahlung imaging. The total scatter contamination is reduced by a factor 10 versus a conventional NaI camera equipped with a high energy parallel hole collimator enabling acquisition using an extended energy window ranging from 50 to 350?keV. By using the recorded event energy in the reconstruction method, shorter acquisition time and reduced orbit range will be usable allowing the design of a simplified mobile gantry. This is more convenient for use in a busy catheterization room. After injecting a safe activity, a fast single photon emission computed tomography could be performed without moving the catheter tip in order to assess the liver dosimetry and estimate the additional safe activity that could still be injected. Further long running time MC simulations of realistic acquisitions will allow assessing the quantification capability of such system. Simultaneously, a dedicated bremsstrahlung prototype camera reusing PMT–BGO blocks coming from a retired PET system is currently under design for further evaluation.

Walrand, Stephan; Hesse, Michel; Wojcik, Randy; Lhommel, Renaud; Jamar, Francois

2014-01-01

302

Imaging of blood vessels with CCD-camera based three-dimensional photoacoustic tomography  

NASA Astrophysics Data System (ADS)

An optical phase contrast full field detection setup in combination with a CCD-camera is presented to record acoustic fields for real-time projection and fast three-dimensional imaging. When recording projection images of the wave pattern around the imaging object, the three-dimensional photoacoustic imaging problem is reduced to a set of two-dimensional reconstructions and the measurement setup requires only a single axis of rotation. Using a 10 Hz pulse laser system for photoacoustic excitation a three dimensional image can be obtained in less than 1 min. The sensitivity and resolution of the detection system was estimated experimentally with 5 kPa mm and 75?m, respectively. Experiments on biological samples show the applicability of this technique for the imaging of blood vessel distributions.

Nuster, Robert; Slezak, Paul; Paltauf, Guenther

2014-03-01

303

High-speed multispectral three-dimensional imaging with a compound-eye camera TOMBO  

NASA Astrophysics Data System (ADS)

We propose a high-speed multispectral three-dimensional imaging system based on a compact and thin compound-eye camera called TOMBO. Wavelengths and times are assigned to the lenses in TOMBO. The time delays are introduced by the rolling shutter of CMOS image sensors, and wavelength decomposition is realized by attaching several kinds of wavelength filters to the lenses. A depth map is reproduced based on disparities in the unit images captured at the same timing. In reproducing the depth map, sum of sum of absolute differences (SSAD) is evaluated after average equalization to compare images for different wavelengths. A prototype of TOMBO is built with a SXGA monochrome CMOS image sensor with a rolling shutter, optical crosstalk barriers, a commercial 5x5-element microlens array, and commercial gelatin color filters. Enhancement of the frame rate and reproduction of a depth map and a 5-band deep-focus image are demonstrated.

Kagawa, Keiichiro; Fukata, Naoki; Tanida, Jun

2010-08-01

304

An earth imaging camera simulation using wide-scale construction of reflectance surfaces  

NASA Astrophysics Data System (ADS)

Developing and testing advanced ground-based image processing systems for earth-observing remote sensing applications presents a unique challenge that requires advanced imagery simulation capabilities. This paper presents an earth-imaging multispectral framing camera simulation system called PayloadSim (PaySim) capable of generating terabytes of photorealistic simulated imagery. PaySim leverages previous work in 3-D scene-based image simulation, adding a novel method for automatically and efficiently constructing 3-D reflectance scenes by draping tiled orthorectified imagery over a geo-registered Digital Elevation Map (DEM). PaySim's modeling chain is presented in detail, with emphasis given to the techniques used to achieve computational efficiency. These techniques as well as cluster deployment of the simulator have enabled tuning and robust testing of image processing algorithms, and production of realistic sample data for customer-driven image product development. Examples of simulated imagery of Skybox's first imaging satellite are shown.

Murthy, Kiran; Chau, Alexandra H.; Amin, Minesh B.; Robinson, M. Dirk

2013-10-01

305

Multispectral integral imaging acquisition and processing using a monochrome camera and a liquid crystal tunable filter.  

PubMed

This paper presents an acquisition system and a procedure to capture 3D scenes in different spectral bands. The acquisition system is formed by a monochrome camera, and a Liquid Crystal Tunable Filter (LCTF) that allows to acquire images at different spectral bands in the [480, 680]nm wavelength interval. The Synthetic Aperture Integral Imaging acquisition technique is used to obtain the elemental images for each wavelength. These elemental images are used to computationally obtain the reconstruction planes of the 3D scene at different depth planes. The 3D profile of the acquired scene is also obtained using a minimization of the variance of the contribution of the elemental images at each image pixel. Experimental results show the viability to recover the 3D multispectral information of the scene. Integration of 3D and multispectral information could have important benefits in different areas, including skin cancer detection, remote sensing and pattern recognition, among others. PMID:23187411

Latorre-Carmona, Pedro; Sánchez-Ortiga, Emilio; Xiao, Xiao; Pla, Filiberto; Martínez-Corral, Manuel; Navarro, Héctor; Saavedra, Genaro; Javidi, Bahram

2012-11-01

306

Technique for improving the quality of images from digital cameras using ink-jet printers and smoothed RGB transfer curves  

NASA Astrophysics Data System (ADS)

The digital camera market is growing at an explosive rate. At the same time, the quality of photographs printed on ink- jet printers continues to improve. Most of the consumer cameras are designed with the monitor as the target output device and ont the printer. When a user is printing his images from a camera, he/she needs to optimize the camera and printer combination in order to maximize image quality. We describe the details of one such method for improving image quality using a AGFA digital camera and an ink jet printer combination. Using Adobe PhotoShop, we generated optimum red, green and blue transfer curves that match the scene content to the printers output capabilities. Application of these curves to the original digital image resulted in a print with more shadow detail, no loss of highlight detail, a smoother tone scale, and more saturated colors. The image also exhibited an improved tonal scale and visually more pleasing images than those captured and printed without any 'correction'. While we report the results for one camera-printer combination we tested this technique on numbers digital cameras and printer combinations and in each case produced a better looking image. We also discuss the problems we encountered in implementing this technique.

Sampat, Nitin; Grim, John F.; O'Hara, James E.

1998-04-01

307

Ultrafast two-dimensional x-ray imaging with x-ray streak cameras for laser fusion research (invited)  

Microsoft Academic Search

Ultrafast two-dimensional x-ray imaging is required for diagnosing laser-driven inertial confinement fusion plasmas. Image sampling technique with x-ray streak cameras can meet this requirement. Multi-imaging x-ray streak camera method (MIXS) with temporal and spatial resolutions of 10 ps and 15 ?m, respectively, has been developed and successfully utilized for diagnosing the uniformity and heating process of the imploded core plasmas.

H. Shiraga; N. Miyanaga; M. Heya; M. Nakasuji; Y. Aoki; H. Azechi; T. Yamanaka; K. Mima

1997-01-01

308

Real time speed estimation of moving vehicles from side view images from an uncalibrated video camera.  

PubMed

In order to estimate the speed of a moving vehicle with side view camera images, velocity vectors of a sufficient number of reference points identified on the vehicle must be found using frame images. This procedure involves two main steps. In the first step, a sufficient number of points from the vehicle is selected, and these points must be accurately tracked on at least two successive video frames. In the second step, by using the displacement vectors of the tracked points and passed time, the velocity vectors of those points are computed. Computed velocity vectors are defined in the video image coordinate system and displacement vectors are measured by the means of pixel units. Then the magnitudes of the computed vectors in image space should be transformed to the object space to find the absolute values of these magnitudes. This transformation requires an image to object space information in a mathematical sense that is achieved by means of the calibration and orientation parameters of the video frame images. This paper presents proposed solutions for the problems of using side view camera images mentioned here. PMID:22399909

Do?an, Sedat; Temiz, Mahir Serhan; Külür, Sitki

2010-01-01

309

Digital camera workflow for high dynamic range images using a model of retinal processing  

NASA Astrophysics Data System (ADS)

We propose a complete digital camera workflow to capture and render high dynamic range (HDR) static scenes, from RAW sensor data to an output-referred encoded image. In traditional digital camera processing, demosaicing is one of the first operations done after scene analysis. It is followed by rendering operations, such as color correction and tone mapping. In our workflow, which is based on a model of retinal processing, most of the rendering steps are performed before demosaicing. This reduces the complexity of the computation, as only one third of the pixels are processed. This is especially important as our tone mapping operator applies local and global tone corrections, which is usually needed to well render high dynamic scenes. Our algorithms efficiently process HDR images with different keys and different content.

Tamburrino, Daniel; Alleysson, David; Meylan, Laurence; Süsstrunk, Sabine

2008-03-01

310

Portable retinal imaging for eye disease screening using a consumer-grade digital camera  

NASA Astrophysics Data System (ADS)

The development of affordable means to image the retina is an important step toward the implementation of eye disease screening programs. In this paper we present the i-RxCam, a low-cost, hand-held, retinal camera for widespread applications such as tele-retinal screening for eye diseases like diabetic retinopathy (DR), glaucoma, and age-related ocular diseases. Existing portable retinal imagers do not meet the requirements of a low-cost camera with sufficient technical capabilities (field of view, image quality, portability, battery power, and ease-of-use) to be distributed widely to low volume clinics, such as the offices of single primary care physicians serving rural communities. The i-RxCam uses a Nikon D3100 digital camera body. The camera has a CMOS sensor with 14.8 million pixels. We use a 50mm focal lens that gives a retinal field of view of 45 degrees. The internal autofocus can compensate for about 2D (diopters) of focusing error. The light source is an LED produced by Philips with a linear emitting area that is transformed using a light pipe to the optimal shape at the eye pupil, an annulus. To eliminate corneal reflex we use a polarization technique in which the light passes through a nano-wire polarizer plate. This is a novel type of polarizer featuring high polarization separation (contrast ratio of more than 1000) and very large acceptance angle (>45 degrees). The i-RxCam approach will yield a significantly more economical retinal imaging device that would allow mass screening of the at-risk population.

Barriga, Simon; Larichev, Andrey; Zamora, Gilberto; Soliz, Peter

2012-02-01

311

Analytical image reconstruction of cone-beam projections from limited-angle Compton camera data  

Microsoft Academic Search

Since the integration of gamma-rays on a cone is measured with Compton cameras, some sort of image reconstruction method is necessary. Parra (2000) developed an analytical reconstruction algorithm based on a spherical-harmonics expansion of projection data that covers the entire scattering-angle range. The scattering-angle range is limited due to the electrical noise of the detector and due to the finite

Takehiro Tomitani; Masahiko Hirasawa

2002-01-01

312

Performance analysis of opto-mechatronic image stabilization for a compact space camera  

Microsoft Academic Search

The paper presents new performance results for the enhanced concept of an opto-mechatronic camera stabilization assembly consisting of a high-speed onboard optical processor for real-time image motion measurement and a 2-axis piezo-drive assembly for high precision positioning of the focal plane assembly. The proposed visual servoing concept allows minimizing the size of the optics and the sensitivity to attitude disturbances.

K. Janschek; V. Tchernykh; S. Dyblenko

2007-01-01

313

Fluorine18Fluorodeoxyglucose Dual-Head Gamma Camera Coincidence Imaging of Recurrent Colorectal Carcinoma  

Microsoft Academic Search

We report our experience with coincidence detection imaging of 18F-fluorodeoxyglucose (FDG) using a dual-head gamma camera. Scanning of the pelvis and abdomen of a patient with recurrent colorectal carcinoma showed recurrent disease in the pelvic floor and the base of the urinary bladder and metastatic disease in the retroperitoneal space of the pelvis. Although the tumor involving the bladder and

Hussein M. Abdel-Dayem; Arthur I. Radin; Jian-Qiao Luo; Hillel Y. Marans; Steven Wong; Sleiman Y. Naddaf

314

Update and image quality error budget for the LSST camera optical design  

Microsoft Academic Search

The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, modified Paul-Baker design, with an 8.4-meter primary mirror, a 3.4-m secondary, and a 5.0-m tertiary feeding a refractive camera design with 3 lenses (0.69-1.55m) and a set of broadband filters\\/corrector lenses. Performance is excellent over a 9.6 square degree field and ultraviolet to near infrared wavelengths. We describe the image

Brian J. Bauman; Gordon Bowden; John Ku; Martin Nordby; Scot Olivier; Vincent Riot; Andrew Rasmussen; Lynn Seppala; Hong Xiao; Nadine Nurita; David Gilmore; Steven Kahn

2010-01-01

315

Energy-Aware Video Encoding for Image Quality Improvement in Battery-Operated Surveillance Camera  

Microsoft Academic Search

Growing needs for surveillance in locations without power lines necessitates the development of a surveillance camera with extremely low-power consumption and an assured stable operation until the time of expected run-out of available energy. This paper proposes an algorithm for scheduling of video encoding configurations in a battery-operated surveillance system to reduce the image distortion while assuring the sustained operation

Younghoon Lee; Jungsoo Kim; Chong-Min Kyung

2012-01-01

316

Volcano geodesy at Santiaguito using ground-based cameras and particle image velocimetry  

NASA Astrophysics Data System (ADS)

The active Santiaguito dome in Guatemala is an exceptional field site for ground-based optical observations owing to the bird's-eye viewing perspective from neighboring Santa Maria Volcano. From the summit of Santa Maria the frequent (1 per hour) explosions and continuous lava flow effusion may be observed from a vantage point, which is at a ~30 degree elevation angle, 1200 m above and 2700 m distant from the active vent. At these distances both video cameras and SLR cameras fitted with high-power lenses can effectively track blocky features translating and uplifting on the surface of Santiaguito's dome. We employ particle image velocimetry in the spatial frequency domain to map movements of ~10x10 m^2 surface patches with better than 10 cm displacement resolution. During three field campaigns to Santiaguito in 2007, 2009, and 2012 we have used cameras to measure dome surface movements for a range of time scales. In 2007 and 2009 we used video cameras recording at 30 fps to track repeated rapid dome uplift (more than 1 m within 2 s) of the 30,000 m^2 dome associated with the onset of eruptive activity. We inferred that the these uplift events were responsible for both a seismic long period response and an infrasound bimodal pulse. In 2012 we returned to Santiaguito to quantify dome surface movements over hour-to-day-long time scales by recording time lapse imagery at one minute intervals. These longer time scales reveal dynamic structure to the uplift and subsidence trends, effusion rate, and surface flow patterns that are related to internal conduit pressurization. In 2012 we performed particle image velocimetry with multiple cameras spatially separated in order to reconstruct 3-dimensional surface movements.

Johnson, J.; Andrews, B. J.; Anderson, J.; Lyons, J. J.; Lees, J. M.

2012-12-01

317

X-ray and gamma-ray imaging with multiple-pinhole cameras using a posteriori image synthesis.  

NASA Technical Reports Server (NTRS)

In 1968, Dicke had suggested that multiple-pinhole camera systems would have significant advantages concerning the SNR in X-ray and gamma-ray astronomy if the multiple images could be somehow synthesized into a single image. The practical development of an image-synthesis method based on these suggestions is discussed. A formulation of the SNR gain theory which is particularly suited for dealing with the proposal by Dicke is considered. It is found that the SNR gain is by no means uniform in all X-ray astronomy applications.

Groh, G.; Hayat, G. S.; Stroke, G. W.

1972-01-01

318

Virtual wiper - removal of adherent noises from images of dynamic scenes by using a pan-tilt camera  

Microsoft Academic Search

In this paper, we propose a new method that can remove view-disturbing noises from images of dynamic scenes. One of the thorny problems in outdoor surveillance by a camera is that adherent noises such as waterdrops or mud blobs on the protecting glass surface lens disturb the view from the camera. Therefore, we propose a method for removing adherent noises

Atsushi Yamashita; Tomoaki Harada; Toru Kaneko; Kenjiro T. Miura

2005-01-01

319

COMPACT CdZnTe-BASED GAMMA CAMERA FOR PROSTATE CANCER IMAGING  

SciTech Connect

In this paper, we discuss the design of a compact gamma camera for high-resolution prostate cancer imaging using Cadmium Zinc Telluride (CdZnTe or CZT) radiation detectors. Prostate cancer is a common disease in men. Nowadays, a blood test measuring the level of prostate specific antigen (PSA) is widely used for screening for the disease in males over 50, followed by (ultrasound) imaging-guided biopsy. However, PSA tests have a high false-positive rate and ultrasound-guided biopsy has a high likelihood of missing small cancerous tissues. Commercial methods of nuclear medical imaging, e.g. PET and SPECT, can functionally image the organs, and potentially find cancer tissues at early stages, but their applications in diagnosing prostate cancer has been limited by the smallness of the prostate gland and the long working distance between the organ and the detectors comprising these imaging systems. CZT is a semiconductor material with wide band-gap and relatively high electron mobility, and thus can operate at room temperature without additional cooling. CZT detectors are photon-electron direct-conversion devices, thus offering high energy-resolution in detecting gamma rays, enabling energy-resolved imaging, and reducing the background of Compton-scattering events. In addition, CZT material has high stopping power for gamma rays; for medical imaging, a few-mm-thick CZT material provides adequate detection efficiency for many SPECT radiotracers. Because of these advantages, CZT detectors are becoming popular for several SPECT medical-imaging applications. Most recently, we designed a compact gamma camera using CZT detectors coupled to an application-specific-integrated-circuit (ASIC). This camera functions as a trans-rectal probe to image the prostate gland from a distance of only 1-5 cm, thus offering higher detection efficiency and higher spatial resolution. Hence, it potentially can detect prostate cancers at their early stages. The performance tests of this camera have been completed. The results show better than 6-mm resolution at a distance of 1 cm. Details of the test results are discussed in this paper.

CUI, Y.; LALL, T.; TSUI, B.; YU, J.; MAHLER, G.; BOLOTNIKOV, A.; VASKA, P.; DeGERONIMO, G.; O'CONNOR, P.; MEINKEN, G.; JOYAL, J.; BARRETT, J.; CAMARDA, G.; HOSSAIN, A.; KIM, K.H.; YANG, G.; POMPER, M.; CHO, S.; WEISMAN, K.; SEO, Y.; BABICH, J.; LaFRANCE, N.; AND JAMES, R.B.

2011-10-23

320

Compact CdZnTe-based gamma camera for prostate cancer imaging  

NASA Astrophysics Data System (ADS)

In this paper, we discuss the design of a compact gamma camera for high-resolution prostate cancer imaging using Cadmium Zinc Telluride (CdZnTe or CZT) radiation detectors. Prostate cancer is a common disease in men. Nowadays, a blood test measuring the level of prostate specific antigen (PSA) is widely used for screening for the disease in males over 50, followed by (ultrasound) imaging-guided biopsy. However, PSA tests have a high falsepositive rate and ultrasound-guided biopsy has a high likelihood of missing small cancerous tissues. Commercial methods of nuclear medical imaging, e.g. PET and SPECT, can functionally image the organs, and potentially find cancer tissues at early stages, but their applications in diagnosing prostate cancer has been limited by the smallness of the prostate gland and the long working distance between the organ and the detectors comprising these imaging systems. CZT is a semiconductor material with wide band-gap and relatively high electron mobility, and thus can operate at room temperature without additional cooling. CZT detectors are photon-electron direct-conversion devices, thus offering high energy-resolution in detecting gamma rays, enabling energy-resolved imaging, and reducing the background of Compton-scattering events. In addition, CZT material has high stopping power for gamma rays; for medical imaging, a few-mm-thick CZT material provides adequate detection efficiency for many SPECT radiotracers. Because of these advantages, CZT detectors are becoming popular for several SPECT medical-imaging applications. Most recently, we designed a compact gamma camera using CZT detectors coupled to an application-specific-integratedcircuit (ASIC). This camera functions as a trans-rectal probe to image the prostate gland from a distance of only 1-5 cm, thus offering higher detection efficiency and higher spatial resolution. Hence, it potentially can detect prostate cancers at their early stages. The performance tests of this camera have been completed. The results show better than 6-mm resolution at a distance of 1 cm. Details of the test results are discussed in this paper.

Cui, Yonggang; Lall, Terry; Tsui, Benjamin; Yu, Jianhua; Mahler, George; Bolotnikov, Aleksey; Vaska, Paul; de Geronimo, Gianluigi; O'Connor, Paul; Meinken, George; Joyal, John; Barrett, John; Camarda, Giuseppe; Hossain, Anwar; Kim, Ki Hyun; Yang, Ge; Pomper, Marty; Cho, Steve; Weisman, Ken; Seo, Youngho; Babich, John; Lafrance, Norman; James, Ralph B.

2011-06-01

321

Contrast reduction in digital images due to x-ray induced damage to a TV camera's CCD image receptor.  

PubMed

The CCD image receptor in a monochrome TV camera was irradiated in the dark with a single-phase 80 kVp narrow beam carrying an additional 0.5 mm Al filter. It was found that the CCD responded to the x-rays with a transient signal which would reach a maximum value at about 70 mR per image frame. The signal's contrast would reach a maximum of about 7%, but was detectable at much lower exposure rates. Continuous irradiation of the CCD image receptor created a 'lesion' at the point of incidence that remained after the x-ray beam was switched off. This signal seemed to be due to permanent damage to the CCD and could be detected after a cumulative exposure of less than 20 R. It was shown that such damage could be created much more effectively if the TV camera was switched off rather than on and the maximum contrast was achieved with about 75 R and 220 R respectively. The maximum contrast achieved was about 8%. Further increase in the cumulative exposure of a particular location on the CCD image receptor was not investigated but it seemed reasonable that much higher contrast values could be reached if irradiation was continued. The latter damage described here was found to persist for at least several weeks and hence it will probably prevent any normal use of the TV camera in the future. PMID:10232819

Okkalides, D

1999-04-01

322

Advances In The Image Sensor: The Critical Element In The Performance Of Cameras  

NASA Astrophysics Data System (ADS)

Digital imaging technology and digital imaging products are advancing at a rapid pace. The progress of digital cameras has been particularly impressive. Image sensors now have smaller pixel size, a greater number of pixels, higher sensitivity, lower noise and a higher frame rate. Picture resolution is a function of the number of pixels of the image sensor. The more pixels there are, the smaller each pixel, but the sensitivity and the charge-handling capability of each pixel can be maintained or even be increased by raising the quantum efficiency and the saturation capacity of the pixel per unit area. Sony's many technologies can be successfully applied to CMOS Image Sensor manufacturing toward sub-2.0 um pitch pixel and beyond.

Narabu, Tadakuni

2011-01-01

323

Intraoperative Imaging Guidance for Sentinel Node Biopsy in Melanoma Using a Mobile Gamma Camera  

SciTech Connect

The objective is to evaluate the sensitivity and clinical utility of intraoperative mobile gamma camera (MGC) imaging in sentinel lymph node biopsy (SLNB) in melanoma. The false-negative rate for SLNB for melanoma is approximately 17%, for which failure to identify the sentinel lymph node (SLN) is a major cause. Intraoperative imaging may aid in detection of SLN near the primary site, in ambiguous locations, and after excision of each SLN. The present pilot study reports outcomes with a prototype MGC designed for rapid intraoperative image acquisition. We hypothesized that intraoperative use of the MGC would be feasible and that sensitivity would be at least 90%. From April to September 2008, 20 patients underwent Tc99 sulfur colloid lymphoscintigraphy, and SLNB was performed with use of a conventional fixed gamma camera (FGC), and gamma probe followed by intraoperative MGC imaging. Sensitivity was calculated for each detection method. Intraoperative logistical challenges were scored. Cases in which MGC provided clinical benefit were recorded. Sensitivity for detecting SLN basins was 97% for the FGC and 90% for the MGC. A total of 46 SLN were identified: 32 (70%) were identified as distinct hot spots by preoperative FGC imaging, 31 (67%) by preoperative MGC imaging, and 43 (93%) by MGC imaging pre- or intraoperatively. The gamma probe identified 44 (96%) independent of MGC imaging. The MGC provided defined clinical benefit as an addition to standard practice in 5 (25%) of 20 patients. Mean score for MGC logistic feasibility was 2 on a scale of 1-9 (1 = best). Intraoperative MGC imaging provides additional information when standard techniques fail or are ambiguous. Sensitivity is 90% and can be increased. This pilot study has identified ways to improve the usefulness of an MGC for intraoperative imaging, which holds promise for reducing false negatives of SLNB for melanoma.

Dengel, Lynn T; Judy, Patricia G; Petroni, Gina R; Smolkin, Mark E; Rehm, Patrice K; Majewski, Stan; Williams, Mark B

2011-04-01

324

Development of a high-speed CT imaging system using EMCCD camera  

NASA Astrophysics Data System (ADS)

The limitations of current CCD-based microCT X-ray imaging systems arise from two important factors. First, readout speeds are curtailed in order to minimize system read noise, which increases significantly with increasing readout rates. Second, the afterglow associated with commercial scintillator films can introduce image lag, leading to substantial artifacts in reconstructed images, especially when the detector is operated at several hundred frames/second (fps). For high speed imaging systems, high-speed readout electronics and fast scintillator films are required. This paper presents an approach to developing a high-speed CT detector based on a novel, back-thinned electron-multiplying CCD (EMCCD) coupled to various bright, high resolution, low afterglow films. The EMCCD camera, when operated in its binned mode, is capable of acquiring data at up to 300 fps with reduced imaging area. CsI:Tl,Eu and ZnSe:Te films, recently fabricated at RMD, apart from being bright, showed very good afterglow properties, favorable for high-speed imaging. Since ZnSe:Te films were brighter than CsI:Tl,Eu films, for preliminary experiments a ZnSe:Te film was coupled to an EMCCD camera at UC Davis Medical Center. A high-throughput tungsten anode X-ray generator was used, as the X-ray fluence from a mini- or micro-focus source would be insufficient to achieve high-speed imaging. A euthanized mouse held in a glass tube was rotated 360 degrees in less than 3 seconds, while radiographic images were recorded at various readout rates (up to 300 fps); images were reconstructed using a conventional Feldkamp cone-beam reconstruction algorithm. We have found that this system allows volumetric CT imaging of small animals in approximately two seconds at ~110 to 190 ?m resolution, compared to several minutes at 160 ?m resolution needed for the best current systems.

Thacker, Samta C.; Yang, Kai; Packard, Nathan; Gaysinskiy, Valeriy; Burkett, George; Miller, Stuart; Boone, John M.; Nagarkar, Vivek

2009-02-01

325

Application of real-time single camera SLAM technology for image-guided targeting in neurosurgery  

NASA Astrophysics Data System (ADS)

In this paper, we propose an application of augmented reality technology for targeting tumors or anatomical structures inside the skull. The application is a combination of the technologies of MonoSLAM (Single Camera Simultaneous Localization and Mapping) and computer graphics. A stereo vision system is developed to construct geometric data of human face for registration with CT images. Reliability and accuracy of the application is enhanced by the use of fiduciary markers fixed to the skull. The MonoSLAM keeps track of the current location of the camera with respect to an augmented reality (AR) marker using the extended Kalman filter. The fiduciary markers provide reference when the AR marker is invisible to the camera. Relationship between the markers on the face and the augmented reality marker is obtained by a registration procedure by the stereo vision system and is updated on-line. A commercially available Android based tablet PC equipped with a 320×240 front-facing camera was used for implementation. The system is able to provide a live view of the patient overlaid by the solid models of tumors or anatomical structures, as well as the missing part of the tool inside the skull.

Chang, Yau-Zen; Hou, Jung-Fu; Tsao, Yi Hsiang; Lee, Shih-Tseng

2012-10-01

326

ROSA: A High-cadence, Synchronized Multi-camera Solar Imaging System  

NASA Astrophysics Data System (ADS)

The Rapid Oscillations in the Solar Atmosphere (ROSA) instrument is a synchronized, six-camera high-cadence solar imaging instrument developed by Queen's University Belfast and recently commissioned at the Dunn Solar Telescope at the National Solar Observatory in Sunspot, New Mexico, USA, as a common-user instrument. Consisting of six 1k x 1k Peltier-cooled frame-transfer CCD cameras with very low noise (0.02 - 15 e/pixel/s), each ROSA camera is capable of full-chip readout speeds in excess of 30 Hz, and up to 200 Hz when the CCD is windowed. ROSA will allow for multi-wavelength studies of the solar atmosphere at a high temporal resolution. We will present the current instrument set-up and parameters, observing modes, and future plans, including a new high QE camera allowing 15 Hz for Halpha. Interested parties should see https://habu.pst.qub.ac.uk/groups/arcresearch/wiki/de502/ROSA.html

Christian, Damian Joseph; Jess, D. B.; Mahtioudakis, M.; Keenan, F. P.

2011-05-01

327

High-speed camera with internal real-time image compression  

NASA Astrophysics Data System (ADS)

High-speed video cameras are powerful tools for investigating, for instance, fluid dynamics or the movements of mechanical parts in manufacturing processes. In the past 5 years the use of CMOS sensors instead of CCDs has facilited the development of high-speed video cameras offering digital outputs, readout flexibility, and lower manufacturing costs. Still the huge data flow provided by the sensor cannot be easily transferred or processed and thus must generally be stored temporarily in fast local RAM. Since this RAM is size limited, the recording time in the camera is only a few seconds long. We tried to develop an alternative solution that would allow continuous recording. We developed a real-time image compression in order to reduce the data flow. We tested three algorithms: run-length encoding, block coding, and compression using wavelets. These compression algorithms have been implemented into a FPGA Virtex II-1000 and allow real-time compression factors between 5 and 10 with a PSNR greater than 35dB. This compression factor allowed us to link a new high-speed CMOS video camera with a PC using a single USB2 connection. The full flow of 500 fps in 1280x1024 format is transferred to the computer in real-time.

Paindavoine, Michel; Mosqueron, Romulad; Dubois, Julien; Clerc, Cedric; Grapin, Jean-CLaude; Pierrefeu, Lionel; Tomasini, Francois

2005-03-01

328

The core of the nearby S0 galaxy NGC 7457 imaged with the HST planetary camera  

NASA Technical Reports Server (NTRS)

A brief analysis is presented of images of the nearby S0 galaxy NGC 7457 obtained with the HST Planetary Camera. While the galaxy remains unresolved with the HST, the images reveal that any core most likely has r(c) less than 0.052 arcsec. The light distribution is consistent with a gamma = -1.0 power law inward to the resolution limit, with a possible stellar nucleus with luminosity of 10 million solar. This result represents the first observation outside the Local Group of a galaxy nucleus at this spatial resolution, and it suggests that such small, high surface brightness cores may be common.

Lauer, Tod R.; Faber, S. M.; Holtzman, Jon A.; Baum, William A.; Currie, Douglas G.; Ewald, S. P.; Groth, Edward J.; Hester, J. Jeff; Kelsall, T.

1991-01-01

329

Improvement of a snapshot spectroscopic retinal multi-aperture imaging camera  

NASA Astrophysics Data System (ADS)

Measurement of oxygen saturation has proved to give important information about the eye health and the onset of eye pathologies such as Diabetic Retinopathy. Recently, we have presented a multi-aperture system enabling snapshot acquisition of human fundus images at six different wavelengths. In our setup a commercial fundus ophthalmoscope was interfaced with the multi-aperture system to acquire spectroscopic sensitive images of the retina vessel, thus enabling assessment of the oxygen saturation in the retina. Snapshot spectroscopic acquisition is meant to minimize the effects of eye movements. Higher measurement accuracy can be achieved by increasing the number of wavelengths at which the fundus images are taken. In this study we present an improvement of our setup by introducing an other multi-aperture camera that enables us to take snapshot images of the fundus at nine different wavelengths. Careful consideration is taken to improve image transfer by measuring the optical properties of the fundus camera used in the setup and modeling the optical train in Zemax.

Lemaillet, Paul; Lompado, Art; Ramella-Roman, Jessica C.

2009-02-01

330

Performance of CID camera x-ray imagers at NIF in a harsh neutron environment  

NASA Astrophysics Data System (ADS)

Charge-injection devices (CIDs) are solid-state 2D imaging sensors similar to CCDs, but their distinct architecture makes CIDs more resistant to ionizing radiation. CID cameras have been used extensively for X-ray imaging at the OMEGA Laser Facility with neutron fluences at the sensor approaching 109 n/cm2 (DT, 14 MeV). A CID Camera X-ray Imager (CCXI) system has been designed and implemented at NIF that can be used as a rad-hard electronic-readout alternative for time-integrated X-ray imaging. This paper describes the design and implementation of the system, calibration of the sensor for X-rays in the 3 - 14 keV energy range, and preliminary data acquired on NIF shots over a range of neutron yields. The upper limit of neutron fluence at which CCXI can acquire useable images is ~ 108 n/cm2 and there are noise problems that need further improvement, but the sensor has proven to be very robust in surviving high yield shots (~ 1014 DT neutrons) with minimal damage.

Palmer, Nathan E.; Schneider, Marilyn B.; Bell, Perry M.; Piston, Ken W.; Moody, James D.; James, D. L.; Ness, Ron A.; Haugh, Michael J.; Lee, Joshua J.; Romano, Edward D.

2013-09-01

331

Flame colour characterization in the visible and infrared spectrum using a digital camera and image processing  

NASA Astrophysics Data System (ADS)

An attempt has been made to characterize the colour spectrum of methane flame under various burning conditions using RGB and HSV colour models instead of resolving the real physical spectrum. The results demonstrate that each type of flame has its own characteristic distribution in both the RGB and HSV space. It has also been observed that the averaged B and G values in the RGB model represent well the CH* and C*2 emission of methane premixed flame. Theses features may be utilized for flame measurement and monitoring. The great advantage of using a conventional camera for monitoring flame properties based on the colour spectrum is that it is readily available, easy to interface with a computer, cost effective and has certain spatial resolution. Furthermore, it has been demonstrated that a conventional digital camera is able to image flame not only in the visible spectrum but also in the infrared. This feature is useful in avoiding the problem of image saturation typically encountered in capturing the very bright sooty flames. As a result, further digital imaging processing and quantitative information extraction is possible. It has been identified that an infrared image also has its own distribution in both the RGB and HSV colour space in comparison with a flame image in the visible spectrum.

Huang, Hua-Wei; Zhang, Yang

2008-08-01

332

Determining 3D Flow Fields via Multi-camera Light Field Imaging  

PubMed Central

In the field of fluid mechanics, the resolution of computational schemes has outpaced experimental methods and widened the gap between predicted and observed phenomena in fluid flows. Thus, a need exists for an accessible method capable of resolving three-dimensional (3D) data sets for a range of problems. We present a novel technique for performing quantitative 3D imaging of many types of flow fields. The 3D technique enables investigation of complicated velocity fields and bubbly flows. Measurements of these types present a variety of challenges to the instrument. For instance, optically dense bubbly multiphase flows cannot be readily imaged by traditional, non-invasive flow measurement techniques due to the bubbles occluding optical access to the interior regions of the volume of interest. By using Light Field Imaging we are able to reparameterize images captured by an array of cameras to reconstruct a 3D volumetric map for every time instance, despite partial occlusions in the volume. The technique makes use of an algorithm known as synthetic aperture (SA) refocusing, whereby a 3D focal stack is generated by combining images from several cameras post-capture 1. Light Field Imaging allows for the capture of angular as well as spatial information about the light rays, and hence enables 3D scene reconstruction. Quantitative information can then be extracted from the 3D reconstructions using a variety of processing algorithms. In particular, we have developed measurement methods based on Light Field Imaging for performing 3D particle image velocimetry (PIV), extracting bubbles in a 3D field and tracking the boundary of a flickering flame. We present the fundamentals of the Light Field Imaging methodology in the context of our setup for performing 3DPIV of the airflow passing over a set of synthetic vocal folds, and show representative results from application of the technique to a bubble-entraining plunging jet.

Truscott, Tadd T.; Belden, Jesse; Nielson, Joseph R.; Daily, David J.; Thomson, Scott L.

2013-01-01

333

Determining 3D flow fields via multi-camera light field imaging.  

PubMed

In the field of fluid mechanics, the resolution of computational schemes has outpaced experimental methods and widened the gap between predicted and observed phenomena in fluid flows. Thus, a need exists for an accessible method capable of resolving three-dimensional (3D) data sets for a range of problems. We present a novel technique for performing quantitative 3D imaging of many types of flow fields. The 3D technique enables investigation of complicated velocity fields and bubbly flows. Measurements of these types present a variety of challenges to the instrument. For instance, optically dense bubbly multiphase flows cannot be readily imaged by traditional, non-invasive flow measurement techniques due to the bubbles occluding optical access to the interior regions of the volume of interest. By using Light Field Imaging we are able to reparameterize images captured by an array of cameras to reconstruct a 3D volumetric map for every time instance, despite partial occlusions in the volume. The technique makes use of an algorithm known as synthetic aperture (SA) refocusing, whereby a 3D focal stack is generated by combining images from several cameras post-capture (1). Light Field Imaging allows for the capture of angular as well as spatial information about the light rays, and hence enables 3D scene reconstruction. Quantitative information can then be extracted from the 3D reconstructions using a variety of processing algorithms. In particular, we have developed measurement methods based on Light Field Imaging for performing 3D particle image velocimetry (PIV), extracting bubbles in a 3D field and tracking the boundary of a flickering flame. We present the fundamentals of the Light Field Imaging methodology in the context of our setup for performing 3DPIV of the airflow passing over a set of synthetic vocal folds, and show representative results from application of the technique to a bubble-entraining plunging jet. PMID:23486112

Truscott, Tadd T; Belden, Jesse; Nielson, Joseph R; Daily, David J; Thomson, Scott L

2013-01-01

334

Application of adaptive optics in retinal imaging: a quantitative and clinical comparison with standard cameras  

NASA Astrophysics Data System (ADS)

Aim: The objective of this project was to evaluate high resolution images from an adaptive optics retinal imager through comparisons with standard film-based and standard digital fundus imagers. Methods: A clinical prototype adaptive optics fundus imager (AOFI) was used to collect retinal images from subjects with various forms of retinopathy to determine whether improved visibility into the disease could be provided to the clinician. The AOFI achieves low-order correction of aberrations through a closed-loop wavefront sensor and an adaptive optics system. The remaining high-order aberrations are removed by direct deconvolution using the point spread function (PSF) or by blind deconvolution when the PSF is not available. An ophthalmologist compared the AOFI images with standard fundus images and provided a clinical evaluation of all the modalities and processing techniques. All images were also analyzed using a quantitative image quality index. Results: This system has been tested on three human subjects (one normal and two with retinopathy). In the diabetic patient vascular abnormalities were detected with the AOFI that cannot be resolved with the standard fundus camera. Very small features, such as the fine vascular structures on the optic disc and the individual nerve fiber bundles are easily resolved by the AOFI. Conclusion: This project demonstrated that adaptive optic images have great potential in providing clinically significant detail of anatomical and pathological structures to the ophthalmologist.

Barriga, E. S.; Erry, G.; Yang, S.; Russell, S.; Raman, B.; Soliz, P.

2005-04-01

335

Characterization of digital cameras for reflected ultraviolet photography; implications for qualitative and quantitative image analysis during forensic examination.  

PubMed

Reflected ultraviolet imaging techniques allow for the visualization of evidence normally outside the human visible spectrum. Specialized digital cameras possessing extended sensitivity can be used for recording reflected ultraviolet radiation. Currently, there is a lack of standardized methods for ultraviolet image recording and processing using digital cameras, potentially limiting the implementation and interpretation. A methodology is presented for processing ultraviolet images based on linear responses and the sensitivity of the respective color channels. The methodology is applied to a FujiS3 UVIR camera, and a modified Nikon D70s camera, to reconstruct their respective spectral sensitivity curves between 320 and 400 nm. This method results in images with low noise and high contrast, suitable for qualitative and/or quantitative analysis. The application of this methodology is demonstrated in the recording of latent fingerprints. PMID:24117678

Garcia, Jair E; Wilksch, Philip A; Spring, Gale; Philp, Peta; Dyer, Adrian

2014-01-01

336

200 ps FWHM and 100 MHz repetition rate ultrafast gated camera for optical medical functional imaging  

NASA Astrophysics Data System (ADS)

The paper describes the realization of a complete optical imaging device to clinical applications like brain functional imaging by time-resolved, spectroscopic diffuse optical tomography. The entire instrument is assembled in a unique setup that includes a light source, an ultrafast time-gated intensified camera and all the electronic control units. The light source is composed of four near infrared laser diodes driven by a nanosecond electrical pulse generator working in a sequential mode at a repetition rate of 100 MHz. The resulting light pulses, at four wavelengths, are less than 80 ps FWHM. They are injected in a four-furcated optical fiber ended with a frontal light distributor to obtain a uniform illumination spot directed towards the head of the patient. Photons back-scattered by the subject are detected by the intensified CCD camera; there are resolved according to their time of flight inside the head. The very core of the intensified camera system is the image intensifier tube and its associated electrical pulse generator. The ultrafast generator produces 50 V pulses, at a repetition rate of 100 MHz and a width corresponding to the 200 ps requested gate. The photocathode and the Micro-Channel-Plate of the intensifier have been specially designed to enhance the electromagnetic wave propagation and reduce the power loss and heat that are prejudicial to the quality of the image. The whole instrumentation system is controlled by an FPGA based module. The timing of the light pulses and the photocathode gating is precisely adjustable with a step of 9 ps. All the acquisition parameters are configurable via software through an USB plug and the image data are transferred to a PC via an Ethernet link. The compactness of the device makes it a perfect device for bedside clinical applications.

Uhring, Wilfried; Poulet, Patrick; Hanselmann, Walter; Glazenborg, René; Zint, Virginie; Nouizi, Farouk; Dubois, Benoit; Hirschi, Werner

2012-05-01

337

Imaging and radiometric performance simulation for a new high-performance dual-band airborne reconnaissance camera  

NASA Astrophysics Data System (ADS)

In recent years, high performance visible and IR cameras have been used widely for tactical airborne reconnaissance. The process improvement for efficient discrimination and analysis of complex target information from active battlefields requires for simultaneous multi-band measurement from airborne platforms at various altitudes. We report a new dual band airborne camera designed for simultaneous registration of both visible and IR imagery from mid-altitude ranges. The camera design uses a common front end optical telescope of around 0.3m in entrance aperture and several relay optical sub-systems capable of delivering both high spatial resolution visible and IR images to the detectors. The camera design is benefited from the use of several optical channels packaged in a compact space and the associated freedom to choose between wide (~3 degrees) and narrow (~1 degree) field of view. In order to investigate both imaging and radiometric performances of the camera, we generated an array of target scenes with optical properties such as reflection, refraction, scattering, transmission and emission. We then combined the target scenes and the camera optical system into the integrated ray tracing simulation environment utilizing Monte Carlo computation technique. Taking realistic atmospheric radiative transfer characteristics into account, both imaging and radiometric performances were then investigated. The simulation results demonstrate successfully that the camera design satisfies NIIRS 7 detection criterion. The camera concept, details of performance simulation computation, the resulting performances are discussed together with future development plan.

Seong, Sehyun; Yu, Jinhee; Ryu, Dongok; Hong, Jinsuk; Yoon, Jee-Yeon; Kim, Sug-Whan; Lee, Jun-Ho; Shin, Myung-Jin

2009-05-01

338

Spartan infrared camera: high-resolution imaging for the SOAR Telescope  

NASA Astrophysics Data System (ADS)

The Spartan Infrared Camera provides tip-tilt corrected imaging for the SOAR Telescope in the 1-2.5?m spectral range with four 2048x2048 HAWAII2 detectors. The median image size is expected to be less than 0.25 arcsec (FWHM), and in the H and K bands a significant amount of the light is expected to be in a core having the diffraction-limited width. The camera has two plate scales: 0.04 arcsec/pixel (f/21) for diffraction-limited sampling in the H and K bands and 0.07 arcsec/pixel (f/12) to cover a 5×5 arcmin2 field, over which tip-tilt correction is substantial. Except for CaF2 field-flattening lenses, the optics is all reflective to achieve the large field size and achromaticity, and all aluminum to match thermally the aluminum cryogenic-optical box in which the optics mount. The Strehl ratio of the camera itself is 0.95-1.00 for the f/21 channel. The optics (including the off-axis aspherical mirrors) will be aligned with precise metrology rather than adjusted using interferometry.

Loh, Edwin D.; Biel, Jason D.; Chen, Jian-Jun; Davis, Michael; Laporte, Rene; Loh, Owen Y.

2004-09-01

339

Fluorine-18-fluorodeoxyglucose dual-head gamma camera coincidence imaging of recurrent colorectal carcinoma.  

PubMed

We report our experience with coincidence detection imaging of 18F-fluorodeoxyglucose (FDG) using a dual-head gamma camera. Scanning of the pelvis and abdomen of a patient with recurrent colorectal carcinoma showed recurrent disease in the pelvic floor and the base of the urinary bladder and metastatic disease in the retroperitoneal space of the pelvis. Although the tumor involving the bladder and pelvic floor was detected by CT and magnetic resonance imaging (MRI), metastatic spread to the retroperitoneal nodes on the left side was detected only by 18F-FDG imaging. Based on the ultrasound, CT, MRI and cystoscopy, a local recurrence of cancer was presumed in our patient. An exploratory laparotomy was performed to resect the tumor in its entirety. At the time of surgery, the retroperitoneal metastasis in the pelvis was confirmed. Had the findings of the coincidence detection imaging study been considered, the patient would have been spared the surgical procedure. PMID:9544675

Abdel-Dayem, H M; Radin, A I; Luo, J Q; Marans, H Y; Wong, S; Naddaf, S Y; El-Zeftawy, H M; Omar, W S; Mithilesh, K; Abujudeh, H; Atay, S

1998-04-01

340

Digital image processing in high resolution infrared camera with use of programmable logic device  

NASA Astrophysics Data System (ADS)

In article a digital system for high resolution infrared camera control and image processing is described. The camera is built with use of bolometric focal plane array of size 640 by 480 detectors. Single detector in array has size of 25 ?m and can detect incident radiation from the spectral range of 8÷12 ?m thanks to the special filter installed in specially designed entrance window. The most important tasks of infrared image processing system are array readout and correction of detectors offset and responsivity variations. The next tasks of the system are conversion of analog voltage signals from microbolometers in array to digital form and then composition of a thermal image. Microbolometer array needs to be controlled via several signals. The signal generator for readout circuit is capable of changing various timing parameters like frame rate or integration time of the detector array. The changes in these parameters can be done via special set of memory mapped registers. The infrared data received from detector array is transferred via data bus to modules performing image processing, for example techniques for image enhancement. Image processing algorithms necessary for infrared image generation are nonuniformity correction, bad pixel replacement and radiometric calibration. Optionally an additional image processing techniques can be performed like edge enhancement, dynamic range compression or object identification. The elaborated architecture of the system allowed easy change of parameters of the system and to adopt many new algorithms without significant hardware changes. Scientific work funded from science fund for years 2009-2011 as a development project.

Sosnowski, Tomasz; Bieszczad, Grzegorz; Kastek, Mariusz; Madura, Henryk

2010-10-01

341

Modeling of three-dimensional camera imaging in a tokamak torus  

SciTech Connect

A procedure is described for precision modeling of the views for imaging diagnostics monitoring tokamak internal components, particularly high heat flux divertor components. Such modeling is required to enable predictions of resolution and viewing angle for the available viewing locations. Since oblique views are typically expected for tokamak divertors, fully three-dimensional (3D) perspective imaging is required. A suite of matched 3D CAD, graphics and animation applications are used to provide a fast and flexible technique for reproducing these views. An analytic calculation of the resolution and viewing incidence angle is developed to validate the results of the modeling procedures. The tokamak physics experiment (TPX) diagnostics1 for infrared viewing are used as an example to demonstrate the implementation of the tools. As is generally the case in tokamak experiments, the available diagnostic locations for TPX are severely constrained by access limitations and the resulting images can be marginal in both resolution and viewing incidence angle. The procedures described here provide a complete design tool for in-vessel viewing, both for camera location and for identification of viewed surfaces. Additionally, these same tools can be used for the interpretation of the actual images obtained by the diagnostic cameras. {copyright} {ital 1997 American Institute of Physics.}

Edmonds, P.H. [Fusion Research Center, University of Texas at Austin, Austin, Texas 08543 (United States)] [Fusion Research Center, University of Texas at Austin, Austin, Texas 08543 (United States); Medley, S.S. [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543 (United States)] [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543 (United States)

1997-01-01

342

Development of a 3D dynamic imaging technique with a multi-pinhole ?-camera  

NASA Astrophysics Data System (ADS)

We propose a multi-pinhole ?-camera, comprising multi-pinhole collimators and flat-panel detectors (FPDs), in order to facilitate dynamic imaging of low-energy ?-ray emitters, such as 99mTc. The collimator comprises an array of 9×9 pinholes with an opening angle of 60°, these pinholes are aligned in a lattice arrangement. In order to experimentally verify the 3D imaging capabilities of this ?-camera design, we have conducted preliminary experiments to measure the profile of a thin disk 57Co (122 keV) ?-ray source with a prototype system by using an imaging plate instead of the FPD. The results indicate that it would be able to identify the location of the 57Co ?-ray sources with a spatial resolution of 2.5 mm. We have also carried out a preliminary experiment by using FPD. The results have shown that the proposed system using FPD (C9250DP) could have a time resolution of 13 s per one image for a simple distribution of a 37 MBq 99mTc source.

Takemoto, Hisato; Watanabe, Kenichi; Kawarabayashi, Jun; Iguchi, Tetsuo

2007-08-01

343

Real-Time On-Board Processing Validation of MSPI Ground Camera Images  

NASA Technical Reports Server (NTRS)

The Earth Sciences Decadal Survey identifies a multiangle, multispectral, high-accuracy polarization imager as one requirement for the Aerosol-Cloud-Ecosystem (ACE) mission. JPL has been developing a Multiangle SpectroPolarimetric Imager (MSPI) as a candidate to fill this need. A key technology development needed for MSPI is on-board signal processing to calculate polarimetry data as imaged by each of the 9 cameras forming the instrument. With funding from NASA's Advanced Information Systems Technology (AIST) Program, JPL is solving the real-time data processing requirements to demonstrate, for the first time, how signal data at 95 Mbytes/sec over 16-channels for each of the 9 multiangle cameras in the spaceborne instrument can be reduced on-board to 0.45 Mbytes/sec. This will produce the intensity and polarization data needed to characterize aerosol and cloud microphysical properties. Using the Xilinx Virtex-5 FPGA including PowerPC440 processors we have implemented a least squares fitting algorithm that extracts intensity and polarimetric parameters in real-time, thereby substantially reducing the image data volume for spacecraft downlink without loss of science information.

Pingree, Paula J.; Werne, Thomas A.; Bekker, Dmitriy L.

2010-01-01

344

Optimization of camera exposure durations for multi-exposure speckle imaging of the microcirculation  

PubMed Central

Improved Laser Speckle Contrast Imaging (LSCI) blood flow analyses that incorporate inverse models of the underlying laser-tissue interaction have been used to develop more quantitative implementations of speckle flowmetry such as Multi-Exposure Speckle Imaging (MESI). In this paper, we determine the optimal camera exposure durations required for obtaining flow information with comparable accuracy with the prevailing MESI implementation utilized in recent in vivo rodent studies. A looping leave-one-out (LOO) algorithm was used to identify exposure subsets which were analyzed for accuracy against flows obtained from analysis with the original full exposure set over 9 animals comprising n = 314 regional flow measurements. From the 15 original exposures, 6 exposures were found using the LOO process to provide comparable accuracy, defined as being no more than 10% deviant, with the original flow measurements. The optimal subset of exposures provides a basis set of camera durations for speckle flowmetry studies of the microcirculation and confers a two-fold faster acquisition rate and a 28% reduction in processing time without sacrificing accuracy. Additionally, the optimization process can be used to identify further reductions in the exposure subsets for tailoring imaging over less expansive flow distributions to enable even faster imaging.

Kazmi, S. M. Shams; Balial, Satyajit; Dunn, Andrew K.

2014-01-01

345

A clinical gamma camera-based pinhole collimated system for high resolution small animal SPECT imaging.  

PubMed

The main objective of the present study was to upgrade a clinical gamma camera to obtain high resolution tomographic images of small animal organs. The system is based on a clinical gamma camera to which we have adapted a special-purpose pinhole collimator and a device for positioning and rotating the target based on a computer-controlled step motor. We developed a software tool to reconstruct the target's three-dimensional distribution of emission from a set of planar projections, based on the maximum likelihood algorithm. We present details on the hardware and software implementation. We imaged phantoms and heart and kidneys of rats. When using pinhole collimators, the spatial resolution and sensitivity of the imaging system depend on parameters such as the detector-to-collimator and detector-to-target distances and pinhole diameter. In this study, we reached an object voxel size of 0.6 mm and spatial resolution better than 2.4 and 1.7 mm full width at half maximum when 1.5- and 1.0-mm diameter pinholes were used, respectively. Appropriate sensitivity to study the target of interest was attained in both cases. Additionally, we show that as few as 12 projections are sufficient to attain good quality reconstructions, a result that implies a significant reduction of acquisition time and opens the possibility for radiotracer dynamic studies. In conclusion, a high resolution single photon emission computed tomography (SPECT) system was developed using a commercial clinical gamma camera, allowing the acquisition of detailed volumetric images of small animal organs. This type of system has important implications for research areas such as Cardiology, Neurology or Oncology. PMID:21085887

Mejia, J; Galvis-Alonso, O Y; Castro, A A de; Braga, J; Leite, J P; Simões, M V

2010-12-01

346

Introducing sub-wavelength pixel THz camera for the understanding of close pixel-to-wavelength imaging challenges  

NASA Astrophysics Data System (ADS)

Conventional guidelines and approximations useful in macro-scale system design can become invalidated when applied to the smaller scales. An illustration of this is when camera pixel size becomes smaller than the diffraction-limited resolution of the incident light. It is sometimes believed that there is no benefit in having a pixel width smaller than the resolving limit defined by the Raleigh criterion, 1.22 ? F/#. Though this rarely occurs in today's imaging technology, terahertz (THz) imaging is one emerging area where the pixel dimensions can be made smaller than the imaging wavelength. With terahertz camera technology, we are able to achieve sub-wavelength pixel sampling pitch, and therefore capable of directly measuring if there are image quality benefits to be derived from sub-wavelength sampling. Interest in terahertz imaging is high due to potential uses in security applications because of the greater penetration depth of terahertz radiation compared to the infrared and the visible. This paper discusses the modification by INO of its infrared MEMS microbolometer detector technology toward a THz imaging platform yielding a sub-wavelength pixel THz camera. Images obtained with this camera are reviewed in this paper. Measurements were also obtained using microscanning to increase sampling resolution. Parameters such as imaging resolution and sampling are addressed. A comparison is also made with results obtained with an 8-12 ?m band camera having a pixel pitch close to the diffractionlimit.

Bergeron, A.; Marchese, L.; Bolduc, M.; Terroux, M.; Dufour, D.; Savard, E.; Tremblay, B.; Oulachgar, H.; Doucet, M.; Le Noc, L.; Alain, C.; Jerominek, H.

2012-05-01

347

The Large Binocular Camera image simulator: predicting the performances of LBC  

NASA Astrophysics Data System (ADS)

The LBC (Large Binocular Camera) Image Simulator is a package for generating artificial images in the typical FITS format. It operates on real or artificial images, simulating the expected performances of real instruments including several observing conditions (filters, air-mass, flat-field, exposure time) and creating images with the LBC instrumental artifacts (optical deformations, noise, CCD architectures). This simulator can be used also to produce artificial images for other existing and future telescopes, since it is very flexible on its structure. The main aim of LBCSIM is to support the development of pipeline and data analysis procedure able to cope with wide field imaging and fast reduction of huge amount of photometric data. The software consists of three stand alone programs written in C language, using IRAF and running under Linux OS. The LBC Image Simulator is built with particular attention to the Virtual Observatory and Data Grid applications. In this paper, we first describe the software, the performances and several tests carried out before the public release and some examples for the users. In particular, we compared the Hubble Deep Field South (HDFS) as seen by FORS1 with a simulated image and found that the agreement is good. Then, we use this software to predict the expected performances of the LBC instrument by means of realistic simulations of deep field observations with the LBT telescope.

Grazian, Andrea; Fontana, Adriano; De Santis, Cristian; Gallozzi, Stefano; Giallongo, Emanuele

2004-09-01

348

Airborne imaging for heritage documentation using the Fotokite tethered flying camera  

NASA Astrophysics Data System (ADS)

Since the beginning of aerial photography, researchers used all kinds of devices (from pigeons, kites, poles, and balloons to rockets) to take still cameras aloft and remotely gather aerial imagery. To date, many of these unmanned devices are still used for what has been referred to as Low-Altitude Aerial Photography or LAAP. In addition to these more traditional camera platforms, radio-controlled (multi-)copter platforms have recently added a new aspect to LAAP. Although model airplanes have been around for several decades, the decreasing cost, increasing functionality and stability of ready-to-fly multi-copter systems has proliferated their use among non-hobbyists. As such, they became a very popular tool for aerial imaging. The overwhelming amount of currently available brands and types (heli-, dual-, tri-, quad-, hexa-, octo-, dodeca-, deca-hexa and deca-octocopters), together with the wide variety of navigation options (e.g. altitude and position hold, waypoint flight) and camera mounts indicate that these platforms are here to stay for some time. Given the multitude of still camera types and the image quality they are currently capable of, endless combinations of low- and high-cost LAAP solutions are available. In addition, LAAP allows for the exploitation of new imaging techniques, as it is often only a matter of lifting the appropriate device (e.g. video cameras, thermal frame imagers, hyperspectral line sensors). Archaeologists were among the first to adopt this technology, as it provided them with a means to easily acquire essential data from a unique point of view, whether for simple illustration purposes of standing historic structures or to compute three-dimensional (3D) models and orthophotographs from excavation areas. However, even very cheap multi-copters models require certain skills to pilot them safely. Additionally, malfunction or overconfidence might lift these devices to altitudes where they can interfere with manned aircrafts. As such, the safe operation of these devices is still an issue, certainly when flying on locations which can be crowded (such as students on excavations or tourists walking around historic places). As the future of UAS regulation remains unclear, this talk presents an alternative approach to aerial imaging: the Fotokite. Developed at the ETH Zürich, the Fotokite is a tethered flying camera that is essentially a multi-copter connected to the ground with a taut tether to achieve controlled flight. Crucially, it relies solely on onboard IMU (Inertial Measurement Unit) measurements to fly, launches in seconds, and is classified as not a UAS (Unmanned Aerial System), e.g. in the latest FAA (Federal Aviation Administration) UAS proposal. As a result it may be used for imaging cultural heritage in a variety of environments and settings with minimal training by non-experienced pilots. Furthermore, it is subject to less extensive certification, regulation and import/export restrictions, making it a viable solution for use at a greater range of sites than traditional methods. Unlike a balloon or a kite it is not subject to particular weather conditions and, thanks to active stabilization, is capable of a variety of intelligent flight modes. Finally, it is compact and lightweight, making it easy to transport and deploy, and its lack of reliance on GNSS (Global Navigation Satellite System) makes it possible to use in urban, overbuilt areas. After outlining its operating principles, the talk will present some archaeological case studies in which the Fotokite was used, hereby assessing its capabilities compared to the conventional UAS's on the market.

Verhoeven, Geert; Lupashin, Sergei; Briese, Christian; Doneus, Michael

2014-05-01

349

Toward Real-time quantum imaging with a single pixel camera  

SciTech Connect

We present a workbench for the study of real-time quantum imaging by measuring the frame-by-frame quantum noise reduction of multi-spatial-mode twin beams generated by four wave mixing in Rb vapor. Exploiting the multiple spatial modes of this squeezed light source, we utilize spatial light modulators to selectively transmit macropixels of quantum correlated modes from each of the twin beams to a high quantum efficiency balanced detector. In low-light-level imaging applications, the ability to measure the quantum correlations between individual spatial modes and macropixels of spatial modes with a single pixel camera will facilitate compressive quantum imaging with sensitivity below the photon shot noise limit.

Lawrie, Benjamin J [ORNL; Pooser, Raphael C [ORNL

2013-01-01

350

High-Contrast Exoplanet Imaging with CLIO2, the Magellan Adaptive Optics Infrared Camera  

NASA Astrophysics Data System (ADS)

MagAO is the adaptive-secondary AO system on the 6.5-m Magellan Clay telescope. With a high actuator density and a sensitive pyramid WFS, MagAO achieves down to ~130 nm rms WFE on bright guide stars in median seeing conditions (0.7'' V band) at Las Campanas Observatory in Chile. MagAO's infrared camera, Clio2, has a comprehensive suite of narrow and broad band filters that allow direct imaging of faint companions from 1-5 um. We present first-light results from Clio2, including images of exoplanetary system Beta Pictoris. High-contrast imaging is an important goal of AO for ELTs, and results from MagAO/Clio2 are the next step along that path --- particularly true for the GMT which is located very close to the Magellan site.

Morzinski, Katie; Close, Laird; Males, Jared; Hinz, Philip; Puglisi, Alfio; Esposito, Simone; Riccardi, Armando; Pinna, Enrico; Xompero, Marco; Briguglio, Runa; Follette, Kate; Kopon, Derek; Skemer, Andy; Gasho, Victor; Uomoto, Alan; Hare, Tyson; Arcidiacono, Carmelo; Quiros-Pacheco, Fernando; Argomedo, Javier; Busoni, Lorenzo; Rodigas, Timothy; Wu, Ya-Lin

2013-12-01

351

The co-imaging of gamma camera measurements of aerosol deposition and respiratory anatomy.  

PubMed

The use of gamma camera imaging following the inhalation of a radiolabel has been widely used by researchers to investigate the fate of inhaled aerosols. The application of two-dimensional (2D) planar gamma scintigraphy and single-photon emission computed tomography (SPECT) to the study of inhaled aerosols is discussed in this review. Information on co-localized anatomy can be derived from other imaging techniques such as krypton ventilation scans and low- and high-resolution X-ray computed tomography (CT). Radionuclide imaging, combined with information on anatomy, is a potentially useful approach when the understanding of regional deposition within the lung is central to research objectives for following disease progression and for the evaluation of therapeutic intervention. PMID:23517170

Conway, Joy; Fleming, John; Bennett, Michael; Havelock, Tom

2013-06-01

352

Gabor zone plate apertures for imaging with the mercuric iodide gamma ray camera  

SciTech Connect

Gabor zone plate (GZP) apertures have been developed for use in EG G EM's mercuric iodide (H{sub g}I{sub 2}) gamma-ray camera. The purpose of such an aperture is to increase efficiency, while maintaining good resolution. The GZP is similar to the Fresnel zone plate (FZP) but it has continuous transitions between opaque and transparent regions. Because there are no sharp transitions in the transmission the inherent interference noise in GZP imaging is lower than that in FZP imaging. GZP parameters were chosen by considering the effects of constraints such as detector pixel size, number of pixels, minimum field of view required, maximum angle of incidence tolerated, and the Nyquist criterion for the minimum sampling rate. As a result an aperture was designed and fabricated with eight zones and a diameter of 3 cm. Lead was chosen as the aperture medium due to its high attenuation coefficient. Experimental data were obtained from the camera with the above GZP aperture. The point-spread function was determined and compared to the calculated response. Excellent agreement was obtained. The reconstruction process involves simulating, by computer, planar-wave illumination of a scaled transparency of the image and recording the intensity pattern at the focal plane. 6 refs., 7 figs.

Patt, B.E. (EG and G Energy Measurements, Inc., Goleta, CA (USA)); Meyyappan, A.; Cai, A.; Wade, G. (California Univ., Santa Barbara, CA (USA). Dept. of Electrical and Computer Engineering)

1990-01-01

353

Noise-equivalent power characterization of an uncooled microbolometer-based THz imaging camera  

NASA Astrophysics Data System (ADS)

A THz camera based on an uncooled microbolometer 160X120 pixel array with nominal pitch of 52 ?m has been developed at INO and initial transmission and reflection images showed promise. In the present paper, the characterization of both standard infrared and THz-optimized uncooled microbolometer pixel arrays are presented at both infrared and THz wavelengths. Measurements in the THz region has been performed using non-uniform low-power quantum-cascade laser (QCL) and uniform high-power far-infrared laser (FIR laser) beams at 3 THz and 4.25 and 2.54 THz, respectively. A measurement comparison has been achieved in the infrared using a blackbody radiation. Different methods for noise-equivalent power (NEP) measurements have been investigated. These characterization methods are promising especially for non-uniform laser beams irradiated on pixel arrays. The NEP results obtained from the different methods are in good agreement independent of the method used in the experiments. The results show a high sensitivity of the THz-optimized pixel array in the THz region. Large beam area reflection imaging of obscured materials at 2.54 THz have been performed at video rates of 30 frames per second using the THz-optimized pixel array equipped with a semi-custom fast THz objective, proving that the INO THz camera provides a promising solution for stand-alone imaging systems.

Bolduc, Martin; Terroux, Marc; Tremblay, Bruno; Marchese, Linda; Savard, Eric; Doucet, Michel; Oulachgar, Hassane; Alain, Christine; Jerominek, Hubert; Bergeron, Alain

2011-05-01

354

Light sources and cameras for standard in vitro membrane potential and high-speed ion imaging.  

PubMed

Membrane potential and fast ion imaging are now standard optical techniques routinely used to record dynamic physiological signals in several preparations in vitro. Although detailed resolution of optical signals can be improved by confocal or two-photon microscopy, high spatial and temporal resolution can be obtained using conventional microscopy and affordable light sources and cameras. Thus, standard wide-field imaging methods are still the most common in research laboratories and can often produce measurements with a signal-to-noise ratio that is superior to other optical approaches. This paper seeks to review the most important instrumentation used in these experiments, with particular reference to recent technological advances. We analyse in detail the optical constraints dictating the type of signals that are obtained with voltage and ion imaging and we discuss how to use this information to choose the optimal apparatus. Then, we discuss the available light sources with specific attention to light emitting diodes and solid state lasers. We then address the current state-of-the-art of available charge coupled device, electron multiplying charge coupled device and complementary metal oxide semiconductor cameras and we analyse the characteristics that need to be taken into account for the choice of optimal detector. Finally, we conclude by discussing prospective future developments that are likely to further improve the quality of the signals expanding the capability of the techniques and opening the gate to novel applications. PMID:23692638

Davies, R; Graham, J; Canepari, M

2013-07-01

355

Intensified array camera imaging of solid surface combustion aboard the NASA Learjet  

NASA Technical Reports Server (NTRS)

An intensified array camera has been used to image weakly luminous flames spreading over thermally thin paper samples in a low-gravity environment aboard the NASA-Lewis Learjet. The aircraft offers 10 to 20 sec of reduced gravity during execution of a Keplerian trajectory and allows the use of instrumentation that is delicate or requires higher electrical power than is available in drop towers. The intensified array camera is a charge intensified device type that responds to light between 400 and 900 nm and has a minimum sensitivity of 10(exp 6) footcandles. The paper sample, either ashless filter paper or a lab wiper, burns inside a sealed chamber which is filled with 21, 18, or 15 pct. oxygen in nitrogen at one atmosphere. The camera views the edge of the paper and its output is recorded on videotape. Flame positions are measured every 0.1 sec to calculate flame spread rates. Comparisons with drop tower data indicate that the flame shapes and spread rates are affected by the residual g level in the aircraft.

Weiland, Karen J.

1992-01-01

356

Intensified array camera imaging of solid surface combustion aboard the NASA Learjet  

NASA Technical Reports Server (NTRS)

An intensified array camera was used to image weakly luminous flames spreading over thermally thin paper samples in a low gravity environment aboard the NASA-Lewis Learjet. The aircraft offers 10 to 20 sec of reduced gravity during execution of a Keplerian trajectory and allows the use of instrumentation that is delicate or requires higher electrical power than is available in drop towers. The intensified array camera is a charge intensified device type that responds to light between 400 and 900 nm and has a minimum sensitivity of 10(exp 6) footcandles. The paper sample, either ashless filter paper or a lab wiper, burns inside a sealed chamber which is filled with 21, 18, or 15 pct. oxygen in nitrogen at one atmosphere. The camera views the edge of the paper and its output is recorded on videotape. Flame positions are measured every 0.1 sec to calculate flame spread rates. Comparisons with drop tower data indicate that the flame shapes and spread rates are affected by the residual g level in the aircraft.

Weiland, Karen J.

1992-01-01

357

Synchroscan streak camera imaging at a 15-MeV photoinjector with emittance exchange  

NASA Astrophysics Data System (ADS)

At the Fermilab A0 photoinjector facility, bunch-length measurements of the laser micropulse and the e-beam micropulse have been done in the past with a fast single-sweep module of the Hamamatsu C5680 streak camera with an intrinsic shot-to-shot trigger jitter of 10-20 ps. We have upgraded the camera system with the synchroscan module tuned to 81.25 MHz to provide synchronous summing capability with less than 1.5 ps FWHM trigger jitter and a phase-locked delay box to provide phase stability of ˜1 ps over 10 s of minutes. These steps allowed us to measure both the UV laser pulse train at 263 nm and the e-beam via optical transition radiation (OTR). Due to the low electron beam energies and OTR signals, we typically summed over 50 micropulses with 0.25-1 nC per micropulse. The phase-locked delay box allowed us to assess chromatic temporal effects and instigated another upgrade to an all-mirror input optics barrel. In addition, we added a slow sweep horizontal deflection plug-in unit to provide dual-sweep capability for the streak camera. We report on a series of measurements made during the commissioning of these upgrades including bunch-length and phase effects using the emittance exchange beamline and simultaneous imaging of a UV drive laser component, OTR, and the 800 nm diagnostics laser.

Lumpkin, A. H.; Ruan, J.; Thurman-Keup, R.

2012-09-01

358

Mars Orbiter Camera High Resolution Images: Some Results From The First 6 Weeks In Orbit  

NASA Technical Reports Server (NTRS)

The Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) images acquired shortly after orbit insertion were relatively poor in both resolution and image quality. This poor performance was solely the result of low sunlight conditions and the relative distance to the planet, both of which have been progressively improving over the past six weeks. Some of the better images are used here (see PIA01021 through PIA01029) to illustrate how the MOC images provide substantially better views of the martian surface than have ever been recorded previously from orbit.

This U.S. Geological Survey shaded relief map provides an overall context for the MGS MOC images of the Tithonium/Ius Chasma, Ganges Chasma, and Schiaparelli Crater. Closeup images of the Tithonium/Ius Chasma area are visible in PIA01021 through PIA01023. Closeups of Ganges Chasma are available as PIA01027 through PIA01029, and Schiaparelli Crater is shown in PIA01024 through PIA01026. The Mars Pathfinder landing site is shown to the north of the sites of the MGS images.

Launched on November 7, 1996, Mars Global Surveyor entered Mars orbit on Thursday, September 11, 1997. The original mission plan called for using friction with the planet's atmosphere to reduce the orbital energy, leading to a two-year mapping mission from close, circular orbit (beginning in March 1998). Owing to difficulties with one of the two solar panels, aerobraking was suspended in mid-October and resumed in November 8. Many of the original objectives of the mission, and in particular those of the camera, are likely to be accomplished as the mission progresses.

Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.

1997-01-01

359

Planetary Camera imaging of the counter-rotating core galaxy NGC 4365  

NASA Technical Reports Server (NTRS)

We analyze F555W(V) band Planetary Camera images of NGC 4365, for which ground-based spectroscopy has revealed a misaligned, counter-rotating core. Line profile analysis by Surma indicates that the counter-rotating component has a disk structure. After deconvolution and galaxy modeling, we find photometric evidence, at small radii to support this claim. There is no indication of a central point source or dust lane. The surface brightness profile reveals a steep outer profile and shallow, by not flat, inner profile with the inflection radius occurring at 1.8 sec. The inner profile is consistent with a cusp.

Forbes, Duncan A.

1994-01-01

360

Update and image quality error budget for the LSST camera optical design  

NASA Astrophysics Data System (ADS)

The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, modified Paul-Baker design, with an 8.4-meter primary mirror, a 3.4-m secondary, and a 5.0-m tertiary feeding a refractive camera design with 3 lenses (0.69-1.55m) and a set of broadband filters/corrector lenses. Performance is excellent over a 9.6 square degree field and ultraviolet to near infrared wavelengths. We describe the image quality error budget analysis methodology which includes effects from optical and optomechanical considerations such as index inhomogeneity, fabrication and null-testing error, temperature gradients, gravity, pressure, stress, birefringence, and vibration.

Bauman, Brian J.; Bowden, Gordon; Ku, John; Nordby, Martin; Olivier, Scot; Riot, Vincent; Rasmussen, Andrew; Seppala, Lynn; Xiao, Hong; Nurita, Nadine; Gilmore, David; Kahn, Steven

2010-07-01

361

High resolution images of AFGL 2688 (Egg Nebula) with the NASA/GSFC array camera (Contributed)  

NASA Astrophysics Data System (ADS)

High spatial resolution photometric imagery, using the NASA/GSFC 10 ?m array camera, of the infrared source at the center of AFGL 2688 (Egg nebulae) at 8.3, 9.8, 11.2 and 12.4 ?m is used to investigate its morphology. The infrared source appears as a centrally peaked ellipsoid with its major axis parallel to the axis of the visible nebulosity. Maps of the distribution of color temperature and dust opacity in the source, as derived from the images, suggest no evidence for the dust toroid model, but instead indicate that the source consists of a central star surrounded by a thin dust shell.

Jaye, D.; Tresch-Fienberg, R.; Fazio, G. G.; Gezari, D. Y.; Lamb, G. M.; Shu, P. K.; Hoffmann, W. F.; McCreight, C.

362

Digital imaging microscopy: the marriage of spectroscopy and the solid state CCD camera  

NASA Astrophysics Data System (ADS)

Biological samples have been imaged using microscopes equipped with slow-scan CCD cameras. Examples are presented of studies based on the detection of light emission signals in the form of fluorescence and phosphorescence. They include applications in the field of cell biology: (a) replication and topology of mammalian cell nuclei; (b) cytogenetic analysis of human metaphase chromosomes; and (c) time-resolved measurements of DNA-binding dyes in cells and on isolated chromosomes, as well as of mammalian cell surface antigens, using the phosphorescence of acridine orange and fluorescence resonance energy transfer of labeled lectins, respectively.

Jovin, Thomas M.; Arndt-Jovin, Donna J.

1991-12-01

363

Camera Image Transformation and Registration for Safe Spacecraft Landing and Hazard Avoidance  

NASA Technical Reports Server (NTRS)

Inherent geographical hazards of Martian terrain may impede a safe landing for science exploration spacecraft. Surface visualization software for hazard detection and avoidance may accordingly be applied in vehicles such as the Mars Exploration Rover (MER) to induce an autonomous and intelligent descent upon entering the planetary atmosphere. The focus of this project is to develop an image transformation algorithm for coordinate system matching between consecutive frames of terrain imagery taken throughout descent. The methodology involves integrating computer vision and graphics techniques, including affine transformation and projective geometry of an object, with the intrinsic parameters governing spacecraft dynamic motion and camera calibration.

Jones, Brandon M.

2005-01-01

364

A 3D HIDAC-PET camera with sub-millimetre resolution for imaging small animals  

Microsoft Academic Search

A HIDAC-PET camera consisting essentially of 5 million 0.5 mm gas avalanching detectors has been constructed for small-animal imaging. The particular HIDAC advantage-a high 3D spatial resolution-has been improved to 0.95 mm fwhm and to 0.7 mm fwhm when reconstructing with 3D-OSEM methods incorporating resolution recovery. A depth-of-interaction resolution of 2.5 mm is implicit, due to the laminar construction. Scatter-corrected

A. P. Jeavons; R. A. Chandler; C. A. R. Dettmar

1999-01-01

365

DESIGN NOTE: Hybrid Zener-based VDN - a better choice for Cerenkov light imaging cameras  

NASA Astrophysics Data System (ADS)

A Zener-based voltage divider network has been used for the photomultiplier tubes (PMTs) employed in the multi-pixel camera of a Cherenkov imaging telescope. This hybrid network ensures a linear photomultiplier response at a chain current nearly an order of magnitude less than that required for a purely resistive voltage divider network of conventional design. The network has been validated for the Thorn-EMI photomultiplier tube 9135B and the results discussed here are also representative of other PMT makes.

Bhat, C. L.; Koul, R.; Kaul, S. R.; Kaul, S. K.; Kaul, I. K.; Goyal, H. C.

1996-04-01

366

Using different interpolation techniques in unwrapping the distorted images from panoramic annular lens camera  

NASA Astrophysics Data System (ADS)

The camera using panoramic annular lens (PAL) can capture the surrounding scene in a view of 360° without any scanning component. Due to severe distortions, the image formed by PAL must be unwrapped into a perspective-view image in order to get consistency with the human's visual custom. However the unfilled pixels would probably exist after unwrapping as a result of the non-uniform resolution in the PAL image, hence the interpolation should be employed in the phase of the forward projection unwrapping. We also evaluated the performance of several interpolation techniques for unwrapping the PAL image on a series of frequency-patterned images as a simulation by using three image quality indexes: MSE, SSIM and S-CIELAB. The experiment result revealed that those interpolation methods had better capability for the low frequent PAL images. The Bicubic, Ferguson and Newton interpolations performed relatively better at higher frequencies, while Bilinear and Bezier could achieve better result at lower frequency. Besides, the Nearest method had poorest performance in general and the Ferguson interpolation was excellent in both high and low frequencies.

Yu, Guo; Fu, Lingjin; Bai, Jian

2010-11-01

367

Local image quality metric for a distributed smart camera network with overlapping FOVs  

Microsoft Academic Search

A set of camera selection templates, using simple rules based on a local (camera) level metric, are implemented for a twelve camera inward-looking distributed smart camera network. The local metric represents the quality of detection for a given camera node of the target-of-interest and is based on a measurable target parameter. To understand the effectiveness of the camera selections, an

Edward Shen; Richard Hornsey

2011-01-01

368

New Mars Camera's First Image of Mars from Mapping Orbit (Full Frame)  

NASA Technical Reports Server (NTRS)

The high resolution camera on NASA's Mars Reconnaissance Orbiter captured its first image of Mars in the mapping orbit, demonstrating the full resolution capability, on Sept. 29, 2006. The High Resolution Imaging Science Experiment (HiRISE) acquired this first image at 8:16 AM (Pacific Time). With the spacecraft at an altitude of 280 kilometers (174 miles), the image scale is 25 centimeters per pixel (10 inches per pixel). If a person were located on this part of Mars, he or she would just barely be visible in this image.

The image covers a small portion of the floor of Ius Chasma, one branch of the giant Valles Marineris system of canyons. The image illustrates a variety of processes that have shaped the Martian surface. There are bedrock exposures of layered materials, which could be sedimentary rocks deposited in water or from the air. Some of the bedrock has been faulted and folded, perhaps the result of large-scale forces in the crust or from a giant landslide. The image resolves rocks as small as small as 90 centimeters (3 feet) in diameter. It includes many dunes or ridges of windblown sand.

This image (TRA_000823_1720) was taken by the High Resolution Imaging Science Experiment camera onboard the Mars Reconnaissance Orbiter spacecraft on Sept. 29, 2006. Shown here is the full image, centered at minus 7.8 degrees latitude, 279.5 degrees east longitude. The image is oriented such that north is to the top. The range to the target site was 297 kilometers (185.6 miles). At this distance the image scale is 25 centimeters (10 inches) per pixel (with one-by-one binning) so objects about 75 centimeters (30 inches) across are resolved. The image was taken at a local Mars time of 3:30 PM and the scene is illuminated from the west with a solar incidence angle of 59.7 degrees, thus the sun was about 30.3 degrees above the horizon. The season on Mars is northern winter, southern summer.

[Photojournal note: Due to the large sizes of the high-resolution TIFF and JPEG files, some systems may experience extremely slow downlink time while viewing or downloading these images; some systems may be incapable of handling the download entirely.]

NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Reconnaissance Orbiter for NASA's Science Mission Directorate, Washington. Lockheed Martin Space Systems, Denver, is the prime contractor for the project and built the spacecraft. The HiRISE camera was built by Ball Aerospace & Technologies Corporation, Boulder, Colo., and is operated by the University of Arizona, Tucson.

2006-01-01

369

Minimal Camera Networks for 3D Image Based Modeling of Cultural Heritage Objects  

PubMed Central

3D modeling of cultural heritage objects like artifacts, statues and buildings is nowadays an important tool for virtual museums, preservation and restoration. In this paper, we introduce a method to automatically design a minimal imaging network for the 3D modeling of cultural heritage objects. This becomes important for reducing the image capture time and processing when documenting large and complex sites. Moreover, such a minimal camera network design is desirable for imaging non-digitally documented artifacts in museums and other archeological sites to avoid disturbing the visitors for a long time and/or moving delicate precious objects to complete the documentation task. The developed method is tested on the Iraqi famous statue “Lamassu”. Lamassu is a human-headed winged bull of over 4.25 m in height from the era of Ashurnasirpal II (883–859 BC). Close-range photogrammetry is used for the 3D modeling task where a dense ordered imaging network of 45 high resolution images were captured around Lamassu with an object sample distance of 1 mm. These images constitute a dense network and the aim of our study was to apply our method to reduce the number of images for the 3D modeling and at the same time preserve pre-defined point accuracy. Temporary control points were fixed evenly on the body of Lamassu and measured by using a total station for the external validation and scaling purpose. Two network filtering methods are implemented and three different software packages are used to investigate the efficiency of the image orientation and modeling of the statue in the filtered (reduced) image networks. Internal and external validation results prove that minimal image networks can provide highly accurate records and efficiency in terms of visualization, completeness, processing time (>60% reduction) and the final accuracy of 1 mm.

Alsadik, Bashar; Gerke, Markus; Vosselman, George; Daham, Afrah; Jasim, Luma

2014-01-01

370

Minimal camera networks for 3D image based modeling of cultural heritage objects.  

PubMed

3D modeling of cultural heritage objects like artifacts, statues and buildings is nowadays an important tool for virtual museums, preservation and restoration. In this paper, we introduce a method to automatically design a minimal imaging network for the 3D modeling of cultural heritage objects. This becomes important for reducing the image capture time and processing when documenting large and complex sites. Moreover, such a minimal camera network design is desirable for imaging non-digitally documented artifacts in museums and other archeological sites to avoid disturbing the visitors for a long time and/or moving delicate precious objects to complete the documentation task. The developed method is tested on the Iraqi famous statue "Lamassu". Lamassu is a human-headed winged bull of over 4.25 m in height from the era of Ashurnasirpal II (883-859 BC). Close-range photogrammetry is used for the 3D modeling task where a dense ordered imaging network of 45 high resolution images were captured around Lamassu with an object sample distance of 1 mm. These images constitute a dense network and the aim of our study was to apply our method to reduce the number of images for the 3D modeling and at the same time preserve pre-defined point accuracy. Temporary control points were fixed evenly on the body of Lamassu and measured by using a total station for the external validation and scaling purpose. Two network filtering methods are implemented and three different software packages are used to investigate the efficiency of the image orientation and modeling of the statue in the filtered (reduced) image networks. Internal and external validation results prove that minimal image networks can provide highly accurate records and efficiency in terms of visualization, completeness, processing time (>60% reduction) and the final accuracy of 1 mm. PMID:24670718

Alsadik, Bashar; Gerke, Markus; Vosselman, George; Daham, Afrah; Jasim, Luma

2014-01-01

371

Efficient Smart CMOS Camera Based on FPGAs Oriented to Embedded Image Processing  

PubMed Central

This article describes an image processing system based on an intelligent ad-hoc camera, whose two principle elements are a high speed 1.2 megapixel Complementary Metal Oxide Semiconductor (CMOS) sensor and a Field Programmable Gate Array (FPGA). The latter is used to control the various sensor parameter configurations and, where desired, to receive and process the images captured by the CMOS sensor. The flexibility and versatility offered by the new FPGA families makes it possible to incorporate microprocessors into these reconfigurable devices, and these are normally used for highly sequential tasks unsuitable for parallelization in hardware. For the present study, we used a Xilinx XC4VFX12 FPGA, which contains an internal Power PC (PPC) microprocessor. In turn, this contains a standalone system which manages the FPGA image processing hardware and endows the system with multiple software options for processing the images captured by the CMOS sensor. The system also incorporates an Ethernet channel for sending processed and unprocessed images from the FPGA to a remote node. Consequently, it is possible to visualize and configure system operation and captured and/or processed images remotely.

Bravo, Ignacio; Balinas, Javier; Gardel, Alfredo; Lazaro, Jose L.; Espinosa, Felipe; Garcia, Jorge

2011-01-01

372

Efficient smart CMOS camera based on FPGAs oriented to embedded image processing.  

PubMed

This article describes an image processing system based on an intelligent ad-hoc camera, whose two principle elements are a high speed 1.2 megapixel Complementary Metal Oxide Semiconductor (CMOS) sensor and a Field Programmable Gate Array (FPGA). The latter is used to control the various sensor parameter configurations and, where desired, to receive and process the images captured by the CMOS sensor. The flexibility and versatility offered by the new FPGA families makes it possible to incorporate microprocessors into these reconfigurable devices, and these are normally used for highly sequential tasks unsuitable for parallelization in hardware. For the present study, we used a Xilinx XC4VFX12 FPGA, which contains an internal Power PC (PPC) microprocessor. In turn, this contains a standalone system which manages the FPGA image processing hardware and endows the system with multiple software options for processing the images captured by the CMOS sensor. The system also incorporates an Ethernet channel for sending processed and unprocessed images from the FPGA to a remote node. Consequently, it is possible to visualize and configure system operation and captured and/or processed images remotely. PMID:22163739

Bravo, Ignacio; Baliñas, Javier; Gardel, Alfredo; Lázaro, José L; Espinosa, Felipe; García, Jorge

2011-01-01

373

High-resolution imaging of the Pluto-Charon system with the Faint Object Camera of the Hubble Space Telescope  

NASA Technical Reports Server (NTRS)

Images of the Pluto-Charon system were obtained with the Faint Object Camera (FOC) of the Hubble Space Telescope (HST) after the refurbishment of the telescope. The images are of superb quality, allowing the determination of radii, fluxes, and albedos. Attempts were made to improve the resolution of the already diffraction limited images by image restoration. These yielded indications of surface albedo distributions qualitatively consistent with models derived from observations of Pluto-Charon mutual eclipses.

Albrecht, R.; Barbieri, C.; Adorf, H.-M.; Corrain, G.; Gemmo, A.; Greenfield, P.; Hainaut, O.; Hook, R. N.; Tholen, D. J.; Blades, J. C.

1994-01-01

374

A camera for imaging hard x-rays from suprathermal electrons during lower hybrid current drive on PBX-M  

SciTech Connect

During lower hybrid current drive (LHCD), suprathermal electrons are generated that emit hard X-ray bremsstrahlung. A pinhole camera has been installed on the PBX-M tokamak that records 128 {times} 128 pixel images of the bremsstrahlung with a 3 ms time resolution. This camera has identified hollow radiation profiles on PBX-M, indicating off-axis current drive. The detector is a 9in. dia. intensifier. A detailed account of the construction of the Hard X-ray Camera, its operation, and its performance is given.

von Goeler, S.; Kaita, R.; Bernabei, S.; Davis, W.; Fishman, H.; Gettelfinger, G.; Ignat, D.; Roney, P.; Stevens, J.; Stodiek, W. [Princeton Univ., NJ (United States). Plasma Physics Lab.; Jones, S.; Paoletti, F. [Massachusetts Inst. of Tech., Cambridge, MA (United States). Plasma Fusion Center; Petravich, G. [Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics; Rimini, F. [JET Joint Undertaking, Abingdon (United Kingdom)

1993-05-01

375

A camera for imaging hard x-rays from suprathermal electrons during lower hybrid current drive on PBX-M  

SciTech Connect

During lower hybrid current drive (LHCD), suprathermal electrons are generated that emit hard X-ray bremsstrahlung. A pinhole camera has been installed on the PBX-M tokamak that records 128 [times] 128 pixel images of the bremsstrahlung with a 3 ms time resolution. This camera has identified hollow radiation profiles on PBX-M, indicating off-axis current drive. The detector is a 9in. dia. intensifier. A detailed account of the construction of the Hard X-ray Camera, its operation, and its performance is given.

von Goeler, S.; Kaita, R.; Bernabei, S.; Davis, W.; Fishman, H.; Gettelfinger, G.; Ignat, D.; Roney, P.; Stevens, J.; Stodiek, W. (Princeton Univ., NJ (United States). Plasma Physics Lab.); Jones, S.; Paoletti, F. (Massachusetts Inst. of Tech., Cambridge, MA (United States). Plasma Fusion Center); Petravich, G. (Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics); Rimini,

1993-05-01

376

Nondestructive Testing Technique for Defect Detection in LiNbO3 Using Infrared Thermal Imaging Camera  

Microsoft Academic Search

The existence of defects in lithium niobate (LiNbO3) degrades the specific properties required of LiNbO3 as substrate material, which is a major problem inhibiting device constructions. For the purpose of providing a convenient and quick tool for use in a nondestructive testing technique for defect detection, the feasibility of using an infrared thermal imaging camera (IR camera) is examined by

Kunyong Lee; Hisashi Miyazaki; Yoichi Okamoto; Jun Morimoto; Kohji Toda

2010-01-01

377

Wall temperature measurements using a thermal imaging camera with temperature-dependent emissivity corrections  

NASA Astrophysics Data System (ADS)

A methodology is presented whereby the relationship between temperature and emissivity for fused quartz has been used to correct the temperature values of a quartz impingement plate detected by an SC3000 thermal imaging camera. The methodology uses an iterative method using the initial temperature (obtained by assuming a constant emissivity) to find the emissivity values which are then put into the thermal imaging software and used to find the subsequent temperatures, which are used to find the emissivities, and so on until converged. This method is used for a quartz impingement plate that has been heated under various flame conditions, and the results are compared. Radiation losses from the plate are also calculated, and it is shown that even a slight change in temperature greatly affects the radiation loss. It is a general methodology that can be used for any wall material whose emissivity is a function of temperature.

McDaid, Chloe; Zhang, Yang

2011-12-01

378

Estimation of Enterococci Input from Bathers and Animals on A Recreational Beach Using Camera Images  

PubMed Central

Enterococci, are used nationwide as a water quality indicator of marine recreational beaches. Prior research has demonstrated that enterococci inputs to the study beach site (located in Miami, FL) are dominated by non-point sources (including humans and animals). We have estimated their respective source functions by developing a counting methodology for individuals to better understand their non-point source load impacts. The method utilizes camera images of the beach taken at regular time intervals to determine the number of people and animal visitors. The developed method translates raw image counts for weekdays and weekend days into daily and monthly visitation rates. Enterococci source functions were computed from the observed number of unique individuals for average days of each month of the year, and from average load contributions for humans and for animals. Results indicate that dogs represent the larger source of enterococci relative to humans and birds.

D, Wang John; M, Solo-Gabriele Helena; M, Abdelzaher Amir; E, Fleming Lora

2010-01-01

379

High resolution Vesta High Altitude Mapping Orbit (HAMO) Atlas derived from Dawn framing camera images  

NASA Astrophysics Data System (ADS)

The Dawn framing camera (FC) acquired about 2500 clear filter images of Vesta with a resolution of about 70 m/pixels during the High Altitude Mapping Orbit (HAMO) in fall 2011. We ortho-rectified these images and produced a global high resolution controlled mosaic of Vesta. This global mosaic is the baseline for a high resolution Vesta atlas that consists of 15 tiles mapped at a scale of 1:500,000. The nomenclature used in this atlas was proposed by the Dawn team and was approved by the International Astronomical Union (IAU). The whole atlas is available to the public through the Dawn GIS web page [http://dawn_gis.dlr.de/atlas].

Roatsch, Th.; Kersten, E.; Matz, K.-D.; Preusker, F.; Scholten, F.; Jaumann, R.; Raymond, C. A.; Russell, C. T.

2012-12-01

380

A JPEG-like algorithm for compression of single-sensor camera image  

NASA Astrophysics Data System (ADS)

This paper presents a JPEG-like coder for image compression of single-sensor camera images using a Bayer Color Filter Array (CFA). The originality of the method is a joint scheme of compression.demosaicking in the DCT domain. In this method, the captured CFA raw data is first separated in four distinct components and then converted to YCbCr. A JPEG compression scheme is then applied. At the decoding level, the bitstream is decompressed until reaching the DCT coefficients. These latter are used for the interpolation stage. The obtained results are better than those obtained by the conventional JPEG in terms of CPSNR, ?E2000 and SSIM. The obtained JPEG-like scheme is also less complex.

Benahmed Daho, Omar; Larabi, Mohamed-Chaker; Mukhopadhyay, Jayanta

2011-01-01

381

Improved Digitization of Lunar Mare Ridges with LROC Derived Products  

NASA Astrophysics Data System (ADS)

Lunar wrinkle ridges (mare ridges) are positive-relief structures formed from compressional stress in basin-filling flood basalt deposits [1]. Previous workers have measured wrinkle ridge orientations and lengths to investigate their spatial distribution and infer basin-localized stress fields [2,3]. Although these plots include the most prominent mare ridges and their general trends, they may not have fully captured all of the ridges, particularly the smaller-scale ridges. Using Lunar Reconnaissance Orbiter Wide Angle Camera (WAC) global mosaics and derived topography (100m pixel scale) [4], we systematically remapped wrinkle ridges in Mare Serenitatis. By comparing two WAC mosaics with different lighting geometry, and shaded relief maps made from a WAC digital elevation model (DEM) [5], we observed that some ridge segments and some smaller ridges are not visible in previous structure maps [2,3]. In the past, mapping efforts were limited by a fixed Sun direction [6,7]. For systematic mapping we created three shaded relief maps from the WAC DEM with solar azimuth angles of 0°, 45°, and 90°, and a fourth map was created by combining the three shaded reliefs into one, using a simple averaging scheme. Along with the original WAC mosaic and the WAC DEM, these four datasets were imported into ArcGIS, and the mare ridges of Imbrium, Serenitatis, and Tranquillitatis were digitized from each of the six maps. Since the mare ridges are often divided into many ridge segments [8], each major component was digitized separately, as opposed to the ridge as a whole. This strategy enhanced our ability to analyze the lengths, orientations, and abundances of these ridges. After the initial mapping was completed, the six products were viewed together to identify and resolve discrepancies in order to produce a final wrinkle ridge map. Comparing this new mare ridge map with past lunar tectonic maps, we found that many mare ridges were not recorded in the previous works. It was noted in some cases, the lengths and orientations of previously digitized ridges were different than those of the ridges digitized in this study. This method of multi-map digitizing allows for a greater accuracy in spatial characterization of mare ridges than previous methods. We intend to map mare ridges on a global scale, creating a more comprehensive ridge map due to higher resolution. References Cited: [1] Schultz P.H. (1976) Moon Morphology, 308. [2] Wilhelms D.E. (1987) USGS Prof. Paper 1348, 5A-B. [3] Carr, M.H. (1966) USGS Geologic Atlas of the Moon, I-498. [4] Robinson M.S. (2010) Space Sci. Rev., 150:82. [5] Scholten F. et al. (2011) LPSC XLII, 2046. [6] Fielder G. and Kiang T. (1962) The Observatory: No. 926, 8. [7] Watters T.R. and Konopliv A.S. (2001) Planetary and Space Sci. 49. 743-748. [8] Aubele J.C. (1988) LPSC XIX, 19.

Crowell, J. M.; Robinson, M. S.; Watters, T. R.; Bowman-Cisneros, E.; Enns, A. C.; Lawrence, S.

2011-12-01

382

Comparison of brain imaging with gamma camera, single-photon emission computed tomography, and transmission computed tomography: concise communication  

Microsoft Academic Search

The utility of a single-photon axial tomographic scanner (ECT) in brain imaging, using routine tracers in 238 patients has been evaluated. When compared with routine delayed gamma-camera images (DGCI), there was agreement in 191 negative studies and 39 positive studies. Four patients had positive DGCI and negative ECT studies, and four had positive ECT and negative DGCI. In the 102

N. E. Jr. Watson; R. J. Cowan; M. R. Ball; D. M. Moody; D. W. Laster; C. D. Maynard

1980-01-01

383

Study of the Ionic and Neutral Species in the Coma of Comet Halley with an Image Intensifier Camera.  

National Technical Information Service (NTIS)

Imaging Fabry-Perot spectroscopy in the lines of hydrogen Balmer alpha and C2 and International Halley Watch (IHW) filter photography on Comet Halley carried out with a small telescope and an image intensifier camera are described. An event seen in H2O(+)...

T. Chandrasekhar C. D. Prasad J. N. Desai N. M. Ashok R. Gupta

1987-01-01

384

Image Registration: Color Image Synthesis from Data Obtained from Satellites Pushbroom Cameras  

Microsoft Academic Search

The problem of image registration is discussed. Firstly, the well known Reddy-Chatterji algorithm is revised. It is shown that the speed may be dramatically increased by using Hartley transform instead of Fourier transform, especially when registration is re- quired for more than two images. Secondary, a new registration algorithm is proposed for images obtained from pushbroom cam- eras that is

Dmitry V. Yurin

385

Applications and Imaging Techniques of a Si/CdTe Compton Gamma-Ray Camera  

NASA Astrophysics Data System (ADS)

By using a new Compton camera consisting of a silicon double-sided strip detector (Si-DSD) and a CdTe doublesided strip detector (CdTe-DSD), originally developed for the ASTRO-H satellite mission, an experiment involving imaging radioisotopes was conducted to study their feasibility for hotspot monitoring. In addition to the hotspot imaging already provided by commercial imaging systems, identification of various radioisotopes is possible thanks to the good energy resolution obtained by the semiconductor detectors. Three radioisotopes of 133Ba (356 keV), 22Na (511 keV) and 137Cs (662 keV) were individually imaged by applying event selection in the energy window and the gamma-ray images were correctly overlapped by an optical picture. Detection efficiency of 1.68 ×10-4 (effective area: 1.7×10-3 cm2) and angular resolution of 3.8 degrees were obtained by stacking five detector modules for a 662 keV gamma ray. The higher detection efficiency required in specific use can be achieved by stacking more detector modules.

Takeda, Shin'ichiro; Ichinohe, Yuto; Hagino, Kouichi; Odaka, Hirokazu; Yuasa, Takayuki; Ishikawa, Shin-nosuke; Fukuyama, Taro; Saito, Shinya; Sato, Tamotsu; Sato, Goro; Watanabe, Shin; Kokubun, Motohide; Takahashi, Tadayuki; Yamaguchi, Mitsutaka; Tajima, Hiroyasu; Tanaka, Takaaki; Nakazawa, Kazuhiro; Fukazawa, Yasushi; Nakano, Takashi

386

Camera Calibration.  

National Technical Information Service (NTIS)

Given an image that has been scaled both horizontally and vertically (possibly with different scale factors in the two directions), we determine the camera position and orientation, as well as the scale factors for sampling in the u and v axes of the imag...

I. Rigoutsos C. M. Brown

1986-01-01

387

A semi-automatic image-based close range 3D modeling pipeline using a multi-camera configuration.  

PubMed

The generation of photo-realistic 3D models is an important task for digital recording of cultural heritage objects. This study proposes an image-based 3D modeling pipeline which takes advantage of a multi-camera configuration and multi-image matching technique that does not require any markers on or around the object. Multiple digital single lens reflex (DSLR) cameras are adopted and fixed with invariant relative orientations. Instead of photo-triangulation after image acquisition, calibration is performed to estimate the exterior orientation parameters of the multi-camera configuration which can be processed fully automatically using coded targets. The calibrated orientation parameters of all cameras are applied to images taken using the same camera configuration. This means that when performing multi-image matching for surface point cloud generation, the orientation parameters will remain the same as the calibrated results, even when the target has changed. Base on this invariant character, the whole 3D modeling pipeline can be performed completely automatically, once the whole system has been calibrated and the software was seamlessly integrated. Several experiments were conducted to prove the feasibility of the proposed system. Images observed include that of a human being, eight Buddhist statues, and a stone sculpture. The results for the stone sculpture, obtained with several multi-camera configurations were compared with a reference model acquired by an ATOS-I 2M active scanner. The best result has an absolute accuracy of 0.26 mm and a relative accuracy of 1:17,333. It demonstrates the feasibility of the proposed low-cost image-based 3D modeling pipeline and its applicability to a large quantity of antiques stored in a museum. PMID:23112656

Rau, Jiann-Yeou; Yeh, Po-Chia

2012-01-01

388

Automated co-registration of images from multiple bands of Liss-4 camera  

NASA Astrophysics Data System (ADS)

Three multi-spectral bands of the Liss-4 camera of IRS-P6 satellite are physically separated in the focal plane in the along-track direction. The time separation of 2.1 s between the acquisition of first and last bands causes scan lines acquired by different bands to lie along different lines on the ground which are not parallel. Therefore, the raw images of multi-spectral bands need to be registered prior to any simple application like data visualization. This paper describes a method for co-registration of multiple bands of Liss-4 camera through photogrammetric means using the collinearity equations. A trajectory fit using the given ephemeris and attitude data, followed by direct georeferencing is being employed in this model. It is also augmented with a public domain DEM for the terrain dependent input to the model. Finer offsets after the application of this parametric technique are addressed by matching a small subsection of the bands (100×100 pixels) using an image-based method. Resampling is done by going back to original raw data when creating the product after refining image coordinates with the offsets. Two types of aligned products are defined in this paper and their operational flow is described. Datasets covering different types of terrain and also viewed with different geometries are studied with extensive number of points. The band-to-band registration (BBR) accuracies are reported. The algorithm described in this paper for co-registration of Liss-4 bands is an integral part of the software package Value Added Products generation System (VAPS) for operational generation of IRS-P6 data products.

Radhadevi, P. V.; Solanki, S. S.; Jyothi, M. V.; Nagasubramanian, V.; Varadan, Geeta

389

Field-programmable gate array-based hardware architecture for high-speed camera with KAI-0340 CCD image sensor  

NASA Astrophysics Data System (ADS)

We present a field-programmable gate array (FPGA)-based hardware architecture for high-speed camera which have fast auto-exposure control and colour filter array (CFA) demosaicing. The proposed hardware architecture includes the design of charge coupled devices (CCD) drive circuits, image processing circuits, and power supply circuits. CCD drive circuits transfer the TTL (Transistor-Transistor-Logic) level timing Sequences which is produced by image processing circuits to the timing Sequences under which CCD image sensor can output analog image signals. Image processing circuits convert the analog signals to digital signals which is processing subsequently, and the TTL timing, auto-exposure control, CFA demosaicing, and gamma correction is accomplished in this module. Power supply circuits provide the power for the whole system, which is very important for image quality. Power noises effect image quality directly, and we reduce power noises by hardware way, which is very effective. In this system, the CCD is KAI-0340 which is can output 210 full resolution frame-per-second, and our camera can work outstandingly in this mode. The speed of traditional auto-exposure control algorithms to reach a proper exposure level is so slow that it is necessary to develop a fast auto-exposure control method. We present a new auto-exposure algorithm which is fit high-speed camera. Color demosaicing is critical for digital cameras, because it converts a Bayer sensor mosaic output to a full color image, which determines the output image quality of the camera. Complexity algorithm can acquire high quality but cannot implement in hardware. An low-complexity demosaicing method is presented which can implement in hardware and satisfy the demand of quality. The experiment results are given in this paper in last.

Wang, Hao; Yan, Su; Zhou, Zuofeng; Cao, Jianzhong; Yan, Aqi; Tang, Linao; Lei, Yangjie

2013-08-01

390

Hubble Space Telescope Wide Field Camera imaging of the gravitational lens 2237 + 0305  

NASA Technical Reports Server (NTRS)

Images of the gravitational lens system 2237 + 0305, taken with the HST Wide Field Camera, are analyzed. Positions for the four quasar images, accurate to +/-0.015 arcsec, and relative magnitudes in U and R, accurate to +/-0.06 and 0.04 mag, respectively, are determined. The upper limits on the observed brightness of the fifth image are found to be less than or approximately equal to 7 percent of the brightest quasar image. The mass of the lens inside 0.9 arcsec is found to be 1.08 +/-0.02 x 10 exp 10 solar masses/h100 corresponding to a mass-to-light ratio in B of 12.3h100. This solar mass/solar luminosity estimate agrees with values obtained from stellar dynamics for other elliptical galaxies. A comparison of predictions from this mass model with the measured central velocity dispersion yields a distance-independent agreement to within 10 percent, assuming isotropic velocity dispersions.

Rix, Hans-Walter; Schneider, Donald P.; Bahcall, John N.

1992-01-01

391

Real-time image processing and fusion for a new high-speed dual-band infrared camera  

Microsoft Academic Search

A dual-band infrared camera system based on a dual-band quantum well infrared photodetector (QWIP) has been developed for acquiring images from both the mid-wavelength (MWIR) and long-wavelength (LWIR) infrared spectral band. The system delivers exactly pixel-registered simultaneously acquired images. It has the advantage that appropriate signal and image processing permit to exploit differences in the characteristics of those bands. Thus,

Markus Müller; Oliver Schreer; Monica López Sáenz

2007-01-01

392

Limb imaging of the Venus O2 visible nightglow with the Venus Monitoring Camera  

NASA Astrophysics Data System (ADS)

investigated the Venus O2 visible nightglow with imagery from the Venus Monitoring Camera on Venus Express. Drawing from data collected between April 2007 and January 2011, we study the global distribution of this emission, discovered in the late 1970s by the Venera 9 and 10 missions. The inferred limb-viewing intensities are on the order of 150 kR at the lower latitudes and seem to drop somewhat toward the poles. The emission is generally stable, although there are episodes when the intensities rise up to 500 kR. We compare a set of Venus Monitoring Camera observations with coincident measurements of the O2 nightglow at 1.27 µm made with the Visible and Infrared Thermal Imaging Spectrometer, also on Venus Express. From the evidence gathered in this and past works, we suggest a direct correlation between the instantaneous emissions from the two O2 nightglow systems. Possible implications regarding the uncertain origin of the atomic oxygen green line at 557.7 nm are noted.

GarcíA MuñOz, A.; Hueso, R.; SáNchez-Lavega, A.; Markiewicz, W. J.; Titov, D. V.; Witasse, O.; Opitz, A.

2013-06-01

393

ANTS — a simulation package for secondary scintillation Anger-camera type detector in thermal neutron imaging  

NASA Astrophysics Data System (ADS)

A custom and fully interactive simulation package ANTS (Anger-camera type Neutron detector: Toolkit for Simulations) has been developed to optimize the design and operation conditions of secondary scintillation Anger-camera type gaseous detectors for thermal neutron imaging. The simulation code accounts for all physical processes related to the neutron capture, energy deposition pattern, drift of electrons of the primary ionization and secondary scintillation. The photons are traced considering the wavelength-resolved refraction and transmission of the output window. Photo-detection accounts for the wavelength-resolved quantum efficiency, angular response, area sensitivity, gain and single-photoelectron spectra of the photomultipliers (PMTs). The package allows for several geometrical shapes of the PMT photocathode (round, hexagonal and square) and offers a flexible PMT array configuration: up to 100 PMTs in a custom arrangement with the square or hexagonal packing. Several read-out patterns of the PMT array are implemented. Reconstruction of the neutron capture position (projection on the plane of the light emission) is performed using the center of gravity, maximum likelihood or weighted least squares algorithm. Simulation results reproduce well the preliminary results obtained with a small-scale detector prototype. ANTS executables can be downloaded from http://coimbra.lip.pt/~andrei/.

Morozov, A.; Defendi, I.; Engels, R.; Fraga, F. A. F.; Fraga, M. M. F. R.; Guerard, B.; Jurkovic, M.; Kemmerling, G.; Manzin, G.; Margato, L. M. S.; Niko, H.; Pereira, L.; Petrillo, C.; Peyaud, A.; Piscitelli, F.; Raspino, D.; Rhodes, N. J.; Sacchetti, F.; Schooneveld, E. M.; Van Esch, P.; Zeitelhack, K.

2012-08-01

394

Studies of Defect Detection and Thermal Influence in Semi-Insulating 6H-SiC Substrates Using a Long-Wavelength Infrared Thermal Imaging Camera  

Microsoft Academic Search

A long-wavelength infrared thermal imaging camera (IR camera) was applied to visually evaluate the thermal influence of defects in semi-insulating 6H-SiC substrates. Defects in substrates were rapidly and effectively detected by IR camera observation, and the dependence of the temperature on the defect size could be observed precisely. We have applied an IR camera to show clearly, for the first

Passapong Wutimakun; Kunyong Lee; Hisashi Miyazaki; Jun Morimoto

2010-01-01

395

2 MHz modulated photothermal imaging using a CCD camera coupled to a multichannel lock-in detection  

NASA Astrophysics Data System (ADS)

We have developed a new photothermal microscope from which we can make a thermal image of an electronic component working at a ``high frequency'' (2 MHz) using a CCD camera and a multichannel lock-in scheme. To do this, we have created an electronic ``stroboscope'' which controls the device, the LED that illuminates the device and analyzes the signal reflected back to the camera whose amplitude is proportional to the variation of reflectivity, and hence to the variation of temperature at the surface of the device. A thermal image of resistors will be presented.

Grauby, S.; Forget, B. C.; Fournier, D.

1999-03-01

396

Comparison of X-ray pinhole camera images with calculations based on lower-hybrid wave physics  

SciTech Connect

An x-ray pinhole camera helps the PBX-M experiment diagnose the location of lower hybrid current drive now being used to demonstrate deliberate modification of the current profile to improve plasma stability and beta. One computational model supporting the experiment is the Lower hybrid Simulation Code (LSC), which can predict the image in the camera after doing multiple ray tracing in general beam-shaped plasmas, and quasilinear damping of waves on electrons. We present experimental and computational images, and discuss the differences and similarities.

Ignat, D.W.; Valeo, E.J.; von Goeler, S. (Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543-0451 (United States))

1994-10-15

397

Space-bandwidth extension in parallel phase-shifting digital holography using a four-channel polarization-imaging camera.  

PubMed

We propose a method for extending the space bandwidth (SBW) available for recording an object wave in parallel phase-shifting digital holography using a four-channel polarization-imaging camera. A linear spatial carrier of the reference wave is introduced to an optical setup of parallel four-step phase-shifting interferometry using a commercially available polarization-imaging camera that has four polarization-detection channels. Then a hologram required for parallel two-step phase shifting, which is a technique capable of recording the widest SBW in parallel phase shifting, can be obtained. The effectiveness of the proposed method was numerically and experimentally verified. PMID:23939081

Tahara, Tatsuki; Ito, Yasunori; Xia, Peng; Awatsuji, Yasuhiro; Nishio, Kenzo; Ura, Shogo; Kubota, Toshihiro; Matoba, Osamu

2013-07-15

398

Mars Orbiter Camera Acquires High Resolution Stereoscopic Images of the Viking One Landing Site  

NASA Technical Reports Server (NTRS)

Two MOC images of the vicinity of the Viking Lander 1 (MOC 23503 and 25403), acquired separately on 12 April 1998 at 08:32 PDT and 21 April 1998 at 13:54 PDT (respectively), are combined here in a stereoscopic anaglyph. The more recent, slightly better quality image is in the red channel, while the earlier image is shown in the blue and green channels. Only the overlap portion of the images is included in the composite.

Image 23503 was taken at a viewing angle of 31.6o from vertical; 25403 was taken at an angle of 22.4o, for a difference of 9.4o. Although this is not as large a difference as is typically used in stereo mapping, it is sufficient to provide some indication of relief, at least in locations of high relief.

The image shows the raised rims and deep interiors of the larger impact craters in the area (the largest crater is about 650 m/2100 feet across). It shows that the relief on the ridges is very subtle, and that, in general, the Viking landing site is very flat. This result is, of course, expected: the VL-1 site was chosen specifically because it was likely to have low to very low slopes that represented potential hazards to the spacecraft.

Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.

1998-01-01

399

A Powerful New Imager for HST: Performance and Early Science Results from Wide Field Camera 3  

NASA Technical Reports Server (NTRS)

Wide Field Camera 3 (WFC3) was installed into the Hubble Space Telescope during the highly successful Servicing Mission 4 in May, 2009. WFC3 offers sensitive, high resolution imaging over a broad wavelength range from the near UV through the visible to the near IR (200nm - 1700nm). Its capabilities in the near UV and near IR ends of that range represent particularly large advances vs. those of previous HST instruments. In this talk, I will review the purpose and design of the instrument, describe its performance in flight, and highlight some of the initial scientific results from the instrument, including its use in deep infrared surveys in search of galaxies at very high redshift, in investigations of the global processes of star formation in nearby galaxies, and in the study of the recent impact on Jupiter.

Kimble, Randy

2009-01-01

400

Quantitative microwave imaging with a 2.45-GHz planar microwave camera.  

PubMed

This paper presents microwave tomographic reconstructions of the complex permittivity of lossy dielectric objects immersed in water from experimental multiview near-field data obtained with a 2.45-GHz planar active microwave camera. An iterative reconstruction algorithm based on the Levenberg-Marquardt method was used to solve the nonlinear matrix equation which results when applying a moment method to the electric field integral representation. The effects of uncertainties in experimental parameters such as the exterior medium complex permittivity, the imaging system geometry and the incident field at the object location are illustrated by means of reconstructions from synthetic data. It appears that the uncertainties in the incident field have the strongest impact on the reconstructions. A receiver calibration procedure has been implemented and some ways to access to the incident field at the object location have been assessed. PMID:9845311

Franchois, A; Joisel, A; Pichot, C; Bolomey, J C

1998-08-01

401

Results of shuttle EMU thermal vacuum tests incorporating an infrared imaging camera data acquisition system  

NASA Technical Reports Server (NTRS)

Manned tests in Chamber B at NASA JSC were conducted in May and June of 1990 to better quantify the Space Shuttle Extravehicular Mobility Unit's (EMU) thermal performance in the cold environmental extremes of space. Use of an infrared imaging camera with real-time video monitoring of the output significantly added to the scope, quality and interpretation of the test conduct and data acquisition. Results of this test program have been effective in the thermal certification of a new insulation configuration and the '5000 Series' glove. In addition, the acceptable thermal performance of flight garments with visually deteriorated insulation was successfully demonstrated, thereby saving significant inspection and garment replacement cost. This test program also established a new method for collecting data vital to improving crew thermal comfort in a cold environment.

Anderson, James E.; Tepper, Edward H.; Trevino, Louis A.

1991-01-01

402