These are representative sample records from Science.gov related to your search topic.
For comprehensive and current results, perform a real-time search at Science.gov.
1

Lunar Reconnaissance Orbiter Camera (LROC) instrument overview  

USGS Publications Warehouse

The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.

Robinson, M.S.; Brylow, S.M.; Tschimmel, M.; Humm, D.; Lawrence, S.J.; Thomas, P.C.; Denevi, B.W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M.A.; Caplinger, M.A.; Ghaemi, F.T.; Schaffner, J.A.; Malin, M.C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T.N.; Eliason, E.M.; McEwen, A.S.; Turtle, E.; Jolliff, B.L.; Hiesinger, H.

2010-01-01

2

Lunar Reconnaissance Orbiter Camera (LROC) Instrument Overview  

Microsoft Academic Search

The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar\\u000a Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m\\/pixel visible and UV, respectively), while\\u000a the two NACs are monochrome narrow-angle linescan imagers (0.5 m\\/pixel). The primary mission of LRO is to obtain measurements\\u000a of the Moon that

M. S. Robinson; S. M. Brylow; M. Tschimmel; D. Humm; S. J. Lawrence; P. C. Thomas; B. W. Denevi; E. Bowman-Cisneros; J. Zerr; M. A. Ravine; M. A. Caplinger; F. T. Ghaemi; J. A. Schaffner; M. C. Malin; P. Mahanti; A. Bartels; J. Anderson; T. N. Tran; E. M. Eliason; A. S. McEwen; E. Turtle; B. L. Jolliff; H. Hiesinger

2010-01-01

3

Retrieving lunar topography from multispectral LROC images  

NASA Astrophysics Data System (ADS)

A technique for retrieving information about the lunar topography from any individual multispectral LROC Wide Angle Camera (WAC) image has been developed. This technology is possible, since images acquired at different wavelengths correspond to different viewing angles and the influence of color differences between the images on the parallax assessments is small. This method provides the precision of Digital Elevation Models (DEMs) comparable to the global lunar 100 m raster DTM retrieved from the LROC WAC stereo model (GLD100). It potentially allows one to obtain maps of the elevations with better horizontal resolution than those of the GLD100. An empirical model of the distortion for LROC WAC has been developed and used for correction of the initial WAC images. In contrast to the standard pre-flight model, our model allows for compensation of the radial distortion, decentering the optics, and tilt of the CCD array almost fully. The DEMs obtained using our approach exhibit real morphological details in some cases that are invisible in GLD100 maps. Thus, our method suggests additional independent information about the lunar topography. The fact that our elevation maps have the same projection as the initial images allows valid corrections of these images to account for topographic effects (i. e. orthorectification) in contrast to the use of the GLD100 that may have slightly different coordinates referencing in comparison to individual WAC images.

Korokhin, Viktor V.; Velikodsky, Yuri I.; Shalygin, Eugene V.; Shkuratov, Yuriy G.; Kaydash, Vadym G.; Videen, Gorden

2014-03-01

4

Characterization of previously unidentified lunar pyroclastic deposits using Lunar Reconnaissance Orbiter Camera (LROC) data  

USGS Publications Warehouse

We used a Lunar Reconnaissance Orbiter Camera (LROC) global monochrome Wide-angle Camera (WAC) mosaic to conduct a survey of the Moon to search for previously unidentified pyroclastic deposits. Promising locations were examined in detail using LROC multispectral WAC mosaics, high-resolution LROC Narrow Angle Camera (NAC) images, and Clementine multispectral (ultraviolet-visible or UVVIS) data. Out of 47 potential deposits chosen for closer examination, 12 were selected as probable newly identified pyroclastic deposits. Potential pyroclastic deposits were generally found in settings similar to previously identified deposits, including areas within or near mare deposits adjacent to highlands, within floor-fractured craters, and along fissures in mare deposits. However, a significant new finding is the discovery of localized pyroclastic deposits within floor-fractured craters Anderson E and F on the lunar farside, isolated from other known similar deposits. Our search confirms that most major regional and localized low-albedo pyroclastic deposits have been identified on the Moon down to ~100 m/pix resolution, and that additional newly identified deposits are likely to be either isolated small deposits or additional portions of discontinuous, patchy deposits.

Gustafson, J. Olaf; Bell, James F.; Gaddis, Lisa R.R.; Hawke, B. Ray Ray; Giguere, Thomas A.

2012-01-01

5

Regolith thickness estimation over Sinus Iridum using morphology of small craters from LROC images  

NASA Astrophysics Data System (ADS)

Regolith thickness over Sinus Iridum region is estimated using morphology and size-frequency distribution of small craters that are counted from Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Cameras (NACs) images. Results show that regolith thickness for Sinus Iridum is from 2 m to more than 10 m, with a medium value between 4.1 m and 6.1 m.

Liu, T.; Fa, W.

2013-09-01

6

Photometric normalization of LROC WAC images  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) acquires near global coverage on a monthly basis. The WAC is a push frame sensor with a 90° field of view (FOV) in BW mode and 60° FOV in 7-color mode (320 nm to 689 nm). WAC images are acquired during each orbit in 10° latitude segments with cross track coverage of ~50 km. Before mosaicking, WAC images are radiometrically calibrated to remove instrumental artifacts and to convert at sensor radiance to I/F. Images are also photometrically normalized to common viewing and illumination angles (30° phase), a challenge due to the wide angle nature of the WAC where large differences in phase angle are observed in a single image line (±30°). During a single month the equatorial incidence angle drifts about 28° and over the course of ~1 year the lighting completes a 360° cycle. The light scattering properties of the lunar surface depend on incidence(i), emission(e), and phase(p) angles as well as soil properties such as single-scattering albedo and roughness that vary with terrain type and state of maturity [1]. We first tested a Lommel-Seeliger Correction (LSC) [cos(i)/(cos(i) + cos(e))] [2] with a phase function defined by an exponential decay plus 4th order polynomial term [3] which did not provide an adequate solution. Next we employed a LSC with an exponential 2nd order decay phase correction that was an improvement, but still exhibited unacceptable frame-to-frame residuals. In both cases we fitted the LSC I/F vs. phase angle to derive the phase corrections. To date, the best results are with a lunar-lambert function [4] with exponential 2nd order decay phase correction (LLEXP2) [(A1exp(B1p)+A2exp(B2p)+A3) * cos(i)/(cos(e) + cos(i)) + B3cos(i)]. We derived the parameters for the LLEXP2 from repeat imaging of a small region and then corrected that region with excellent results. When this correction was applied to the whole Moon the results were less than optimal - no surprise given the variability of the regolith from region to region. As the fitting area increases, the accuracy of curve fitting decreases due to the larger variety of albedo, topography, and composition. Thus we have adopted an albedo-dependent photometric normalization routine. Phase curves are derived for discreet bins of preliminary normalized reflectance calculated from Clementine global mosaic in a fitting area that is composed of predominantly mare in Oceanus Procellarum. The global WAC mosaic was then corrected pixel-by-pixel according to its preliminary reflectance map with satisfactory results. We observed that the phase curves per normalized-reflectance bins become steeper as the reflectance value increases. Further filtering by using FeO, TiO2, or optical maturity [5] for parameter calculations may help elucidate the effects of surface composition and maturity on photometric properties of the surface. [1] Hapke, B.W. (1993) Theory of Reflectance and Emittance Spectroscopy, Cambridge Univ. Press. [2] Schoenberg (1925) Ada. Soc. Febb., vol. 50. [3] Hillier et al. (1999) Icarus 141, 205-225. [4] McEwen (1991) Icarus 92, 298-311. [5] Lucey et al. (2000) JGR, v105, no E8, p20377-20386.

Sato, H.; Denevi, B.; Robinson, M. S.; Hapke, B. W.; McEwen, A. S.; LROC Science Team

2010-12-01

7

LROC NAC Stereo Anaglyphs  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter Camera Narrow Angle Camera (LROC NAC) acquires high resolution (50 to 200 cm pixel scale) images of the Moon. In operation since June 2009, LROC NAC acquires geometric stereo pairs by rolling off-nadir on subsequent orbits. A new automated processing system currently in development will produce anaglyphs from most of the NAC geometric stereo pairs. An anaglyph is an image formed by placing one image from the stereo pair in the red channel, and the other image from the stereo pair in the green and blue channels, so that together with red-blue or red-cyan glasses, the 3D information in the pair can be readily viewed. These new image products will make qualitative interpretation of the lunar surface in 3D more accessible, without the need for intensive computational resources or special equipment. The LROC NAC is composed of two separate pushbroom CCD cameras (NAC L and R) aligned to increase the full swath width to 5 km from an altitude of 50 km. Development of the anaglyph processing system incorporates stereo viewing geometry, proper alignment of the NAC L and R frames, and optimal contrast normalization of the stereo pair to minimize extreme brightness differences, which can make stereo viewing difficult in an anaglyph. The LROC NAC anaglyph pipeline is based on a similar automated system developed for the HiRISE camera, on the Mars Reconnaissance Orbiter. Improved knowledge of camera pointing and spacecraft position allows for the automatic registration of the L and R frames by map projecting them to a polar stereographic projection. One half of the stereo pair must then be registered to the other so there is no offset in the vertical (y) direction. Stereo viewing depends on parallax only in the horizontal (x) direction. High resolution LROC NAC anaglyphs will be made available to the lunar science community and to the public on the LROC web site (http://lroc.sese.asu.edu).

Mattson, S.; McEwen, A. S.; Speyerer, E.; Robinson, M. S.

2012-12-01

8

Investigating at the Moon With new Eyes: The Lunar Reconnaissance Orbiter Mission Camera (LROC)  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter Mission Camera (LROC) H. Hiesinger (1,2), M.S. Robinson (3), A.S. McEwen (4), E.P. Turtle (4), E.M. Eliason (4), B.L. Jolliff (5), M.C. Malin (6), and P.C. Thomas (7) (1) Brown Univ., Dept. of Geological Sciences, Providence RI 02912, Harald_Hiesinger@brown.edu, (2) Westfaelische Wilhelms-University, (3) Northwestern Univ., (4) LPL, Univ. of Arizona, (5) Washington Univ., (6) Malin Space Science Systems, (7) Cornell Univ. The Lunar Reconnaissance Orbiter (LRO) mission is scheduled for launch in October 2008 as a first step to return humans to the Moon by 2018. The main goals of the Lunar Reconnaissance Orbiter Camera (LROC) are to: 1) assess meter and smaller- scale features for safety analyses for potential lunar landing sites near polar resources, and elsewhere on the Moon; and 2) acquire multi-temporal images of the poles to characterize the polar illumination environment (100 m scale), identifying regions of permanent shadow and permanent or near permanent illumination over a full lunar year. In addition, LROC will return six high-value datasets such as 1) meter-scale maps of regions of permanent or near permanent illumination of polar massifs; 2) high resolution topography through stereogrammetric and photometric stereo analyses for potential landing sites; 3) a global multispectral map in 7 wavelengths (300-680 nm) to characterize lunar resources, in particular ilmenite; 4) a global 100-m/pixel basemap with incidence angles (60-80 degree) favorable for morphologic interpretations; 5) images of a variety of geologic units at sub-meter resolution to investigate physical properties and regolith variability; and 6) meter-scale coverage overlapping with Apollo Panoramic images (1-2 m/pixel) to document the number of small impacts since 1971-1972, to estimate hazards for future surface operations. LROC consists of two narrow-angle cameras (NACs) which will provide 0.5-m scale panchromatic images over a 5-km swath, a wide-angle camera (WAC) to acquire images at about 100 m/pixel in seven color bands over a 100-km swath, and a common Sequence and Compressor System (SCS). Each NAC has a 700-mm-focal-length optic that images onto a 5000-pixel CCD line-array, providing a cross-track field-of-view (FOV) of 2.86 degree. The NAC readout noise is better than 100 e- , and the data are sampled at 12 bits. Its internal buffer holds 256 MB of uncompressed data, enough for a full-swath image 25-km long or a 2x2 binned image 100-km long. The WAC has two 6-mm- focal-length lenses imaging onto the same 1000 x 1000 pixel, electronically shuttered CCD area-array, one imaging in the visible/near IR, and the other in the UV. Each has a cross-track FOV of 90 degree. From the nominal 50-km orbit, the WAC will have a resolution of 100 m/pixel in the visible, and a swath width of ˜100 km. The seven-band color capability of the WAC is achieved by color filters mounted directly 1 over the detector, providing different sections of the CCD with different filters [1]. The readout noise is less than 40 e- , and, as with the NAC, pixel values are digitized to 12-bits and may be subsequently converted to 8-bit values. The total mass of the LROC system is about 12 kg; the total LROC power consumption averages at 22 W (30 W peak). Assuming a downlink with lossless compression, LRO will produce a total of 20 TeraBytes (TB) of raw data. Production of higher-level data products will result in a total of 70 TB for Planetary Data System (PDS) archiving, 100 times larger than any previous missions. [1] Malin et al., JGR, 106, 17651-17672, 2001. 2

Hiesinger, H.; Robinson, M. S.; McEwen, A. S.; Turtle, E. P.; Eliason, E. M.; Jolliff, B. L.; Malin, M. C.; Thomas, P. C.

9

Exploring the Moon at High-Resolution: First Results From the Lunar Reconnaissance Orbiter Camera (LROC)  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter (LRO) spacecraft was launched on an Atlas V 401 rocket from the Cape Canaveral Air Force Station Launch Complex 41 on June 18, 2009. After spending four days in Earth-Moon transit, the spacecraft entered a three month commissioning phase in an elliptical 30×200 km orbit. On September 15, 2009, LRO began its planned one-year nominal mapping mission in a quasi-circular 50 km orbit. A multi-year extended mission in a fixed 30×200 km orbit is optional. The Lunar Reconnaissance Orbiter Camera (LROC) consists of a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NACs). The WAC is a 7-color push-frame camera, which images the Moon at 100 and 400 m/pixel in the visible and UV, respectively, while the two NACs are monochrome narrow-angle linescan imagers with 0.5 m/pixel spatial resolution. LROC was specifically designed to address two of the primary LRO mission requirements and six other key science objectives, including 1) assessment of meter-and smaller-scale features in order to select safe sites for potential lunar landings near polar resources and elsewhere on the Moon; 2) acquire multi-temporal synoptic 100 m/pixel images of the poles during every orbit to unambiguously identify regions of permanent shadow and permanent or near permanent illumination; 3) meter-scale mapping of regions with permanent or near-permanent illumination of polar massifs; 4) repeat observations of potential landing sites and other regions to derive high resolution topography; 5) global multispectral observations in seven wavelengths to characterize lunar resources, particularly ilmenite; 6) a global 100-m/pixel basemap with incidence angles (60° -80° ) favorable for morphological interpretations; 7) sub-meter imaging of a variety of geologic units to characterize their physical properties, the variability of the regolith, and other key science questions; 8) meter-scale coverage overlapping with Apollo-era panoramic images (1-2 m/pixel) to document the number of small impacts since 1971-1972. LROC allows us to determine the recent impact rate of bolides in the size range of 0.5 to 10 meters, which is currently not well known. Determining the impact rate at these sizes enables engineering remediation measures for future surface operations and interplanetary travel. The WAC has imaged nearly the entire Moon in seven wavelengths. A preliminary global WAC stereo-based topographic model is in preparation [1] and global color processing is underway [2]. As the mission progresses repeat global coverage will be obtained as lighting conditions change providing a robust photometric dataset. The NACs are revealing a wealth of morpho-logic features at the meter scale providing the engineering and science constraints needed to support future lunar exploration. All of the Apollo landing sites have been imaged, as well as the majority of robotic landing and impact sites. Through the use of off-nadir slews a collection of stereo pairs is being acquired that enable 5-m scale topographic mapping [3-7]. Impact mor-phologies (terraces, impact melt, rays, etc) are preserved in exquisite detail at all Copernican craters and are enabling new studies of impact mechanics and crater size-frequency distribution measurements [8-12]. Other topical studies including, for example, lunar pyroclastics, domes, and tectonics are underway [e.g., 10-17]. The first PDS data release of LROC data will be in March 2010, and will include all images from the commissioning phase and the first 3 months of the mapping phase. [1] Scholten et al. (2010) 41st LPSC, #2111; [2] Denevi et al. (2010a) 41st LPSC, #2263; [3] Beyer et al. (2010) 41st LPSC, #2678; [4] Archinal et al. (2010) 41st LPSC, #2609; [5] Mattson et al. (2010) 41st LPSC, #1871; [6] Tran et al. (2010) 41st LPSC, #2515; [7] Oberst et al. (2010) 41st LPSC, #2051; [8] Bray et al. (2010) 41st LPSC, #2371; [9] Denevi et al. (2010b) 41st LPSC, #2582; [10] Hiesinger et al. (2010a) 41st LPSC, #2278; [11] Hiesinger et al. (2010b) 41st LPSC, #2304; [12] van der Bogert et al. (2010) 41st LPSC, #2165;

Robinson, Mark; Hiesinger, Harald; McEwen, Alfred; Jolliff, Brad; Thomas, Peter C.; Turtle, Elizabeth; Eliason, Eric; Malin, Mike; Ravine, A.; Bowman-Cisneros, Ernest

10

Where did the Luna 23 and 24 spacecraft land?: Comparing the spacecraft seen in LROC NAC images with synthetic images  

NASA Astrophysics Data System (ADS)

Shkuratov et al. (2013) described specific photometric anomalies found around the Luna 16, 20 and 23 spacecraft and not found around the Luna 24 spacecraft. The authors explained this lack of an anomaly at the Luna 24 site as a result of the misidentification of the Luna 23 and 24 spacecraft in the LROC images by Robinson et al. (2012). In order to address this question, we synthesized images of the Luna spacecraft as they might appear in the LROC images (made by the Lavochkin Association, builders of the Luna spacecraft series). We compared the model images of the virtual Luna 23 and Luna 24 spacecraft sitting on the lunar surface with the spacecraft seen in the LROC images and concluded, on the basis of similarity of the spacecraft seen in the LROC images with the synthetic images, that identification of the spacecraft in these images by Robinson et al. (2012) is likely correct.

Dolgopolov, V. P.; Basilevsky, A. T.; Robinson, M. S.; Plescia, J. B.; Head, J. W.

2013-06-01

11

Secondary Craters and the Size-Velocity Distribution of Ejected Fragments around Lunar Craters Measured Using LROC Images  

NASA Astrophysics Data System (ADS)

Title: Secondary Craters and the Size-Velocity Distribution of Ejected Fragments around Lunar Craters Measured Using LROC Images Authors: Kelsi N. Singer1, Bradley L. Jolliff1, and William B. McKinnon1 Affiliations: 1. Earth and Planetary Sciences, Washington University in St Louis, St. Louis, MO, United States. We report results from analyzing the size-velocity distribution (SVD) of secondary crater forming fragments from the 93 km diameter Copernicus impact. We measured the diameters of secondary craters and their distances from Copernicus using LROC Wide Angle Camera (WAC) and Narrow Angle Camera (NAC) image data. We then estimated the velocity and size of the ejecta fragment that formed each secondary crater from the range equation for a ballistic trajectory on a sphere and Schmidt-Holsapple scaling relations. Size scaling was carried out in the gravity regime for both non-porous and porous target material properties. We focus on the largest ejecta fragments (dfmax) at a given ejection velocity (?ej) and fit the upper envelope of the SVD using quantile regression to an equation of the form dfmax = A*?ej ^- ?. The velocity exponent, ?, describes how quickly fragment sizes fall off with increasing ejection velocity during crater excavation. For Copernicus, we measured 5800 secondary craters, at distances of up to 700 km (15 crater radii), corresponding to an ejecta fragment velocity of approximately 950 m/s. This mapping only includes secondary craters that are part of a radial chain or cluster. The two largest craters in chains near Copernicus that are likely to be secondaries are 6.4 and 5.2 km in diameter. We obtained a velocity exponent, ?, of 2.2 × 0.1 for a non-porous surface. This result is similar to Vickery's [1987, GRL 14] determination of ? = 1.9 × 0.2 for Copernicus using Lunar Orbiter IV data. The availability of WAC 100 m/pix global mosaics with illumination geometry optimized for morphology allows us to update and extend the work of Vickery [1986, Icarus 67, and 1987], who compared secondary crater SVDs for craters on the Moon, Mercury, and Mars. Additionally, meter-scale NAC images enable characterization of secondary crater morphologies and fields around much smaller primary craters than were previously investigated. Combined results from all previous studies of ejecta fragment SVDs from secondary crater fields show that ? ranges between approximately 1 and 3. First-order spallation theory predicts a ? of 1 [Melosh 1989, Impact Cratering, Oxford Univ. Press]. Results in Vickery [1987] for the Moon exhibit a generally decreasing ? with increasing primary crater size (5 secondary fields mapped). In the same paper, however, this trend is flat for Mercury (3 fields mapped) and opposite for Mars (4 fields mapped). SVDs for craters on large icy satellites (Ganymede and Europa), with gravities not too dissimilar to lunar gravity, show generally low velocity exponents (? between 1 and 1.5), except for the very largest impactor measured: the 585-km-diameter Gilgamesh basin on Ganymede (? = 2.6 × 0.4) [Singer et al., 2013, Icarus 226]. The present work, focusing initially on lunar craters using LROC data, will attempt to confirm or clarify these trends, and expand the number of examples under a variety of impact conditions and surface materials to evaluate possible causes of variations.

Singer, K. N.; Jolliff, B. L.; McKinnon, W. B.

2013-12-01

12

Photometric parameter maps of the Moon derived from LROC WAC images  

NASA Astrophysics Data System (ADS)

Spatially resolved photometric parameter maps were computed from 21 months of Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) images. Due to a 60° field-of-view (FOV), the WAC achieves nearly global coverage of the Moon each month with more than 50% overlap from orbit-to-orbit. From the repeat observations at various viewing and illumination geometries, we calculated Hapke bidirectional reflectance model parameters [1] for 1°x1° "tiles" from 70°N to 70°S and 0°E to 360°E. About 66,000 WAC images acquired from February 2010 to October 2011 were converted from DN to radiance factor (I/F) though radiometric calibration, partitioned into gridded tiles, and stacked in a time series (tile-by-tile method [2]). Lighting geometries (phase, incidence, emission) were computed using the WAC digital terrain model (100 m/pixel) [3]. The Hapke parameters were obtained by model fitting against I/F within each tile. Among the 9 parameters of the Hapke model, we calculated 3 free parameters (w, b, and hs) by setting constant values for 4 parameters (Bco=0, hc=1, ?, ?=0) and interpolating 2 parameters (c, Bso). In this simplification, we ignored the Coherent Backscatter Opposition Effect (CBOE) to avoid competing CBOE and Shadow Hiding Opposition Effect (SHOE). We also assumed that surface regolith porosity is uniform across the Moon. The roughness parameter (?) was set to an averaged value from the equator (× 3°N). The Henyey-Greenstein double lobe function (H-G2) parameter (c) was given by the 'hockey stick' relation [4] (negative correlation) between b and c based on laboratory measurements. The amplitude of SHOE (Bso) was given by the correlation between w and Bso at the equator (× 3°N). Single scattering albedo (w) is strongly correlated to the photometrically normalized I/F, as expected. The c shows an inverse trend relative to b due to the 'hockey stick' relation. The parameter c is typically low for the maria (0.08×0.06) relative to the highlands (0.47×0.16). Since c controls the fraction of backward/forward scattering in H-G2, lower c for the maria indicates more forward scattering relative to the highlands. This trend is opposite to what was expected because darker particles are usually more backscattering. However, the lower albedo of the maria is due to the higher abundance of ilmenite, which is an opaque mineral that scatters all of the light by specular reflection from the its surface. If their surface facets are relatively smooth the ilmenite particles will be forward scattering. Other factors (e.g. grain shape, grain size, porosity, maturity) besides the mineralogy might also be affecting c. The angular-width of SHOE (hs) typically shows lower values (0.047×0.02) for the maria relative to the highlands (0.074×0.025). An increase in hs for the maria theoretically suggests lower porosity or a narrower grain size distribution [1], but the link between actual materials and hs is not well constrained. Further experiments using both laboratory and spacecraft observations will help to unravel the photometric properties of the surface materials of the Moon. [1] Hapke, B.: Cambridge Univ. Press, 2012. [2] Sato, H. et al.: 42nd LPSC, abstract #1974, 2011. [3] Scholten, F. et al.: JGR, 117, E00H17, 2012. [4] Hapke, B.: Icarus, 221(2), p1079-1083, 2012.

Sato, H.; Robinson, M. S.; Hapke, B. W.; Denevi, B. W.; Boyd, A. K.

2013-12-01

13

Depths, Diameters, and Profiles of Small Lunar Craters From LROC NAC Stereo Images  

NASA Astrophysics Data System (ADS)

Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) images (pixel scale ~0.5 m) provide new 3-D views of small craters (40m>D>200m). We extracted topographic profiles from 85 of these craters in mare and highland terrains between 18.1-19.1°N and 5.2-5.4°E to investigate relationships among crater shape, age, and target. Obvious secondary craters (e.g., clustered) and moderately- to heavily-degraded craters were excluded. The freshest craters included in the study have crisp rims, bright ejecta, and no superposed craters. The depth, diameter, and profiles of each crater were determined from a NAC-derived DTM (M119808916/M119815703) tied to LOLA topography with better than 1 m vertical resolution (see [1]). Depth/diameter ratios for the selected craters are generally between 0.12 and 0.2. Crater profiles were classified into one of 3 categories: V-shaped, U-shaped, or intermediate (craters on steep slopes were excluded). Craters were then morphologically classified according to [2], where crater shape is determined by changes in material strength between subsurface layers, resulting in bowl-shaped, flat-bottomed, concentric, or central-mound crater forms. In this study, craters with U-shaped profiles tend to be small (<60 m) and flat-bottomed, while V-shaped craters have steep slopes (~20°), little to no floor, and a range of diameters. Both fresh and relatively degraded craters display the full range of profile shapes (from U to V and all stages in between). We found it difficult to differentiate U-shaped craters from V-shaped craters without the DTM, and we saw no clear correlation between morphologic and profile classification. Further study is still needed to increase our crater statistics and expand on the relatively small population of craters included here. For the craters in this study, we found that block abundances correlate with relative crater degradation state as defined by [3], where abundant blocks signal fresher craters; however, block abundances do not correlate with U- or V-shaped profiles. The craters examined here show that profile shape cannot be used to determine the relative age or degradation state as might be inferred from [4, for example]. The observed variability in crater profiles may be explained by local variations in regolith thickness [e.g., 2, 5], impactor velocity, and/or possibly bolide density. Ongoing efforts will quantify the possible effects of solitary secondary craters and investigate whether or not depth/diameter ratios and crater profiles vary between different regions of the Moon (thick vs thin regolith, highlands vs mare, and old vs young mare). References: [1] Tran T. et al. (2010) LPSC XXXXI, Abstract 2515. [2] Quaide W. L. and V. R. Oberbeck (1968) JGR, 73: 5247-5270. [3] Basilevsky A. T. (1976) Proc LPSC 7th, p. 1005-1020. [4] Soderblom L. A. and L. A. Lebofsky (1972) JGR, 77: 279-296. [5] Wilcox B. B. et al. (2005) Met. Planet. Sci., 40: 695-710.

Stopar, J. D.; Robinson, M.; Barnouin, O. S.; Tran, T.

2010-12-01

14

LROC Advances in Lunar Science  

NASA Astrophysics Data System (ADS)

Since entering orbit in 2009 the Lunar Reconnaissance Orbiter Camera (LROC) has acquired over 700,000 Wide Angle Camera (WAC) and Narrow Angle Camera (NAC) images of the Moon. This new image collection is fueling research into the origin and evolution of the Moon. NAC images revealed a volcanic complex 35 x 25 km (60N, 100E), between Compton and Belkovich craters (CB). The CB terrain sports volcanic domes and irregular depressed areas (caldera-like collapses). The volcanic complex corresponds to an area of high-silica content (Diviner) and high Th (Lunar Prospector). A low density of impact craters on the CB complex indicates a relatively young age. The LROC team mapped over 150 volcanic domes and 90 volcanic cones in the Marius Hills (MH), many of which were not previously identified. Morphology and compositional estimates (Diviner) indicate that MH domes are silica poor, and are products of low-effusion mare lavas. Impact melt deposits are observed with Copernican impact craters (>10 km) on exterior ejecta, the rim, inner wall, and crater floors. Preserved impact melt flow deposits are observed around small craters (25 km diam.), and estimated melt volumes exceed predictions. At these diameters the amount of melt predicted is small, and melt that is produced is expected to be ejected from the crater. However, we observe well-defined impact melt deposits on the floor of highland craters down to 200 m diameter. A globally distributed population of previously undetected contractional structures were discovered. Their crisp appearance and associated impact crater populations show that they are young landforms (<1 Ga). NAC images also revealed small extensional troughs. Crosscutting relations with small-diameter craters and depths as shallow as 1 m indicate ages <50 Ma. These features place bounds on the amount of global radial contraction and the level of compressional stress in the crust. WAC temporal coverage of the poles allowed quantification of highly illuminated regions, including one site that remains lit for 94% of a year (longest eclipse period of 43 hours). Targeted NAC images provide higher resolution characterization of key sites with permanent shadow and extended illumination. Repeat WAC coverage provides an unparalleled photometric dataset allowing spatially resolved solutions (currently 1 degree) to Hapke's photometric equation - data invaluable for photometric normalization and interpreting physical properties of the regolith. The WAC color also provides the means to solve for titanium, and distinguish subtle age differences within Copernican aged materials. The longevity of the LRO mission allows follow up NAC and WAC observations of previously known and newly discovered targets over a range of illumination and viewing geometries. Of particular merit is the acquisition of NAC stereo pairs and oblique sequences. With the extended SMD phase, the LROC team is working towards imaging the whole Moon with pixel scales of 50 to 200 cm.

Robinson, M. S.

2012-12-01

15

Dry imaging cameras  

PubMed Central

Dry imaging cameras are important hard copy devices in radiology. Using dry imaging camera, multiformat images of digital modalities in radiology are created from a sealed unit of unexposed films. The functioning of a modern dry camera, involves a blend of concurrent processes, in areas of diverse sciences like computers, mechanics, thermal, optics, electricity and radiography. Broadly, hard copy devices are classified as laser and non laser based technology. When compared with the working knowledge and technical awareness of different modalities in radiology, the understanding of a dry imaging camera is often superficial and neglected. To fill this void, this article outlines the key features of a modern dry camera and its important issues that impact radiology workflow. PMID:21799589

Indrajit, IK; Alam, Aftab; Sahni, Hirdesh; Bhatia, Mukul; Sahu, Samaresh

2011-01-01

16

Satellite camera image navigation  

NASA Technical Reports Server (NTRS)

Pixels within a satellite camera (1, 2) image are precisely located in terms of latitude and longitude on a celestial body, such as the earth, being imaged. A computer (60) on the earth generates models (40, 50) of the satellite's orbit and attitude, respectively. The orbit model (40) is generated from measurements of stars and landmarks taken by the camera (1, 2), and by range data. The orbit model (40) is an expression of the satellite's latitude and longitude at the subsatellite point, and of the altitude of the satellite, as a function of time, using as coefficients (K) the six Keplerian elements at epoch. The attitude model (50) is based upon star measurements taken by each camera (1, 2). The attitude model (50) is a set of expressions for the deviations in a set of mutually orthogonal reference optical axes (x, y, z) as a function of time, for each camera (1, 2). Measured data is fit into the models (40, 50) using a walking least squares fit algorithm. A transformation computer (66 ) transforms pixel coordinates as telemetered by the camera (1, 2) into earth latitude and longitude coordinates, using the orbit and attitude models (40, 50).

Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Savides, John (Inventor); Hanson, Charles W. (Inventor)

1987-01-01

17

Image Sensors Enhance Camera Technologies  

NASA Technical Reports Server (NTRS)

In the 1990s, a Jet Propulsion Laboratory team led by Eric Fossum researched ways of improving complementary metal-oxide semiconductor (CMOS) image sensors in order to miniaturize cameras on spacecraft while maintaining scientific image quality. Fossum s team founded a company to commercialize the resulting CMOS active pixel sensor. Now called the Aptina Imaging Corporation, based in San Jose, California, the company has shipped over 1 billion sensors for use in applications such as digital cameras, camera phones, Web cameras, and automotive cameras. Today, one of every three cell phone cameras on the planet feature Aptina s sensor technology.

2010-01-01

18

Neutron Imaging Camera  

NASA Technical Reports Server (NTRS)

We describe the Neutron Imaging Camera (NIC) being developed for DTRA applications by NASA/GSFC and NSWC/Carderock. The NIC is based on the Three-dimensional Track Imager (3-DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution. 3-D tracking of charged particles. The incident direction of fast neutrons, E(sub N) > 0.5 MeV. arc reconstructed from the momenta and energies of the proton and triton fragments resulting from 3He(n,p)3H interactions in the 3-DTI volume. We present angular and energy resolution performance of the NIC derived from accelerator tests.

Hunter, Stanley D.; DeNolfo, Georgia; Floyd, Sam; Krizmanic, John; Link, Jason; Son, Seunghee; Guardala, Noel; Skopec, Marlene; Stark, Robert

2008-01-01

19

Computer-Assisted Detection of Collapse Pits in LROC NAC Images  

NASA Astrophysics Data System (ADS)

Pits in mare basalts and impact melt deposits provide unique environments for human shelters and preservation of geologic information. Due to their steep walls, pits are most distinguishable when the Sun is high (pit walls are casting shadows and impact crater walls are not). Because of the large number of NAC images acquired every day (>350), each typically with 5000 samples and 52,224 lines, it is not feasible to carefully search each image manually, so we developed a shadow detection algorithm (Pitscan) which analyzes an image in thirty seconds. It locates blocks of pixels that are below a digital number (DN) cutoff value, indicating that the block of pixels is "in shadow", and then runs a DN profile in the direction of solar lighting, comparing average DN values of the up-Sun and down-Sun sides. If the up-Sun average DN is higher than the down-Sun average, the shadow is assumed to be from a positive relief feature, and ignored. Otherwise, Pitscan saves a 200 x 200 pixel sub-image for later manual review. The algorithm currently generates ~150 false positives for each successful pit identification. This number would be unacceptable for an algorithm designed to catalog a common feature, but since the logic is merely intended to assist humans in locating an unusual type of feature, the false alarm rate is acceptable, and the current version allows a human to effectively check 10,000 NAC images for pits (over 2500 gigapixels) per hour. The false negative rate is not yet known, however Pitscan detected every pit in a test on a small subset of the images known to contain pits. Pitscan is only effective when the Sun is within 50° of the zenith. When the Sun is closer to the horizon crater walls often cast shadows, resulting in unacceptable numbers of false positives. Due to the Sun angle limit, only regions within 50° latitude of the equator are searchable. To date, 25.42% of the Moon has been imaged within this constraint. Early versions of Pitscan found more than 150 small (average diameter 15m) pits in impact melt deposits of Copernican craters [1]. More recently, improvements to the algorithm revealed two new large mare pits, similar to the three pits discovered in Kaguya images [2]. One is in Schlüter crater, a mare-filled crater near Orientale basin, with a 20 x 40m opening, approximately 60 m deep. The second new pit is in Lacus Mortis (44.96°N, 25.61°E) in a tectonically complex region west of Burg crater, This pit is the largest mare pit found to date, with an opening approximately 100 x 150 m, and a floor more than 90 m below the surrounding terrain. Most interesting from an exploration point of view is the fact that the east wall appears to have collapsed, leaving a relatively smooth ~22° slope from the surrounding mare down to the pit floor. Computer-assisted feature detection is an effective method of locating rare features in the extremely large high-resolution NAC dataset. Pitscan enabled the discovery of unknown collapse pits both in the mare and highlands. These pits are an important resource for future surface exploration, both by providing access to pristine cross-sections of the near-surface and by providing radiation and micrometorite shielding for human outposts. [1] Wagner, R.V. et al. (2012), LPSC XLIII, #2266 [2] Haruyama, J. et al. (2010), LPSC XLI, #1285

Wagner, R. V.; Robinson, M. S.

2012-12-01

20

Calibration of the Lunar Reconnaissance Orbiter Camera  

Microsoft Academic Search

The Lunar Reconnaissance Orbiter Camera (LROC) onboard the NASA Lunar Reconnaissance Orbiter (LRO) spacecraft consists of three cameras: the Wide-Angle Camera (WAC) and two identical Narrow Angle Cameras (NAC-L, NAC-R). The WAC is push-frame imager with 5 visible wavelength filters (415 to 680 nm) at a spatial resolution of 100 m\\/pixel and 2 UV filters (315 and 360 nm) with

M. Tschimmel; M. S. Robinson; D. C. Humm; B. W. Denevi; S. J. Lawrence; S. Brylow; M. Ravine; T. Ghaemi

2008-01-01

21

Binarising Camera Images for OCR  

Microsoft Academic Search

In this paper we describe a new binarisation method de- signed specifically for OCR of low quality camera images: Background Surface Thresholding or BST. This method is robust to lighting variations and produces images with very little noise and consistent stroke width. BST computes a \\

Mauritius Seeger; Christopher R. Dance

2001-01-01

22

LROC WAC Ultraviolet Reflectance of the Moon  

NASA Astrophysics Data System (ADS)

Earth-based color filter photography, first acquired in the 1960s, showed color differences related to morphologic boundaries on the Moon [1]. These color units were interpreted to indicate compositional differences, thought to be the result of variations in titanium content [1]. Later it was shown that iron abundance (FeO) also plays a dominant role in controlling color in lunar soils [2]. Equally important is the maturity of a lunar soil in terms of its reflectance properties (albedo and color) [3]. Maturity is a measure of the state of alteration of surface materials due to sputtering and high velocity micrometeorite impacts over time [3]. The Clementine (CL) spacecraft provided the first global and digital visible through infrared observations of the Moon [4]. This pioneering dataset allowed significant advances in our understanding of compositional (FeO and TiO2) and maturation differences across the Moon [5,6]. Later, the Lunar Prospector (LP) gamma ray and neutron experiments provided the first global, albeit low resolution, elemental maps [7]. Newly acquired Moon Mineralogic Mapper hyperspectral measurements are now providing the means to better characterize mineralogic variations on a global scale [8]. Our knowledge of ultraviolet color differences between geologic units is limited to low resolution (km scale) nearside telescopic observations, and high resolution Hubble Space Telescope images of three small areas [9], and laboratory analyses of lunar materials [10,11]. These previous studies detailed color differences in the UV (100 to 400 nm) related to composition and physical state. HST UV (250 nm) and visible (502 nm) color differences were found to correlate with TiO2, and were relatively insensitive to maturity effects seen in visible ratios (CL) [9]. These two results led to the conclusion that improvements in TiO2 estimation accuracy over existing methods may be possible through a simple UV/visible ratio [9]. The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) provides the first global lunar ultraviolet through visible (321 nm to 689 nm) multispectral observations [12]. The WAC is a sevencolor push-frame imager with nominal resolutions of 400 m (321, 360 nm) and 100 m (415, 566, 604, 643, 689 nm). Due to its wide field-of-view (60° in color mode) the phase angle within a single line varies ±30°, thus requiring the derivation of a precise photometric characterization [13] before any interpretations of lunar reflectance properties can be made. The current WAC photometric correction relies on multiple WAC observations of the same area over a broad range of phase angles and typically results in relative corrections good to a few percent [13].

Robinson, M. S.; Denevi, B. W.; Sato, H.; Hapke, B. W.; Hawke, B. R.

2011-10-01

23

Height-to-diameter ratios of moon rocks from analysis of Lunokhod-1 and -2 and Apollo 11-17 panoramas and LROC NAC images  

NASA Astrophysics Data System (ADS)

An analysis is performed of 91 panoramic photographs taken by Lunokhod-1 and -2, 17 panoramic images composed of photographs taken by Apollo 11-15 astronauts, and six LROC NAC photographs. The results are used to measure the height-to-visible-diameter ( h/ d) and height-to-maximum-diameter ( h/ D) ratios for lunar rocks at three highland and three mare sites on the Moon. The average h/ d and h/ D for the six sites are found to be indistinguishable at a significance level of 95%. Therefore, our estimates for the average h/ d = 0.6 ± 0.03 and h/ D = 0.54 ± 0.03 on the basis of 445 rocks are applicable for the entire Moon's surface. Rounding off, an h/ D ratio of ?0.5 is suggested for engineering models of the lunar surface. The ratios between the long, medium, and short axes of the lunar rocks are found to be similar to those obtained in high-velocity impact experiments for different materials. It is concluded, therefore, that the degree of penetration of the studied lunar rocks into the regolith is negligible, and micrometeorite abrasion and other factors do not dominate in the evolution of the shape of lunar rocks.

Demidov, N. E.; Basilevsky, A. T.

2014-09-01

24

Digital Elevation Models and Derived Products from Lroc Nac Stereo Observations  

NASA Astrophysics Data System (ADS)

One of the primary objectives of the Lunar Reconnaissance Orbiter Camera (LROC) is to acquire stereo observations with the Narrow Angle Camera (NAC) to enable production of high resolution digital elevation models (DEMs). This work describes the processes and techniques used in reducing the NAC stereo observations to DEMs through a combination of USGS integrated Software for Imagers and Spectrometers (ISIS) and SOCET SET® from BAE Systems by a team at Arizona State University (ASU). LROC Science Operations Center personnel have thus far reduced 130 stereo observations to DEMs of more than 130 stereo pairs for 11 Constellation Program (CxP) sites and 53 other regions of scientific interest. The NAC DEM spatial sampling is typically 2 meters, and the vertical precision is 1-2 meters. Such high resolution provides the three-dimensional view of the lunar surface required for site selection, hazard avoidance and planning traverses that minimize resource consumption. In addition to exploration analysis, geologists can measure parameters such as elevation, slope, and volume to place constraints on composition and geologic history. The NAC DEMs are released and archived through NASA's Planetary Data System.

Burns, K. N.; Speyerer, E. J.; Robinson, M. S.; Tran, T.; Rosiek, M. R.; Archinal, B. A.; Howington-Kraus, E.; the LROC Science Team

2012-08-01

25

Marius Hills: Surface Roughness from LROC and Mini-RF  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter Camera (LROC) Team is collecting hundreds of high-resolution (0.5 m/pixel) Narrow Angle Camera (NAC) images of lunar volcanic constructs (domes, “cones”, and associated features) [1,2]. Marius Hills represents the largest concentration of volcanic features on the Moon and is a high-priority target for future exploration [3,4]. NAC images of this region provide new insights into the morphology and geology of specific features at the meter scale, including lava flow fronts, tectonic features, layers, and topography (using LROC stereo imagery) [2]. Here, we report initial results from Mini-RF and LROC collaborative studies of the Marius Hills. Mini-RF uses a hybrid polarimetric architecture to measure surface backscatter characteristics and can acquire data in one of two radar bands, S (12 cm) or X (4 cm) [5]. The spatial resolution of Mini-RF (15 m/pixel) enables correlation of features observed in NAC images to Mini-RF data. Mini-RF S-Band zoom-mode data and daughter products, such as circular polarization ratio (CPR), were directly compared to NAC images. Mini-RF S-Band radar images reveal enhanced radar backscatter associated with volcanic constructs in the Marius Hills region. Mini-RF data show that Marius Hills volcanic constructs have enhanced average CPR values (0.5-0.7) compared to the CPR values of the surrounding mare (~0.4). This result is consistent with the conclusions of [6], and implies that the lava flows comprising the domes in this region are blocky. To quantify the surface roughness [e.g., 6,7] block populations associated with specific geologic features in the Marius Hills region are being digitized from NAC images. Only blocks that can be unambiguously identified (>1 m diameter) are included in the digitization process, producing counts and size estimates of the block population. High block abundances occur mainly at the distal ends of lava flows. The average size of these blocks is 9 m, and 50% of observed blocks are between 9-12 m in diameter. These blocks are not associated with impact craters and have at most a thin layer of regolith. There is minimal visible evidence for downslope movement. Relatively high block abundances are also seen on the summits of steep-sided asymmetrical positive relief features (“cones”) atop low-sided domes. Digitization efforts will continue as we study the block populations of different geologic features in the Marius Hills region and correlate the results with Mini-RF data, which will provide new information about the emplacement of volcanic features in the region. [1] J.D. Stopar et al., LPI Contribution 1483 (2009) 93-94. [2] S.J. Lawrence et al. (2010) LPSC 41 #1906. [2] S.J. Lawrence et al. (2010) LPSC 41 # 2689. [3] C. Coombs & B.R. Hawke (1992) 2nd Proc. Lun. Bases & Space Act. 21st Cent pp. 219-229. [4]J.Gruener and B. Joosten (2009) LPI Contributions 1483 50-51. [5] D.B.J. Bussey et al. (2010) LPSC 41 # 2319. [6] B.A. Campbell et al. (2009) JGR-Planets, 114, 01001. [7] S.W. Anderson et al. (1998) GSA Bull, 110, 1258-1267.

Lawrence, S.; Hawke, B. R.; Bussey, B.; Stopar, J. D.; Denevi, B.; Robinson, M.; Tran, T.

2010-12-01

26

Image dissector camera system study  

NASA Technical Reports Server (NTRS)

Various aspects of a rendezvous and docking system using an image dissector detector as compared to a GaAs detector were discussed. Investigation into a gimbled scanning system is also covered and the measured video response curves from the image dissector camera are presented. Rendezvous will occur at ranges greater than 100 meters. The maximum range considered was 1000 meters. During docking, the range, range-rate, angle, and angle-rate to each reflector on the satellite must be measured. Docking range will be from 3 to 100 meters. The system consists of a CW laser diode transmitter and an image dissector receiver. The transmitter beam is amplitude modulated with three sine wave tones for ranging. The beam is coaxially combined with the receiver beam. Mechanical deflection of the transmitter beam, + or - 10 degrees in both X and Y, can be accomplished before or after it is combined with the receiver beam. The receiver will have a field-of-view (FOV) of 20 degrees and an instantaneous field-of-view (IFOV) of two milliradians (mrad) and will be electronically scanned in the image dissector. The increase in performance obtained from the GaAs photocathode is not needed to meet the present performance requirements.

Howell, L.

1984-01-01

27

Case sampling in LROC: a Monte Carlo analysis  

NASA Astrophysics Data System (ADS)

We conducted a series of Monte Carlo simulations to investigate how hypothesis testing for modality effects in multireader localization ROC (LROC) studies is influenced by case effects. One specific goal was to evaluate for LROC studies the Dorfman-Berbaum-Metz method of analyzing case effects in reader data acquired from a single case-set. Previous evaluations with ROC study simulations found the DBM method to be moderately conservative. Our simulations, using procedures adapted from those earlier works, showed the DBM method to be a conservative test of modality effect in LROC studies as well. The degree of conservatism was greater for a critical value of (alpha) equals0.05 than for (alpha) equals0.01, and was not moderated by increased numbers of readers or cases. Other simulations investigated the tradeoff between power and empirical type-I error rate for the DBM method and two standard hypothesis tests. Besides the DBM method, a two-way analysis of variance (ANOVA) was applied to performance indices based on the LROC curve under an assumption of negligible case effects. The third test was a three-way ANOVA applied to performance indices, which required two sets of images per modality. With (alpha) equals0.01, the DBM method outperformed the other tests for studies with low numbers of readers and cases. In most other situations, its performance lagged behind that of the other tests.

Gifford, Howard C.; King, Michael A.

2001-06-01

28

Multiplex imaging with multiple-pinhole cameras  

NASA Technical Reports Server (NTRS)

When making photographs in X rays or gamma rays with a multiple-pinhole camera, the individual images of an extended object such as the sun may be allowed to overlap. Then the situation is in many ways analogous to that in a multiplexing device such as a Fourier spectroscope. Some advantages and problems arising with such use of the camera are discussed, and expressions are derived to describe the relative efficacy of three exposure/postprocessing schemes using multiple-pinhole cameras.

Brown, C.

1974-01-01

29

Single-Camera Panoramic-Imaging Systems  

NASA Technical Reports Server (NTRS)

Panoramic detection systems (PDSs) are developmental video monitoring and image-data processing systems that, as their name indicates, acquire panoramic views. More specifically, a PDS acquires images from an approximately cylindrical field of view that surrounds an observation platform. The main subsystems and components of a basic PDS are a charge-coupled- device (CCD) video camera and lens, transfer optics, a panoramic imaging optic, a mounting cylinder, and an image-data-processing computer. The panoramic imaging optic is what makes it possible for the single video camera to image the complete cylindrical field of view; in order to image the same scene without the benefit of the panoramic imaging optic, it would be necessary to use multiple conventional video cameras, which have relatively narrow fields of view.

Lindner, Jeffrey L.; Gilbert, John

2007-01-01

30

MIT Media Lab Camera Culture Coded Computational Imaging  

E-print Network

Camera Culture Bokode fb Bokode image depends on camera angle camera #12;MIT Media Lab Camera CultureMIT Media Lab Camera Culture Coded Computational Imaging: Light Fields and Applications Ankit Mohan MIT Media Lab #12;Coded Computational Imaging Agrawal, Veeraraghavan, Narasimhan & Mohan Schedule

Agrawal, Amit

31

Generating Stereoscopic Television Images With One Camera  

NASA Technical Reports Server (NTRS)

Straightforward technique for generating stereoscopic television images involves use of single television camera translated laterally between left- and right-eye positions. Camera acquires one of images (left- or right-eye image), and video signal from image delayed while camera translated to position where it acquires other image. Length of delay chosen so both images displayed simultaneously or as nearly simultaneously as necessary to obtain stereoscopic effect. Technique amenable to zooming in on small areas within broad scenes. Potential applications include three-dimensional viewing of geological features and meteorological events from spacecraft and aircraft, inspection of workpieces moving along conveyor belts, and aiding ground and water search-and-rescue operations. Also used to generate and display imagery for public education and general information, and possible for medical purposes.

Coan, Paul P.

1996-01-01

32

Uncertainty Analysis of LROC NAC Derived Elevation Models  

NASA Astrophysics Data System (ADS)

One of the primary objectives of the Lunar Reconnaissance Orbiter Camera (LROC) [1] is to gather stereo observations with the Narrow Angle Camera (NAC) to generate digital elevation models (DEMs). From an altitude of 50 km, the NAC acquires images with a pixel scale of 0.5 meters, and a dual NAC observation covers approximately 5 km cross-track by 25 km down-track. This low altitude was common from September 2009 to December 2011. Images acquired during the commissioning phase and those acquired from the fixed orbit (after 11 December 2011) have pixel scales that range from 0.35 meters at the south pole to 2 meters at the north pole. Alimetric observations obtained by the Lunar Orbiter Laser Altimeter (LOLA) provide measurements of ±0.1 m between the spacecraft and the surface [2]. However, uncertainties in the spacecraft positioning can result in offsets (±20m) between altimeter tracks over many orbits. The LROC team is currently developing a tool to automatically register alimetric observations to NAC DEMs [3]. Using a generalized pattern search (GPS) algorithm, the new automatic registration adjusts the spacecraft position and pointing information during times when NAC images, as well as LOLA measurements, of the same region are acquired to provide an absolute reference frame for the DEM. This information is then imported into SOCET SET to aide in creating controlled NAC DEMs. For every DEM, a figure of merit (FOM) map is generated using SOCET SET software. This is a valuable tool for determining the relative accuracy of a specific pixel in a DEM. Each pixel in a FOM map is given a value to determine its "quality" by determining if the specific pixel was shadowed, saturated, suspicious, interpolated/extrapolated, or successfully correlated. The overall quality of a NAC DEM is a function of both the absolute and relative accuracies. LOLA altimetry provides the most accurate absolute geodetic reference frame with which the NAC DEMs can be compared. Offsets between LOLA profiles and NAC DEMs are used to quantify the absolute accuracy. Small lateral movements in the LOLA points coupled with large changes in topography contribute to sizeable offsets between the datasets. The steep topography of Lichtenberg Crater provides an example of the offsets in the LOLA data. Ten tracks that cross the region of interest were used to calculate the offset with a root mean square (RMS) error of 9.67 m, an average error of 7.02 m, and a standard deviation of 9.61m. Large areas (>375 km sq) covered by a mosaic of NAC DEMs were compared to the Wide Angel Camera (WAC) derived Global Lunar DTM 100 m topographic model (GLD100) [4]. The GLD100 has a pixel scale of 100 m; therefore, the NAC DEMs were reduced to calculate the offsets between two datasets. When comparing NAC DEMs to WAC DEMs, it was determined that the vertical offsets were as follows [Site name (average offset in meters, standard deviation in meters)]: Lichtenberg Crater (-7.74, 20.49), Giordano Bruno (-5.31, 28.80), Hortensius Domes (-3.52, 16.00), and Reiner Gamma (-0.99,14.11). Resources: [1] Robinson et al. (2010) Space Sci. Rev. [2] Smith et al. (2010) Space Sci. Rev. [3]Speyerer et al. (2012) European Lunar Symp. [4] Scholten et al. (2012) JGR-Planets.

Burns, K.; Yates, D. G.; Speyerer, E.; Robinson, M. S.

2012-12-01

33

Occluded object imaging via optimal camera selection  

NASA Astrophysics Data System (ADS)

High performance occluded object imaging in cluttered scenes is a significant challenging task for many computer vision applications. Recently the camera array synthetic aperture imaging is proved to be an effective way to seeing object through occlusion. However, the imaging quality of occluded object is often significantly decreased by the shadows of the foreground occluder. Although some works have been presented to label the foreground occluder via object segmentation or 3D reconstruction, these methods will fail in the case of complicated occluder and severe occlusion. In this paper, we present a novel optimal camera selection algorithm to solve the above problem. The main characteristics of this algorithm include: (1) Instead of synthetic aperture imaging, we formulate the occluded object imaging problem as an optimal camera selection and mosaicking problem. To the best of our knowledge, our proposed method is the first one for occluded object mosaicing. (2) A greedy optimization framework is presented to propagate the visibility information among various depth focus planes. (3) A multiple label energy minimization formulation is designed in each plane to select the optimal camera. The energy is estimated in the synthetic aperture image volume and integrates the multi-view intensity consistency, previous visibility property and camera view smoothness, which is minimized via Graph cuts. We compare our method with the state-of-the-art synthetic aperture imaging algorithms, and extensive experimental results with qualitative and quantitative analysis demonstrate the effectiveness and superiority of our approach.

Yang, Tao; Zhang, Yanning; Tong, Xiaomin; Ma, Wenguang; Yu, Rui

2013-12-01

34

Prediction of Viking lander camera image quality  

NASA Technical Reports Server (NTRS)

Formulations are presented that permit prediction of image quality as a function of camera performance, surface radiance properties, and lighting and viewing geometry. Predictions made for a wide range of surface radiance properties reveal that image quality depends strongly on proper camera dynamic range command and on favorable lighting and viewing geometry. Proper camera dynamic range commands depend mostly on the surface albedo that will be encountered. Favorable lighting and viewing geometries depend mostly on lander orientation with respect to the diurnal sun path over the landing site, and tend to be independent of surface albedo and illumination scattering function. Side lighting with low sun elevation angles (10 to 30 deg) is generally favorable for imaging spatial details and slopes, whereas high sun elevation angles are favorable for measuring spectral reflectances.

Huck, F. O.; Burcher, E. E.; Jobson, D. J.; Wall, S. D.

1976-01-01

35

A Nonparametric Procedure for Comparing the Areas Under Correlated LROC Curves  

PubMed Central

In contrast to the receiver operating characteristic (ROC) assessment paradigm, localization ROC (LROC) analysis provides a means to jointly assess the accuracy of localization and detection in an observer study. In a typical multireader, multicase (MRMC) evaluation, the data sets are paired so that correlations arise in observer performance both between readers and across the imaging conditions (e.g., reconstruction methods or scanning parameters) being compared. Therefore, MRMC evaluations motivate the need for a statistical methodology to compare correlated LROC curves. In this paper, we suggest a nonparametric strategy for this purpose. Specifically, we find that seminal work of Sen on U-statistics can be applied to estimate the covariance matrix for a vector of LROC area estimates. The resulting covariance estimator is the LROC analog of the covariance estimator given by DeLong et al. for ROC analysis. Once the covariance matrix is estimated, it can be used to construct confidence intervals and/or confidence regions for purposes of comparing observer performance across imaging conditions. In addition, given the results of a small-scale pilot study, the covariance estimator may be used to estimate the number of images and observers needed to achieve a desired confidence interval size in a full-scale observer study. The utility of our methodology is illustrated with a human-observer LROC evaluation of three image reconstruction strategies for fan-beam X-ray computed tomography. PMID:22736638

Noo, Frederic

2012-01-01

36

LROC NAC Digital Elevation Model of Gruithuisen Gamma  

NASA Astrophysics Data System (ADS)

The Gruithuisen Domes have long been of interest as examples of non-mare volcanism [1]. Their form suggests extrusion of silica-rich magmas, possibly dating to 3.7-3.85 Ga (around the same time as the Iridum event), and were subsequently embayed by mare [2,3]. Non-mare volcanism is indicated by spectral features known as “red spots” which have (a) high albedo, (b) strong absorption in the ultraviolet, and (c) a wide range of morphologies [4,5,6]. The composition of red spot domes is still unknown, but dacitic or rhyolitic KREEP-rich compositions [5] and mature, low iron and low titanium agglutinate-rich soils [7] have been suggested. The existence of non-mare volcanism has major implications for the thermal history and crustal evolution of the Moon. A new digital elevation model (DEM), derived from stereo image pairs acquired with the Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC), allows detailed investigation of the morphology and thus origin of Mons Gruithuisen Gamma (36.6° N, 40.5° W). The 10 meter per pixel DEM shows relief of ~1500 meters from the summit plateau of Gruithuisen Gamma to the nearby mare surface. This measurement is close to previous estimates of over 1200 meters from Apollo era images [4]. Previous estimates also suggested that the overall slopes ranged from 15-30° [7]. Radial profiles (n=25) across the eastern two-thirds of the Gruithuisen Gamma DEM show that the overall slope is 17-18° along the north- and northeastern-facing slopes, 14° along the eastern-most edge, 12° on the side facing the contact of the dome material and highlands material, and 11° on the directly southern-facing slope. The north-south diameter of the dome is ~24 km and the east-west diameter is ~18 km. The textures on each slope are remarkably similar and distinct from the highlands and crater slopes, with irregular furrows oriented down-slope. The same furrowed texture is not seen on mare domes, which are generally much smoother, flatter, and smaller than red spot domes [8]. Two ~2 km diameter craters on Gamma have likely exposed fresh dome material from below the surface texture, as evidenced by boulders visible in the ejecta. Overall, Gruithuisen Gamma has asymmetric slope morphology, but uniform texture. Topographic analysis and models of rheological properties with data from new LROC DEMs may aid in constraining the composition and origin of Gruithuisen Gamma. [1] Scott and Eggleton (1973) I-805, USGS. [2] Wagner, R.J., et al. (2002) LPSC #1619 [3] Wagner, R.J., et al. (2002) JGR. 104. [4] Chevrel, S.D., Pinet, P.C., and Head J.W. (1999) JGR. 104, 16515-16529 [5] Malin, M. (1974) Earth Planet. Sci. Lett. 21, 331 [6] Whitaker, E.A. (1972) Moon, 4, 348. [7] Head, J.W. and McCord, T.B. (1978) Science. 199, 1433-1436 [8] Head, J.W. and Gifford, A. (1980) Moon and Planets.

Braden, S.; Tran, T. N.; Robinson, M. S.

2009-12-01

37

Camera Trajectory fromWide Baseline Images  

NASA Astrophysics Data System (ADS)

Camera trajectory estimation, which is closely related to the structure from motion computation, is one of the fundamental tasks in computer vision. Reliable camera trajectory estimation plays an important role in 3D reconstruction, self localization, and object recognition. There are essential issues for a reliable camera trajectory estimation, for instance, choice of the camera and its geometric projection model, camera calibration, image feature detection and description, and robust 3D structure computation. Most of approaches rely on classical perspective cameras because of the simplicity of their projection models and ease of their calibration. However, classical perspective cameras offer only a limited field of view, and thus occlusions and sharp camera turns may cause that consecutive frames look completely different when the baseline becomes longer. This makes the image feature matching very difficult (or impossible) and the camera trajectory estimation fails under such conditions. These problems can be avoided if omnidirectional cameras, e.g. a fish-eye lens convertor, are used. The hardware which we are using in practice is a combination of Nikon FC-E9 mounted via a mechanical adaptor onto a Kyocera Finecam M410R digital camera. Nikon FC-E9 is a megapixel omnidirectional addon convertor with 180° view angle which provides images of photographic quality. Kyocera Finecam M410R delivers 2272×1704 images at 3 frames per second. The resulting combination yields a circular view of diameter 1600 pixels in the image. Since consecutive frames of the omnidirectional camera often share a common region in 3D space, the image feature matching is often feasible. On the other hand, the calibration of these cameras is non-trivial and is crucial for the accuracy of the resulting 3D reconstruction. We calibrate omnidirectional cameras off-line using the state-of-the-art technique and Mi?ušík's two-parameter model, that links the radius of the image point r to the angle ? of its corresponding rays w.r.t. the optical axis as ? = ar 1+br2 . After a successful calibration, we know the correspondence of the image points to the 3D optical rays in the coordinate system of the camera. The following steps aim at finding the transformation between the camera and the world coordinate systems, i.e. the pose of the camera in the 3D world, using 2D image matches. For computing 3D structure, we construct a set of tentative matches detecting different affine covariant feature regions including MSER, Harris Affine, and Hessian Affine in acquired images. These features are alternative to popular SIFT features and work comparably in our situation. Parameters of the detectors are chosen to limit the number of regions to 1-2 thousands per image. The detected regions are assigned local affine frames (LAF) and transformed into standard positions w.r.t. their LAFs. Discrete Cosine Descriptors are computed for each region in the standard position. Finally, mutual distances of all regions in one image and all regions in the other image are computed as the Euclidean distances of their descriptors and tentative matches are constructed by selecting the mutually closest pairs. Opposed to the methods using short baseline images, simpler image features which are not affine covariant cannot be used because the view point can change a lot between consecutive frames. Furthermore, feature matching has to be performed on the whole frame because no assumptions on the proximity of the consecutive projections can be made for wide baseline images. This is making the feature detection, description, and matching much more time-consuming than it is for short baseline images and limits the usage to low frame rate sequences when operating in real-time. Robust 3D structure can be computed by RANSAC which searches for the largest subset of the set of tentative matches which is, within a predefined threshold ", consistent with an epipolar geometry. We use ordered sampling as suggested in to draw 5-tuples from the list of tentative matches ordered ascendin

Havlena, M.; Torii, A.; Pajdla, T.

2008-09-01

38

Evaluation system for camera shake and image stabilizers  

Microsoft Academic Search

Various optical image stabilizers for reducing camera shake have been developed in many camera manufacturers, however, it has not been known exactly how effective they work for cancellation of the camera shake. We developed a novel system for analyzing behavior of camera shake and evaluating effectiveness of image stabilizers quantitatively. It consists of the custom-made LED display for generating a

Kazuki Nishi; Tsubasa Onda

2010-01-01

39

Analysis of camera's images influenced by light variation  

Microsoft Academic Search

Cameras are excellent sensors for robots in unstructured environment such as human living space. However, a camera being extreme sensitive to light illumination variation restricts robots in human life. In this paper, color image of a camera influenced by light variation is studied for practical applications used as robot sensors in future. Firstly, the influence of camera's image data due

Kuo-Yang Tu; Chien-Hua Chen

2007-01-01

40

Camera identification from cropped and scaled images  

NASA Astrophysics Data System (ADS)

In this paper, we extend our camera identification technology based on sensor noise to a more general setting when the image under investigation has been simultaneously cropped and scaled. The sensor fingerprint detection is formulated using hypothesis testing as a two-channel problem and a detector is derived using the generalized likelihood ratio test. A brute force search is proposed to find the scaling factor which is then refined in a detailed search. The cropping parameters are determined from the maximum of the normalized cross-correlation between two signals. The accuracy and limitations of the proposed technique are tested on images that underwent a wide range of cropping and scaling, including images that were acquired by digital zoom. Additionally, we demonstrate that sensor noise can be used as a template to reverse-engineer in-camera geometrical processing as well as recover from later geometrical transformations, thus offering a possible application for re-synchronizing in digital watermark detection.

Goljan, Miroslav; Fridrich, Jessica

2008-02-01

41

Digital readout for image converter cameras  

NASA Astrophysics Data System (ADS)

There is an increasing need for fast and reliable analysis of recorded sequences from image converter cameras so that experimental information can be readily evaluated without recourse to more time consuming photographic procedures. A digital readout system has been developed using a randomly triggerable high resolution CCD camera, the output of which is suitable for use with IBM AT compatible PC. Within half a second from receipt of trigger pulse, the frame reformatter displays the image and transfer to storage media can be readily achieved via the PC and dedicated software. Two software programmes offer different levels of image manipulation which includes enhancement routines and parameter calculations with accuracy down to pixel levels. Hard copy prints can be acquired using a specially adapted Polaroid printer, outputs for laser and video printer extend the overall versatility of the system.

Honour, Joseph

1991-04-01

42

Imaging spectrometer/camera having convex grating  

NASA Technical Reports Server (NTRS)

An imaging spectrometer has fore-optics coupled to a spectral resolving system with an entrance slit extending in a first direction at an imaging location of the fore-optics for receiving the image, a convex diffraction grating for separating the image into a plurality of spectra of predetermined wavelength ranges; a spectrometer array for detecting the spectra; and at least one concave sperical mirror concentric with the diffraction grating for relaying the image from the entrance slit to the diffraction grating and from the diffraction grating to the spectrometer array. In one embodiment, the spectrometer is configured in a lateral mode in which the entrance slit and the spectrometer array are displaced laterally on opposite sides of the diffraction grating in a second direction substantially perpendicular to the first direction. In another embodiment, the spectrometer is combined with a polychromatic imaging camera array disposed adjacent said entrance slit for recording said image.

Reininger, Francis M. (Inventor)

2000-01-01

43

Lroc Observations of Permanently Shadowed Regions: Seeing into the Dark  

NASA Astrophysics Data System (ADS)

Permanently shadowed regions (PSRs) near the lunar poles that receive secondary illumination from nearby Sun facing slopes were imaged by the Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Cameras (NAC). Typically secondary lighting is optimal in polar areas around respective solstices and when the LRO orbit is nearly coincident with the sub-solar point (low spacecraft beta angles). NAC PSR images provide the means to search for evidence of surface frosts and unusual morphologies from ice rich regolith, and aid in planning potential landing sites for future in-situ exploration. Secondary illumination imaging in PSRs requires NAC integration times typically more than ten times greater than nominal imaging. The increased exposure time results in downtrack smear that decreases the spatial resolution of the NAC PSR images. Most long exposure NAC images of PSRs were acquired with exposure times of 24.2-ms (1-m by 40-m pixels, sampled to 20-m) and 12-ms (1-m by 20-m, sampled to 10-m). The initial campaign to acquire long exposure NAC images of PSRs in the north pole region ran from February 2013 to April 2013. Relative to the south polar region, PSRs near the north pole are generally smaller (D<24-km) and located in simple craters. Long exposure NAC images of PSRs in simple craters are often well illuminated by secondary light reflected from Sun-facing crater slopes during the northern summer solstice, allowing many PSRs to be imaged with the shorter exposure time of 12-ms (resampled to 10-m). With the exception of some craters in Peary crater, most northern PSRs with diameters >6-km were successfully imaged (ex. Whipple, Hermite A, and Rozhestvenskiy U). The third PSR south polar campaign began in April 2013 and will continue until October 2013. The third campaign will expand previous NAC coverage of PSRs and follow up on discoveries with new images of higher signal to noise ratio (SNR), higher resolution, and varying secondary illumination conditions. Utilizing previous campaign images and Sun's position, an individualized approach for targeting each crater drives this campaign. Secondary lighting within the PSRs, though somewhat diffuse, is at low incidence angles and coupled with nadir NAC imaging results in large phase angles. Such conditions tend to reduce albedo contrasts, complicating identification of patchy frost or ice deposits. Within the long exposure PSR images, a few small craters (D<200-m) with highly reflective ejecta blankets have been identified and interpreted as small fresh impact craters. Sylvester N and Main L are Copernican-age craters with PSRs; NAC images reveal debris flows, boulders, and morphologically fresh interior walls indicative of their young age. The identifications of albedo anomalies associated with these fresh craters and debris flows indicate that strong albedo contrasts (~2x) associated with small fresh impact craters can be distinguished in PSRs. Lunar highland material has an albedo of ~0.2, while pure water frost has an albedo of ~0.9. If features in PSRs have an albedo similar to lunar highlands, significant surface frost deposits could result in detectable reflective anomalies in the NAC images. However, no reflective anomalies have thus far been identified in PSRs attributable to frost.

Koeber, S. D.; Robinson, M. S.

2013-12-01

44

X-ray imaging using digital cameras  

NASA Astrophysics Data System (ADS)

The possibility of using the combination of a computed radiography (storage phosphor) cassette and a semiprofessional grade digital camera for medical or dental radiography is investigated. We compare the performance of (i) a Canon 5D Mk II single lens reflex camera with f1.4 lens and full-frame CMOS array sensor and (ii) a cooled CCD-based camera with a 1/3 frame sensor and the same lens system. Both systems are tested with 240 x 180 mm cassettes which are based on either powdered europium-doped barium fluoride bromide or needle structure europium-doped cesium bromide. The modulation transfer function for both systems has been determined and falls to a value of 0.2 at around 2 lp/mm, and is limited by light scattering of the emitted light from the storage phosphor rather than the optics or sensor pixelation. The modulation transfer function for the CsBr:Eu2+ plate is bimodal, with a high frequency wing which is attributed to the light-guiding behaviour of the needle structure. The detective quantum efficiency has been determined using a radioisotope source and is comparatively low at 0.017 for the CMOS camera and 0.006 for the CCD camera, attributed to the poor light harvesting by the lens. The primary advantages of the method are portability, robustness, digital imaging and low cost; the limitations are the low detective quantum efficiency and hence signal-to-noise ratio for medical doses, and restricted range of plate sizes. Representative images taken with medical doses are shown and illustrate the potential use for portable basic radiography.

Winch, Nicola M.; Edgar, Andrew

2012-03-01

45

Methods for identification of images acquired with digital cameras  

NASA Astrophysics Data System (ADS)

From the court we were asked whether it is possible to determine if an image has been made with a specific digital camera. This question has to be answered in child pornography cases, where evidence is needed that a certain picture has been made with a specific camera. We have looked into different methods of examining the cameras to determine if a specific image has been made with a camera: defects in CCDs, file formats that are used, noise introduced by the pixel arrays and watermarking in images used by the camera manufacturer.

Geradts, Zeno J.; Bijhold, Jurrien; Kieft, Martijn; Kurosawa, Kenji; Kuroki, Kenro; Saitoh, Naoki

2001-02-01

46

Fast Camera Imaging of Hall Thruster Ignition  

SciTech Connect

Hall thrusters provide efficient space propulsion by electrostatic acceleration of ions. Rotating electron clouds in the thruster overcome the space charge limitations of other methods. Images of the thruster startup, taken with a fast camera, reveal a bright ionization period which settles into steady state operation over 50 ?s. The cathode introduces azimuthal asymmetry, which persists for about 30 ?s into the ignition. Plasma thrusters are used on satellites for repositioning, orbit correction and drag compensation. The advantage of plasma thrusters over conventional chemical thrusters is that the exhaust energies are not limited by chemical energy to about an electron volt. For xenon Hall thrusters, the ion exhaust velocity can be 15-20 km/s, compared to 5 km/s for a typical chemical thruster

C.L. Ellison, Y. Raitses and N.J. Fisch

2011-02-24

47

Image responses to x-ray radiation in ICCD camera  

NASA Astrophysics Data System (ADS)

When used in digital radiography, ICCD camera will be inevitably irradiated by x-ray and the output image will degrade. In this research, we separated ICCD camera into two optical-electric parts, CCD camera and MCP image intensifier, and irradiated them respectively on Co-60 gamma ray source and pulsed x-ray source. By changing time association between radiation and the shutter of CCD camera, the state of power supply of MCP image intensifier, significant differences have been observed in output images. A further analysis has revealed the influence of the CCD chip, readout circuit in CCD camera, and the photocathode, microchannel plate and fluorescent screen in MCP image intensifier on image quality of an irradiated ICCD camera. The study demonstrated that compared with other parts, irradiation response of readout circuit is very slight and in most cases negligible. The interaction of x-ray with CCD chip usually behaves as bright spots or rough background in output images, which depends on x-ray doses. As to the MCP image intensifier, photocathode and microchannel plate are the two main steps that degrade output images. When being irradiated by x-ray, microchannel plate in MCP image intensifier tends to contribute a bright background in output images. Background caused by the photocathode looks more bright and fluctuant. Image responses of fluorescent screen in MCP image intensifier in ICCD camera and that of a coupling fiber bundle are also evaluated in this presentation.

Ma, Jiming; Duan, Baojun; Song, Yan; Song, Guzhou; Han, Changcai; Zhou, Ming; Du, Jiye; Wang, Qunshu; Zhang, Jianqi

2013-08-01

48

Single Image Frequency Measurements Of Camera Lenses  

NASA Astrophysics Data System (ADS)

Image characteristics of 35mm still camera lenses have been measured at the Laboratory of Modern Photography Magazine for more than fifteen years. In an attempt to speed up measurements and quantitative evaluations, a single image frequency was selected for those measurements on a photoelectric contrast-transfer optical bench. Comparisons were made of MTF curves obtained for a group of lenses, and comparisons were also made of subjectively judged photographic color transparencies and enlarged prints of black-and-white negatives taken with these lenses. When the MTFs and the relative rankings were correlated, it was found that an image frequency of 30 lines (or cycles) per millimeter yields good comparative judgements and good discrimination among the lenses tested. Similar correlations were tried for MTF values at 15 and 45 lines per mm. It was observed that better correlation is obtained when picture rankings and measured contrast at 30 lines per mm are used. When combined with resolving power obtained from measurements of fine-grained black-and-white negative film, a consistent and repeatable relative evaluation is achieved.

Sherman, Bennett

1980-09-01

49

New digital camera sensor architecture for low light imaging  

Microsoft Academic Search

In this paper, a novel digital camera sensor architecture to acquire images with high fidelity under low light conditions is proposed. Today, almost all digital cameras have only red, green, and blue color pixels. Due to their limited sensitivity to the wavelengths of visible light, the images captured using these color pixels under low light conditions are noisy. A new

Mrityunjay Kumar; Efrain O. Morales; James E. Adams; Wei Hao

2009-01-01

50

Plenoptic camera image simulation for reconstruction algorithm verification  

NASA Astrophysics Data System (ADS)

Plenoptic cameras have emerged in recent years as a technology for capturing light field data in a single snapshot. A conventional digital camera can be modified with the addition of a lenslet array to create a plenoptic camera. Two distinct camera forms have been proposed in the literature. The first has the camera image focused onto the lenslet array. The lenslet array is placed over the camera sensor such that each lenslet forms an image of the exit pupil onto the sensor. The second plenoptic form has the lenslet array relaying the image formed by the camera lens to the sensor. We have developed a raytracing package that can simulate images formed by a generalized version of the plenoptic camera. Several rays from each sensor pixel are traced backwards through the system to define a cone of rays emanating from the entrance pupil of the camera lens. Objects that lie within this cone are integrated to lead to a color and exposure level for that pixel. To speed processing three-dimensional objects are approximated as a series of planes at different depths. Repeating this process for each pixel in the sensor leads to a simulated plenoptic image on which different reconstruction algorithms can be tested.

Schwiegerling, Jim

2014-09-01

51

New insight into lunar impact melt mobility from the LRO camera  

USGS Publications Warehouse

The Lunar Reconnaissance Orbiter Camera (LROC) is systematically imaging impact melt deposits in and around lunar craters at meter and sub-meter scales. These images reveal that lunar impact melts, although morphologically similar to terrestrial lava flows of similar size, exhibit distinctive features (e.g., erosional channels). Although generated in a single rapid event, the post-impact mobility and morphology of lunar impact melts is surprisingly complex. We present evidence for multi-stage influx of impact melt into flow lobes and crater floor ponds. Our volume and cooling time estimates for the post-emplacement melt movements noted in LROC images suggest that new flows can emerge from melt ponds an extended time period after the impact event.

Bray, Veronica J.; Tornabene, Livio L.; Keszthelyi, Laszlo P.; McEwen, Alfred S.; Hawke, B. Ray; Giguere, Thomas A.; Kattenhorn, Simon A.; Garry, William B.; Rizk, Bashar; Caudill, C.M.; Gaddis, Lisa R.; van der Bogert, Carolyn H.

2010-01-01

52

LCD display screen performance testing for handheld thermal imaging cameras  

NASA Astrophysics Data System (ADS)

Handheld thermal imaging cameras are an important tool for the first responder community. As their use becomes more prevalent, it will become important for a set of standard test metrics to be available to characterize the performance of these cameras. A major factor in the performance of the imagers is the quality of the image on a display screen. An imager may employ any type of display screen, but the results of this paper will focus on those using liquid crystal displays. First responders, especially firefighters, in the field rely on the performance of this screen to relay vital information during critical situations. Current research on thermal imaging camera performance metrics for first responder applications uses trained observer tests or camera composite output signal measurements. Trained observer tests are subjective and composite output tests do not evaluate the performance of the complete imaging system. It is the goal of this work to develop a non-nondestructive, objective method that tests the performance of the entire thermal imaging camera system, from the infrared destructive, sensor to the display screen. Application of existing display screen performance metrics to thermal imaging cameras requires additional consideration. Most display screen test metrics require a well defined electronic input, with either full black or white pixel input, often encompassing detailed spatial patterns and resolution. Well characterized thermal inputs must be used to obtain accurate, repeatable, and non-destructive display screen measurements for infrared cameras. For this work, a thermal target is used to correlate the measured camera output with the actual display luminance. A test method was developed to determine display screen luminance. A well characterized CCD camera and digital recording device were used to determine an electro-optical transfer function for thermal imaging cameras. This value directly relates the composite output signal to the luminance of the display screen, providing a realistic characterization of system performance.

Dinaburg, Joshua B.; Amon, Francine; Hamins, Anthony; Boynton, Paul

2006-05-01

53

Measurement of the nonuniformity of first responder thermal imaging cameras  

NASA Astrophysics Data System (ADS)

Police, firefighters, and emergency medical personnel are examples of first responders that are utilizing thermal imaging cameras in a very practical way every day. However, few performance metrics have been developed to assist first responders in evaluating the performance of thermal imaging technology. This paper describes one possible metric for evaluating the nonuniformity of thermal imaging cameras. Several commercially available uncooled focal plane array cameras were examined. Because of proprietary property issues, each camera was considered a 'black box'. In these experiments, an extended area black body (18 cm square) was placed very close to the objective lens of the thermal imaging camera. The resultant video output from the camera was digitized at a resolution of 640x480 pixels and a grayscale depth of 10 bits. The nonuniformity was calculated using the standard deviation of the digitized image pixel intensities divided by the mean of those pixel intensities. This procedure was repeated for each camera at several blackbody temperatures in the range from 30° C to 260° C. It has observed that the nonuniformity initially increases with temperature, then asymptotically approaches a maximum value. Nonuniformity is also applied to the calculation of Spatial Frequency Response as well providing a noise floor. The testing procedures described herein are being developed as part of a suite of tests to be incorporated into a performance standard covering thermal imaging cameras for first responders.

Lock, Andrew; Amon, Francine

2008-04-01

54

Photorealistic image synthesis and camera validation from 2D images  

NASA Astrophysics Data System (ADS)

This paper presents a new 3D scene reconstruction technique using the Unity 3D game engine. The method presented here allow us to reconstruct the shape of simple objects and more complex ones from multiple 2D images, including infrared and digital images from indoor scenes and only digital images from outdoor scenes and then add the reconstructed object to the simulated scene created in Unity 3D, these scenes are then validated with real world scenes. The method used different cameras settings and explores different properties in the reconstructions of the scenes including light, color, texture, shapes and different views. To achieve the highest possible resolution, it was necessary the extraction of partial textures from visible surfaces. To recover the 3D shapes and the depth of simple objects that can be represented by the geometric bodies, there geometric characteristics were used. To estimate the depth of more complex objects the triangulation method was used, for this the intrinsic and extrinsic parameters were calculated using geometric camera calibration. To implement the methods mentioned above the Matlab tool was used. The technique presented here also let's us to simulate small simple videos, by reconstructing a sequence of multiple scenes of the video separated by small margins of time. To measure the quality of the reconstructed images and video scenes the Fast Low Band Model (FLBM) metric from the Video Quality Measurement (VQM) software was used. Low bandwidth perception based features include edges and motion.

Santos Ferrer, Juan C.; González Chévere, David; Manian, Vidya

2014-06-01

55

Camera and visual veiling glare in HDR images  

Microsoft Academic Search

High-dynamic-range (HDR) images are superior to conventional images. The experiments in this paper measure camera and human responses to calibrated HDR test targets. We calibrated a 4.3-log-unit test target, with minimal and maximal glare from a changeable surround. Glare is an uncontrolled spread of an image-dependent fraction of scene luminance in cameras and in the eye. We use this standard

John J. McCann; Alessandro Rizzi

2007-01-01

56

Dual-camera system for high-speed imaging in particle image velocimetry  

E-print Network

Particle image velocimetry is an important technique in experimental fluid mechanics, for which it has been essential to use a specialized high-speed camera. However, the high speed is at the expense of other performances of the camera, i.e., sensitivity and image resolution. Here, we demonstrate that the high-speed imaging is also possible with a pair of still cameras.

Hashimoto, K; Hara, T; Onogi, S; Mouri, H

2012-01-01

57

Textline information extraction from grayscale camera-captured document images  

Microsoft Academic Search

Cameras offer flexible document imaging, but with uneven shading and non-planar page shape. Therefore camera- captured documents need to go through dewarping before be- ing processed by traditional text recognition methods. Curled textline detection is an important step of dewarping. Previous approaches of curled textline detection use binarization as a pre-processing step, which can negatively affect the detection results under

Syed Saqib Bukhari; Thomas M. Breuel; Faisal Shafait

2009-01-01

58

Imaging Emission Spectra with Handheld and Cellphone Cameras  

ERIC Educational Resources Information Center

As point-and-shoot digital camera technology advances it is becoming easier to image spectra in a laboratory setting on a shoestring budget and get immediate results. With this in mind, I wanted to test three cameras to see how their results would differ. Two undergraduate physics students and I used one handheld 7.1 megapixel (MP) digital Cannon…

Sitar, David

2012-01-01

59

Recording Images Using a Simple Pinhole Camera  

NSDL National Science Digital Library

In this lesson, students develop and expand their observational skills and technological understanding by building and operating a pinhole camera. The interdisciplinary connections are in the realm of application in this motivating activity. The lesson pr

Eichinger, John

2009-05-30

60

Anger camera image generation with microcomputers  

E-print Network

! wight& an array of photomultiplier tubes and an electronics package, the location of a scintillation's epicenter may be calculated. An Anger camera, shown in Figure 4, is comprised of a collimator, a scintilla ion crystal, a series of photomultiplier...! wight& an array of photomultiplier tubes and an electronics package, the location of a scintillation's epicenter may be calculated. An Anger camera, shown in Figure 4, is comprised of a collimator, a scintilla ion crystal, a series of photomultiplier...

Williams, Karl Morgan

2012-06-07

61

Imaging Emission Spectra with Handheld and Cellphone Cameras  

NASA Astrophysics Data System (ADS)

As point-and-shoot digital camera technology advances it is becoming easier to image spectra in a laboralory setting on a shoestring budget and get immediale results. With this in mind, I wanted to test three cameras to see how their results would differ. Two undergraduate physics students and I used one handheld 7.1 megapixel (MP) digital Cannon point-and-shoot auto focusing camera and two different cellphone cameras: one at 6.1 MP and the other at 5.1 MP.

Sitar, David

2012-12-01

62

Mobile phone camera benchmarking: combination of camera speed and image quality  

NASA Astrophysics Data System (ADS)

When a mobile phone camera is tested and benchmarked, the significance of quality metrics is widely acknowledged. There are also existing methods to evaluate the camera speed. For example, ISO 15781 defines several measurements to evaluate various camera system delays. However, the speed or rapidity metrics of the mobile phone's camera system have not been used with the quality metrics even if the camera speed has become more and more important camera performance feature. There are several tasks in this work. Firstly, the most important image quality metrics are collected from the standards and papers. Secondly, the speed related metrics of a mobile phone's camera system are collected from the standards and papers and also novel speed metrics are identified. Thirdly, combinations of the quality and speed metrics are validated using mobile phones in the market. The measurements are done towards application programming interface of different operating system. Finally, the results are evaluated and conclusions are made. The result of this work gives detailed benchmarking results of mobile phone camera systems in the market. The paper defines also a proposal of combined benchmarking metrics, which includes both quality and speed parameters.

Peltoketo, Veli-Tapani

2014-01-01

63

Latest Results of 3D Topographic Mapping Using Lunar Reconnaissance Orbiter Narrow-Angle Camera Data  

NASA Astrophysics Data System (ADS)

This abstract presents the latest research results and quantitative analysis of topographic mapping using Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Cameras (NAC) conducted at the Ohio State University.

Li, R.; Wang, W.; He, S.; Yan, L.; Meng, X.; Crawford, J.; Robinson, M. S.; Tran, T.; Archinal, B. A.; Lroc Team

2011-03-01

64

Analysis of Melt Flows observed by SIR-2 and LROC NAC: Stevinus A  

NASA Astrophysics Data System (ADS)

Impact melt structures occur in and around young craters from 200 m to hundreds of kilometers in diameter. Impact events lead to melting and vaporization processes. Through the combination of high-resolution LROC NAC images and measurements made by the SIR-2 point spectrometer carried by Chandrayaan-1, compositional aspects can be investigated in a manner hitherto impossible. We will present and discuss as a specific example first results for the small lunar crater Stevinus A.

Mall, U.; Wöhler, C.; Grumpe, A.; Bugiolacchi, R.

2013-09-01

65

Bin mode estimation methods for Compton camera imaging  

NASA Astrophysics Data System (ADS)

We study the image reconstruction problem of a Compton camera which consists of semiconductor detectors. The image reconstruction is formulated as a statistical estimation problem. We employ a bin-mode estimation (BME) and extend an existing framework to a Compton camera with multiple scatterers and absorbers. Two estimation algorithms are proposed: an accelerated EM algorithm for the maximum likelihood estimation (MLE) and a modified EM algorithm for the maximum a posteriori (MAP) estimation. Numerical simulations demonstrate the potential of the proposed methods.

Ikeda, S.; Odaka, H.; Uemura, M.; Takahashi, T.; Watanabe, S.; Takeda, S.

2014-10-01

66

Film reciprocity law failure in scintillation camera imaging  

SciTech Connect

Sensitometric measurements of scintillation camera images show that substantial changes in film response occur with variations in imaging time or dot focus. Both the speed and the slope of the film characteristic curve are affected. The phenomenon responsible for the variation in response is referred to as film reciprocity law failure, and is inherent in the image-forming process.

Grossman, L.W.; Van Tuinen, R.J.; Kruger, J.B.; Scholz, K.L.

1981-03-01

67

A time-resolved image sensor for tubeless streak cameras  

NASA Astrophysics Data System (ADS)

This paper presents a time-resolved CMOS image sensor with draining-only modulation (DOM) pixels for tube-less streak cameras. Although the conventional streak camera has high time resolution, the device requires high voltage and bulky system due to the structure with a vacuum tube. The proposed time-resolved imager with a simple optics realize a streak camera without any vacuum tubes. The proposed image sensor has DOM pixels, a delay-based pulse generator, and a readout circuitry. The delay-based pulse generator in combination with an in-pixel logic allows us to create and to provide a short gating clock to the pixel array. A prototype time-resolved CMOS image sensor with the proposed pixel is designed and implemented using 0.11um CMOS image sensor technology. The image array has 30(Vertical) x 128(Memory length) pixels with the pixel pitch of 22.4um. .

Yasutomi, Keita; Han, SangMan; Seo, Min-Woong; Takasawa, Taishi; Kagawa, Keiichiro; Kawahito, Shoji

2014-03-01

68

Camera Identification from Printed Images Miroslav Goljan*  

E-print Network

for sensors". There are several important advantages of using this fingerprint for forensic purposes These defects will likely show up in good quality printouts and thus form a very useful forensic entity) was recently proposed7 as a fingerprint for digital cameras or an electronic equivalent of "biometrics

Fridrich, Jessica

69

Application of the CCD camera in medical imaging  

NASA Astrophysics Data System (ADS)

Medical fluoroscopy is a set of radiological procedures used in medical imaging for functional and dynamic studies of digestive system. Major components in the imaging chain include image intensifier that converts x-ray information into an intensity pattern on its output screen and a CCTV camera that converts the output screen intensity pattern into video information to be displayed on a TV monitor. To properly respond to such a wide dynamic range on a real-time basis, such as fluoroscopy procedure, are very challenging. Also, similar to all other medical imaging studies, detail resolution is of great importance. Without proper contrast, spatial resolution is compromised. The many inherent advantages of CCD make it a suitable choice for dynamic studies. Recently, CCD camera are introduced as the camera of choice for medical fluoroscopy imaging system. The objective of our project was to investigate a newly installed CCD fluoroscopy system in areas of contrast resolution, details, and radiation dose.

Chu, Wei-Kom; Smith, Chuck; Bunting, Ralph; Knoll, Paul; Wobig, Randy; Thacker, Rod

1999-04-01

70

Space solar EUV and x-ray imaging camera  

NASA Astrophysics Data System (ADS)

The space solar EUV and X ray imaging camera is the core of the solar EUV and X ray imaging telescope, which is designed to monitor and predict solar activities, such as CMEs, flare and coronal hole. This paper presents a comprehensive description of the camera, from the detector selection to the electronics system design, and gives some experiment results. The camera adopts a back illuminated X ray sensible CCD as detector and the support electronics is based on it. The electronics system should be designed according to the CCD structure. This article gives an illustration of the three modules and a software in electronics system, and offers possible solutions to some common problems in CCD camera designing. There are detailed descriptions about drivers and signal amplifing and processing Module.

Peng, Ji-Long; Li, Bao-Quan; Wei, Fei; Zhang, Xin; Liu, Xin; Zeng, Zhi-Rong

2008-10-01

71

Analysis of Camera's Images Influenced by Varying Light Illumination for Design of Color Segmentation  

Microsoft Academic Search

Cameras are excellent sensors in robotic systems. However, a camera being extreme sensitive to illumination variation restricts its capability. In this paper, color image seg- mentation of a camera influenced by varying light illumination is studied for extending camera's capability in the future. The influence of an image captured by camera due to varying light illumination is analyzed. The components

Kuo-yang Tu

2009-01-01

72

Multiple range imaging camera operation with minimal performance impact  

NASA Astrophysics Data System (ADS)

Time-of-flight range imaging cameras operate by illuminating a scene with amplitude modulated light and measuring the phase shift of the modulation envelope between the emitted and reflected light. Object distance can then be calculated from this phase measurement. This approach does not work in multiple camera environments as the measured phase is corrupted by the illumination from other cameras. To minimize inaccuracies in multiple camera environments, replacing the traditional cyclic modulation with pseudo-noise amplitude modulation has been previously demonstrated. However, this technique effectively reduced the modulation frequency, therefore decreasing the distance measurement precision (which has a proportional relationship with the modulation frequency). A new modulation scheme using maximum length pseudo-random sequences binary phase encoded onto the existing cyclic amplitude modulation, is presented. The effective modulation frequency therefore remains unchanged, providing range measurements with high precision. The effectiveness of the new modulation scheme was verified using a custom time-of-flight camera based on the PMD19-K2 range imaging sensor. The new pseudo-noise modulation has no significant performance decrease in a single camera environment. In a two camera environment, the precision is only reduced by the increased photon shot noise from the second illumination source.

Whyte, Refael Z.; Payne, Andrew D.; Dorrington, Adrian A.; Cree, Michael J.

2010-01-01

73

Fluorescence lifetime imaging microscopy using a streak camera  

NASA Astrophysics Data System (ADS)

We present the development of a fluorescence lifetime imaging microscopy system using a streak camera (SC-FLIM), which uses ultrafast infrared laser for multiphoton excitation and a streak camera for lifetime measurement. A pair of galvo mirrors are employed to accomplish quick time-resolved scanning on a line and 2D fluorescence lifetime imaging. The SC-FLIM system was calibrated using an F-P etalon and several standard fluorescent dyes, and was also used to perform fluorescence lifetime imaging of fluorescent microspheres and a prepared plant stem slide.

Liu, Lixin; Li, Yahui; Sun, Luogeng; Li, Heng; Peng, Xiao; Qu, Junle

2014-02-01

74

ProxiScan™: A Novel Camera for Imaging Prostate Cancer  

SciTech Connect

ProxiScan is a compact gamma camera suited for high-resolution imaging of prostate cancer. Developed by Brookhaven National Laboratory and Hybridyne Imaging Technologies, Inc., ProxiScan won a 2009 R&D 100 Award, sponsored by R&D Magazine to recognize t

Ralph James

2009-10-27

75

Everywhere-in-focus image fusion using controlablle cameras  

Microsoft Academic Search

Imaging parameters such as focus strongly influence data quality and the performance of content extraction techniques. Narrow depth of field gives clear focus but only over a short range of depths. This paper shows results from an algorithm that uses computer-controlled focus and pan camera movement in order to obtain a scene image that is a composite which is in

W. Brent Seales; Sandeep Dutta

1996-01-01

76

ProxiScan?: A Novel Camera for Imaging Prostate Cancer  

ScienceCinema

ProxiScan is a compact gamma camera suited for high-resolution imaging of prostate cancer. Developed by Brookhaven National Laboratory and Hybridyne Imaging Technologies, Inc., ProxiScan won a 2009 R&D 100 Award, sponsored by R&D Magazine to recognize t

Ralph James

2010-01-08

77

Hubble Space Telescope Planetary Camera Imaging of HH 29  

Microsoft Academic Search

We present Hubble Space Telescope Halpha and [S II] images of HH 29. The proximity of HH 29 (140 pc) and the high resolution of the Planetary Camera has resulted in the most detailed images obtained so far of any Herbig-Haro object. The most prominent feature is a linear Halpha ridge leading the working surface of a bow shock with

David Devine; John Bally; Bo Reipurth; John Stocke; Jon Morse

2000-01-01

78

Applying image quality in cell phone cameras: lens distortion  

NASA Astrophysics Data System (ADS)

This paper describes the framework used in one of the pilot studies run under the I3A CPIQ initiative to quantify overall image quality in cell-phone cameras. The framework is based on a multivariate formalism which tries to predict overall image quality from individual image quality attributes and was validated in a CPIQ pilot program. The pilot study focuses on image quality distortions introduced in the optical path of a cell-phone camera, which may or may not be corrected in the image processing path. The assumption is that the captured image used is JPEG compressed and the cellphone camera is set to 'auto' mode. As the used framework requires that the individual attributes to be relatively perceptually orthogonal, in the pilot study, the attributes used are lens geometric distortion (LGD) and lateral chromatic aberrations (LCA). The goal of this paper is to present the framework of this pilot project starting with the definition of the individual attributes, up to their quantification in JNDs of quality, a requirement of the multivariate formalism, therefore both objective and subjective evaluations were used. A major distinction in the objective part from the 'DSC imaging world' is that the LCA/LGD distortions found in cell-phone cameras, rarely exhibit radial behavior, therefore a radial mapping/modeling cannot be used in this case.

Baxter, Donald; Goma, Sergio R.; Aleksic, Milivoje

2009-01-01

79

Image Forensics of Digital Cameras by Analysing Image Variations using Statistical Process Control  

E-print Network

Image Forensics of Digital Cameras by Analysing Image Variations using Statistical Process Control the novel use of Statistical Process Control (SPC) as a tool for identifying anomalies in digital cameras to deduce the cause of inconsistency in the device's image acquisition process. This could ultimately lead

Doran, Simon J.

80

Laser speckle imaging using a consumer-grade color camera.  

PubMed

Laser speckle imaging (LSI) is a noninvasive optical imaging technique able to provide wide-field two-dimensional maps of moving particles. Raw laser speckle images are typically taken with a scientific-grade monochrome camera. We demonstrate that a digital single-lens reflex (dSLR) camera with a Bayer filter is able to provide similar sensitivity despite taking information only from a specific pixel color. Here we demonstrate the effect of changing three primary dSLR exposure settings (i.e., aperture, exposure time/shutter speed, and gain/sensitivity (ISO)) on speckle contrast. In addition, we present data from an in vivo reactive hyperemia experiment that demonstrates the qualitative similarity in blood-flow dynamics visualized with a color dSLR and a scientific-grade monochrome camera. PMID:23027244

Yang, Owen; Choi, Bernard

2012-10-01

81

Laser speckle imaging using a consumer-grade color camera  

PubMed Central

Laser speckle imaging (LSI) is a noninvasive optical imaging technique able to provide wide-field two-dimensional maps of moving particles. Raw laser speckle images are typically taken with a scientific-grade monochrome camera. We demonstrate that a digital single-lens reflex (dSLR) camera with a Bayer filter is able to provide similar sensitivity despite taking information only from a specific pixel color. Here we demonstrate the effect of changing three primary dSLR exposure settings (i.e., aperture, exposure time/shutter speed, and gain/sensitivity (ISO)) on speckle contrast. In addition, we present data from an in vivo reactive hyperemia experiment that demonstrates the qualitative similarity in blood-flow dynamics visualized with a color dSLR and a scientific-grade monochrome camera. PMID:23027244

Yang, Owen; Choi, Bernard

2013-01-01

82

Efficient height measurement method of surveillance camera image.  

PubMed

As surveillance cameras are increasingly installed, their films are often submitted as evidence of crime, but very scant detailed information such as features and clothes is obtained due to the limited camera performance. Height, however, is relatively not significantly influenced by the camera performance. This paper studied the height measurement method using images from a CCTV. The information on the height was obtained via photogrammetry, including the reference points in the photographed area and the calculation of the relationship between a 3D space and a 2D image through linear and nonlinear calibration. Using this correlation, this paper suggested the height measurement method, which projects a 3D virtual ruler onto the image. This method has been proven to offer more stable values within the range of data convergence than those of other existing methods. PMID:18096339

Lee, Joong; Lee, Eung-Dae; Tark, Hyun-Oh; Hwang, Jin-Woo; Yoon, Do-Young

2008-05-01

83

Mars Global Surveyor Mars Orbiter Camera Image Gallery  

NSDL National Science Digital Library

This site from Malin Space Science Systems provides access to all of the images acquired by the Mars Orbiter Camera (MOC) during the Mars Global Surveyor mission through March 2005. MOC consists of several cameras: A narrow angle system that provides grayscale high resolution views of the planet's surface (typically, 1.5 to 12 meters/pixel), and red and blue wide angle cameras that provide daily global weather monitoring, context images to determine where the narrow angle views were actually acquired, and regional coverage to monitor variable surface features such as polar frost and wind streaks. Ancillary data for each image is provided and instructions regarding gallery usage are also available on the site.

Systems, Malin S.

84

CMOS Image Sensors: Electronic Camera On A Chip  

NASA Technical Reports Server (NTRS)

Recent advancements in CMOS image sensor technology are reviewed, including both passive pixel sensors and active pixel sensors. On- chip analog to digital converters and on-chip timing and control circuits permit realization of an electronic camera-on-a-chip. Highly miniaturized imaging systems based on CMOS image sensor technology are emerging as a competitor to charge-coupled devices for low cost uses.

Fossum, E. R.

1995-01-01

85

Handshake characterization and image stabilization for cell-phone cameras  

NASA Astrophysics Data System (ADS)

Image blur due to handshake is a significant problem for cell-phone cameras. A set of new handshake characteristics are established using a high-frame-rate image capture and processing system. Based on these newly established handshake characteristics, an efficient, effective and inexpensive method for minimizing image blur due to handshake is proposed. The results of applying the proposed method under different scene conditions are presented.

Safaee-Rad, Reza; Aleksic, Milivoje

2009-01-01

86

High-resolution, low-light, image-intensified CCD camera  

Microsoft Academic Search

The maturing camera industry has been employing a RS 170 formatted Image Intensified CCD Camera. This left some users with limited image resolutions. In a typical RS 170 camera the CCD sensor's resolving pixel density is in the order of 500 X 400 squared pixels. To surmount this limitation for a particular user, a high resolution image intensified CCD camera

Satoru C. Tanaka; Tom Silvey; Greg Long; Bill Braze

1991-01-01

87

Homeland Security – Image Processing for Intelligent Cameras  

Microsoft Academic Search

Image surveillance systems demand is a challenging fast growing domain for TOSA [Thales Optronique Société Anonyme]. To address\\u000a this domain and image processing architecture in general, TOSA bets on the strategy of using reconfigurable multi-purpose\\u000a architecture to improve performances, re-use, productivity and reactivity. MORPHEUS project allows to realize this concept\\u000a and demonstrate its capacities. Among the two phases that compose

Cyrille Batariere

88

Establishing imaging sensor specifications for digital still cameras  

NASA Astrophysics Data System (ADS)

Digital Still Cameras, DSCs, have now displaced conventional still cameras in most markets. The heart of a DSC is thought to be the imaging sensor, be it Full Frame CCD, and Interline CCD, a CMOS sensor or the newer Foveon buried photodiode sensors. There is a strong tendency by consumers to consider only the number of mega-pixels in a camera and not to consider the overall performance of the imaging system, including sharpness, artifact control, noise, color reproduction, exposure latitude and dynamic range. This paper will provide a systematic method to characterize the physical requirements of an imaging sensor and supporting system components based on the desired usage. The analysis is based on two software programs that determine the "sharpness", potential for artifacts, sensor "photographic speed", dynamic range and exposure latitude based on the physical nature of the imaging optics, sensor characteristics (including size of pixels, sensor architecture, noise characteristics, surface states that cause dark current, quantum efficiency, effective MTF, and the intrinsic full well capacity in terms of electrons per square centimeter). Examples will be given for consumer, pro-consumer, and professional camera systems. Where possible, these results will be compared to imaging system currently on the market.

Kriss, Michael A.

2007-02-01

89

Digital Camera Identification from Images - Estimating False Acceptance Probability  

NASA Astrophysics Data System (ADS)

Photo-response non-uniformity noise present in output signals of CCD and CMOS sensors has been used as fingerprint to uniquely identify the source digital camera that took the image. The same fingerprint can establish a link between images according to their common source. In this paper, we review the state-of-the-art identification method and discuss its practical issues. In the camera identification task, when formulated as a binary hypothesis test, a decision threshold is set on correlation between image noise and modulated fingerprint. The threshold determines the probability of two kinds of possible errors: false acceptance and missed detection. We will focus on estimation of the false acceptance probability that we wish to keep very low. A straightforward approach involves testing a large number of different camera fingerprints against one image or one camera fingerprint against many images from different sources. Such sampling of the correlation probability distribution is time consuming and expensive while extrapolation of the tails of the distribution is still not reliable. A novel approach is based on cross-correlation analysis and peak-to-correlation-energy ratio.

Goljan, Miroslav

90

Pixel-level Image Fusion Algorithms for Multi-camera Imaging System  

E-print Network

Pixel-level Image Fusion Algorithms for Multi-camera Imaging System A Thesis Presented of image fusion technologies in the multi sensor image fusion system and applications. With specific focus on pixel level image fusion, the process after the image registration is processed, we develop graphic user

Abidi, Mongi A.

91

Digital camera with apparatus for authentication of images produced from an image file  

NASA Technical Reports Server (NTRS)

A digital camera equipped with a processor for authentication of images produced from an image file taken by the digital camera is provided. The digital camera processor has embedded therein a private key unique to it, and the camera housing has a public key that is so uniquely based upon the private key that digital data encrypted with the private key by the processor may be decrypted using the public key. The digital camera processor comprises means for calculating a hash of the image file using a predetermined algorithm, and second means for encrypting the image hash with the private key, thereby producing a digital signature. The image file and the digital signature are stored in suitable recording means so they will be available together. Apparatus for authenticating at any time the image file as being free of any alteration uses the public key for decrypting the digital signature, thereby deriving a secure image hash identical to the image hash produced by the digital camera and used to produce the digital signature. The apparatus calculates from the image file an image hash using the same algorithm as before. By comparing this last image hash with the secure image hash, authenticity of the image file is determined if they match, since even one bit change in the image hash will cause the image hash to be totally different from the secure hash.

Friedman, Gary L. (inventor)

1993-01-01

92

Removing Image Artifacts Due to Dirty Camera Lenses and Thin Occluders Columbia University  

E-print Network

- ital cameras, or the front windows of security cameras, often accu- mulate various types outdoor security cameras, underwater cameras or covert surveillance be- hind a fence. Therefore, weRemoving Image Artifacts Due to Dirty Camera Lenses and Thin Occluders Jinwei Gu Columbia

Nayar, Shree K.

93

Image quality testing of assembled IR camera modules  

NASA Astrophysics Data System (ADS)

Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.

Winters, Daniel; Erichsen, Patrik

2013-10-01

94

Mosaicing of Camera-captured Document Images  

E-print Network

aJian Liang is with Amazon.com; Seattle, WA; USA.2 bDaniel DeMenthon and David Doermann are with University of Maryland;3 College Park, MD; USA.4 Preprint submitted to Elsevier 11 June 2008 #12;Abstract5 transformation-like voting method. In the image12 composition part, our contribution is a sharpness based

DeMenthon, Daniel

95

Hyperspectral low-light camera for imaging of biological samples  

Microsoft Academic Search

The capability of acquiring hyperspectral information in low light conditions is potentially important for a variety of applications, ranging from remote sensing to biomedical fluorescence imaging. Particularly interesting is its use in optical analysis of biological samples in which the light level should be kept low to prevent tissue damage. For this purpose a low-light hyperspectral camera has been developed

J. Hernandez-Palacios; L. L. Randeberg; I. Baarstad; T. Loke; T. Skauli

2010-01-01

96

Multiexposure and multifocus image fusion with multidimensional camera shake compensation  

NASA Astrophysics Data System (ADS)

Multiexposure image fusion algorithms are used for enhancing the perceptual quality of an image captured by sensors of limited dynamic range. This is achieved by rendering a single scene based on multiple images captured at different exposure times. Similarly, multifocus image fusion is used when the limited depth of focus on a selected focus setting of a camera results in parts of an image being out of focus. The solution adopted is to fuse together a number of multifocus images to create an image that is focused throughout. A single algorithm that can perform both multifocus and multiexposure image fusion is proposed. This algorithm is a new approach in which a set of unregistered multiexposure/focus images is first registered before being fused to compensate for the possible presence of camera shake. The registration of images is done via identifying matching key-points in constituent images using scale invariant feature transforms. The random sample consensus algorithm is used to identify inliers of SIFT key-points removing outliers that can cause errors in the registration process. Finally, the coherent point drift algorithm is used to register the images, preparing them to be fused in the subsequent fusion stage. For the fusion of images, a new approach based on an improved version of a wavelet-based contourlet transform is used. The experimental results and the detailed analysis presented prove that the proposed algorithm is capable of producing high-dynamic range (HDR) or multifocus images by registering and fusing a set of multiexposure or multifocus images taken in the presence of camera shake. Further, comparison of the performance of the proposed algorithm with a number of state-of-the art algorithms and commercial software packages is provided. In particular, our literature review has revealed that this is one of the first attempts where the compensation of camera shake, a very likely practical problem that can result in HDR image capture using handheld devices, has been addressed as a part of a multifocus and multiexposure image enhancement system.

Gomez, Alexis Lluis; Saravi, Sara; Edirisinghe, Eran A.

2013-10-01

97

Spatial calibration of full stokes polarization imaging camera  

NASA Astrophysics Data System (ADS)

Objective and background: We present a new method for the calibration of Bossa Nova Technologies' full Stokes, passive polarization imaging camera SALSA. The SALSA camera is a Division of Time Imaging Polarimeter. It uses custom made Ferroelectric Liquid Crystals mounted directly in front of the camera's CCD. Regular calibration process based on Data Reduction Matrix calculation assumes a perfect spatial uniformity of the FLC. However, alignment of FLC molecules can be disturbed by external constraints like mechanical stress from fixture, temperature variations and humidity. This disarray of the FLC molecules alignment appears as spatial non-uniformity. With typical DRM condition numbers of 2 to 5, the influence on DOLP and DOCP variations over the field of view can get up to 10%. Spatial nonuniformity of commercially available FLC products is the limiting factor for achieving reliable performances over the whole camera's field of view. We developed a field calibration technique based on mapping the CCD into areas of interest, then applying the DRM calculations on those individual areas. Results: First, we provide general background of the SALSA camera's technology, its performances and limitations. Detailed analysis of commercially available FLCs is described. Particularly, the spatial non uniformity influence on the Stokes parameters. Then, the new calibration technique is presented. Several configurations and parameters are tested: even division of the CCD into square-shaped regions, the number of regions, adaptive regions. Finally, the spatial DRM "stitching" process is described, especially for live calculation and display of Stokes parameters.

Vedel, M.; Breugnot, S.; Lechocinski, N.

2014-05-01

98

A compact gamma camera for biological imaging  

SciTech Connect

A compact detector, sized particularly for imaging a mouse, is described. The active area of the detector is approximately 46 mm; spl times/ 96 mm. Two flat-panel Hamamatsu H8500 position-sensitive photomultiplier tubes (PSPMTs) are coupled to a pixellated NaI(Tl) scintillator which views the animal through a copper-beryllium (CuBe) parallel-hole collimator specially designed for {sup 125}I. Although the PSPMTs have insensitive areas at their edges and there is a physical gap, corrections for scintillation light collection at the junction between the two tubes results in a uniform response across the entire rectangular area of the detector. The system described has been developed to optimize both sensitivity and resolution for in-vivo imaging of small animals injected with iodinated compounds. We demonstrate an in-vivo application of this detector, particularly to SPECT, by imaging mice injected with approximately 10-15; spl mu/Ci of {sup 125}I.

Bradley, E.L.; Cella, J.; Majewski, S.; Popov, V.; Jianguo Qian; Saha, M.S.; Smith, M.F.; Weisenberger, A.G.; Welsh, R.E.

2006-02-01

99

Radiometric cloud imaging with an uncooled microbolometer thermal infrared camera.  

PubMed

An uncooled microbolometer-array thermal infrared camera has been incorporated into a remote sensing system for radiometric sky imaging. The radiometric calibration is validated and improved through direct comparison with spectrally integrated data from the Atmospheric Emitted Radiance Interferometer (AERI). With the improved calibration, the Infrared Cloud Imager (ICI) system routinely obtains sky images with radiometric uncertainty less than 0.5 W/(m(2 )sr) for extended deployments in challenging field environments. We demonstrate the infrared cloud imaging technique with still and time-lapse imagery of clear and cloudy skies, including stratus, cirrus, and wave clouds. PMID:19498585

Shaw, Joseph; Nugent, Paul; Pust, Nathan; Thurairajah, Brentha; Mizutani, Kohei

2005-07-25

100

The TrICE Prototype MAMPT Imaging Camera  

E-print Network

The Track Imaging Cerenov Experiment (TrICE) is an air Cerenkov prototype telescope designed to use multi-anode photomultiplier to acheive a high angular resolution for measuring cosmic-ray composition at TeV-PeV energies. The TrICE camera, composed of 16 Hamamatsu R8900 16-channel multi-anode photomultiplier tubes, achieves 0.086 degree angular width per pixel over 1.5 degree wide field of view. We present a description of the TrICE camera design, calibration and performance.

Byrum, K; Drake, G; Hays, E; Kieda, D; Kovács, E; Magill, S; Nodulmann, L; Norhtrop, R; Swordy, S; Wagner, R G; Wakely, S P; Wissel, S A

2007-01-01

101

The TrICE Prototype MAMPT Imaging Camera  

E-print Network

The Track Imaging Cerenov Experiment (TrICE) is an air Cerenkov prototype telescope designed to use multi-anode photomultiplier to acheive a high angular resolution for measuring cosmic-ray composition at TeV-PeV energies. The TrICE camera, composed of 16 Hamamatsu R8900 16-channel multi-anode photomultiplier tubes, achieves 0.086 degree angular width per pixel over 1.5 degree wide field of view. We present a description of the TrICE camera design, calibration and performance.

K. Byrum; J. Cunningham; G. Drake; E. Hays; D. Kieda; E. Kovacs; S. Magill; L. Nodulmann; R. Norhtrop; S. Swordy; R. G. Wagner; S. P. Wakely; S. A. Wissel

2007-10-02

102

The TrICE Prototype MAMPT Imaging Camera  

NASA Astrophysics Data System (ADS)

The Track Imaging Cerenkov Experiment (TrICE) is an air Cerenkov telescope designed to use multi-anode photomultipliers to achieve a high angular resolution for measuring cosmic-ray composition at TeV-PeV energies. The TrICE camera, composed of 16 Hamamatsu R8900 16-channel multi-anode photomultiplier tubes achieves 0.086 degree pixel spacing over 1.5 degree field of view. We present a description of the TrICE camera design, calibration and performance.

Byrum, K.; Cunningham, J.; Drake, G.; et al.

103

A novel SPECT camera for molecular imaging of the prostate  

NASA Astrophysics Data System (ADS)

The objective of this work is to develop an improved SPECT camera for dedicated prostate imaging. Complementing the recent advancements in agents for molecular prostate imaging, this device has the potential to assist in distinguishing benign from aggressive cancers, to improve site-specific localization of cancer, to improve accuracy of needle-guided prostate biopsy of cancer sites, and to aid in focal therapy procedures such as cryotherapy and radiation. Theoretical calculations show that the spatial resolution/detection sensitivity of the proposed SPECT camera can rival or exceed 3D PET and further signal-to-noise advantage is attained with the better energy resolution of the CZT modules. Based on photon transport simulation studies, the system has a reconstructed spatial resolution of 4.8 mm with a sensitivity of 0.0001. Reconstruction of a simulated prostate distribution demonstrates the focal imaging capability of the system.

Cebula, Alan; Gilland, David; Su, Li-Ming; Wagenaar, Douglas; Bahadori, Amir

2011-10-01

104

Innovative Camera and Image Processing System to Characterize Cryospheric Changes  

NASA Astrophysics Data System (ADS)

The polar regions play an important role in Earth’s climatic and geodynamic systems. Digital photogrammetric mapping provides a means for monitoring the dramatic changes observed in the polar regions during the past decades. High-resolution, photogrammetrically processed digital aerial imagery provides complementary information to surface measurements obtained by laser altimetry systems. While laser points accurately sample the ice surface, stereo images allow for the mapping of features, such as crevasses, flow bands, shear margins, moraines, leads, and different types of sea ice. Tracking features in repeat images produces a dense velocity vector field that can either serve as validation for interferometrically derived surface velocities or it constitutes a stand-alone product. A multi-modal, photogrammetric platform consists of one or more high-resolution, commercial color cameras, GPS and inertial navigation system as well as optional laser scanner. Such a system, using a Canon EOS-1DS Mark II camera, was first flown on the Icebridge missions Fall 2009 and Spring 2010, capturing hundreds of thousands of images at a frame rate of about one second. While digital images and videos have been used for quite some time for visual inspection, precise 3D measurements with low cost, commercial cameras require special photogrammetric treatment that only became available recently. Calibrating the multi-camera imaging system and geo-referencing the images are absolute prerequisites for all subsequent applications. Commercial cameras are inherently non-metric, that is, their sensor model is only approximately known. Since these cameras are not as rugged as photogrammetric cameras, the interior orientation also changes, due to temperature and pressure changes and aircraft vibration, resulting in large errors in 3D measurements. It is therefore necessary to calibrate the cameras frequently, at least whenever the system is newly installed. Geo-referencing the images is performed by the Applanix navigation system. Our new method enables a 3D reconstruction of ice sheet surface with high accuracy and unprecedented details, as it is demonstrated by examples from the Antarctic Peninsula, acquired by the IceBridge mission. Repeat digital imaging also provides data for determining surface elevation changes and velocities that are critical parameters for ice sheet models. Although these methods work well, there are known problems with satellite images and the traditional area-based matching, especially over rapidly changing outlet glaciers. To take full advantage of the high resolution, repeat stereo imaging we have developed a new method. The processing starts with the generation of a DEM from geo-referenced stereo images of the first time epoch. The next step is concerned with extracting and matching interest points in object space. Since an interest point moves its spatial position between two time epochs, such points are only radiometrically conjugate but not geometrically. In fact, the geometric displacement of two identical points, together with the time difference, renders velocities. We computed the evolution of the velocity field and surface topography on the floating tongue of the Jakobshavn glacier from historical stereo aerial photographs to illustrate the approach.

Schenk, A.; Csatho, B. M.; Nagarajan, S.

2010-12-01

105

Reflectance Calibration Scheme for Airborne Frame Camera Images  

NASA Astrophysics Data System (ADS)

The image quality of photogrammetric images is influenced by various effects from outside the camera. One effect is the scattered light from the atmosphere that lowers contrast in the images and creates a colour shift towards the blue. Another is the changing illumination during the day which results in changing image brightness within an image block. In addition, there is the so-called bidirectional reflectance of the ground (BRDF effects) that is giving rise to a view and sun angle dependent brightness gradient in the image itself. To correct for the first two effects an atmospheric correction with reflectance calibration is chosen. The effects have been corrected successfully for ADS linescan sensor data by using a parametrization of the atmospheric quantities. Following Kaufman et al. the actual atmospheric condition is estimated by the brightness of a dark pixel taken from the image. The BRDF effects are corrected using a semi-empirical modelling of the brightness gradient. Both methods are now extended to frame cameras. Linescan sensors have a viewing geometry that is only dependent from the cross track view zenith angle. The difference for frame cameras now is to include the extra dimension of the view azimuth into the modelling. Since both the atmospheric correction and the BRDF correction require a model inversion with the help of image data, a different image sampling strategy is necessary which includes the azimuth angle dependence. For the atmospheric correction a sixth variable is added to the existing five variables visibility, view zenith angle, sun zenith angle, ground altitude, and flight altitude - thus multiplying the number of modelling input combinations for the offline-inversion. The parametrization has to reflect the view azimuth angle dependence. The BRDF model already contains the view azimuth dependence and is combined with a new sampling strategy.

Beisl, U.

2012-07-01

106

ARNICA, the NICMOS 3 imaging camera of TIRGO.  

NASA Astrophysics Data System (ADS)

ARNICA (ARcetri Near Infrared CAmera) is the imaging camera for the near infrared bands between 1.0 and 2.5 ?m that Arcetri Observatory has designed and built as a general facility for the TIRGO telescope (1.5 m diameter, f/20) located at Gornergrat (Switzerland). The scale is 1?per pixel, with sky coverage of more than 4 min×4 min on the NICMOS 3 (256×256 pixels, 40 ?m side) detector array. The camera is remotely controlled by a PC 486, connected to the array control electronics via a fiber-optics link. A C-language package, running under MS-DOS on the PC 486, acquires and stores the frames, and controls the timing of the array. The camera is intended for imaging of large extra-galactic and Galactic fields; a large effort has been dedicated to explore the possibility of achieving precise photometric measurements in the J, H, K astronomical bands, with very promising results.

Lisi, F.; Baffa, C.; Hunt, L.; Stanga, R.

107

Camera assembly design proposal for SRF cavity image collection  

SciTech Connect

This project seeks to collect images from the inside of a superconducting radio frequency (SRF) large grain niobium cavity during vertical testing. These images will provide information on multipacting and other phenomena occurring in the SRF cavity during these tests. Multipacting, a process that involves an electron buildup in the cavity and concurrent loss of RF power, is thought to be occurring near the cathode in the SRF structure. Images of electron emission in the structure will help diagnose the source of multipacting in the cavity. Multipacting sources may be eliminated with an alteration of geometric or resonant conditions in the SRF structure. Other phenomena, including unexplained light emissions previously discovered at SLAC, may be present in the cavity. In order to effectively capture images of these events during testing, a camera assembly needs to be installed to the bottom of the RF structure. The SRF assembly operates under extreme environmental conditions: it is kept in a dewar in a bath of 2K liquid helium during these tests, is pumped down to ultra-high vacuum, and is subjected to RF voltages. Because of this, the camera needs to exist as a separate assembly attached to the bottom of the cavity. The design of the camera is constrained by a number of factors that are discussed.

Tuozzolo, S.

2011-10-10

108

Engineering design criteria for an image intensifier/image converter camera  

NASA Technical Reports Server (NTRS)

The design, display, and evaluation of an image intensifier/image converter camera which can be utilized in various requirements of spaceshuttle experiments are described. An image intensifier tube was utilized in combination with two brassboards as power supply and used for evaluation of night photography in the field. Pictures were obtained showing field details which would have been undistinguishable to the naked eye or to an ordinary camera.

Sharpsteen, J. T.; Lund, D. L.; Stoap, L. J.; Solheim, C. D.

1976-01-01

109

VLSI Architecture and FPGA Prototyping of a Digital Camera for Image Security and Authentication  

E-print Network

VLSI Architecture and FPGA Prototyping of a Digital Camera for Image Security and Authentication and security mechanism for images produced by it. Since the proposal of the trustworthy digital camera Watermarking Unit Flash Memory Compression Unit Encryption Unit Fig. 1. Secure digital camera for image

Mohanty, Saraju P.

110

Development of omni-directional image stabilization system using camera posture information  

Microsoft Academic Search

A teleoperated robot which works in unstructured environments such as disaster sites is mainly controlled based on images obtained from cameras mounted on the robot. Since it runs on rough terrains, the images from the mounted camera are non-steady ones. These non-steady images will make the operator difficult to understand the surrounding information and will cause the camera motion sickness.

Ryo MIYAUCHI; Naoji SHIROMA; Fumitoshi MATSUNO

2007-01-01

111

Image\\/video deblurring using a hybrid camera  

Microsoft Academic Search

We propose a novel approach to reduce spatially varying motion blur using a hybrid camera system that simultane- ously captures high-resolution video at a low-frame rate to- gether with low-resolution video at a high-frame rate. Our work is inspired by Ben-Ezra and Nayar (3) who introduced thehybridcameraideaforcorrectingglobalmotionblurfor a single still image. We broaden the scope of the problem to address

Yu-wing Tai; Hao Du; Michael S. Brown; Stephen Lin

2008-01-01

112

Parallel phase-sensitive three-dimensional imaging camera  

DOEpatents

An apparatus is disclosed for generating a three-dimensional (3-D) image of a scene illuminated by a pulsed light source (e.g. a laser or light-emitting diode). The apparatus, referred to as a phase-sensitive 3-D imaging camera utilizes a two-dimensional (2-D) array of photodetectors to receive light that is reflected or scattered from the scene and processes an electrical output signal from each photodetector in the 2-D array in parallel using multiple modulators, each having inputs of the photodetector output signal and a reference signal, with the reference signal provided to each modulator having a different phase delay. The output from each modulator is provided to a computational unit which can be used to generate intensity and range information for use in generating a 3-D image of the scene. The 3-D camera is capable of generating a 3-D image using a single pulse of light, or alternately can be used to generate subsequent 3-D images with each additional pulse of light.

Smithpeter, Colin L. (Albuquerque, NM); Hoover, Eddie R. (Sandia Park, NM); Pain, Bedabrata (Los Angeles, CA); Hancock, Bruce R. (Altadena, CA); Nellums, Robert O. (Albuquerque, NM)

2007-09-25

113

An Efficient Image Compressor for Charge Coupled Devices Camera  

PubMed Central

Recently, the discrete wavelet transforms- (DWT-) based compressor, such as JPEG2000 and CCSDS-IDC, is widely seen as the state of the art compression scheme for charge coupled devices (CCD) camera. However, CCD images project on the DWT basis to produce a large number of large amplitude high-frequency coefficients because these images have a large number of complex texture and contour information, which are disadvantage for the later coding. In this paper, we proposed a low-complexity posttransform coupled with compressing sensing (PT-CS) compression approach for remote sensing image. First, the DWT is applied to the remote sensing image. Then, a pair base posttransform is applied to the DWT coefficients. The pair base are DCT base and Hadamard base, which can be used on the high and low bit-rate, respectively. The best posttransform is selected by the lp-norm-based approach. The posttransform is considered as the sparse representation stage of CS. The posttransform coefficients are resampled by sensing measurement matrix. Experimental results on on-board CCD camera images show that the proposed approach significantly outperforms the CCSDS-IDC-based coder, and its performance is comparable to that of the JPEG2000 at low bit rate and it does not have the high excessive implementation complexity of JPEG2000. PMID:25114977

Li, Jin; Xing, Fei; You, Zheng

2014-01-01

114

Goal-oriented rectification of camera-based document images.  

PubMed

Document digitization with either flatbed scanners or camera-based systems results in document images which often suffer from warping and perspective distortions that deteriorate the performance of current OCR approaches. In this paper, we present a goal-oriented rectification methodology to compensate for undesirable document image distortions aiming to improve the OCR result. Our approach relies upon a coarse-to-fine strategy. First, a coarse rectification is accomplished with the aid of a computationally low cost transformation which addresses the projection of a curved surface to a 2-D rectangular area. The projection of the curved surface on the plane is guided only by the textual content's appearance in the document image while incorporating a transformation which does not depend on specific model primitives or camera setup parameters. Second, pose normalization is applied on the word level aiming to restore all the local distortions of the document image. Experimental results on various document images with a variety of distortions demonstrate the robustness and effectiveness of the proposed rectification methodology using a consistent evaluation methodology that encounters OCR accuracy and a newly introduced measure using a semi-automatic procedure. PMID:20876019

Stamatopoulos, Nikolaos; Gatos, Basilis; Pratikakis, Ioannis; Perantonis, Stavros J

2011-04-01

115

The Atlases of Vesta derived from Dawn Framing Camera images  

NASA Astrophysics Data System (ADS)

The Dawn Framing Camera acquired during its two HAMO (High Altitude Mapping Orbit) phases in 2011 and 2012 about 6,000 clear filter images with a resolution of about 60 m/pixel. We combined these images in a global ortho-rectified mosaic of Vesta (60 m/pixel resolution). Only very small areas near the northern pole were still in darkness and are missing in the mosaic. The Dawn Framing Camera also acquired about 10,000 high-resolution clear filter images (about 20 m/pixel) of Vesta during its Low Altitude Mapping Orbit (LAMO). Unfortunately, the northern part of Vesta was still in darkness during this phase, good illumination (incidence angle < 70°) was only available for 66.8 % of the surface [1]. We used the LAMO images to calculate another global mosaic of Vesta, this time with 20 m/pixel resolution. Both global mosaics were used to produce atlases of Vesta: a HAMO atlas with 15 tiles at a scale of 1:500,000 and a LAMO atlas with 30 tiles at a scale between 1:200,000 and 1:225,180. The nomenclature used in these atlases is based on names and places historically associated with the Roman goddess Vesta, and is compliant with the rules of the IAU. 65 names for geological features were already approved by the IAU, 39 additional names are currently under review. Selected examples of both atlases will be shown in this presentation. Reference: [1]Roatsch, Th., etal., High-resolution Vesta Low Altitude Mapping Orbit Atlas derived from Dawn Framing Camera images. Planetary and Space Science (2013), http://dx.doi.org/10.1016/j.pss.2013.06.024i

Roatsch, T.; Kersten, E.; Matz, K.; Preusker, F.; Scholten, F.; Jaumann, R.; Raymond, C. A.; Russell, C. T.

2013-12-01

116

Imaging of Venus from Galileo: Early results and camera performance  

USGS Publications Warehouse

Three images of Venus have been returned so far by the Galileo spacecraft following an encounter with the planet on UT February 10, 1990. The images, taken at effective wavelengths of 4200 and 9900 A??, characterize the global motions and distribution of haze near the Venus cloud tops and, at the latter wavelength, deep within the main cloud. Previously undetected markings are clearly seen in the near-infrared image. The global distribution of these features, which have maximum contrasts of 3%, is different from that recorded at short wavelengths. In particular, the "polar collar," which is omnipresent in short wavelength images, is absent at 9900 A??. The maximum contrast in the features at 4200 A?? is about 20%. The optical performance of the camera is described and is judged to be nominal. ?? 1992.

Belton, M. J. S.; Gierasch, P.; Klaasen, K. P.; Anger, C. D.; Carr, M. H.; Chapman, C. R.; Davies, M. E.; Greeley, R.; Greenberg, R.; Head, J. W.; Neukum, G.; Pilcher, C. B.; Veverka, J.; Fanale, F. P.; Ingersoll, A. P.; Pollock, J. B.; Morrison, D.; Clary, M. C.; Cunningham, W.; Breneman, H.

1992-01-01

117

Imaging of Venus from Galileo - Early results and camera performance  

NASA Technical Reports Server (NTRS)

Three images of Venus have been returned so far by the Galileo spacecraft following an encounter with the planet on UT February 10, 1990. The images, taken at effective wavelengths of 4200 and 9900 A, characterize the global motions and distribution of haze near the Venus cloud tops and, at the latter wavelength, deep within the main cloud. Previously undetected markings are clearly seen in the near-infrared image. The global distribution of these features, which have maximum contrasts of 3 percent, is different from that recorded at short wavelengths. In particular, the 'polar collar', which is omnipresent in short wavelength images, is absent at 9900 A. The maximum contrast in the features at 4200 A is about 20 percent. The optical performance of the camera is described and is judged to be nominal.

Belton, M. J. S.; Gierasch, P.; Klaasen, K. P.; Anger, C. D.; Carr, M. H.; Chapman, C. R.; Davies, M. E.; Greeley, R.; Greenberg, R.; Head, J. W.

1992-01-01

118

CMOS image sensor noise reduction method for image signal processor in digital cameras and camera phones  

NASA Astrophysics Data System (ADS)

Digital images captured from CMOS image sensors suffer Gaussian noise and impulsive noise. To efficiently reduce the noise in Image Signal Processor (ISP), we analyze noise feature for imaging pipeline of ISP where noise reduction algorithm is performed. The Gaussian noise reduction and impulsive noise reduction method are proposed for proper ISP implementation in Bayer domain. The proposed method takes advantage of the analyzed noise feature to calculate noise reduction filter coefficients. Thus, noise is adaptively reduced according to the scene environment. Since noise is amplified and characteristic of noise varies while the image sensor signal undergoes several image processing steps, it is better to remove noise in earlier stage on imaging pipeline of ISP. Thus, noise reduction is carried out in Bayer domain on imaging pipeline of ISP. The method is tested on imaging pipeline of ISP and images captured from Samsung 2M CMOS image sensor test module. The experimental results show that the proposed method removes noise while effectively preserves edges.

Yoo, Youngjin; Lee, SeongDeok; Choe, Wonhee; Kim, Chang-Yong

2007-02-01

119

First experiences with ARNICA, the ARCETRI observatory imaging camera  

NASA Astrophysics Data System (ADS)

ARNICA (ARcetri Near Infrared CAmera) is the imaging camera for the near infrared bands between 1.0 and 2.5 micrometer that Arcetri Observatory has designed and built as a common use instrument for the TIRGO telescope (1.5 m diameter, f/20) located at Gornergrat (Switzerland). The scale is 1 sec per pixel, with sky coverage of more than 4 min x 4 min on the NICMOS 3 (256 x 256 pixels, 40 micrometer side) detector array. The optical path is compact enough to be enclosed in a 25.4 cm diameter dewar; the working temperature of detector and optics is 76 K. We give an estimate of performance, in terms of sensitivity with an assigned observing time, along with some preliminary considerations on photometric accuracy.

Lisi, F.; Baffa, C.; Hunt, L.; Maiolino, R.; Moriondo, G.; Stanga, R.

1994-03-01

120

Image-intensifier camera studies of shocked metal surfaces  

SciTech Connect

A high-space-resolution image-intensifier camera with luminance gain of up to 5000 and exposure times as short as 30 ns has been applied to the study of the interaction of posts and welds with strongly shocked metal surfaces, which included super strong steels. The time evolution of a single experiment can be recorded by multiple pulsing of the camera. Phenomena that remain coherent for relatively long durations have been observed. An important feature of the hydrodynamic flow resulting from post-plate interactions is the creation of a wave that propagates outward on the plate; the flow blocks the explosive product gases from escaping through the plate for greater than 10 ..mu..s. Electron beam welds were ineffective in blocking product gases from escaping for even short periods of time.

Engelke, R.P.; Thurston, R.S.

1986-01-01

121

Touchless sensor capturing five fingerprint images by one rotating camera  

NASA Astrophysics Data System (ADS)

Conventional touch-based sensors cannot capture the fingerprint images of all five fingers simultaneously due to their flat surfaces because the view of the thumb is not parallel to the other fingers. In addition, touch-based sensors have inherent difficulties, including variations in captured images due to partial contact, nonlinear distortion, inconsistent image quality, and latent images. These degrade recognition performance and user acceptance to using sensors. To overcome these difficulties, we propose a device that adopts a contact-free structure composed of a charge-coupled device (CCD) camera, rotating mirror equipped with a stepping motor, and a green light-emitting diode (LED) illuminator. The device does not make contact with any finger and captures all five fingerprint images simultaneously. We describe and discuss the structure of the proposed device in terms of four aspects: the quality of captured images, verification performance, compatibility with existing touch-based sensors, and ease of use. The experimental results show that the proposed device can capture all five fingerprint images with a high throughput (in 2.5 s), as required at the immigration control office of a country. Also, on average, a captured touchless image takes 57% of a whole rolled image whereas the image captured from a conventional touch-based sensor only takes 41% of a whole rolled image, and they have 63 and 40 true minutiae on average, respectively. Even though touchless images contain 13.18-deg rolling and 9.18-deg pitching distortion on average, 0% equal error rate (EER) is obtained by using five fingerprint images in verification stage.

Noh, Donghyun; Choi, Heeseung; Kim, Jaihie

2011-11-01

122

The European Photon Imaging Camera on XMM-Newton: The MOS Cameras  

E-print Network

The EPIC focal plane imaging spectrometers on XMM-Newton use CCDs to record the images and spectra of celestial X-ray sources focused by the three X-ray mirrors. There is one camera at the focus of each mirror; two of the cameras contain seven MOS CCDs, while the third uses twelve PN CCDs, defining a circular field of view of 30 arcmin diameter in each case. The CCDs were specially developed for EPIC, and combine high quality imaging with spectral resolution close to the Fano limit. A filter wheel carrying three kinds of X-ray transparent light blocking filter, a fully closed, and a fully open position, is fitted to each EPIC instrument. The CCDs are cooled passively and are under full closed loop thermal control. A radio-active source is fitted for internal calibration. Data are processed on-board to save telemetry by removing cosmic ray tracks, and generating X-ray event files; a variety of different instrument modes are available to increase the dynamic range of the instrument and to enable fast timing. The instruments were calibrated using laboratory X-ray beams, and synchrotron generated monochromatic X-ray beams before launch; in-orbit calibration makes use of a variety of celestial X-ray targets. The current calibration is better than 10% over the entire energy range of 0.2 to 10 keV. All three instruments survived launch and are performing nominally in orbit. In particular full field-of-view coverage is available, all electronic modes work, and the energy resolution is close to pre-launch values. Radiation damage is well within pre-launch predictions and does not yet impact on the energy resolution. The scientific results from EPIC amply fulfil pre-launch expectations.

M. J. L. Turner; A. Abbey; M. Arnaud

2000-11-27

123

MECHANICAL ADVANCING HANDLE THAT SIMPLIFIES MINIRHIZOTRON CAMERA REGISTRATION AND IMAGE COLLECTION  

EPA Science Inventory

Minirkizotrons in conjunction with a minirkizotron video camera system are becoming widely used tools for investigating root production and survical in a variety of ecosystems. Image collection with a minirhizotron camera can be time consuming and tedious particularly when hundre...

124

ARNICA: the Arcetri Observatory NICMOS3 imaging camera  

NASA Astrophysics Data System (ADS)

ARNICA (ARcetri Near Infrared CAmera) is the imaging camera for the near infrared bands between 1.0 and 2.5 micrometers that Arcetri Observatory has designed and built as a general facility for the TIRGO telescope (1.5 m diameter, f/20) located at Gornergrat (Switzerland). The scale is 1' per pixel, with sky coverage of more than 4' X 4' on the NICMOS 3 (256 X 256 pixels, 40 micrometers side) detector array. The optical path is compact enough to be enclosed in a 25.4 cm diameter dewar; the working temperature is 76 K. The camera is remotely controlled by a 486 PC, connected to the array control electronics via a fiber-optics link. A C-language package, running under MS-DOS on the 486 PC, acquires and stores the frames, and controls the timing of the array. We give an estimate of performance, in terms of sensitivity with an assigned observing time, along with some details on the main parameters of the NICMOS 3 detector.

Lisi, Franco; Baffa, Carlo; Hunt, Leslie K.

1993-10-01

125

An Introduction for CIPA Standard "Stereo Still Image Format for Digital Cameras"  

NASA Astrophysics Data System (ADS)

The author introduce a standard `Stereo still Image Format for digital cameras' (SIF) established by Camera & Imaging Products Association (CIPA) with the background. The `body image file' is a JPEG compressed image file with tags in side-by-side arrangement having unique extension `ssi'.

Yoshida, Hideaki

126

Distortion corrections for better character recognition of camera-based document images  

NASA Astrophysics Data System (ADS)

The usage of cellular camera phones and digital cameras is rapidly increasing, but camera imaging applications are not so much due to the lack of practical camera imaging technology. Especially the acquisition environments of camera images are very different from those of scanner images. Light illumination, viewing distance and viewing angles constantly varies when we take a picture in indoor and outdoor. These variations make it difficult to extract character areas from images through binarization and the variation of camera viewing angles makes the images distorted geometrically. In this paper, these problems are totally discussed and the resolving methods are suggested for a better image recognition. The solutions such as adaptive binarization, color conversion, correction of lens distortion and correction of geometrical distortion are discussed and the sequence of correction processes are suggested for accurate document image recognition. In experiment, we use the various types of document images captured by mobile phone cameras and digital cameras. The results of distortion correction show that our image processing methods are efficient to increase the accuracy of character recognition for camera based document image.

Chung, YunKoo; Jang, DaeGeun; Yu, WonPil; Chi, SooYoung; Kim, KyeKyung; Soh, Jung

2004-10-01

127

Embedded image enhancement for high-throughput cameras  

NASA Astrophysics Data System (ADS)

This paper presents image enhancement for a novel Ultra-High-Definition (UHD) video camera offering 4K images and higher. Conventional image enhancement techniques need to be reconsidered for the high-resolution images and the low-light sensitivity of the new sensor. We study two image enhancement functions and evaluate and optimize the algorithms for embedded implementation in programmable logic (FPGA). The enhancement study involves high-quality Auto White Balancing (AWB) and Local Contrast Enhancement (LCE). We have compared multiple algorithms from literature, both with objective and subjective metrics. In order to objectively compare Local Contrast (LC), an existing LC metric is modified for LC measurement in UHD images. For AWB, we have found that color histogram stretching offers a subjective high image quality and it is among the algorithms with the lowest complexity, while giving only a small balancing error. We impose a color-to-color gain constraint, which improves robustness of low-light images. For local contrast enhancement, a combination of contrast preserving gamma and single-scale Retinex is selected. A modified bilateral filter is designed to prevent halo artifacts, while significantly reducing the complexity and simultaneously preserving quality. We show that by cascading contrast preserving gamma and single-scale Retinex, the visibility of details is improved towards the level appropriate for high-quality surveillance applications. The user is offered control over the amount of enhancement. Also, we discuss the mapping of those functions on a heterogeneous platform to come to an effective implementation while preserving quality and robustness.

Geerts, Stan J. C.; Cornelissen, Dion; de With, Peter H. N.

2014-03-01

128

Image Frequency Analysis for Testing of Fire Service Thermal Imaging Cameras  

Microsoft Academic Search

A growing number of first responders are purchasing infrared cameras (thermal imagers). Thermal imagers provide critical information\\u000a for many firefighting operations: size up, tracking of fire growth, location of victims and egress routes, overhaul, etc….\\u000a Although thermal imagers have proven to be useful tools for the fire service over the past 10–15 years, there is no existing\\u000a comprehensive performance standard for

Francine Amon; Alfred Ducharme

2009-01-01

129

Image deblurring using the direction dependence of camera resolution  

NASA Astrophysics Data System (ADS)

The blurring that occurs in the lens of a camera has a tendency to further degrade in areas away from the on-axis of the image. In addition, the degradation of the blurred image in an off-axis area exhibits directional dependence. Conventional methods have been known to use the Wiener filter or the Richardson-Lucy algorithm to mitigate the problem. These methods use the pre-defined point spread function (PSF) in the restoration process, thereby preventing an increase in the noise elements. However, the nonuniform degradation that depends on the direction is not improved even though the edges are emphasized by these conventional methods. In this paper, we analyze the directional dependence of resolution based on the modeling of an optical system using a blurred image. We propose a novel image deblurring method that employs a reverse filter based on optimizing the directional dependence coefficients of the regularization term in the maximum a posterior probability (MAP) algorithm. We have improved the directional dependence of resolution by optimizing the weight coefficients of the direction in which the resolution is degraded.

Hirai, Yukio; Yoshikawa, Hiroyasu; Shimizu, Masayoshi

2013-03-01

130

X-ray fluorescence camera for biomedical imaging  

NASA Astrophysics Data System (ADS)

X-Ray Fluorescence (XRF) analysis is useful for measuring density distributions of contrast media in vivo. An XRF camera was developed to carry out mapping for iodine-based contrast media used in medical angiography. In this camera, objects are exposed by an x-ray beam formed using a 3.0-mm-diameter lead hole. Next, cerium K-series characteristic x-rays are absorbed effectively by iodine media in objects, and iodine fluorescences are produced from the objects. Iodine K? fluorescences are selected out using a 58-?m-thick stannum filter and are detected by a cadmium telluride (CdTe) detector. K? rays are discriminated out by a multichannel analyzer (MCA), and photon number is counted by a counter board (CB). The objects are moved and scanned using an x-y stage driven by a two-stage controller, and x-ray images obtained by iodine mapping are shown in a personal computer (PC) monitor. In particular, iodine fluorescences were produced from remanent iodine elements in a cancer region of a rabbit ear.

Matsukiyo, Hiroshi; Sato, Eiichi; Abderyim, Purkhet; Osawa, Akihiro; Enomoto, Toshiyuki; Watanabe, Manabu; Nagao, Jiro; Nomiya, Seiichiro; Sato, Koetsu; Ogawa, Akira; Sato, Shigehiro; Ichimaru, Toshio

2008-08-01

131

Distant Supernova Remnant Imaged by Chandra's High Resolution Camera  

NASA Astrophysics Data System (ADS)

The High Resolution Camera (HRC), one of the two X-ray cameras on NASA's Chandra X-ray Observatory, was placed into the focus for the first time on Monday, August 30. The first target was LMC X-1, a point-like source of X rays in the Large Magellanic Cloud. The Large Magellanic Cloud, a companion galaxy to the Milky Way, is 160,000 light years from Earth. After checking the focus with LMC X-1, Chandra observed N132D, a remnant of an exploded star in the Large Magellanic Cloud. "These were preliminary test observations," emphasized Dr. Stephen Murray, of the Harvard-Smithsonian Center for Astrophysics, principal investigator for the High Resolution Camera. "But we are very pleased with the results. All indications are that the HRC will produce X-ray images of unprecedented clarity." The N132D image shows a highly structured remnant, or shell, of 10-million-degree gas that is 80 light years across. Such a shell in the vicinity of the Sun would encompass more than fifty nearby stars. The amount of material in the N132D hot gas remnant is equal to that of 600 suns. The N132D supernova remnant appears to be colliding with a giant molecular cloud, which produces the brightening on the southern rim of the remnant. The molecular cloud, visible with a radio telescope, has the mass of 300,000 suns. The relatively weak x-radiation on the upper left shows that the shock wave is expanding into a less dense region on the edge of the molecular cloud. A number of small circular structures are visible in the central regions and a hint of a large circular loop can be seen in the upper part of the remnant. Whether the peculiar shape of the supernova remnant can be fully explained in terms of these effects, or whether they point to a peculiar cylindrically shaped explosion remains to be seen. -more- "The image is so rich in structure that it will take a while to sort out what is really going on," Murray said. "It could be multiple supernovas, or absorbing clouds in the vicinity of the supernova." The unique capabilities of the HRC stem from the close match of its imaging capability to the focusing power of the mirrors. When used with the Chandra mirrors, the HRC will make images that reveal detail as small as one-half an arc second. This is equivalent to the ability to read a stop sign at a distance of twelve miles. The checkout period for the HRC will continue for the next few weeks, during which time the team expects to acquire images of other supernova remnants, star clusters, and starburst galaxies. To follow Chandra's progress, visit the Chandra News Web site at: http://chandra.harvard.edu AND http://chandra.nasa.gov NASA's Marshall Space Flight Center in Huntsville, Alabama, manages the Chandra X-ray Observatory for NASA's Office of Space Science, NASA Headquarters, Washington, D.C. The Smithsonian Astrophysical Observatory's Chandra X-ray Center in Cambridge, Mass., manages the Chandra science program and controls the observatory for NASA. TRW Space and Electronics Group of Redondo Beach, Calif., leads the contractor team that built Chandra. High resolution digital versions of the X-ray image (300 dpi JPG, TIFF) and other information associated with this release are available on the Internet at: http://chandra.harvard.edu/photo/0050/ or via links in: http://chandra.harvard.edu

1999-09-01

132

High-dynamic-range imaging for digital still camera  

NASA Astrophysics Data System (ADS)

The paper presents a collection of methods and algorithms able to deal with high dynamic range of real pictures acquired by digital engines (e.g. CCD/CMOS cameras). Accurate image acquisition can be not well suited under difficult light conditions. A few techniques that overcome the usual 8 bit-depth representations by using differently exposed pictures and recovering the original radiance values are reported. This allows capturing both low and highlight details, fusing the various pictures into a singe map, thus providing a more faithful description of what the real world scene was. However in order to be viewed on a common computer monitor the map needs to be re-quantized while preserving visibility of details. The main problem comes from the fact that usually the contrast of the radiance values is far greater than that of the display device. Various related techniques are reviewed and discussed.

Battiato, Sebastiano; Castorina, Alfio; Mancuso, Massimo

2002-04-01

133

2000-fps digital imager for replacing 16-mm film cameras  

Microsoft Academic Search

For many years 16 mm film cameras have been used in severe environments. These film cameras are used on Hy-G automotive sleds, airborne weapon testing, range tracking, and other hazardous environments. The companies and government agencies using these cameras are in need of replacing them with a more cost-effective solution. Film-based cameras still produce the best resolving capability. However, film

Kris S. Balch

1999-01-01

134

Solid state television camera has no imaging tube  

NASA Technical Reports Server (NTRS)

Camera with characteristics of vidicon camera and greater resolution than home TV receiver uses mosaic of phototransistors. Because of low power and small size, camera has many applications. Mosaics can be used as cathode ray tubes and analog-to-digital converters.

Huggins, C. T.

1972-01-01

135

Football players and ball trajectories projection from single camera's image  

Microsoft Academic Search

In this paper, we propose a method to track multiple players in a football match video which is captured by a single camera. The camera pans and brings the players and the ball into the view, enabling to record the whole pitch. The players' trajectories in frames and camera movement are obtained to estimate the trajectories on the pitch. Moreover,

Hirokatsu Kataoka; Yoshimitsu Aoki

2011-01-01

136

Applying and extending ISO/TC42 digital camera resolution standards to mobile imaging products  

NASA Astrophysics Data System (ADS)

There are no fundamental differences between today's mobile telephone cameras and consumer digital still cameras that suggest many existing ISO imaging performance standards do not apply. To the extent that they have lenses, color filter arrays, detectors, apertures, image processing, and are hand held, there really are no operational or architectural differences. Despite this, there are currently differences in the levels of imaging performance. These are driven by physical and economic constraints, and image-capture conditions. Several ISO standards for resolution, well established for digital consumer digital cameras, require care when applied to the current generation of cell phone cameras. In particular, accommodation of optical flare, shading non-uniformity and distortion are recommended. We offer proposals for the application of existing ISO imaging resolution performance standards to mobile imaging products, and suggestions for extending performance standards to the characteristic behavior of camera phones.

Williams, Don; Burns, Peter D.

2007-01-01

137

A simple enhancement algorithm for Time-of-Flight camera range images  

Microsoft Academic Search

The proposed algorithm, based on a modified Time of Flight (TOF) camera, computes a correction vector that corresponds to the average effect of the diffuse and flare light. The enhanced range image was obtained by subtracting this vector from the recorded vector image. I. INTRODUCTION The Time of Flight (ToF) camera is a 3D-movie camera type with a potential utilization

D. Falie; L. C. Ciobotaru

2011-01-01

138

LROC WAC 100 Meter Scale Photometrically Normalized Map of the Moon  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) monthly global observations allowed derivation of a robust empirical photometric solution over a broad range of incidence, emission and phase (i, e, g) angles. Combining the WAC stereo-based GLD100 [1] digital terrain model (DTM) and LOLA polar DTMs [2] enabled precise topographic corrections to photometric angles. Over 100,000 WAC observations at 643 nm were calibrated to reflectance (I/F). Photometric angles (i, e, g), latitude, and longitude were calculated and stored for each WAC pixel. The 6-dimensional data set was then reduced to 3 dimensions by photometrically normalizing I/F with a global solution similar to [3]. The global solution was calculated from three 2°x2° tiles centered on (1°N, 147°E), (45°N, 147°E), and (89°N, 147°E), and included over 40 million WAC pixels. A least squares fit to a multivariate polynomial of degree 4 (f(i,e,g)) was performed, and the result was the starting point for a minimum search solving the non-linear function min[{1-[ I/F / f(i,e,g)] }2]. The input pixels were filtered to incidence angles (calculated from topography) < 89° and I/F greater than a minimum threshold to avoid shadowed pixels, and the output normalized I/F values were gridded into an equal-area map projection at 100 meters/pixel. At each grid location the median, standard deviation, and count of valid pixels were recorded. The normalized reflectance map is the result of the median of all normalized WAC pixels overlapping that specific 100-m grid cell. There are an average of 86 WAC normalized I/F estimates at each cell [3]. The resulting photometrically normalized mosaic provides the means to accurately compare I/F values for different regions on the Moon (see Nuno et al. [4]). The subtle differences in normalized I/F can now be traced across the local topography at regions that are illuminated at any point during the LRO mission (while the WAC was imaging), including at polar latitudes. This continuous map of reflectance at 643 nm, normalized to a standard geometry of i=30, e=0, g=30, ranges from 0.036 to 0.36 (0.01%-99.99% of the histogram) with a global mean reflectance of 0.115. Immature rays of Copernican craters are typically >0.14 and maria are typically <0.07 with averages for individual maria ranging from 0.046 to 0.060. The materials with the lowest normalized reflectance on the Moon are pyroclastic deposits at Sinus Aestuum (<0.036) and those with the highest normalized reflectance are found on steep crater walls (>0.36)[4]. 1. Scholten et al. (2012) J. Geophys. Res., 117, doi: 10.1029/2011JE003926. 2. Smith et al. (2010), Geophys. Res. Lett., 37, L18204, doi:10.1029/2010GL043751. 3. Boyd et al. (2012) LPSC XLIII, #2795 4. Nuno et al. AGU, (this conference)

Boyd, A. K.; Nuno, R. G.; Robinson, M. S.; Denevi, B. W.; Hapke, B. W.

2013-12-01

139

Automatic Borders Detection of Camera Document Images N. Stamatopoulos, B. Gatos, A. Kesidis  

E-print Network

Automatic Borders Detection of Camera Document Images N. Stamatopoulos, B. Gatos, A. Kesidis document image is often framed by a noisy black border or includes noisy text regions from neighbouring by a digital camera by automatically detecting the document borders and cutting out noisy black borders as well

Gatos, Basilios

140

Low light imaging system with? Expanding spectrum band for digital camera  

Microsoft Academic Search

This paper presents the camera system for increasing sensitivity especially in the low light condition. The camera system consists of two parts. The one is the unique image sensor structure which includes special pixels with broad band spectrum response (White+Near IR) and the other is image processing algorithm which fuses color and White+Near IR (WNIR) information to produce high sensitivity

Byung Kwan Park; Sang-Wook Han; Wonhee Choe; JaeGuyn Lim; Youngjin Yoo; SeongDeok Lee

2012-01-01

141

Improving Resolution and Depth-of-Field of Light Field Cameras Using a Hybrid Imaging System  

E-print Network

Improving Resolution and Depth-of-Field of Light Field Cameras Using a Hybrid Imaging System Vivek, Kaushik.Mitra, vashok] @rice.edu Abstract Current light field (LF) cameras provide low spatial res megapixels no angular information. Figure 1: Fundamental resolution trade-off in light-field imaging: Given

Mellor-Crummey, John

142

Optimisation of noise equivalent count rates for brain and body FDG imaging using gamma camera PET  

Microsoft Academic Search

Describes the use of Noise Equivalent Count rates to optimise the clinical use of a modified dual headed gamma camera for positron emission tomography. Phantoms were used to simulate the distribution and uptake for brain and body 18F-fluorodeoxyglucose imaging, including the presence of activity outside the imaging field of view. Camera count rates were recorded as a function of activity

D. Visvikis; T. Fryer; S. Downey

1998-01-01

143

IMAGE-BASED PAN-TILT CAMERA CONTROL IN A MULTI-CAMERA SURVEILLANCE ENVIRONMENT  

E-print Network

Elgammal and Larry S. Davis University of Maryland, College Park Computer Vision Laboratory, UMIACS {sernam,Elgammal,lsd and experimental results show it to be highly accurate and reliable. 2. CAMERA CONTROL TECHNIQUE In this paper, we

Davis, Larry

144

Gamma camera-mounted anatomical X-ray tomography: technology, system characteristics and first images  

Microsoft Academic Search

.   Scintigraphic diagnosis, based on functional image interpretation, becomes more accurate and meaningful when supported by\\u000a corresponding anatomical data. In order to produce anatomical images that are inherently registered with images of emission\\u000a computerised tomography acquired with a gamma camera, an X-ray transmission system was mounted on the slip-ring gantry of\\u000a a GEMS Millennium VG gamma camera. The X-ray imaging

Moshe Bocher; Adi Balan; Yodphat Krausz; Yigal Shrem; Albert Lonn; Michael Wilk; Roland Chisin

2000-01-01

145

An improved wide-field camera for imaging Earth's plasmasphere at 30.4 nm  

NASA Astrophysics Data System (ADS)

The Extreme Ultraviolet Imager (IMAGE/EUV) aboard NASA's IMAGE mission studied the distribution of singly ionized helium (He+) in the Earth's plasmasphere by imaging its emission at 30.4 nm. This instrument consisted of three separate camera heads, each with a 28° field-of-view, with 0.6°resolution. We describe an improved imaging system that can simultaneously image a 40° field-of-view with 0.45° resolution utilizing only one compact camera head and detector. This improved imager also increases sensitivity over the heritage EUV imager by a factor of four due to improvements in optical coatings, detector technology, and a larger entrance aperture.

Davis, Michael W.; Gladstone, G. Randall; Goldstein, Jerry; Sandel, Bill R.; Greathouse, Thomas K.; Retherford, Kurt D.; Winters, Gregory S.

2013-09-01

146

Faint Object Camera imaging and spectroscopy of NGC 4151  

NASA Technical Reports Server (NTRS)

We describe ultraviolet and optical imaging and spectroscopy within the central few arcseconds of the Seyfert galaxy NGC 4151, obtained with the Faint Object Camera on the Hubble Space Telescope. A narrowband image including (O III) lambda(5007) shows a bright nucleus centered on a complex biconical structure having apparent opening angle approximately 65 deg and axis at a position angle along 65 deg-245 deg; images in bands including Lyman-alpha and C IV lambda(1550) and in the optical continuum near 5500 A, show only the bright nucleus. In an off-nuclear optical long-slit spectrum we find a high and a low radial velocity component within the narrow emission lines. We identify the low-velocity component with the bright, extended, knotty structure within the cones, and the high-velocity component with more confined diffuse emission. Also present are strong continuum emission and broad Balmer emission line components, which we attribute to the extended point spread function arising from the intense nuclear emission. Adopting the geometry pointed out by Pedlar et al. (1993) to explain the observed misalignment of the radio jets and the main optical structure we model an ionizing radiation bicone, originating within a galactic disk, with apex at the active nucleus and axis centered on the extended radio jets. We confirm that through density bounding the gross spatial structure of the emission line region can be reproduced with a wide opening angle that includes the line of sight, consistent with the presence of a simple opaque torus allowing direct view of the nucleus. In particular, our modelling reproduces the observed decrease in position angle with distance from the nucleus, progressing initially from the direction of the extended radio jet, through our optical structure, and on to the extended narrow-line region. We explore the kinematics of the narrow-line low- and high-velocity components on the basis of our spectroscopy and adopted model structure.

Boksenberg, A.; Catchpole, R. M.; Macchetto, F.; Albrecht, R.; Barbieri, C.; Blades, J. C.; Crane, P.; Deharveng, J. M.; Disney, M. J.; Jakobsen, P.

1995-01-01

147

Toward an image compression algorithm for the high-resolution electronic still camera  

NASA Technical Reports Server (NTRS)

Taking pictures with a camera that uses a digital recording medium instead of film has the advantage of recording and transmitting images without the use of a darkroom or a courier. However, high-resolution images contain an enormous amount of information and strain data-storage systems. Image compression will allow multiple images to be stored in the High-Resolution Electronic Still Camera. The camera is under development at Johnson Space Center. Fidelity of the reproduced image and compression speed are of tantamount importance. Lossless compression algorithms are fast and faithfully reproduce the image, but their compression ratios will be unacceptably low due to noise in the front end of the camera. Future efforts will include exploring methods that will reduce the noise in the image and increase the compression ratio.

Nerheim, Rosalee

1989-01-01

148

High performance imaging streak camera for the National Ignition Facility  

NASA Astrophysics Data System (ADS)

An x-ray streak camera platform has been characterized and implemented for use at the National Ignition Facility. The camera has been modified to meet the experiment requirements of the National Ignition Campaign and to perform reliably in conditions that produce high electromagnetic interference. A train of temporal ultra-violet timing markers has been added to the diagnostic in order to calibrate the temporal axis of the instrument and the detector efficiency of the streak camera was improved by using a CsI photocathode. The performance of the streak camera has been characterized and is summarized in this paper. The detector efficiency and cathode measurements are also presented.

Opachich, Y. P.; Kalantar, D. H.; MacPhee, A. G.; Holder, J. P.; Kimbrough, J. R.; Bell, P. M.; Bradley, D. K.; Hatch, B.; Brienza-Larsen, G.; Brown, C.; Brown, C. G.; Browning, D.; Charest, M.; Dewald, E. L.; Griffin, M.; Guidry, B.; Haugh, M. J.; Hicks, D. G.; Homoelle, D.; Lee, J. J.; Mackinnon, A. J.; Mead, A.; Palmer, N.; Perfect, B. H.; Ross, J. S.; Silbernagel, C.; Landen, O.

2012-12-01

149

High performance imaging streak camera for the National Ignition Facility.  

PubMed

An x-ray streak camera platform has been characterized and implemented for use at the National Ignition Facility. The camera has been modified to meet the experiment requirements of the National Ignition Campaign and to perform reliably in conditions that produce high electromagnetic interference. A train of temporal ultra-violet timing markers has been added to the diagnostic in order to calibrate the temporal axis of the instrument and the detector efficiency of the streak camera was improved by using a CsI photocathode. The performance of the streak camera has been characterized and is summarized in this paper. The detector efficiency and cathode measurements are also presented. PMID:23278024

Opachich, Y P; Kalantar, D H; MacPhee, A G; Holder, J P; Kimbrough, J R; Bell, P M; Bradley, D K; Hatch, B; Brienza-Larsen, G; Brown, C; Brown, C G; Browning, D; Charest, M; Dewald, E L; Griffin, M; Guidry, B; Haugh, M J; Hicks, D G; Homoelle, D; Lee, J J; Mackinnon, A J; Mead, A; Palmer, N; Perfect, B H; Ross, J S; Silbernagel, C; Landen, O

2012-12-01

150

Development of filter exchangeable 3CCD camera for multispectral imaging acquisition  

NASA Astrophysics Data System (ADS)

There are a lot of methods to acquire multispectral images. Dynamic band selective and area-scan multispectral camera has not developed yet. This research focused on development of a filter exchangeable 3CCD camera which is modified from the conventional 3CCD camera. The camera consists of F-mounted lens, image splitter without dichroic coating, three bandpass filters, three image sensors, filer exchangeable frame and electric circuit for parallel image signal processing. In addition firmware and application software have developed. Remarkable improvements compared to a conventional 3CCD camera are its redesigned image splitter and filter exchangeable frame. Computer simulation is required to visualize a pathway of ray inside of prism when redesigning image splitter. Then the dimensions of splitter are determined by computer simulation which has options of BK7 glass and non-dichroic coating. These properties have been considered to obtain full wavelength rays on all film planes. The image splitter is verified by two line lasers with narrow waveband. The filter exchangeable frame is designed to make swap bandpass filters without displacement change of image sensors on film plane. The developed 3CCD camera is evaluated to application of detection to scab and bruise on Fuji apple. As a result, filter exchangeable 3CCD camera could give meaningful functionality for various multispectral applications which need to exchange bandpass filter.

Lee, Hoyoung; Park, Soo Hyun; Kim, Moon S.; Noh, Sang Ha

2012-05-01

151

Fly's Eye camera system: optical imaging using a hexapod platform  

NASA Astrophysics Data System (ADS)

The Fly's Eye Project is a high resolution, high coverage time-domain survey in multiple optical passbands: our goal is to cover the entire visible sky above the 30° horizontal altitude with a cadence of ~3 min. Imaging is going to be performed by 19 wide-field cameras mounted on a hexapod platform resembling a fly's eye. Using a hexapod developed and built by our team allows us to create a highly fault-tolerant instrument that uses the sky as a reference to define its own tracking motion. The virtual axis of the platform is automatically aligned with the Earth's rotational axis; therefore the same mechanics can be used independently from the geographical location of the device. Its enclosure makes it capable of autonomous observing and withstanding harsh environmental conditions. We briefly introduce the electrical, mechanical and optical design concepts of the instrument and summarize our early results, focusing on sidereal tracking. Due to the hexapod design and hence the construction is independent from the actual location, it is considerably easier to build, install and operate a network of such devices around the world.

Jaskó, Attila; Pál, András.; Vida, Krisztián.; Mészáros, László; Csépány, Gergely; Mez?, György

2014-07-01

152

Cloud Detection with the Earth Polychromatic Imaging Camera (EPIC)  

NASA Technical Reports Server (NTRS)

The Earth Polychromatic Imaging Camera (EPIC) on board the Deep Space Climate Observatory (DSCOVR) would provide a unique opportunity for Earth and atmospheric research due not only to its Lagrange point sun-synchronous orbit, but also to the potential for synergistic use of spectral channels in both the UV and visible spectrum. As a prerequisite for most applications, the ability to detect the presence of clouds in a given field of view, known as cloud masking, is of utmost importance. It serves to determine both the potential for cloud contamination in clear-sky applications (e.g., land surface products and aerosol retrievals) and clear-sky contamination in cloud applications (e.g., cloud height and property retrievals). To this end, a preliminary cloud mask algorithm has been developed for EPIC that applies thresholds to reflected UV and visible radiances, as well as to reflected radiance ratios. This algorithm has been tested with simulated EPIC radiances over both land and ocean scenes, with satisfactory results. These test results, as well as algorithm sensitivity to potential instrument uncertainties, will be presented.

Meyer, Kerry; Marshak, Alexander; Lyapustin, Alexei; Torres, Omar; Wang, Yugie

2011-01-01

153

The imaging system design of three-line LMCCD mapping camera  

NASA Astrophysics Data System (ADS)

In this paper, the authors introduced the theory about LMCCD (line-matrix CCD) mapping camera firstly. On top of the introduction were consists of the imaging system of LMCCD mapping camera. Secondly, some pivotal designs which were Introduced about the imaging system, such as the design of focal plane module, the video signal's procession, the controller's design of the imaging system, synchronous photography about forward and nadir and backward camera and the nadir camera of line-matrix CCD. At last, the test results of LMCCD mapping camera imaging system were introduced. The results as following: the precision of synchronous photography about forward and nadir and backward camera is better than 4 ns and the nadir camera of line-matrix CCD is better than 4 ns too; the photography interval of line-matrix CCD of the nadir camera can satisfy the butter requirements of LMCCD focal plane module; the SNR tested in laboratory is better than 95 under typical working condition(the solar incidence degree is 30, the reflectivity of the earth's surface is 0.3) of each CCD image; the temperature of the focal plane module is controlled under 30° in a working period of 15 minutes. All of these results can satisfy the requirements about the synchronous photography, the temperature control of focal plane module and SNR, Which give the guarantee of precision for satellite photogrammetry.

Zhou, Huai-de; Liu, Jin-Guo; Wu, Xing-Xing; Lv, Shi-Liang; Zhao, Ying; Yu, Da

2011-08-01

154

Simple Camera Calibration From a Single Image Using Five Points on Two Orthogonal 1-D Objects  

Microsoft Academic Search

We propose a simple and practical calibration technique that effectively estimates camera parameters from just five points on two orthogonal 1-D objects, each which has three collinear points, one of which is shared. We derive the basic equations needed to realize camera calibration from just five points observed on a single image that captures the objects. We describe a new

Isao Miyagawa; Hiroyuki Arai; Hideki Koike

2010-01-01

155

Adaptive image feature prediction and control for visual tracking with a hand-eye coordinated camera  

Microsoft Academic Search

An adaptive method for visually tracking a known moving object with a single mobile camera is described. The method differs from previous methods of motion estimation in that both the camera and the object are moving. The objective is to predict the location of features of the object on the image plane based on past observations and past control inputs

J. T. Feddema; C. S. G. Lee

1990-01-01

156

Sensor Fingerprint Digests for Fast Camera Identification from Geometrically Distorted Images  

E-print Network

Sensor Fingerprint Digests for Fast Camera Identification from Geometrically Distorted Images,fridrich}@binghamton.edu ABSTRACT In camera identification using sensor fingerprint, it is absolutely essential that the fingerprint to a geometrical trans- formation, fingerprint detection becomes significantly more complicated. Besides

Fridrich, Jessica

157

Development of Wide-Field Imaging Camera for Zodiacal Light Observation  

Microsoft Academic Search

We have developed a wide-field imaging camera system, called WICZO, to monitor light of the night sky over extended period. Such monitoring is necessary for studying the morphology of interplanetary dust cloud and also the time and spatial variations of airglow emission. The system consists of an electric cooler, a CCD camera with ~60% quantum efficiency at 500nm, and a

S. M. Kwon; S. S. Hong; K. J. Shin

2004-01-01

158

ON-ORBIT STELLAR CAMERA CALIBRATION BASED ON SPACE RESECTION WITH MULTI-IMAGES  

Microsoft Academic Search

Due to the intens e vibration and\\/or the change of space environmental factors such as temp erature, pressure, etc., the stellar camera parameters might change during satellites launching or while orbiting the earth, which causes the decline of satellite attitude accuracy. So the on-orbit calibration is essential for the stellar camera. In this paper, the selection of star image for

Xie Junfeng; Jiang Wanshou; Gong Jianya

159

I IMAGE FORMATION 1 1 Geometric Camera Models 3  

E-print Network

. . . . . . . . . . . . . . . . . . . . . . . . . 6 1.1.3 Cameras with Lenses . . . . . . . . . . . . . . . . . . . . . . . 8 1.1.4 The Human Eye Color 68 3.1 Human Color Perception . . . . . . . . . . . . . . . . . . . . . . . . . 68 3.1.1 Color Matching . . . . . . . . . . . . . . . . . . . . . . . . . . 68 3.1.2 Color Receptors

Forsyth, David

160

Fast visible 2-D camera imaging on NSTX  

NASA Astrophysics Data System (ADS)

Initial applications of 2-D cameras on tokamak devices were mainly used to gain qualitative insight into global dynamics such as plasma formation and motion. Quantitative data were limited, due in large part to the time scales of most plasma events that were far faster than the frame rate of the camera and the inability to resolve structures at the characteristic plasma scale lengths. Recent advances in the development of fast cameras increased the frame rate up to megahertz rates while exposure times as short as 3nsec have been achieved with intensifiers. NSTX supports 8 fast visible cameras with framing rates ranging from 10^3 to 10^6 frames/sec with resolution from 64 X 64 to 1024 X 1080 pixels at the shortest exposure times. These cameras have been used to study plasma phenomena including edge and divertor turbulence, ELMs, and impurity transport. The most recent applications include studies of supersonic gas jets, pellet ablation, incandescent dust behavior, and measurement of fast ion loss of neutral beam injected particles. The capabilities of each camera will be presented and movies of plasma phenomena recorded with them will be presented.

Roquemore, A. L.; Nishino, N.; Maqueda, R.; Bush, C.; Dorf, L.; Shinohara, K.

2005-10-01

161

Imaging Asteroid 4 Vesta Using the Framing Camera  

NASA Technical Reports Server (NTRS)

The Framing Camera (FC) onboard the Dawn spacecraft serves a dual purpose. Next to its central role as a prime science instrument it is also used for the complex navigation of the ion drive spacecraft. The CCD detector with 1024 by 1024 pixels provides the stability for a multiyear mission and its high requirements of photometric accuracy over the wavelength band from 400 to 1000 nm covered by 7 band-pass filters. Vesta will be observed from 3 orbit stages with image scales of 227, 63, and 17 m/px, respectively. The mapping of Vesta s surface with medium resolution will be only completed during the exit phase when the north pole will be illuminated. A detailed pointing strategy will cover the surface at least twice at similar phase angles to provide stereo views for reconstruction of the topography. During approach the phase function of Vesta was determined over a range of angles not accessible from earth. This is the first step in deriving the photometric function of the surface. Combining the topography based on stereo tie points with the photometry in an iterative procedure will disclose details of the surface morphology at considerably smaller scales than the pixel scale. The 7 color filters are well positioned to provide information on the spectral slope in the visible, the depth of the strong pyroxene absorption band, and their variability over the surface. Cross calibration with the VIR spectrometer that extends into the near IR will provide detailed maps of Vesta s surface mineralogy and physical properties. Georeferencing all these observation will result in a coherent and unique data set. During Dawn s approach and capture FC has already demonstrated its performance. The strong variation observed by the Hubble Space Telescope can now be correlated with surface units and features. We will report on results obtained from images taken during survey mode covering the whole illuminated surface. Vesta is a planet-like differentiated body, but its surface gravity and escape velocity are comparable to those of other asteroids and hence much smaller than those of the inner planets or

Keller, H. Uwe; Nathues, Andreas; Coradini, Angioletta; Jaumann, Ralf; Jorda, Laurent; Li, Jian-Yang; Mittlefehldt, David W.; Mottola, Stefano; Raymond, C. A.; Schroeder, Stefan E.

2011-01-01

162

Effects of environment factors on imaging performance of long focal length space camera  

NASA Astrophysics Data System (ADS)

In course of developing, testing, launching and working in orbit, Space camera has to undergo the shock of external loads and changing environment. The optical performance of a long focal length space camera is largely determined by external mechanical loads and ambient temperature. The performance of the camera is a result of the interaction between environment factors. The performance of the optical system should be making an accurate forecast when a modern optical instrument is designed. In this paper, the research methods are reviewed firstly. Then the related technologies are described. The analysis methods of environment temperature and structural characteristics effecting space camera imaging performance are also discussed.

Guo, Quanfeng; Jin, Guang; Dong, Jihong; Li, Wei; Li, Yanchun; Wang, Haiping; Wang, Kejun; Zhao, Weiguo

2012-10-01

163

A Prediction Method of TV Camera Image for Space Manual-control Rendezvous and Docking  

NASA Astrophysics Data System (ADS)

Space manual-control rendezvous and docking (RVD) is a key technology for accomplishing the RVD mission in manned space engineering, especially when automatic control system is out of work. The pilot on chase spacecraft manipulates the hand-stick by the image of target spacecraft captured by TV camera. From the TV image, the relative position and attitude of chase and target spacecrafts can be shown. Therefore, the size, the position, the brightness and the shadow of the target on TV camera are key to guarantee the success of manual-control RVD. A method of predicting the on-orbit TV camera image at different relative positions and light conditions during the process of RVD is discussed. Firstly, the basic principle of capturing the image of cross drone on target spacecraft by TV camera is analyzed theoretically, based which the strategy of manual-control RVD is discussed in detail. Secondly, the relationship between the displayed size or position and the real relative distance of chase and target spacecrafts is presented, the brightness and reflection by the target spacecraft at different light conditions are decribed, the shadow on cross drone caused by the chase or target spacecraft is analyzed. Thirdly, a prediction method of on-orbit TV camera images at certain orbit and light condition is provided, and the characteristics of TV camera image during the RVD is analyzed. Finally, the size, the position, the brightness and the shadow of target spacecraft on TV camera image at typical orbit is simulated. The result, by comparing the simulated images with the real images captured by the TV camera on Shenzhou manned spaceship , shows that the prediction method is reasonable

Zhen, Huang; Qing, Yang; Wenrui, Wu

164

NOTE: Measuring the imaged-object distance with a stationary high-spatial-resolution scintillation camera  

NASA Astrophysics Data System (ADS)

A method to measure the detector-to-object distance from the images obtained with stationary high-spatial-resolution gamma-ray cameras for in vivo studies has been developed. It exploits the shift of the imaged object in the image plane, obtained at a certain tilt of the parallel-hole collimator. A linear dependence of the image displacement on the distance to the object has been measured using a high-spatial-resolution scintillation camera employing an yttrium aluminium perovskite (YAP) scintillator. It is shown that the modified YAP camera can be used to obtain three-dimensional information without moving the camera or the object. The method could be applied in scintimammography and radioguided surgery, in lymphoscintigraphy, as well as in the analysis of the biodistribution of radiopharmaceuticals.

Uzunov, N.; Bello, M.; Boccaccio, P.; Moschini, G.; Bollini, D.; de Notaristefani, F.

2006-05-01

165

Image Intensifier Modules For Use With Commercially Available Solid State Cameras  

NASA Astrophysics Data System (ADS)

A modular approach to design has contributed greatly to the success of the family of machine vision video equipment produced by EG&G Reticon during the past several years. Internal modularity allows high-performance area (matrix) and line scan cameras to be assembled with two or three electronic subassemblies with very low labor costs, and permits camera control and interface circuitry to be realized by assemblages of various modules suiting the needs of specific applications. Product modularity benefits equipment users in several ways. Modular matrix and line scan cameras are available in identical enclosures (Fig. 1), which allows enclosure components to be purchased in volume for economies of scale and allows field replacement or exchange of cameras within a customer-designed system to be easily accomplished. The cameras are optically aligned (boresighted) at final test; modularity permits optical adjustments to be made with the same precise test equipment for all camera varieties. The modular cameras contain two, or sometimes three, hybrid microelectronic packages (Fig. 2). These rugged and reliable "submodules" perform all of the electronic operations internal to the camera except for the job of image acquisition performed by the monolithic image sensor. Heat produced by electrical power dissipation in the electronic modules is conducted through low resistance paths to the camera case by the metal plates, which results in a thermally efficient and environmentally tolerant camera with low manufacturing costs. A modular approach has also been followed in design of the camera control, video processor, and computer interface accessory called the Formatter (Fig. 3). This unit can be attached directly onto either a line scan or matrix modular camera to form a self-contained units, or connected via a cable to retain the advantages inherent to a small, light weight, and rugged image sensing component. Available modules permit the bus-structured Formatter to be configured as required by a specific camera application. Modular line and matrix scan cameras incorporating sensors with fiber optic faceplates (Fig 4) are also available. These units retain the advantages of interchangeability, simple construction, ruggedness, and optical precision offered by the more common lens input units. Fiber optic faceplate cameras are used for a wide variety of applications. A common usage involves mating of the Reticon-supplied camera to a customer-supplied intensifier tube for low light level and/or short exposure time situations.

Murphy, Howard; Tyler, Al; Lake, Donald W.

1989-04-01

166

Estimating canopy cover from color digital camera image of rice field  

Microsoft Academic Search

Canopy cover (CC) is a good predictor variable for plant growth parameters such as leaf area index and aboveground biomass.\\u000a A nondestructive, low-cost, and convenient method is presented for estimating CC using digital camera image analysis. CC was\\u000a estimated by the ratio of plant pixels to total pixels of digital camera image of rice field. To determine the criteria for

Kyu-Jong Lee; Byun-Woo Lee

2011-01-01

167

Development of an ultra-violet digital camera for volcanic SO2 imaging  

Microsoft Academic Search

In an effort to improve monitoring of passive volcano degassing, we have constructed and tested a digital camera for quantifying the sulfur dioxide (SO2) content of volcanic plumes. The camera utilizes a bandpass filter to collect photons in the ultra-violet (UV) region where SO2 selectively absorbs UV light. SO2 is quantified by imaging calibration cells of known SO2 concentrations. Images

G. J. S. Bluth; J. M. Shannon; I. M. Watson; A. J. Prata; V. J. Realmuto

2007-01-01

168

Development of an ultra-violet digital camera for volcanic SO 2 imaging  

Microsoft Academic Search

In an effort to improve monitoring of passive volcano degassing, we have constructed and tested a digital camera for quantifying the sulfur dioxide (SO2) content of volcanic plumes. The camera utilizes a bandpass filter to collect photons in the ultra-violet (UV) region where SO2 selectively absorbs UV light. SO2 is quantified by imaging calibration cells of known SO2 concentrations.Images of

G. J. S. Bluth; J. M. Shannon; I. M. Watson; A. J. Prata; V. J. Realmuto

2007-01-01

169

Optimisation of noise equivalent count rates for brain and body FDG imaging using gamma camera PET  

Microsoft Academic Search

Describes the use of noise-equivalent count (NEC) rates to optimise the clinical use of a modified dual-headed gamma camera for positron emission tomography (PET). Phantoms were used to simulate the distribution and uptake for brain and body 18F-fluorodeoxyglucose (FDG) imaging, including the presence of activity outside the imaging field of view. Camera count rates were recorded as a function of

D. Visvikis; T. Fryer; S. Downey

1999-01-01

170

Image fusion for a digital camera application based on wavelet domain hidden Markov models  

Microsoft Academic Search

The traditional image fusion for a digital camera application may not be satisfactory to classify pixels in the source image by the statistical techniques. In this paper, we present a technique, based on wavelet domain hidden Markov models (HMMs) and max-likelihood estimation. The method presented here consists of deciding the quality of pixels in source images directly from the statistical

S. X. Hu; Z. W. Liao; Y. Y. Tang

2004-01-01

171

Development of Slow Scan Digital CCD Camera for Low light level Image  

Microsoft Academic Search

this paper studies the method of the development of low cost and high resolving power scientific grade camera for low light level image, its image can be received by computer. The main performance parameter and readout driving signal are introduced, the total scheme of image acquisition is designed. Using computer Expand Parallel Port and the pipelining work method of readout,

YAOYU CHENG; YAN HU; YONGHONG LI

172

Low light level imaging using a standard TV camera with a videorecorder  

Microsoft Academic Search

Low light level imaging is possible using a standard TV camera with a video recorder, if the integration properties of the vidicon tubes or CCDs are exploited. Only a simple additional electronic circuit is necessary. The field of application is for static or slowly variable images, the integration time being about equal to the sampling period. Analogue image display and

G. Zanella; M. Vascon; A. Cavestro

1983-01-01

173

BlurBurst: Removing Blur Due to Camera Shake using Multiple Images  

E-print Network

and Phrases: Deblurring, Multiple images, Tele- photo imaging, Low-light photography, HDR imaging ACM telephoto and low-light photogra- phy -- where camera shake produces a blur kernel that is large enoughBurst in tele- photo and low-light photography and highlight broader uses in hand-held high dynamic-range (HDR

174

Mosaicing of acoustic camera images K. Kim, N. Neretti and N. Intrator  

E-print Network

environments with low visibility. For optical images in those environments, often natural light, inhomogeneous illumination and low frame rate is presented. Imaging geometry of acoustic cameras. It follows that the SNR is significantly lower than in optical images. (iii) Low resolution with respect

Intrator, Nathan

175

Investigation of noise equivalent count rate in positron imaging using a dual head gamma camera  

Microsoft Academic Search

In positron imaging, image quality depends on scatter, random and coincidence rate. It is known that noise equivalent count (NEC) rate is a good indicator of image quality and that it helps optimizing acquisition parameters. We measured the NEC curve with a SophyCamera DST (SMV) using a 20 cm cylinder phantom filled with 18F (59 MBq). The field of view

D. Brasse; M. Tararine; O. Lamer; B. Bendriem

1998-01-01

176

A high speed camera system based on an image sensor in standard CMOS technology  

Microsoft Academic Search

In this contribution a novel camera system developed for high speed imaging will be presented. The core of the system consists of a CMOS image sensor manufactured in a 1 ?m standard CMOS process. The special merit of the image sensor is the capability to acquire more than 1000 frames\\/s using a global electronic shutter in each sensor cell. The

Nenad Stevanovic; Matthias Hillebrand; Bedrich J. Hosticka; Uri Iurgel; Andreas Teuner

1999-01-01

177

An image compression algorithm for a high-resolution digital still camera  

NASA Technical Reports Server (NTRS)

The Electronic Still Camera (ESC) project will provide for the capture and transmission of high-quality images without the use of film. The image quality will be superior to video and will approach the quality of 35mm film. The camera, which will have the same general shape and handling as a 35mm camera, will be able to send images to earth in near real-time. Images will be stored in computer memory (RAM) in removable cartridges readable by a computer. To save storage space, the image will be compressed and reconstructed at the time of viewing. Both lossless and loss-y image compression algorithms are studied, described, and compared.

Nerheim, Rosalee

1989-01-01

178

Be Foil "Filter Knee Imaging" NSTX Plasma with Fast Soft X-ray Camera  

SciTech Connect

A fast soft x-ray (SXR) pinhole camera has been implemented on the National Spherical Torus Experiment (NSTX). This paper presents observations and describes the Be foil Filter Knee Imaging (FKI) technique for reconstructions of a m/n=1/1 mode on NSTX. The SXR camera has a wide-angle (28{sup o}) field of view of the plasma. The camera images nearly the entire diameter of the plasma and a comparable region in the vertical direction. SXR photons pass through a beryllium foil and are imaged by a pinhole onto a P47 scintillator deposited on a fiber optic faceplate. An electrostatic image intensifier demagnifies the visible image by 6:1 to match it to the size of the charge-coupled device (CCD) chip. A pair of lenses couples the image to the CCD chip.

B.C. Stratton; S. von Goeler; D. Stutman; K. Tritz; L.E. Zakharov

2005-08-08

179

LROC NAC Photometry as a Tool for Studying Physical and Compositional Properties of the Lunar Surface  

NASA Astrophysics Data System (ADS)

LROC NAC photometry has been used to study the effects of rocket exhaust on lunar soil properties, and here we apply the same photometric methods to place compositional constraints on regions of silicic volcanism and pure anorthosite on the Moon.

Clegg, R. N.; Jolliff, B. L.; Boyd, A. K.; Stopar, J. D.; Sato, H.; Robinson, M. S.; Hapke, B. W.

2014-10-01

180

Temperature resolution enhancing of commercially available THz passive cameras due to computer processing of images  

NASA Astrophysics Data System (ADS)

As it is well-known, application of the passive THz camera for the security problems is very promising way. It allows seeing concealed object without contact with a person and this camera is non-dangerous for a person. Efficiency of using the passive THz camera depends on its temperature resolution. This characteristic specifies possibilities of the detection of concealed object: minimal size of the object, maximal distance of the detection, image detail. One of probable ways for a quality image enhancing consists in computer processing of image. Using computer processing of the THz image of objects concealed on the human body, one may improve it many times. Consequently, the instrumental resolution of such device may be increased without any additional engineering efforts. We demonstrate new possibilities for seeing the clothes details, which raw images, produced by the THz cameras, do not allow to see. We achieve good quality of the image due to applying various spatial filters with the aim to demonstrate independence of processed images on math operations. This result demonstrates a feasibility of objects seeing. We consider images produced by THz passive cameras manufactured by Microsemi Corp., and ThruVision Corp., and Capital Normal University (Beijing, China).

Trofimov, Vyacheslav A.; Trofimov, Vladislav V.; Kuchik, Igor E.

2014-06-01

181

Joint estimation of high resolution images and depth maps from light field cameras  

NASA Astrophysics Data System (ADS)

Light field cameras are attracting much attention as tools for acquiring 3D information of a scene through a single camera. The main drawback of typical lenselet-based light field cameras is the limited resolution. This limitation comes from the structure where a microlens array is inserted between the sensor and the main lens. The microlens array projects 4D light field on a single 2D image sensor at the sacrifice of the resolution; the angular resolution and the position resolution trade-off under the fixed resolution of the image sensor. This fundamental trade-off remains after the raw light field image is converted to a set of sub-aperture images. The purpose of our study is to estimate a higher resolution image from low resolution sub-aperture images using a framework of super-resolution reconstruction. In this reconstruction, these sub-aperture images should be registered as accurately as possible. This registration is equivalent to depth estimation. Therefore, we propose a method where super-resolution and depth refinement are performed alternatively. Most of the process of our method is implemented by image processing operations. We present several experimental results using a Lytro camera, where we increased the resolution of a sub-aperture image by three times horizontally and vertically. Our method can produce clearer images compared to the original sub-aperture images and the case without depth refinement.

Ohashi, Kazuki; Takahashi, Keita; Fujii, Toshiaki

2014-03-01

182

UCXp camera imaging principle and key technologies of data post-processing  

NASA Astrophysics Data System (ADS)

The large format digital aerial camera product UCXp was introduced into the Chinese market in 2008, the image consists of 17310 columns and 11310 rows with a pixel size of 6 mm. The UCXp camera has many advantages compared with the same generation camera, with multiple lenses exposed almost at the same time and no oblique lens. The camera has a complex imaging process whose principle will be detailed in this paper. On the other hand, the UCXp image post-processing method, including data pre-processing and orthophoto production, will be emphasized in this article. Based on the data of new Beichuan County, this paper will describe the data processing and effects.

Yuan, Fangyan; Li, Guoqing; Zuo, Zhengli; Liu, Jianmin; Wu, Liang; Yu, Xiaoping; Zhao, Haitao

2014-03-01

183

DEFINITION OF AIRWAY COMPOSITION WITHIN GAMMA CAMERA IMAGES  

EPA Science Inventory

The efficacies on inhaled pharmacologic drugs in the prophylaxis and treatment if airway diseases could be improved if particles were selectively directed to appropriate Sites. n the medical arena, planar gamma scintillation cameras may be employed to study factors affecting such...

184

Color enhancement of image of earth photographed by UV camera  

NASA Technical Reports Server (NTRS)

A color enhancement of an ultraviolet photograph of the geocorona, a halo of low density hydrogen around the Earth. Sunlight is shining from the left, and the geocorona is brighter on that side. The UV camera was operated by Astronaut John W. Young on the Apollo 16 lunar landing mission.

1972-01-01

185

A Low-Cost Imaging Method to Avoid Hand Shake Blur for Cell Phone Cameras  

NASA Astrophysics Data System (ADS)

In this letter, a novel imaging method to reduce the hand shake blur of a cell phone camera without using frame memory is proposed. The method improves the captured image in real time through the use of two additional preview images whose parameters can be calculated in advance and stored in a look-up table. The method does not require frame memory, and thus it can significantly reduce the chip size. The scheme is suitable for integration into a low-cost image sensor of a cell phone camera.

Luo, Lin-Bo; Chong, Jong-Wha

186

Suite of proposed imaging performance metrics and test methods for fire service thermal imaging cameras  

NASA Astrophysics Data System (ADS)

The use of thermal imaging cameras (TIC) by the fire service is increasing as fire fighters become more aware of the value of these tools. The National Fire Protection Association (NFPA) is currently developing a consensus standard for design and performance requirements for TIC as used by the fire service. This standard will include performance requirements for TIC design robustness and image quality. The National Institute of Standards and Technology facilitates this process by providing recommendations for science-based performance metrics and test methods to the NFPA technical committee charged with the development of this standard. A suite of imaging performance metrics and test methods based on the harsh operating environment and limitations of use particular to the fire service has been proposed for inclusion in the standard. The performance metrics include large area contrast, effective temperature range, spatial resolution, nonuniformity, and thermal sensitivity. Test methods to measure TIC performance for these metrics are in various stages of development. An additional procedure, image recognition, has also been developed to facilitate the evaluation of TIC design robustness. The pass/fail criteria for each of these imaging performance metrics are derived from perception tests in which image contrast, brightness, noise, and spatial resolution are degraded to the point that users can no longer consistently perform tasks involving TIC due to poor image quality.

Amon, Francine; Lock, Andrew; Bryner, Nelson

2008-04-01

187

New 1000x1000 frame transfer CCD camera for high-speed high-fidelity digital imaging  

NASA Astrophysics Data System (ADS)

DALSA's new CA-D4-1024 camera is the state-of-the-art in high-resolution, high speed, high fidelity, CCD area array image capture camera technology. Its features marry the best of both high frame rate cameras and slow-scan scientific cameras. These include ultra-low dark current at room temperature operation, low noise, high signal capacity, high fill-factor, electronic shuttering, low smear, anti- blooming, 1000 X 1000 resolution, progressive scan, user controlled frame rates up to 40 f.p.s., in-camera digitization, compact camera size, low camera power dissipation,and both 8 to 12 bit video digitization options.

Litwiller, David J.

1997-04-01

188

X-ray imaging using a consumer-grade digital camera  

NASA Astrophysics Data System (ADS)

The recent advancements in consumer-grade digital camera technology and the introduction of high-resolution, high sensitivity CsBr:Eu 2+ storage phosphor imaging plates make possible a new cost-effective technique for X-ray imaging. The imaging plate is bathed with red stimulating light by high-intensity light-emitting diodes, and the photostimulated image is captured with a digital single-lens reflex (SLR) camera. A blue band-pass optical filter blocks the stimulating red light but transmits the blue photostimulated luminescence. Using a Canon D5 Mk II camera and an f1.4 wide-angle lens, the optical image of a 240×180 mm 2 Konica CsBr:Eu 2+ imaging plate from a position 230 mm in front of the camera lens can be focussed so as to laterally fill the 35×23.3 mm 2 camera sensor, and recorded in 2808×1872 pixel elements, corresponding to an equivalent pixel size on the plate of 88 ?m. The analogue-to-digital conversion from the camera electronics is 13 bits, but the dynamic range of the imaging system as a whole is limited in practice by noise to about 2.5 orders of magnitude. The modulation transfer function falls to 0.2 at a spatial frequency of 2.2 line pairs/mm. The limiting factor of the spatial resolution is light scattering in the plate rather than the camera optics. The limiting factors for signal-to-noise ratio are shot noise in the light, and dark noise in the CMOS sensor. Good quality images of high-contrast objects can be recorded with doses of approximately 1 mGy. The CsBr:Eu 2+ plate has approximately three times the readout sensitivity of a similar BaFBr:Eu 2+ plate.

Winch, N. M.; Edgar, A.

2011-10-01

189

Compact camera for multispectral and conventional imaging based on patterned filters.  

PubMed

A multispectral camera concept is presented. The concept is based on using a patterned filter in the focal plane, combined with scanning of the field of view. The filter layout has stripes of different bandpass filters extending orthogonally to the scan direction. The pattern of filter stripes is such that all bands are sampled multiple times, while minimizing the total duration of the sampling of a given scene point. As a consequence, the filter needs only a small part of the area of an image sensor. The remaining area can be used for conventional 2D imaging. A demonstrator camera has been built with six bands in the visible and near infrared, as well as a panchromatic 2D imaging capability. Image recording and reconstruction is demonstrated, but the quality of image reconstruction is expected to be a main challenge for systems based on this concept. An important advantage is that the camera can potentially be made very compact, and also low cost. It is shown that under assumptions that are not unreasonable, the proposed camera concept can be much smaller than a conventional imaging spectrometer. In principle, it can be smaller in volume by a factor on the order of several hundred while collecting the same amount of light per multispectral band. This makes the proposed camera concept very interesting for small airborne platforms and other applications requiring compact spectral imagers. PMID:24921891

Skauli, Torbjørn; Torkildsen, Hans Erling; Nicolas, Stephane; Opsahl, Thomas; Haavardsholm, Trym; Kåsen, Ingebjørg; Rognmo, Atle

2014-05-01

190

Lobate Scarp Modeling with Lunar Reconnaissance Orbiter Camera Digital Terrain Models  

NASA Astrophysics Data System (ADS)

Lobate scarps are a type of contractional tectonic landform expressed on the Moon's surface in both highlands and maria. Typically only tens of meters in relief, these linear or curvilinear topographic rises are interpreted to be low-angle thrust fault scarps resulting from global radial contraction. Radial contraction of the Moon can be inferred from shortening across the population of lobate scarps and is estimated at ~100 m. However, the geometry and depth of the underlying faults and mechanical properties of the near-surface lunar crustal materials are not well constrained. The Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Cameras (NACs) acquire 0.5 to 2.0 m/pixel panchromatic images and digital terrain models (DTMs) with spatial resolutions of 2 m are derived from NAC stereo pairs. Topographic data are being used to constrain models of the lobate scarp thrust faults. DTMs are analyzed for relief and morphology of the Slipher (48.3°N, 160.6°E), Racah X-1 (10°S, 178°E), and Simpelius-1 (73.5°S, 13°E) scarps. Profiles are extracted, detrended, and compared along strike. LROC Wide Angle Camera (WAC) 100 m/pixel image mosaics and topography provide regional contexts. Using elastic dislocation modeling, the fault dip angles, depths, slip, and taper are each varied until the predicted surface displacement best fits the DTM profiles for each lobate scarp. Preliminary best-fit dip angles vary from 30-40°, maximum fault depths extend to several hundred meters, and the amount of slip varies from 10 to 30 meters for the three scarps. The modeled maximum depths suggest that the thrust faults are not deeply rooted.

Williams, N. R.; Watters, T. R.; Pritchard, M. E.; Banks, M. E.; Bell, J. F.; Robinson, M. S.; Tran, T.

2011-12-01

191

High-frame-rate intensified fast optically shuttered TV cameras with selected imaging applications  

SciTech Connect

This invited paper focuses on high speed electronic/electro-optic camera development by the Applied Physics Experiments and Imaging Measurements Group (P-15) of Los Alamos National Laboratory`s Physics Division over the last two decades. The evolution of TV and image intensifier sensors and fast readout fast shuttered cameras are discussed. Their use in nuclear, military, and medical imaging applications are presented. Several salient characteristics and anomalies associated with single-pulse and high repetition rate performance of the cameras/sensors are included from earlier studies to emphasize their effects on radiometric accuracy of electronic framing cameras. The Group`s test and evaluation capabilities for characterization of imaging type electro-optic sensors and sensor components including Focal Plane Arrays, gated Image Intensifiers, microchannel plates, and phosphors are discussed. Two new unique facilities, the High Speed Solid State Imager Test Station (HSTS) and the Electron Gun Vacuum Test Chamber (EGTC) arc described. A summary of the Group`s current and developmental camera designs and R&D initiatives are included.

Yates, G.J.; King, N.S.P.

1994-08-01

192

Camera Animation  

NSDL National Science Digital Library

A general discussion of the use of cameras in computer animation. This section includes principles of traditional film techniques and suggestions for the use of a camera during an architectural walkthrough. This section includes html pages, images and one video.

2011-01-30

193

Design Considerations Of A Compton Camera For Low Energy Medical Imaging  

SciTech Connect

Development of a Compton camera for low energy medical imaging applications is underway. The ProSPECTus project aims to utilize position sensitive detectors to generate high quality images using electronic collimation. This method has the potential to significantly increase the imaging efficiency compared with mechanically collimated SPECT systems, a highly desirable improvement on clinical systems. Design considerations encompass the geometrical optimisation and evaluation of image quality from the system which is to be built and assessed.

Harkness, L. J.; Boston, A. J.; Boston, H. C.; Cresswell, J. R.; Grint, A. N.; Judson, D. S.; Nolan, P. J.; Oxley, D. C. [Department of Physics, University of Liverpool, Oliver Lodge Laboratory, Liverpool, UK L697ZE (United Kingdom); Lazarus, I.; Simpson, J. [STFC Daresbury Laboratory, Daresbury, Warrington, WA44AD (United Kingdom)

2009-12-02

194

Image enhancement in PET scanning with the Royal Marsden Hospital MWPC camera (MUP-PET)  

NASA Astrophysics Data System (ADS)

This article presents preliminary results of applying space-variant, texture-dependent image processing algorithms to images of the distribution of positron labelled radiopharmaceuticals from the clinical prototype RMH positron camera (MUP-PET). Based on the concept of pixel value significance, an image processing algorithm has been developed which yields tomographic data in a form which should enable the clinician to distinguish real detail from artefactual and noise structures in the images.

Webb, S.; Ott, R. J.; Marsden, P. K.; Flower, M. A.

1988-06-01

195

The Nimbus image dissector camera system - An evaluation of its meteorological applications.  

NASA Technical Reports Server (NTRS)

Brief description of the electronics and operation of the Nimbus image dissector camera system (IDCS). The geometry and distortions of the IDCS are compared to the conventional AVCS camera on board the operational ITOS and ESSA satellites. The unique scanning of the IDCS provides for little distortion of the image, making it feasible to use a strip grid for the IDCS received in real time by local APT stations. The dynamic range of the camera favors the white end (high reflectance) of the gray scale. Thus, the camera is good for detecting cloud structure and ice features through brightness changes. Examples of cloud features, ice, and snow-covered land are presented. Land features, on the other hand, show little contrast. The 2600 x 2600 km coverage by the IDCS is adequate for the early detection of weather systems which may affect the local area. An example of IDCS coverage obtained by an APT station in midlatitudes is presented.

Sabatini, R. R.

1971-01-01

196

Optical characterization of UV multispectral imaging cameras for SO2 plume measurements  

NASA Astrophysics Data System (ADS)

Only a few years ago spectral imaging cameras for SO2 plume monitoring were developed for remote sensing of volcanic plumes. We describe the development from a first camera using a single filter in the absorption band of SO2 to more advanced systems using several filters and an integrated spectrometer. The first system was based on the Hamamatsu C8484 UV camera (1344 x 1024 pixels) with high quantum efficiency in the UV region from 280 nm onward. At the heart of the second UV camera system, EnviCam, is a cooled Alta U47 camera, equipped with two on-band (310 and 315 nm) and two off-band (325 and 330 nm) filters. The third system utilizes again the uncooled Hamamatsu camera for faster sampling (~10 Hz) and a four-position filter-wheel equipped with two 10 nm filters centered at 310 and 330 nm, a UV broadband view and a blackened plate for dark-current measurement. Both cameras have been tested with lenses with different focal lengths. A co-aligned spectrometer provides a ~0.3nm resolution spectrum within the field-of-view of the camera. We describe the ground-based imaging cameras systems developed and utilized at our Institute. Custom made cylindrical quartz calibration cells with 50 mm diameter, to cover the entire field of view of the camera optics, are filled with various amounts of gaseous SO2 (typically between 100 and 1500 ppm•m). They are used for calibration and characterization of the cameras in the laboratory. We report about the procedures for monitoring and analyzing SO2 path-concentration and fluxes. This includes a comparison of the calibration in the atmosphere using the SO2 cells versus the SO2 retrieval from the integrated spectrometer. The first UV cameras have been used to monitor ship emissions (Ny-Ålesund, Svalbard and Genova, Italy). The second generation of cameras were first tested for industrial stack monitoring during a field campaign close to the Rovinari (Romania) power plant in September 2010, revealing very high SO2 emissions (> 1000 ppm•m). The second generation cameras are now used by students from several universities in Romania. The newest system has been tested for volcanic plume monitoring at Turrialba, Costa Rica in January, 2011, at Merapi volcani, Indonesia in February 2011, at Lascar volcano in Chile in July 2011 and at Etna/Stromboli (Italy) in November 2011. Retrievals from some of these campaigns will be presented.

Stebel, K.; Prata, F.; Dauge, F.; Durant, A.; Amigo, A.,

2012-04-01

197

Reduced isotope dose with rapid SPECT MPI imaging: Initial experience with a CZT SPECT camera  

Microsoft Academic Search

Background  Cardiac SPECT imaging needs to become shorter and use lower radiation doses to compete with other available noninvasive imaging\\u000a modalities. Recently introduced cadmium zinc telluride (CZT) SPECT cameras have the potential to achieve both goals, but clinical\\u000a experience is so far limited.\\u000a \\u000a \\u000a \\u000a \\u000a Methods  Images of all patients undergoing a stress MPI over a 4-month period using a CZT camera (Discovery NM

Lori B. Croft; Tapan Godiwala; Eric Ginsberg; Titus George; Milena J. Henzlova

2010-01-01

198

Reduced isotope dose and imaging time with a high-efficiency CZT SPECT camera  

Microsoft Academic Search

Background  In light of recent focus on diagnostic imaging, cardiac SPECT imaging needs to become a shorter test with lower radiation\\u000a exposure to patients. Recently introduced Cadmium Zinc Telluride (CZT) cameras have the potential to achieve both goals.\\u000a \\u000a \\u000a \\u000a \\u000a Methods  During a 2-month period patients presenting for a Tc-99m sestamibi SPECT MPI study were imaged using a CZT camera using a\\u000a low-dose rest-stress

Lori B. Croft; Eric S. Ginsberg; Andrew J. Einstein; Krista A. Guma; Titus George; Milena J. Henzlova

199

Microchannel plate pinhole camera for 20 to 100 keV x-ray imaging  

SciTech Connect

We present the design and construction of a sensitive pinhole camera for imaging suprathermal x-rays. Our device is a pinhole camera consisting of four filtered pinholes and microchannel plate electron multiplier for x-ray detection and signal amplification. We report successful imaging of 20, 45, 70, and 100 keV x-ray emissions from the fusion targets at our Novette laser facility. Such imaging reveals features of the transport of hot electrons and provides views deep inside the target.

Wang, C.L.; Leipelt, G.R.; Nilson, D.G.

1984-10-03

200

Camera simulation engine enables efficient system optimization for super-resolution imaging  

NASA Astrophysics Data System (ADS)

Quantitative fluorescent imaging requires optimization of the complete optical system, from the sample to the detector. Such considerations are especially true for precision localization microscopy such as PALM and (d)STORM where the precision of the result is limited by the noise in both the optical and detection systems. Here, we present a Camera Simulation Engine (CSE) that allows comparison of imaging results from CCD, CMOS and EM-CCD cameras under various sample conditions and can accurately validate the quality of precision localization algorithms and camera performance. To achieve these results, the CSE incorporates the following parameters: 1) Sample conditions including optical intensity, wavelength, optical signal shot noise, and optical background shot noise; 2) Camera specifications including QE, pixel size, dark current, read noise, EM-CCD excess noise; 3) Camera operating conditions such as exposure, binning and gain. A key feature of the CSE is that, from a single image (either real or simulated "ideal") we generate a stack of statistically realistic images. We have used the CSE to validate experimental data showing that certain current scientific CMOS technology outperforms EM-CCD in most super-resolution scenarios. Our results support using the CSE to efficiently and methodically select cameras for quantitative imaging applications. Furthermore, the CSE can be used to robustly compare and evaluate new algorithms for data analysis and image reconstruction. These uses of the CSE are particularly relevant to super-resolution precision localization microscopy and provide a faster, simpler and more cost effective means of system optimization, especially camera selection.

Fullerton, Stephanie; Bennett, Keith; Toda, Eiji; Takahashi, Teruo

2012-02-01

201

Simultaneous Images of the Northern and Southern Auroras with the Earth Camera on the Polar Spacecraft  

Microsoft Academic Search

During the period 22 October through late November, 2001, the Polar spacecraft was fortuitously positioned near Earth's equatorial plane such that the wide field-of-view of its far-ultraviolet camera was able to capture simultaneous images of the auroras in the Northern and Southern Hemispheres. This Earth Camera provided sufficient resolution of auroral features that the luminosities could be mapped into corrected

L. A. Frank; J. B. Sigwarth

2002-01-01

202

Automatic camera control for AUVs: a comparison of image assessment methods  

Microsoft Academic Search

The authors compare the performance of a spatial domain and a spatial frequency domain control method used to assess image content information from charge coupled device (CCD) camera imagery. Sample underwater images were processed using the developed techniques and are used to illustrate comparative levels of control for different water qualities, object contrasts, and ranges. The implementation algorithm required to

J. S. Chu; L. A. Lieberman; P. Downes

1992-01-01

203

Two detector MWPC positron camera with honeycomb lead converters for medical imaging: Performance and developments  

Microsoft Academic Search

A MWPC camera for Nuclear Medicine imaging with positron emitting isotopes has been built. The two planar detectors are converter coupled MWPC with sensitive area 48 x 48 cm² placed 50 cm apart with clear space in between for the patient. The effective sensitivity of 5.8% for each detector with a spatial resolution of 7 mm FWHM on the image

V. Perez-Mendez; C. B. Lim; D. Ortendahl; R. Hattner; L. Kaufman; D. C. Price

1978-01-01

204

A Comprehensive Market Analysis on Camera and Illumination Sensors for Image Processing and Machine Vision Applications  

Microsoft Academic Search

In the present industrial application space, conventional inspection techniques using manual intervention are getting replaced by automated inspection using image sensors. Image sensors have empowered machines with vision and that has led to increased levels of process automation which used to be manually exhaustive. This paper mainly focuses on the comparative study of Machine Vision hardware aspect specifically on cameras

Ayan Chaki; M. Prashant; P. Sen

2010-01-01

205

Periscope-camera system for visible and infrared imaging diagnostics on TFTR  

SciTech Connect

An optical diagnostic consisting of a periscope which relays images of the torus interior to an array of cameras is used on the Tokamak Fusion Test Reactor (TFTR) to view plasma discharge phenomena and inspect vacuum vessel internal structures in both visible and near-infrared wavelength regions. Three periscopes view through 20-cm-diameter fused-silica windows which are spaced around the torus midplane to provide a viewing coverage of approximately 75% of the vacuum vessel internal surface area. The periscopes have f/8 optics and motor-driven controls for focusing, magnification selection (5/sup 0/, 20/sup 0/, and 60/sup 0/ field of view), elevation and azimuth setting, mast rotation, filter selection, iris aperture, and viewing port selection. The four viewing ports on each periscope are equipped with multiple imaging devices which include: (1) an inspection eyepiece, (2) standard (RCA TC2900) and fast (RETICON) framing rate television cameras, (3) a PtSi CCD infrared imaging camera, (4) a 35 mm Nikon F3 still camera, or (5) a 16 mm Locam II movie camera with variable framing up to 500 fps. Operation of the periscope-camera system is controlled either locally or remotely through a computer-CAMAC interface. A description of the equipment and examples of its application are presented.

Medley, S.S.; Dimock, D.L.; Hayes, S.; Long, D.; Lowrence, J.L.; Mastrocola, V.; Renda, G.; Ulrickson, M.; Young, K.M.

1985-05-01

206

Smart camera design for realtime High Dynamic Range imaging  

E-print Network

time, the image loses low-light detail in exchange for improved details in areas of high illumination successive image acquisition. Our hardware platform is build around a standard low dynamic range CMOS sensor dynamic range (i.e. the ratio between the lightest and darkest pixel) resulting in low quality images

Paris-Sud XI, Université de

207

Retinal axial focusing and multi-layer imaging with a liquid crystal adaptive optics camera  

NASA Astrophysics Data System (ADS)

With the help of adaptive optics (AO) technology, cellular level imaging of living human retina can be achieved. Aiming to reduce distressing feelings and to avoid potential drug induced diseases, we attempted to image retina with dilated pupil and froze accommodation without drugs. An optimized liquid crystal adaptive optics camera was adopted for retinal imaging. A novel eye stared system was used for stimulating accommodation and fixating imaging area. Illumination sources and imaging camera kept linkage for focusing and imaging different layers. Four subjects with diverse degree of myopia were imaged. Based on the optical properties of the human eye, the eye stared system reduced the defocus to less than the typical ocular depth of focus. In this way, the illumination light can be projected on certain retina layer precisely. Since that the defocus had been compensated by the eye stared system, the adopted 512 × 512 liquid crystal spatial light modulator (LC-SLM) corrector provided the crucial spatial fidelity to fully compensate high-order aberrations. The Strehl ratio of a subject with -8 diopter myopia was improved to 0.78, which was nearly close to diffraction-limited imaging. By finely adjusting the axial displacement of illumination sources and imaging camera, cone photoreceptors, blood vessels and nerve fiber layer were clearly imaged successfully.

Liu, Rui-Xue; Zheng, Xian-Liang; Li, Da-Yu; Xia, Ming-Liang; Hu, Li-Fa; Cao, Zhao-Liang; Mu, Quan-Quan; Xuan, Li

2014-09-01

208

A 5-18 micron array camera for high-background astronomical imaging  

NASA Technical Reports Server (NTRS)

A new infrared array camera system using a Hughes/SBRC 58 x 62 pixel hybrid Si:Ga array detector has been successfully applied to high-background 5-18-micron astronomical imaging observations. The off-axis reflective optical system minimizes thermal background loading and produces diffraction-limited images with negligible spatial distortion. The noise equivalent flux density (NEFD) of the camera at 10 microns on the 3.0-m NASA/Infrared Telescope Facility with broadband interference filters and 0.26 arcsec pixel is NEFD = 0.01 Jy/sq rt min per pixel (1sigma), and it operates at a frame rate of 30 Hz with no compromise in observational efficiency. The electronic and optical design of the camera, its photometric characteristics, examples of observational results, and techniques for successful array imaging in a high- background astronomical application are discussed.

Gezari, Daniel Y.; Folz, Walter C.; Woods, Lawrence A.; Varosi, Frank

1992-01-01

209

A novel IR polarization imaging system designed by a four-camera array  

NASA Astrophysics Data System (ADS)

A novel IR polarization staring imaging system employing a four-camera-array is designed for target detection and recognition, especially man-made targets hidden in complex battle field. The design bases on the existence of the difference in infrared radiation's polarization characteristics, which is particularly remarkable between artificial objects and the natural environment. The system designed employs four cameras simultaneously to capture the00 polarization difference to replace the commonly used systems engaging only one camera. Since both types of systems have to obtain intensity images in four different directions (I0 , I45 , I90 , I-45 ), the four-camera design allows better real-time capability and lower error without the mechanical rotating parts which is essential to one-camera systems. Information extraction and detailed analysis demonstrate that the caught polarization images include valuable polarization information which can effectively increase the images' contrast and make it easier to segment the target even the hidden target from various scenes.

Liu, Fei; Shao, Xiaopeng; Han, Pingli

2014-05-01

210

Achieving thermography with a thermal security camera using uncooled amorphous silicon microbolometer image sensors  

NASA Astrophysics Data System (ADS)

Advancements in uncooled microbolometer technology over the last several years have opened up many commercial applications which had been previously cost prohibitive. Thermal technology is no longer limited to the military and government market segments. One type of thermal sensor with low NETD which is available in the commercial market segment is the uncooled amorphous silicon (?-Si) microbolometer image sensor. Typical thermal security cameras focus on providing the best image quality by auto tonemaping (contrast enhancing) the image, which provides the best contrast depending on the temperature range of the scene. While this may provide enough information to detect objects and activities, there are further benefits of being able to estimate the actual object temperatures in a scene. This thermographic ability can provide functionality beyond typical security cameras by being able to monitor processes. Example applications of thermography[2] with thermal camera include: monitoring electrical circuits, industrial machinery, building thermal leaks, oil/gas pipelines, power substations, etc...[3][5] This paper discusses the methodology of estimating object temperatures by characterizing/calibrating different components inside a thermal camera utilizing an uncooled amorphous silicon microbolometer image sensor. Plots of system performance across camera operating temperatures will be shown.

Wang, Yu-Wei; Tesdahl, Curtis; Owens, Jim; Dorn, David

2012-06-01

211

Dynamic imaging with high resolution time-of-flight pet camera - TOFPET I  

SciTech Connect

One of the major design goals of the TOFPET I positron camera was to produce a high resolution whole body positron camera capable of dynamically imaging an organ such as the heart. TOFPET I is now nearing completion and preliminary images have been obtained to assess its dynamic and three dimensional imaging capabilities. Multiple gated images of the uptake of Rubidium in the dog heart and three dimensional surface displays of the distribution of the Rubidium-82 in the myocardium have been generated to demonstrate the three dimensional imaging properties. Fast dynamic images of the first pass of a bolus of radio-tracer through the heart have been collected with 4 second integration time and 50% gating (2 second equivalent integration time) with 18 mCi of Rb-82.

Mullani, N.A.; Bristow, D.; Gaeta, J.; Gould, K.L.; Hartz, R.K.; Philipe, E.A.; Wong, W.H.; Yerian, K.

1984-02-01

212

High-resolution imaging of biological and other objects with an X-ray digital camera  

Microsoft Academic Search

A high-resolution CCD X-ray camera based on YAG:Ce or LuAG:Ce thin scintillators is presented. The high resolution in low energy X-ray radiation is quantified with several test objects. The achieved spatial resolution of the images is <1?m. The objects used for imaging are grids and small animals with parts of several microns in extent. The high-resolution imaging system can be

J. Touš; K. Blažek; L. Pína; B. Sopko

2010-01-01

213

The Lunar Student Imaging Project (LSIP): Bringing the Excitement of Lunar Exploration to Students Using LRO Mission Data  

NASA Astrophysics Data System (ADS)

In June 2009, NASA launched the Lunar Reconnaissance Orbiter (LRO) - the first mission in NASA's Vision for Space Exploration, a plan to return to the Moon and then to travel to Mars and beyond. LRO is equipped with seven instruments including the Lunar Reconnaissance Orbiter Camera (LROC), a system of two narrow-angle cameras and one wide-angle camera, controlled by scientists in the School of Earth and Space Exploration at Arizona State University. The orbiter will have a one-year primary mission in a 50 km polar orbit. The measurements from LROC will uncover much-needed information about potential landing sites and will help generate a meter scale map of the lunar surface. With support from NASA Goddard Space Flight Center, the LROC Science Operations Center and the ASU Mars Education Program, have partnered to develop an inquiry-based student program, the Lunar Student Imaging Project (LSIP). Based on the nationally recognized, Mars Student Imaging Project (MSIP), LSIP uses cutting-edge NASA content and remote sensing data to involve students in authentic lunar exploration. This program offers students (grades 5-14) immersive experiences where they can: 1) target images of the lunar surface, 2) interact with NASA planetary scientists, mission engineers and educators, and 3) gain access to NASA curricula and materials developed to enhance STEM learning. Using a project based learning model, students drive their own research and learn first hand what it’s like to do real planetary science. The LSIP curriculum contains a resource manual and program guide (including lunar feature identification charts, classroom posters, and lunar exploration time line) and a series of activities covering image analysis, relative age dating and planetary comparisons. LSIP will be based upon the well-tested MSIP model, and will encompass onsite as well as distance learning components.

Taylor, W. L.; Roberts, D.; Burnham, R.; Robinson, M. S.

2009-12-01

214

A mobile phone-based retinal camera for portable wide field imaging.  

PubMed

Digital fundus imaging is used extensively in the diagnosis, monitoring and management of many retinal diseases. Access to fundus photography is often limited by patient morbidity, high equipment cost and shortage of trained personnel. Advancements in telemedicine methods and the development of portable fundus cameras have increased the accessibility of retinal imaging, but most of these approaches rely on separate computers for viewing and transmission of fundus images. We describe a novel portable handheld smartphone-based retinal camera capable of capturing high-quality, wide field fundus images. The use of the mobile phone platform creates a fully embedded system capable of acquisition, storage and analysis of fundus images that can be directly transmitted from the phone via the wireless telecommunication system for remote evaluation. PMID:24344230

Maamari, Robi N; Keenan, Jeremy D; Fletcher, Daniel A; Margolis, Todd P

2014-04-01

215

UNIFYING IMAGE PLANE LIFTINGS FOR CENTRAL CATADIOPTRIC AND DIOPTRIC CAMERAS  

E-print Network

projection of a line is a conic section. A unifying theory for central catadioptric systems has been proposed the division model, the image of a line is in general a conic curve. 2. For both catadioptric and radially

Barreto, Joao

216

Chandra High Resolution Camera Imaging of GRS 1758-258  

E-print Network

We observed the "micro-quasar" GRS 1758-258 four times with Chandra. Two HRC-I observations were made in 2000 September-October spanning an intermediate-to-hard spectral transition (identified with RXTE). Another HRC-I and an ACIS/HETG observation were made in 2001 March following a hard-to-soft transition to a very low flux state. Based on the three HRC images and the HETG zero order image, the accurate position (J2000) of the X-ray source is RA = 18h 01m 12.39s, Dec = -25d 44m 36.1s (90% confidence radius = 0".45), consistent with the purported variable radio counterpart. All three HRC images are consistent with GRS 1758-258 being a point source, indicating that any bright jet is less than ~1 light-month in projected length, assuming a distance of 8.5 kpc.

W. A. Heindl; D. M. Smith

2002-08-19

217

Unifying Image Plane Liftings for Central Catadioptric and Dioptric Cameras  

Microsoft Academic Search

In this paper, we study projection systems with a single viewpoint, including combinations of mirrors and lenses (catadioptric) as well as just lenses with or without radial distortion (dioptric systems). Firstly, we extend a well- known unifying model for catadioptric systems to incorporate a class of dioptric systems with radial distortion. Secondly, we provide a new representation for the image

Joao P. Barreto; Kostas Daniilidis

218

Camera Identification from Cropped and Scaled Images Miroslav Goljan*  

E-print Network

advantages of using this fingerprint for forensic purposes: a) Stability. The fingerprint is stable in time when the image under investigation has been simultaneously cropped and scaled. The sensor fingerprint identification, Photo-Response Non-Uniformity, digital forensic. 1. INTRODUCTION The problem of establishing

Fridrich, Jessica

219

Artificially reproduced image of earth photographed by UV camera  

NASA Technical Reports Server (NTRS)

A reproduction of a color enhancement of a picture photographed in far-ultraviolet light by Astronaut John W. Young, Apollo 16 commander, showing the Earth. Note this is an artificially reproduced image. The three auroral belts, the sunlit atmosphere and background stars are visible.

1972-01-01

220

Innovative Camera and Image Processing System to Characterize Cryospheric Changes  

Microsoft Academic Search

The polar regions play an important role in Earth's climatic and geodynamic systems. Digital photogrammetric mapping provides a means for monitoring the dramatic changes observed in the polar regions during the past decades. High-resolution, photogrammetrically processed digital aerial imagery provides complementary information to surface measurements obtained by laser altimetry systems. While laser points accurately sample the ice surface, stereo images

A. Schenk; B. M. Csatho; S. Nagarajan

2010-01-01

221

X-ray framing cameras for > 5 keV imaging  

SciTech Connect

Recent and proposed improvements in spatial resolution, temporal resolution, contrast, and detection efficiency for x-ray framing cameras are discussed in light of present and future laser-plasma diagnostic needs. In particular, improvements in image contrast above hard x-ray background levels is demonstrated by using high aspect ratio tapered pinholes.

Landen, O.L.; Bell, P.M.; Costa, R.; Kalantar, D.H.; Bradley, D.K.

1995-07-20

222

Image-based visual\\/force\\/position control under parametric uncertainties of camera and robot arm  

Microsoft Academic Search

The problem of control law syntheses for constrained dynamic robot manipulators driven by image-based camera information is the subject of this paper. The tasks is to drive the end effector oveT a constrained surface, while simultaneously controlling the applied force over its normal. This tasks can be found in a variety of applications in the industrial floor, wherein it is

E. C. Dean-Leon; V. Parra-Vega; A. Espinosa-Romero

2004-01-01

223

Positron emission imaging using acquired cone-surfaces from opposing compton cameras  

Microsoft Academic Search

The Compton camera, a method of electronic collimation, introduces angular resolution into a gamma-ray imaging system without the need of a collimating device. Compton kinematics are used to deduce the incident trajectory of the photon, resulting in a cone-surface of possible source locations. So far this type of system has been successfully employed only as a single photon detection device.

John E. Gillam; Toby E. Beveridge; Rob A. Lewis

2004-01-01

224

Interactive alignment and image reconstruction for wafer-level multi-aperture camera systems  

NASA Astrophysics Data System (ADS)

Assembly of miniaturized high-resolution cameras is typically carried out by active alignment. The sensor image is constantly monitored while the lens stack is adjusted. When sharpness is acceptable in all regions of the image, the lens position over the sensor is fixed. For multi-aperture cameras, this approach is not sufficient. During prototyping, it is beneficial to see the complete reconstructed image, assembled from all optical channels. However, typical reconstruction algorithms are high-quality offline methods that require calibration. As the geometric setup of the camera repeatedly changes during assembly, this would require frequent re-calibration. We present a real-time algorithm for an interactive preview of the reconstructed image during camera alignment. With this algorithm, systematic alignment errors can be tracked and corrected during assembly. Known imperfections of optical components can also be included in the reconstruction. Finally, the algorithm easily maps to very simple GPU operations, making it ideal for applications in mobile devices where power consumption is critical.

Oberdörster, Alexander; Brückner, Andreas; Lensch, Hendrik P. A.

2014-09-01

225

Saturn's hydrogen aurora: Wide field and planetary camera 2 imaging from the Hubble Space Telescope  

Microsoft Academic Search

Wide field and planetary camera 2\\/Hubble Space Telescope (WFPC2\\/HST) images of Saturn's far ultraviolet aurora reveal emissions confined to a narrow band of latitudes near Saturn's north and south poles. The aurorae are most prominent in the morning sector with patterns that appear fixed in local time. The geographic distribution and vertical extent of the auroral emissions seen in these

John T. Trauger; John T. Clarke; Gilda E. Ballester; Robin W. Evans; Christopher J. Burrows; David Crisp; John S. Gallagher; Richard E. Griffiths; J. Jeff Hester; John G. Hoessel; Jon A. Holtzman; John E. Krist; Jeremy R. Mould; Raghvendra Sahai; Paul A. Scowen; Karl R. Stapelfeldt; Alan M. Watson

1998-01-01

226

Fire service and first responder thermal imaging camera (TIC) advances and standards  

Microsoft Academic Search

Fire Service and First Responder Thermal Imaging Camera (TIC) applications are growing, saving lives and preventing injury and property damage. Firefighters face a wide range of serious hazards. TICs help mitigate the risks by protecting Firefighters and preventing injury, while reducing time spent fighting the fire and resources needed to do so. Most fire safety equipment is covered by performance

Lawrence S. Konsin; Stuart Nixdorff

2007-01-01

227

Research on the affect of differential-images technique to the resolution of infrared spatial camera  

NASA Astrophysics Data System (ADS)

The optical system of infrared spatial camera adopts bigger relative aperture and bigger pixel size on focal plane element. These make the system have bulky volume and low resolution. The potential of the optical systems can not be exerted adequately. So, one method for improving resolution of infrared spatial camera based on multi-frame difference-images is introduced in the dissertation. The method uses more than one detectors to acquire several difference images, and then reconstructs a new high-resolution image from these images through the relationship of pixel grey value. The technique of difference-images that uses more than two detectors is researched, and it can improve the resolution 2.5 times in theory. The relationship of pixel grey value between low-resolution difference-images and high-resolution image is found by analyzing the energy of CCD sampling, a general relationship between the enhanced times of the resolution of the detected figure with differential method and the least count of CCD that will be used to detect figure is given. Based on the research of theory, the implementation process of utilizing difference-images technique to improve the resolution of the figure was simulated used Matlab software by taking a personality image as the object, and the software can output the result as an image. The result gotten from the works we have finished proves that the technique is available in high-resolution image reconstruction. The resolution of infrared spatial camera can be improved evidently when holding the size of optical structure or using big size detector by applying for difference image technique. So the technique has a high value in optical remote fields.

Jin, Guang; An, Yuan; Qi, Yingchun; Hu, Fusheng

2007-12-01

228

Removing cosmic-ray hits from multiorbit HST Wide Field Camera images  

Microsoft Academic Search

We present an optimized algorithm that removes cosmic rays ('CRs') from multiorbit Hubble Space Telescope (HST) Wide Field\\/Planetary Camera ('WF\\/PC') images. It computes the image noise in every iteration from the WF\\/PC CCD equation. This includes all known sources of random and systematic calibration errors. We test this algorithm on WF\\/PC stacks of 2-12 orbits as a function of the

Rogier A. Windhorst; Barbara E. Franklin; Lyman W. Neuschaefer

1994-01-01

229

Hologram synthesis of three-dimensional real objects using portable integral imaging camera.  

PubMed

We propose a portable hologram capture system based on integral imaging. An integral imaging camera with an integrated micro lens array captures spatio-angular light ray distribution of the three-dimensional scene under incoherent illumination. The captured light ray distribution is then processed to synthesize corresponding hologram. Experimental results show that the synthesized hologram is optically reconstructed successfully, demonstrating accommodation and motion parallax of the reconstructed three-dimensional scene. PMID:24104278

Lee, Sung-Keun; Hong, Sung-In; Kim, Yong-Soo; Lim, Hong-Gi; Jo, Na-Young; Park, Jae-Hyeung

2013-10-01

230

Quantitative Evaluation of Surface Color of Tomato Fruits Cultivated in Remote Farm Using Digital Camera Images  

NASA Astrophysics Data System (ADS)

To measure the quantitative surface color information of agricultural products with the ambient information during cultivation, a color calibration method for digital camera images and a remote monitoring system of color imaging using the Web were developed. Single-lens reflex and web digital cameras were used for the image acquisitions. The tomato images through the post-ripening process were taken by the digital camera in both the standard image acquisition system and in the field conditions from the morning to evening. Several kinds of images were acquired with the standard RGB color chart set up just behind the tomato fruit on a black matte, and a color calibration was carried out. The influence of the sunlight could be experimentally eliminated, and the calibrated color information consistently agreed with the standard ones acquired in the system through the post-ripening process. Furthermore, the surface color change of the tomato on the tree in a greenhouse was remotely monitored during maturation using the digital cameras equipped with the Field Server. The acquired digital color images were sent from the Farm Station to the BIFE Laboratory of Mie University via VPN. The time behavior of the tomato surface color change during the maturing process could be measured using the color parameter calculated based on the obtained and calibrated color images along with the ambient atmospheric record. This study is a very important step in developing the surface color analysis for both the simple and rapid evaluation of the crop vigor in the field and to construct an ambient and networked remote monitoring system for food security, precision agriculture, and agricultural research.

Hashimoto, Atsushi; Suehara, Ken-Ichiro; Kameoka, Takaharu

231

The iQID camera: An ionizing-radiation quantum imaging detector  

NASA Astrophysics Data System (ADS)

We have developed and tested a novel, ionizing-radiation Quantum Imaging Detector (iQID). This scintillation-based detector was originally developed as a high-resolution gamma-ray imager, called BazookaSPECT, for use in single-photon emission computed tomography (SPECT). Recently, we have investigated the detector's response and imaging potential with other forms of ionizing radiation including alpha, neutron, beta, and fission fragment particles. The confirmed response to this broad range of ionizing radiation has prompted its new title. The principle operation of the iQID camera involves coupling a scintillator to an image intensifier. The scintillation light generated by particle interactions is optically amplified by the intensifier and then re-imaged onto a CCD/CMOS camera sensor. The intensifier provides sufficient optical gain that practically any CCD/CMOS camera can be used to image ionizing radiation. The spatial location and energy of individual particles are estimated on an event-by-event basis in real time using image analysis algorithms on high-performance graphics processing hardware. Distinguishing features of the iQID camera include portability, large active areas, excellent detection efficiency for charged particles, and high spatial resolution (tens of microns). Although modest, iQID has energy resolution that is sufficient to discriminate between particles. Additionally, spatial features of individual events can be used for particle discrimination. An important iQID imaging application that has recently been developed is real-time, single-particle digital autoradiography. We present the latest results and discuss potential applications.

Miller, Brian W.; Gregory, Stephanie J.; Fuller, Erin S.; Barrett, Harrison H.; Bradford Barber, H.; Furenlid, Lars R.

2014-12-01

232

Investigation of noise equivalent count rate in positron imaging using a dual head gamma camera  

Microsoft Academic Search

In positron imaging, Noise Equivalent Count (NEC) rate curves provide a good link between image quality and scatter, random and true coincidence rates. The NEC curve has been measured on a SophyCamera DST (SMV) with a 20 cm cylinder phantom filled with 18F (59 MBq). On that system, the field of view is determined by two rectangular parallel detectors (400*300

D. Brassel; M. Tararine; O. Lamer; B. Bendriem

1997-01-01

233

The trustworthy digital camera: Restoring credibility to the photographic image  

NASA Technical Reports Server (NTRS)

The increasing sophistication of computers has made digital manipulation of photographic images, as well as other digitally-recorded artifacts such as audio and video, incredibly easy to perform and increasingly difficult to detect. Today, every picture appearing in newspapers and magazines has been digitally altered to some degree, with the severity varying from the trivial (cleaning up 'noise' and removing distracting backgrounds) to the point of deception (articles of clothing removed, heads attached to other people's bodies, and the complete rearrangement of city skylines). As the power, flexibility, and ubiquity of image-altering computers continues to increase, the well-known adage that 'the photography doesn't lie' will continue to become an anachronism. A solution to this problem comes from a concept called digital signatures, which incorporates modern cryptographic techniques to authenticate electronic mail messages. 'Authenticate' in this case means one can be sure that the message has not been altered, and that the sender's identity has not been forged. The technique can serve not only to authenticate images, but also to help the photographer retain and enforce copyright protection when the concept of 'electronic original' is no longer meaningful.

Friedman, Gary L.

1994-01-01

234

Preliminary experience with small animal SPECT imaging on clinical gamma cameras.  

PubMed

The traditional lack of techniques suitable for in vivo imaging has induced a great interest in molecular imaging for preclinical research. Nevertheless, its use spreads slowly due to the difficulties in justifying the high cost of the current dedicated preclinical scanners. An alternative for lowering the costs is to repurpose old clinical gamma cameras to be used for preclinical imaging. In this paper we assess the performance of a portable device, that is, working coupled to a single-head clinical gamma camera, and we present our preliminary experience in several small animal applications. Our findings, based on phantom experiments and animal studies, provided an image quality, in terms of contrast-noise trade-off, comparable to dedicated preclinical pinhole-based scanners. We feel that our portable device offers an opportunity for recycling the widespread availability of clinical gamma cameras in nuclear medicine departments to be used in small animal SPECT imaging and we hope that it can contribute to spreading the use of preclinical imaging within institutions on tight budgets. PMID:24963478

Aguiar, P; Silva-Rodríguez, J; Herranz, M; Ruibal, A

2014-01-01

235

Real-time full-field photoacoustic imaging using an ultrasonic camera  

NASA Astrophysics Data System (ADS)

A photoacoustic imaging system that incorporates a commercial ultrasonic camera for real-time imaging of two-dimensional (2-D) projection planes in tissue at video rate (30 Hz) is presented. The system uses a Q-switched frequency-doubled Nd:YAG pulsed laser for photoacoustic generation. The ultrasonic camera consists of a 2-D 12×12 mm CCD chip with 120×120 piezoelectric sensing elements used for detecting the photoacoustic pressure distribution radiated from the target. An ultrasonic lens system is placed in front of the chip to collect the incoming photoacoustic waves, providing the ability for focusing and imaging at different depths. Compared with other existing photoacoustic imaging techniques, the camera-based system is attractive because it is relatively inexpensive and compact, and it can be tailored for real-time clinical imaging applications. Experimental results detailing the real-time photoacoustic imaging of rubber strings and buried absorbing targets in chicken breast tissue are presented, and the spatial resolution of the system is quantified.

Balogun, Oluwaseyi; Regez, Brad; Zhang, Hao F.; Krishnaswamy, Sridhar

2010-03-01

236

High-speed camera with real time processing for frequency domain imaging  

PubMed Central

We describe a high-speed camera system for frequency domain imaging suitable for applications such as in vivo diffuse optical imaging and fluorescence lifetime imaging. 14-bit images are acquired at 2 gigapixels per second and analyzed with real-time pipeline processing using field programmable gate arrays (FPGAs). Performance of the camera system has been tested both for RF-modulated laser imaging in combination with a gain-modulated image intensifier and a simpler system based upon an LED light source. System amplitude and phase noise are measured and compared against theoretical expressions in the shot noise limit presented for different frequency domain configurations. We show the camera itself is capable of shot noise limited performance for amplitude and phase in as little as 3 ms, and when used in combination with the intensifier the noise levels are nearly shot noise limited. The best phase noise in a single pixel is 0.04 degrees for a 1 s integration time. PMID:21750770

Shia, Victor; Watt, David; Faris, Gregory W.

2011-01-01

237

Development of a gas leak detection method based on infrared spectrum imaging utilizing microbolometer camera  

NASA Astrophysics Data System (ADS)

Development of an early gas leak detection system is essential for safety of energy storage tank fields or chemical plants. Contact-type conventional gas sensors are not suitable for remote surveillance of gas leakage in wide area. Infrared camera has been utilized for gas leak detection, however it is limited only for detecting particular gas. In this study a gas leak identification system, which enables us to detect gas leakage and to identify gas type and density, is developed based on infrared spectrum imaging system utilizing low cost and compact microbolometer infrared camera. Feasibility of the proposed system was demonstrated by experimental results on identification of hydrofluorocarbon gas.

Sakagami, Takahide; Anzai, Hiroaki; Kubo, Shiro

2011-05-01

238

Wide Field Camera 3: A Powerful New Imager for the Hubble Space Telescope  

NASA Technical Reports Server (NTRS)

Wide Field Camera 3 (WFC3) is a powerful UV/visible/near-infrared camera in development for installation into the Hubble Space Telescope during upcoming Servicing Mission 4. WFC3 provides two imaging channels. The UVIS channel incorporates a 4096 x 4096 pixel CCD focal plane with sensitivity from 200 to 1000 nm. The IR channel features a 1024 x 1024 pixel HgCdTe focal plane covering 850 to 1700 nm. We report here on the design of the instrument, the performance of its flight detectors, results of the ground test and calibration program, and the plans for the Servicing Mission installation and checkout.

Kimble, Randy

2008-01-01

239

Multi-camera: interactive rendering of abstract digital images  

E-print Network

positions. ..................... 23 11 Multiple-view images of a cube composed from (A) a grid of in- dividual views, (B) individual views with random values added, and (C) individual views with random and overlap values added. .. 24 12 Two-dimensional view..., simultaneous views allowed a reality of the object to be shown that was more clear and complete than by means of a single perspective. Time and memory were now added to the overall experience of a subject through painting. Later, non-painted objects were added...

Smith, Jeffrey Statler

2004-09-30

240

MONICA: a compact, portable dual gamma camera system for mouse whole-body imaging  

SciTech Connect

Introduction We describe a compact, portable dual-gamma camera system (named "MONICA" for MObile Nuclear Imaging CAmeras) for visualizing and analyzing the whole-body biodistribution of putative diagnostic and therapeutic single photon emitting radiotracers in animals the size of mice. Methods Two identical, miniature pixelated NaI(Tl) gamma cameras were fabricated and installed ?looking up? through the tabletop of a compact portable cart. Mice are placed directly on the tabletop for imaging. Camera imaging performance was evaluated with phantoms and field performance was evaluated in a weeklong In-111 imaging study performed in a mouse tumor xenograft model. Results Tc-99m performance measurements, using a photopeak energy window of 140 keV?10%, yielded the following results: spatial resolution (FWHM at 1 cm), 2.2 mm; sensitivity, 149 cps (counts per seconds)/MBq (5.5 cps/μCi); energy resolution (FWHM, full width at half maximum), 10.8%; count rate linearity (count rate vs. activity), r2=0.99 for 0?185 MBq (0?5 mCi) in the field of view (FOV); spatial uniformity, <3% count rate variation across the FOV. Tumor and whole-body distributions of the In-111 agent were well visualized in all animals in 5-min images acquired throughout the 168-h study period. Conclusion Performance measurements indicate that MONICA is well suited to whole-body single photon mouse imaging. The field study suggests that inter-device communications and user-oriented interfaces included in the MONICA design facilitate use of the system in practice. We believe that MONICA may be particularly useful early in the (cancer) drug development cycle where basic whole-body biodistribution data can direct future development of the agent under study and where logistical factors, e.g., limited imaging space, portability and, potentially, cost are important.

Choyke, Peter L.; Xia, Wenze; Seidel, Jurgen; Kakareka, John W.; Pohida, Thomas J.; Milenic, Diane E.; Proffitt, James; Majewski, Stan; Weisenberger, Andrew G.; Green, Michael V.

2010-04-01

241

Engineering performance of IRIS2 infrared imaging camera and spectrograph  

NASA Astrophysics Data System (ADS)

IRIS2, the infrared imager and spectrograph for the Cassegrain focus of the Anglo Australian Telescope, has been in service since October 2001. IRIS2 incorporated many novel features, including multiple cryogenic multislit masks, a dual chambered vacuum vessel (the smaller chamber used to reduce thermal cycle time required to change sets of multislit masks), encoded cryogenic wheel drives with controlled backlash, a deflection compensating structure, and use of teflon impregnated hard anodizing for gear lubrication at low temperatures. Other noteworthy features were: swaged foil thermal link terminations, the pupil imager, the detector focus mechanism, phased getter cycling to prevent detector contamination, and a flow-through LN2 precooling system. The instrument control electronics was designed to allow accurate positioning of the internal mechanisms with minimal generation of heat. The detector controller was based on the AAO2 CCD controller, adapted for use on the HAWAII1 detector (1024 x 1024 pixels) and is achieving low noise and high performance. We describe features of the instrument design, the problems encountered and the development work required to bring them into operation, and their performance in service.

Churilov, Vladimir; Dawson, John; Smith, Greg A.; Waller, Lew; Whittard, John D.; Haynes, Roger; Lankshear, Allan; Ryder, Stuart D.; Tinney, Chris G.

2004-09-01

242

A portable device for small animal SPECT imaging in clinical gamma-cameras  

NASA Astrophysics Data System (ADS)

Molecular imaging is reshaping clinical practice in the last decades, providing practitioners with non-invasive ways to obtain functional in-vivo information on a diversity of relevant biological processes. The use of molecular imaging techniques in preclinical research is equally beneficial, but spreads more slowly due to the difficulties to justify a costly investment dedicated only to animal scanning. An alternative for lowering the costs is to repurpose parts of old clinical scanners to build new preclinical ones. Following this trend, we have designed, built, and characterized the performance of a portable system that can be attached to a clinical gamma-camera to make a preclinical single photon emission computed tomography scanner. Our system offers an image quality comparable to commercial systems at a fraction of their cost, and can be used with any existing gamma-camera with just an adaptation of the reconstruction software.

Aguiar, P.; Silva-Rodríguez, J.; González-Castaño, D. M.; Pino, F.; Sánchez, M.; Herranz, M.; Iglesias, A.; Lois, C.; Ruibal, A.

2014-07-01

243

Validation of spectral sky radiance derived from all-sky camera images - a case study  

NASA Astrophysics Data System (ADS)

Spectral sky radiance (380-760 nm) is derived from measurements with a Hemispherical Sky Imager (HSI) system. The HSI consists of a commercial compact CCD (charge coupled device) camera equipped with a fish-eye lens and provides hemispherical sky images in three reference bands such as red, green and blue. To obtain the spectral sky radiance from these images non-linear regression functions for various sky conditions have been derived. The camera-based spectral sky radiance was validated by spectral sky radiance measured with a CCD spectroradiometer. The spectral sky radiance for complete distribution over the hemisphere between both instruments deviates by less than 20% at 500 nm for all sky conditions and for zenith angles less than 80°. The reconstructed spectra of the wavelength 380 nm to 760 nm between both instruments at various directions deviate by less then 20% for all sky conditions.

Tohsing, K.; Schrempf, M.; Riechelmann, S.; Seckmeyer, G.

2014-01-01

244

High Contrast Imaging with the Magellan VisAO Camera  

NASA Astrophysics Data System (ADS)

The Magellan Adaptive Optics system (MagAO) saw first light in November 2012 at Las Campanas Observatory on the 6.5m Clay telescope. Here we present an introduction to MagAO's visible wavelength diffraction limited imager, VisAO. VisAO delivers Strehl ratios greater than 30% from 0.62 microns (r') through 1 micron, where Strehl is even higher, and achieved resolutions as small as 20 milli-arcseconds. There are several design considerations which allow VisAO to achieve such good performance on a large telescope, and these will have important implications for ELT AO systems. We took advantage of the excellent performance of MagAO/VisAO to conduct high contrast observations of an exoplanet in the optical.

Males, Jared; Close, Laird; Morzinski, Katie; Kopon, Derek; Follette, Kate; Hinz, Philip; Rodigas, Timothy; Puglisi, Alfio; Esposito, Simone; Riccardi, Armando; Pinna, Enrico; Xompero, Marco; Briguglio, Runa; Wu, Ya-Lin; Uomoto, Alan; Hare, Tison

2013-12-01

245

Camera-Based Lock-in and Heterodyne Carrierographic Photoluminescence Imaging of Crystalline Silicon Wafers  

NASA Astrophysics Data System (ADS)

Carrierographic (spectrally gated photoluminescence) imaging of a crystalline silicon wafer using an InGaAs camera and two spread super-bandgap illumination laser beams is introduced in both low-frequency lock-in and high-frequency heterodyne modes. Lock-in carrierographic images of the wafer up to 400 Hz modulation frequency are presented. To overcome the frame rate and exposure time limitations of the camera, a heterodyne method is employed for high-frequency carrierographic imaging which results in high-resolution near-subsurface information. The feasibility of the method is guaranteed by the typical superlinearity behavior of photoluminescence, which allows one to construct a slow enough beat frequency component from nonlinear mixing of two high frequencies. Intensity-scan measurements were carried out with a conventional single-element InGaAs detector photocarrier radiometry system, and the nonlinearity exponent of the wafer was found to be around 1.7. Heterodyne images of the wafer up to 4 kHz have been obtained and qualitatively analyzed. With the help of the complementary lock-in and heterodyne modes, camera-based carrierographic imaging in a wide frequency range has been realized for fundamental research and industrial applications toward in-line nondestructive testing of semiconductor materials and devices.

Sun, Q. M.; Melnikov, A.; Mandelis, A.

2014-05-01

246

Long bone X-ray image stitching using Camera Augmented Mobile C-arm.  

PubMed

X-ray images are widely used during surgery for long bone fracture fixation. Mobile C-arms provide X-ray images which are used to determine the quality of trauma reduction, i.e. the extremity length and mechanical axis of long bones. Standard X-ray images have a narrow field of view and can not visualize the entire long bone on a single image. In this paper, we propose a novel method to generate panoramic X-ray images in real time by using the previously introduced Camera Augmented Mobile C-arm. This advanced mobile C-arm system acquires registered X-ray and optical images by construction, which facilitates the generation of panoramic X-ray images based on first stitching the optical images and then embedding the X-ray images. We additionally introduce a method to reduce the parallax effect that leads to the blurring and measurement error on panoramic X-ray images. Visual marker tracking is employed to automatically stitch the sequence of video images and to rectify images. Our proposed method is suitable for intra-operative usage generating panoramic X-ray images, which enable metric measurements, with less radiation and without requirement of fronto-parallel setup and overlapping X-ray images. The results show that the panoramic X-ray images generated by our method are accurate enough (errors less than 1%) for metric measurements and suitable for many clinical applications in trauma reduction. PMID:18982651

Wang, Lejing; Traub, Joerg; Heining, Sandro Michael; Benhimane, Selim; Euler, Ekkehard; Graumann, Rainer; Navab, Nassir

2008-01-01

247

[The hyperspectral camera side-scan geometric imaging in any direction considering the spectral mixing].  

PubMed

In order to correct the image distortion in the hyperspectral camera side-scan geometric Imaging, the image pixel geo-referenced algorithm was deduced in detail in the present paper, which is suitable to the linear push-broom camera side-scan imaging on the ground in any direction. It takes the orientation of objects in the navigation coordinates system into account. Combined with the ground sampling distance of geo-referenced image and the area of push broom imaging, the general process of geo-referenced image divided into grids is also presented. The new image rows and columns will be got through the geo-referenced image area dividing the ground sampling distance. Considering the error produced by round rule in the pixel grids generated progress, and the spectral mixing problem caused by traditional direct spectral sampling method in the process of image correction, the improved spectral sampling method based on the weighted fusion method was proposed. It takes the area proportion of adjacent pixels in the new generated pixel as coefficient and then the coefficients are normalized to avoid the spectral overflow. So the new generated pixel is combined with the geo-referenced adjacent pixels spectral. Finally the amounts of push-broom imaging experiments were taken on the ground, and the distortion images were corrected according to the algorithm proposed above. The results show that the linear image distortion correction algorithm is valid and robust. At the same time, multiple samples were selected in the corrected images to verify the spectral data. The results indicate that the improved spectral sampling method is better than the direct spectral sampling algorithm. It provides reference for the application of similar productions on the ground. PMID:25269321

Wang, Shu-Min; Zhang, Ai-Wu; Hu, Shao-Xing; Sun, Wei-Dong

2014-07-01

248

Miniature indium gallium arsenide short-wave infrared camera for unattended imaging applications  

NASA Astrophysics Data System (ADS)

Indium gallium arsenide (InGaAs) focal plane arrays and cameras have demonstrated significant potential in battlefield applications. Room temperature detectivities, D*, in excess of 1014 cm-(root)Hz/W have enabled night vision imaging under low light level conditions. The 0.9 micrometers to 1.7 micrometers wavelength band allows the use of eye- safe lasers for target designation and covert active illumination. We report here a miniature InGaAs camera designed for unattended ground sensor and robot-mounted applications. The camera is approximately the size of a D- cell battery, weighs less than 200 g. has a 320 X 240 pixel spatial resolution and maintains D* > 1014 cm- (root)Hz/W. The miniature camera is fully self contained. The only input is DC power (3.6 V). The camera has both analog (RS170) and 12-bit digital (LVDS) video outputs. It is intended as a demonstration vehicle for battlefield distributed robotic vision but will find use in other applications as an unattended sensor or rifle site.

Cohen, Marshall J.; O'Grady, Matthew T.; Vermaak, Jacobus S.; Groppe, Joseph V.; Olsen, Gregory H.

2000-07-01

249

An improved image restoration method for the high definition panoramic camera system  

NASA Astrophysics Data System (ADS)

An improved panoramic annular camera system with very high spatial bandwidth while at compact size that images the surrounding environment through an annular lens to an image sensor in quasi-real time is described here. The image improvements are realized by using a larger annular lens and a larger CCD array. The flat cylinder perspective method, which is adopted by the annular lens, and the mapping relationship between the three-dimensional object spaces and the two-dimensional annular imageries were investigated. The annular images can be transformed into rectangular by an improved panoramic image restoration method proposed here. The calculating basis of that method is the circular image at 90? viewing angle in the object space, which is always included in the field of view of annular lenses. As the large format CCD requires a light shutter to make it function properly, a dual-layer twisted nematic liquid crystal box was adopted to switch the incident light on and off as a camera shutter built inside the lens assembly. It was the first attempt, to the best of our knowledge, to extend the usage of a liquid crystal box to the camera as a light shutter. The induced lens resolution degradation by the liquid crystal light shutter is not evident and the supporting experimental results are provided. Subsequent digital image processing methods, which are necessary due to that imperfect liquid crystal light shutter, are described here to suppress the transfer smears occurred on the CCD array. At last, a calibration experiment was set up to verify the image restoration method and several outdoor experiments were taken to demonstrate the high quality and rich information in the result images.

Zhao, Liefeng; Feng, Huajun; Bai, Jian; Liu, Xu; Li, Haifeng

2009-09-01

250

Simultaneous Images of the Northern and Southern Auroras with the Earth Camera on the Polar Spacecraft  

NASA Astrophysics Data System (ADS)

During the period 22 October through late November, 2001, the Polar spacecraft was fortuitously positioned near Earth's equatorial plane such that the wide field-of-view of its far-ultraviolet camera was able to capture simultaneous images of the auroras in the Northern and Southern Hemispheres. This Earth Camera provided sufficient resolution of auroral features that the luminosities could be mapped into corrected geomagnetic coordinates for the first time. A few such images were previously available from the Dynamics Explorer-1 spacecraft, but the viewing aspect and the spatial resolution of the camera were insufficient to map the luminosities onto the Earth's surface. The viewing period for simultaneous imaging of the two auroras with the Polar spacecraft was about one hour during the spacecraft orbital period of 18.3 hours. The results of such imaging were exciting in two important areas of auroral research. These areas were the behavior of the auroral luminosities during substorm activity and the mapping of magnetic field lines from auroral altitudes into the equatorial magnetosphere. One of the principal results concerning substorms was the unequal auroral intensities between the two hemispheres during the onset and the expansive phase of substorms which imply strong coupling between the ionosphere and magnetosphere during these events. With regard to the mapping of field lines it is found that further work on the global magnetic models is required in order to provide accuracies of 1 RE earthward of radial distances of 20 RE.

Frank, L. A.; Sigwarth, J. B.

2002-05-01

251

New design of a gamma camera detector with reduced edge effect for breast imaging  

NASA Astrophysics Data System (ADS)

In recent years, there has been a growing interest in developing small gamma cameras dedicated to breast imaging. We designed a new detector with trapezoidal shape to expand the field of view (FOV) of camera without increasing its dimensions. To find optimal parameters, images of point sources at the edge area as functions of the angle and optical treatment of crystal side surface were simulated by using a DETECT2000. Our detector employs monolithic CsI(Tl) with dimensions of 48.0×48.0×6.0 mm coupled to an array of photo-sensors. Side surfaces of crystal were treated with three different surface finishes: black absorber, metal reflector and white reflector. The trapezoidal angle varied from 45° to 90° in steps of 15°. Gamma events were generated on 15 evenly spaced points with 1.0 mm spacing in the X-axis starting 1.0 mm away from the side surface. Ten thousand gamma events were simulated at each location and images were formed by calculating the Anger-logic. The results demonstrated that all the 15 points could be identified only for the crystal with trapezoidal shape having 45° angle and white reflector on the side surface. In conclusion, our new detector proved to be a reliable design to expand the FOV of small gamma camera for breast imaging.

Yeon Hwang, Ji; Lee, Seung-Jae; Baek, Cheol-Ha; Hyun Kim, Kwang; Hyun Chung, Yong

2011-05-01

252

First responder thermal imaging cameras: establishment of representative performance testing conditions  

NASA Astrophysics Data System (ADS)

Thermal imaging cameras are rapidly becoming integral equipment for first responders for use in structure fires and other emergencies. Currently there are no standardized performance metrics or test methods available to the users and manufacturers of these instruments. The Building and Fire Research Laboratory (BFRL) at the National Institute of Standards and Technology is conducting research to establish test conditions that best represent the environment in which these cameras are used. First responders may use thermal imagers for field operations ranging from fire attack and search/rescue in burning structures, to hot spot detection in overhaul activities, to detecting the location of hazardous materials. In order to develop standardized performance metrics and test methods that capture the harsh environment in which these cameras may be used, information has been collected from the literature, and from full-scale tests that have been conducted at BFRL. Initial experimental work has focused on temperature extremes and the presence of obscuring media such as smoke. In full-scale tests, thermal imagers viewed a target through smoke, dust, and steam, with and without flames in the field of view. The fuels tested were hydrocarbons (methanol, heptane, propylene, toluene), wood, upholstered cushions, and carpeting with padding. Gas temperatures, CO, CO II, and O II volume fraction, emission spectra, and smoke concentrations were measured. Simple thermal bar targets and a heated mannequin fitted in firefighter gear were used as targets. The imagers were placed at three distances from the targets, ranging from 3 m to 12 m.

Amon, Francine; Hamins, Anthony; Rowe, Justin

2006-04-01

253

Imaging of blood vessels with CCD-camera based three-dimensional photoacoustic tomography  

NASA Astrophysics Data System (ADS)

An optical phase contrast full field detection setup in combination with a CCD-camera is presented to record acoustic fields for real-time projection and fast three-dimensional imaging. When recording projection images of the wave pattern around the imaging object, the three-dimensional photoacoustic imaging problem is reduced to a set of two-dimensional reconstructions and the measurement setup requires only a single axis of rotation. Using a 10 Hz pulse laser system for photoacoustic excitation a three dimensional image can be obtained in less than 1 min. The sensitivity and resolution of the detection system was estimated experimentally with 5 kPa mm and 75?m, respectively. Experiments on biological samples show the applicability of this technique for the imaging of blood vessel distributions.

Nuster, Robert; Slezak, Paul; Paltauf, Guenther

2014-03-01

254

Technique for improving the quality of images from digital cameras using ink-jet printers and smoothed RGB transfer curves  

NASA Astrophysics Data System (ADS)

The digital camera market is growing at an explosive rate. At the same time, the quality of photographs printed on ink- jet printers continues to improve. Most of the consumer cameras are designed with the monitor as the target output device and ont the printer. When a user is printing his images from a camera, he/she needs to optimize the camera and printer combination in order to maximize image quality. We describe the details of one such method for improving image quality using a AGFA digital camera and an ink jet printer combination. Using Adobe PhotoShop, we generated optimum red, green and blue transfer curves that match the scene content to the printers output capabilities. Application of these curves to the original digital image resulted in a print with more shadow detail, no loss of highlight detail, a smoother tone scale, and more saturated colors. The image also exhibited an improved tonal scale and visually more pleasing images than those captured and printed without any 'correction'. While we report the results for one camera-printer combination we tested this technique on numbers digital cameras and printer combinations and in each case produced a better looking image. We also discuss the problems we encountered in implementing this technique.

Sampat, Nitin; Grim, John F.; O'Hara, James E.

1998-04-01

255

Real Time Speed Estimation of Moving Vehicles from Side View Images from an Uncalibrated Video Camera  

PubMed Central

In order to estimate the speed of a moving vehicle with side view camera images, velocity vectors of a sufficient number of reference points identified on the vehicle must be found using frame images. This procedure involves two main steps. In the first step, a sufficient number of points from the vehicle is selected, and these points must be accurately tracked on at least two successive video frames. In the second step, by using the displacement vectors of the tracked points and passed time, the velocity vectors of those points are computed. Computed velocity vectors are defined in the video image coordinate system and displacement vectors are measured by the means of pixel units. Then the magnitudes of the computed vectors in image space should be transformed to the object space to find the absolute values of these magnitudes. This transformation requires an image to object space information in a mathematical sense that is achieved by means of the calibration and orientation parameters of the video frame images. This paper presents proposed solutions for the problems of using side view camera images mentioned here. PMID:22399909

Dogan, Sedat; Temiz, Mahir Serhan; Kulur, S?tk?

2010-01-01

256

Development of a high-speed camera system for neutron imaging at a pulsed neutron source  

NASA Astrophysics Data System (ADS)

A neutron energy resolved imaging system with a time-of-flight technique has been newly developed and installed at Japan Proton Accelerator Research Complex (J-PARC) with the aim to investigate more preciously and rapidly a spatial distribution of several elements and crystals in various kinds of materials or substances. A high-speed video camera (CMOS, 1300 k frame/s) equipped system allows to obtain TOF images consecutively resolved into narrow energy ranges with a single pulsed neutrons while conventional CCD camera imaging system could obtain only one TOF image in an arbitral neutron energy region in the pulsed neutron energy region from 0.01 eV to a few keV. Qualities of the images obtained with the system, such as spatial resolution (defined by modulation transfer function, 0.8 line-pairs/mm at En˜0.01 eV), dependence of the brightness on the neutron energy and measurement errors (˜2%) of the system were examined experimentally and evaluated by comparison with those of conventional imaging system. The results obtained in the experiments show that the system can visualize the neutron energy resolved images within a small error even at high speed.

Segawa, M.; Kai, T.; Sakai, T.; Ooi, M.; Kureta, M.

2013-01-01

257

Applications of a streak-camera-based imager with simultaneous high space and time resolution  

NASA Astrophysics Data System (ADS)

A high-speed imaging device has been built that is capable of recording several hundred images over a time span of 25 to 400 ns. The imager is based on a streak camera, which provides both spatial and temporal resolution. The system's current angular resolution is 16 X 16 pixels, with a time resolution of 250 ps. It was initially employed to provide 3-D images of objects, in conjunction with a short-pulse (approximately 100 ps) laser. For the 3-D (angle-angle-range) laser radar, the 250 ps time resolution corresponds to a range resolution of 4 cm. In the 3-D system, light from a short-pulse laser (a frequency-doubled, Q-switched, mode-locked Nd:YAG laser operating at a wavelength of 532 nm) flood-illuminates a target of linear dimension approximately 1 m. The returning light from the target is imaged, and the image is dissected by a 16 X 16 array of optical fibers. At the other end of the fiber optic image converter, the 256 fibers form a vertical line array, which is input to the slit of a streak camera. The streak camera sweeps the input line across the output phosphor screen so that horizontal position is directly proportional to time. The resulting 2-D image (fiber location vs. time) at the phosphor is read by an intensified (SIT) vidicon TV tube, and the image is digitized and stored. A computer subsequently decodes the image, unscrambling the linear pixels into an angle-angle image at each time or range bin. We are left with a series of snapshots, each one depicting the portion of target surface in a given range bin. The pictures can be combined to form a 3-D realization of the target. Continuous recording of many images over a short time span is of use in imaging other transient phenomena. These applications share a need for multiple images from a nonrepeatable transient event of time duration on the order of nanoseconds. Applications discussed for the imager include (1) pulsed laser beam diagnostics -- measuring laser beam spatial and temporal structure, (2) reflectivity monitoring during pulsed laser annealing of microelectronics, and (3) detonics or shock wave research, especially microscopic studies of shocks produced by laser pulses.

Klick, David I.; Knight, Frederick K.

1993-01-01

258

Imaging Observations of Thermal Emissions from Augustine Volcano Using a Small Astronomical Camera  

USGS Publications Warehouse

Long-exposure visible-light images of Augustine Volcano were obtained using a charge-coupled device (CCD) camera during several nights of the 2006 eruption. The camera was located 105 km away, at Homer, Alaska, yet showed persistent bright emissions from the north flank of the volcano corresponding to steam releases, pyroclastic flows, and rockfalls originating near the summit. The apparent brightness of the emissions substantially exceeded that of the background nighttime scene. The bright signatures in the images are shown to probably be thermal emissions detected near the long-wavelength limit (~1 (u or mu)) of the CCD. Modeling of the emissions as a black-body brightness yields an apparent temperature of 400 to 450 degrees C that likely reflects an unresolved combination of emissions from hot ejecta and cooler material.

Sentman, Davis D.; McNutt, Stephen R.; Stenbaek-Nielsen, Hans C.; Tytgat, Guy; DeRoin, Nicole

2010-01-01

259

A curve fitting method for extrinsic camera calibration from a single image of a cylindrical object  

NASA Astrophysics Data System (ADS)

An important step in the process of optical steel coil quality assurance is to measure the proportions of width and radius of steel coils as well as the relative position and orientation of the camera. This work attempts to estimate these extrinsic parameters from single images by using the cylindrical coil itself as the calibration target. Therefore, an adaptive least-squares algorithm is applied to fit parametrized curves to the detected true coil outline in the acquisition. The employed model allows for strictly separating the intrinsic and the extrinsic parameters. Thus, the intrinsic camera parameters can be calibrated beforehand using available calibration software. Furthermore, a way to segment the true coil outline in the acquired images is motivated. The proposed optimization method yields highly accurate results and can be generalized even to measure other solids which cannot be characterized by the identification of simple geometric primitives.

Winkler, A. W.; Zagar, B. G.

2013-08-01

260

Manufacturing and integration of the IRDIS dual imaging camera and spectrograph for SPHERE  

NASA Astrophysics Data System (ADS)

SPHERE is a planet hunting instrument for the VLT 8m telescope in Chile whose prime objective is the discovery and characterization of young Jupiter-sized planets outside of the solar system. It is a complex instrument, consisting of an extreme Adaptive Optics System (SAXO), various coronagraphs, an infrared differential imaging camera (IRDIS), an infrared integral field spectrograph (IFS) and a visible differential polarimeter (ZIMPOL). The performance of the IRDIS camera is directly related to various wavefront error budgets of the instrument, in particular the differential aberrations occurring after separation of the two image beams. We report on the ongoing integration and testing activities in terms of optical, mechanical, and cryo-vacuum instrument parts. In particular, we show results of component level tests of the optics and indicate expected overall performance in comparison with design-level budgets. We also describe the plans for instrumental performance and science testing of the instrument, foreseen to be conducted during coming months.

Dohlen, Kjetil; Carle, Michael; Madec, Fabrice; Langlois, Maud; Le Mignant, David; Saisse, Michel; Vigan, Arthur; Arthaud, Gilles; Barette, Rudy; Benedetti, Jean-Antoine; Blanc, Jean-Claude; Blanchard, Patrick; Bon, William; Castinel, Louis; Fabron, Christophe; Hill, Lucien; Jaquet, Marc; Laurent, Philippe; Llored, Marc; Manzone, Nataly; Mazzanti, Silvio; Melkonian, Jeanne; Moreaux, Gabriel; Moutou, Claire; Origne, Alain; Feldt, Markus; Naranjo, Vianak; Rohloff, Ralf-Rainer; Beuzit, Jean-Luc; Gluck, Laurence; Mouillet, David; Puget, Pascal; Baruffolo, Andrea; Wildi, Francois; Dorn, Reinhold; Finger, Gert; Hubin, Norbert; Kasper, Markus; Lizon, Jean-Louis

2010-07-01

261

Mars Exploration Rover (MER) Panoramic Camera (Pancam) Twilight Image Analysis for Determination of Planetary Boundary Layer and Dust Particle Size Parameters  

E-print Network

MARS EXPLORATION ROVER (MER) PANORAMIC CAMERA (PANCAM) TWILIGHT IMAGE ANALYSIS FOR DETERMINATION OF PLANETARY BOUNDARY LAYER AND DUST PARTICLE SIZE PARAMETERS A Thesis by STEPHANIE BETH GROUNDS Submitted to the Office of Graduate... Camera (Pancam) Twilight Image Analysis for Determination of Planetary Boundary Layer and Dust Particle Size Parameters Copyright 2010 Stephanie Beth Grounds MARS EXPLORATION ROVER (MER) PANORAMIC CAMERA (PANCAM) TWILIGHT IMAGE ANALYSIS...

Grounds, Stephanie Beth

2012-02-14

262

Imaging performance of the mini-mosaic camera at the WIYN telescope  

Microsoft Academic Search

The goal of much recent engineering improvements at the 3.5m WIYN telescope has been to improve imaging performance that utilizes the good intrinsic seeing at Kitt Peak. This direction complements the efforts of high order adaptive optics by maximizing the usable field. The new 'mini-mosaic' camera, which is a mosaic of 2 4K by 2K SITE CCDs is in the

Abhijit Saha; Taft Armandroff; David G. Sawyer; Charles Corson

2000-01-01

263

Cosmic Rays In Multi-Orbit Images With The HST Wide Field Planetary Camera 2  

Microsoft Academic Search

In the July 1994 PASP (Vol. 106, p. 798), we presented an optimized algorithm that removes cosmic rays (CR's) from multi-orbit HST Wide Field\\/Planetary Camera (WF\\/PC-1) images. The algorithm was shown to work best at clipping levels of =~ 2.0-2.5times sigma_ {Poisson} when 4--6 unshifted exposures are available, while for 6--12 exposures the optimal clipping level is =~ 1.8-2.0sigma_ {Poisson}.

Barbara E. Franklin; Rogier A. Windhorst

1994-01-01

264

Geiger-mode APD camera system for single-photon 3D LADAR imaging  

NASA Astrophysics Data System (ADS)

The unparalleled sensitivity of 3D LADAR imaging sensors based on single photon detection provides substantial benefits for imaging at long stand-off distances and minimizing laser pulse energy requirements. To obtain 3D LADAR images with single photon sensitivity, we have demonstrated focal plane arrays (FPAs) based on InGaAsP Geiger-mode avalanche photodiodes (GmAPDs) optimized for use at either 1.06 ?m or 1.55 ?m. These state-of-the-art FPAs exhibit excellent pixel-level performance and the capability for 100% pixel yield on a 32 x 32 format. To realize the full potential of these FPAs, we have recently developed an integrated camera system providing turnkey operation based on FPGA control. This system implementation enables the extremely high frame-rate capability of the GmAPD FPA, and frame rates in excess of 250 kHz (for 0.4 ?s range gates) can be accommodated using an industry-standard CameraLink interface in full configuration. Real-time data streaming for continuous acquisition of 2 ?s range gate point cloud data with 13-bit time-stamp resolution at 186 kHz frame rates has been established using multiple solid-state storage drives. Range gate durations spanning 4 ns to 10 ?s provide broad operational flexibility. The camera also provides real-time signal processing in the form of multi-frame gray-scale contrast images and single-frame time-stamp histograms, and automated bias control has been implemented to maintain a constant photon detection efficiency in the presence of ambient temperature changes. A comprehensive graphical user interface has been developed to provide complete camera control using a simple serial command set, and this command set supports highly flexible end-user customization.

Entwistle, Mark; Itzler, Mark A.; Chen, Jim; Owens, Mark; Patel, Ketan; Jiang, Xudong; Slomkowski, Krystyna; Rangwala, Sabbir

2012-06-01

265

A SPECT Scanner for Rodent Imaging Based on Small-Area Gamma Cameras  

Microsoft Academic Search

We developed a cost-effective SPECT scanner prototype (rSPECT) for in vivo imaging of rodents based on small-area gamma cameras. Each detector consists of a position-sensitive photomultiplier tube (PS-PMT) coupled to a 30 x 30 Nal(Tl) scintillator array and electronics attached to the PS-PMT sockets for adapting the detector signals to an in-house developed data acquisition system. The detector components are

Eduardo Lage; José L. Villena; Gustavo Tapias; Naira P. Martínez; Maria L. Soto-Montenegro; Mónica Abella; Alejandro Sisniega; Francisco Pino; Domènec Ros; Javier Pavia; Manuel Desco; Juan J. Vaquero

2010-01-01

266

Portable retinal imaging for eye disease screening using a consumer-grade digital camera  

NASA Astrophysics Data System (ADS)

The development of affordable means to image the retina is an important step toward the implementation of eye disease screening programs. In this paper we present the i-RxCam, a low-cost, hand-held, retinal camera for widespread applications such as tele-retinal screening for eye diseases like diabetic retinopathy (DR), glaucoma, and age-related ocular diseases. Existing portable retinal imagers do not meet the requirements of a low-cost camera with sufficient technical capabilities (field of view, image quality, portability, battery power, and ease-of-use) to be distributed widely to low volume clinics, such as the offices of single primary care physicians serving rural communities. The i-RxCam uses a Nikon D3100 digital camera body. The camera has a CMOS sensor with 14.8 million pixels. We use a 50mm focal lens that gives a retinal field of view of 45 degrees. The internal autofocus can compensate for about 2D (diopters) of focusing error. The light source is an LED produced by Philips with a linear emitting area that is transformed using a light pipe to the optimal shape at the eye pupil, an annulus. To eliminate corneal reflex we use a polarization technique in which the light passes through a nano-wire polarizer plate. This is a novel type of polarizer featuring high polarization separation (contrast ratio of more than 1000) and very large acceptance angle (>45 degrees). The i-RxCam approach will yield a significantly more economical retinal imaging device that would allow mass screening of the at-risk population.

Barriga, Simon; Larichev, Andrey; Zamora, Gilberto; Soliz, Peter

2012-03-01

267

Volcano geodesy at Santiaguito using ground-based cameras and particle image velocimetry  

NASA Astrophysics Data System (ADS)

The active Santiaguito dome in Guatemala is an exceptional field site for ground-based optical observations owing to the bird's-eye viewing perspective from neighboring Santa Maria Volcano. From the summit of Santa Maria the frequent (1 per hour) explosions and continuous lava flow effusion may be observed from a vantage point, which is at a ~30 degree elevation angle, 1200 m above and 2700 m distant from the active vent. At these distances both video cameras and SLR cameras fitted with high-power lenses can effectively track blocky features translating and uplifting on the surface of Santiaguito's dome. We employ particle image velocimetry in the spatial frequency domain to map movements of ~10x10 m^2 surface patches with better than 10 cm displacement resolution. During three field campaigns to Santiaguito in 2007, 2009, and 2012 we have used cameras to measure dome surface movements for a range of time scales. In 2007 and 2009 we used video cameras recording at 30 fps to track repeated rapid dome uplift (more than 1 m within 2 s) of the 30,000 m^2 dome associated with the onset of eruptive activity. We inferred that the these uplift events were responsible for both a seismic long period response and an infrasound bimodal pulse. In 2012 we returned to Santiaguito to quantify dome surface movements over hour-to-day-long time scales by recording time lapse imagery at one minute intervals. These longer time scales reveal dynamic structure to the uplift and subsidence trends, effusion rate, and surface flow patterns that are related to internal conduit pressurization. In 2012 we performed particle image velocimetry with multiple cameras spatially separated in order to reconstruct 3-dimensional surface movements.

Johnson, J.; Andrews, B. J.; Anderson, J.; Lyons, J. J.; Lees, J. M.

2012-12-01

268

Camera model and calibration process for high-accuracy digital image metrology of inspection planes  

NASA Astrophysics Data System (ADS)

High accuracy digital image based metrology must rely on an integrated model of image generation that is able to consider simultaneously the geometry of the camera vs. object positioning, and the conversion of the optical image on the sensor into an electronic digital format. In applications of automated visual inspection involving the analysis of approximately plane objects these models are generally simplified in order to facilitate the process of camera calibration. In this context, the lack of rigor in the determination of the intrinsic parameters in such models is particularly relevant. Aiming at the high accuracy metrology of contours of objects lying on an analysis plane, and involving sub-pixel measurements, this paper presents a three-stage camera model that includes an extrinsic component of perspective distortion and the intrinsic components of radial lens distortion and sensor misalignment. The later two factors are crucial in applications of machine vision that rely on the use of low cost optical components. A polynomial model for the negative radial lens distortion of wide field of view CCTV lenses is also established.

Correia, Bento A. B.; Dinis, Joao

1998-10-01

269

Gamma camera imaging for studying intestinal absorption and whole-body distribution of selenomethionine.  

PubMed

Se metabolism in humans is not well characterised. Currently, the estimates of Se absorption, whole-body retention and excretion are being obtained from balance and tracer studies. In the present study, we used gamma camera imaging to evaluate the whole-body retention and distribution of radiolabelled selenomethionine (SeMet), the predominant form of Se present in foods. A total of eight healthy young men participated in the study. After consumption of a meal containing 4 MBq [??Se]L-SeMet ([??Se]SeMet), whole-body gamma camera scanning was performed for 45 min every hour over a 6 h period, every second hour for the next 18 h and once on each of the subsequent 6 d. Blood, urine and faecal samples were collected to determine the plasma content of [??Se]SeMet as well as its excretion in urine and faeces. Imaging showed that 87·9 (sd 3·3)% of the administered activity of [??Se]SeMet was retained within the body after 7 d. In contrast, the measured excretion in urine and faeces for the 7 d period was 8·2 (sd 1·1)% of the activity. Time-activity curves were generated for the whole body, stomach, liver, abdomen (other than the stomach and the liver), brain and femoral muscles. Gamma camera imaging allows for the assessment of the postprandial absorption of SeMet. This technique may also permit concurrent studies of organ turnover of SeMet. PMID:23930999

Madsen, Jan L; Sjögreen-Gleisner, Katarina; Elema, Dennis R; Søndergaard, Lasse R; Rasmussen, Palle; Fuglsang, Stefan; Ljungberg, Michael; Damgaard, Morten

2014-02-01

270

Improving estimates of leaf area index by processing RAW images in upward-pointing-digital cameras  

NASA Astrophysics Data System (ADS)

Leaf Area Index (LAI) measurement using upward-pointing digital camera in the forest floor has gained great attentions due to the feasibility of measuring LAI continuously at high accuracy. However, using upward-pointing digital camera could underestimate LAI when photos are exposed to excessive light conditions which make leaves near the sky in the photo disappeared. Processing RAW images could reduce possibility of LAI underestimation. This study aims to develop RAW image processing and compare RAW-derived LAI to JPEG-derived LAI. Digital photos have been automatically taken three times per day (0.5, 1, 1.5 hours before sunset) in both RAW and JPEG formats at Gwangreung deciduous and evergreen forests in South Korea. We used blue channel of RAW images to quantify gap fraction, then LAI. LAI estimates from JPEG and RAW images do not show substantial differences in the deciduous forest. However, LAI derived from RAW images at evergreen forest where forest floor is fairly dark even in daytime shows substantially less noise and greater values than JPEG-derived LAI. This study concludes that LAI estimates should be derived from RAW images for more accurate measurement of LAI.

Jeon, S.; Ryu, Y.

2013-12-01

271

Imaging microscopic structures in pathological retinas using a flood-illumination adaptive optics retinal camera  

NASA Astrophysics Data System (ADS)

This research is aimed at characterizing in vivo differences between healthy and pathological retinal tissues at the microscopic scale using a compact adaptive optics (AO) retinal camera. Tests were performed in 120 healthy eyes and 180 eyes suffering from 19 different pathological conditions, including age-related maculopathy (ARM), glaucoma and rare diseases such as inherited retinal dystrophies. Each patient was first examined using SD-OCT and infrared SLO. Retinal areas of 4°x4° were imaged using an AO flood-illumination retinal camera based on a large-stroke deformable mirror. Contrast was finally enhanced by registering and averaging rough images using classical algorithms. Cellular-resolution images could be obtained in most cases. In ARM, AO images revealed granular contents in drusen, which were invisible in SLO or OCT images, and allowed the observation of the cone mosaic between drusen. In glaucoma cases, visual field was correlated to changes in cone visibility. In inherited retinal dystrophies, AO helped to evaluate cone loss across the retina. Other microstructures, slightly larger in size than cones, were also visible in several retinas. AO provided potentially useful diagnostic and prognostic information in various diseases. In addition to cones, other microscopic structures revealed by AO images may also be of interest in monitoring retinal diseases.

Viard, Clément; Nakashima, Kiyoko; Lamory, Barbara; Pâques, Michel; Levecq, Xavier; Château, Nicolas

2011-03-01

272

COMPACT CdZnTe-BASED GAMMA CAMERA FOR PROSTATE CANCER IMAGING  

SciTech Connect

In this paper, we discuss the design of a compact gamma camera for high-resolution prostate cancer imaging using Cadmium Zinc Telluride (CdZnTe or CZT) radiation detectors. Prostate cancer is a common disease in men. Nowadays, a blood test measuring the level of prostate specific antigen (PSA) is widely used for screening for the disease in males over 50, followed by (ultrasound) imaging-guided biopsy. However, PSA tests have a high false-positive rate and ultrasound-guided biopsy has a high likelihood of missing small cancerous tissues. Commercial methods of nuclear medical imaging, e.g. PET and SPECT, can functionally image the organs, and potentially find cancer tissues at early stages, but their applications in diagnosing prostate cancer has been limited by the smallness of the prostate gland and the long working distance between the organ and the detectors comprising these imaging systems. CZT is a semiconductor material with wide band-gap and relatively high electron mobility, and thus can operate at room temperature without additional cooling. CZT detectors are photon-electron direct-conversion devices, thus offering high energy-resolution in detecting gamma rays, enabling energy-resolved imaging, and reducing the background of Compton-scattering events. In addition, CZT material has high stopping power for gamma rays; for medical imaging, a few-mm-thick CZT material provides adequate detection efficiency for many SPECT radiotracers. Because of these advantages, CZT detectors are becoming popular for several SPECT medical-imaging applications. Most recently, we designed a compact gamma camera using CZT detectors coupled to an application-specific-integrated-circuit (ASIC). This camera functions as a trans-rectal probe to image the prostate gland from a distance of only 1-5 cm, thus offering higher detection efficiency and higher spatial resolution. Hence, it potentially can detect prostate cancers at their early stages. The performance tests of this camera have been completed. The results show better than 6-mm resolution at a distance of 1 cm. Details of the test results are discussed in this paper.

CUI, Y.; LALL, T.; TSUI, B.; YU, J.; MAHLER, G.; BOLOTNIKOV, A.; VASKA, P.; DeGERONIMO, G.; O'CONNOR, P.; MEINKEN, G.; JOYAL, J.; BARRETT, J.; CAMARDA, G.; HOSSAIN, A.; KIM, K.H.; YANG, G.; POMPER, M.; CHO, S.; WEISMAN, K.; SEO, Y.; BABICH, J.; LaFRANCE, N.; AND JAMES, R.B.

2011-10-23

273

Diffuse reflection imaging of sub-epidermal tissue haematocrit using a simple RGB camera  

NASA Astrophysics Data System (ADS)

This paper describes the design and evaluation of a novel easy to use, tissue viability imaging system (TiVi). The system is based on the methods of diffuse reflectance spectroscopy and polarization spectroscopy. The technique has been developed as an alternative to current imaging technology in the area of microcirculation imaging, most notably optical coherence tomography (OCT) and laser Doppler perfusion imaging (LDPI). The system is based on standard digital camera technology, and is sensitive to red blood cells (RBCs) in the microcirculation. Lack of clinical acceptance of both OCT and LDPI fuels the need for an objective, simple, reproducible and portable imaging method that can provide accurate measurements related to stimulus vasoactivity in the microvasculature. The limitations of these technologies are discussed in this paper. Uses of the Tissue Viability system include skin care products, drug development, and assessment spatial and temporal aspects of vasodilation (erythema) and vasoconstriction (blanching).

Leahy, Martin J.; O'Doherty, Jim; McNamara, Paul; Henricson, Joakim; Nilsson, Gert E.; Anderson, Chris; Sjoberg, Folke

2007-05-01

274

Intraoperative Imaging Guidance for Sentinel Node Biopsy in Melanoma Using a Mobile Gamma Camera  

SciTech Connect

The objective is to evaluate the sensitivity and clinical utility of intraoperative mobile gamma camera (MGC) imaging in sentinel lymph node biopsy (SLNB) in melanoma. The false-negative rate for SLNB for melanoma is approximately 17%, for which failure to identify the sentinel lymph node (SLN) is a major cause. Intraoperative imaging may aid in detection of SLN near the primary site, in ambiguous locations, and after excision of each SLN. The present pilot study reports outcomes with a prototype MGC designed for rapid intraoperative image acquisition. We hypothesized that intraoperative use of the MGC would be feasible and that sensitivity would be at least 90%. From April to September 2008, 20 patients underwent Tc99 sulfur colloid lymphoscintigraphy, and SLNB was performed with use of a conventional fixed gamma camera (FGC), and gamma probe followed by intraoperative MGC imaging. Sensitivity was calculated for each detection method. Intraoperative logistical challenges were scored. Cases in which MGC provided clinical benefit were recorded. Sensitivity for detecting SLN basins was 97% for the FGC and 90% for the MGC. A total of 46 SLN were identified: 32 (70%) were identified as distinct hot spots by preoperative FGC imaging, 31 (67%) by preoperative MGC imaging, and 43 (93%) by MGC imaging pre- or intraoperatively. The gamma probe identified 44 (96%) independent of MGC imaging. The MGC provided defined clinical benefit as an addition to standard practice in 5 (25%) of 20 patients. Mean score for MGC logistic feasibility was 2 on a scale of 1-9 (1 = best). Intraoperative MGC imaging provides additional information when standard techniques fail or are ambiguous. Sensitivity is 90% and can be increased. This pilot study has identified ways to improve the usefulness of an MGC for intraoperative imaging, which holds promise for reducing false negatives of SLNB for melanoma.

Dengel, Lynn T; Judy, Patricia G; Petroni, Gina R; Smolkin, Mark E; Rehm, Patrice K; Majewski, Stan; Williams, Mark B

2011-04-01

275

Application of real-time single camera SLAM technology for image-guided targeting in neurosurgery  

NASA Astrophysics Data System (ADS)

In this paper, we propose an application of augmented reality technology for targeting tumors or anatomical structures inside the skull. The application is a combination of the technologies of MonoSLAM (Single Camera Simultaneous Localization and Mapping) and computer graphics. A stereo vision system is developed to construct geometric data of human face for registration with CT images. Reliability and accuracy of the application is enhanced by the use of fiduciary markers fixed to the skull. The MonoSLAM keeps track of the current location of the camera with respect to an augmented reality (AR) marker using the extended Kalman filter. The fiduciary markers provide reference when the AR marker is invisible to the camera. Relationship between the markers on the face and the augmented reality marker is obtained by a registration procedure by the stereo vision system and is updated on-line. A commercially available Android based tablet PC equipped with a 320×240 front-facing camera was used for implementation. The system is able to provide a live view of the patient overlaid by the solid models of tumors or anatomical structures, as well as the missing part of the tool inside the skull.

Chang, Yau-Zen; Hou, Jung-Fu; Tsao, Yi Hsiang; Lee, Shih-Tseng

2012-10-01

276

Formulation of image quality prediction criteria for the Viking lander camera  

NASA Technical Reports Server (NTRS)

Image quality criteria are defined and mathematically formulated for the prediction computer program which is to be developed for the Viking lander imaging experiment. The general objective of broad-band (black and white) imagery to resolve small spatial details and slopes is formulated as the detectability of a right-circular cone with surface properties of the surrounding terrain. The general objective of narrow-band (color and near-infrared) imagery to observe spectral characteristics if formulated as the minimum detectable albedo variation. The general goal to encompass, but not exceed, the range of the scene radiance distribution within single, commandable, camera dynamic range setting is also considered.

Huck, F. O.; Jobson, D. J.; Taylor, E. J.; Wall, S. D.

1973-01-01

277

The core of the nearby S0 galaxy NGC 7457 imaged with the HST planetary camera  

SciTech Connect

A brief analysis is presented of images of the nearby S0 galaxy NGC 7457 obtained with the HST Planetary Camera. While the galaxy remains unresolved with the HST, the images reveal that any core most likely has r(c) less than 0.052 arcsec. The light distribution is consistent with a gamma = -1.0 power law inward to the resolution limit, with a possible stellar nucleus with luminosity of 10 million solar. This result represents the first observation outside the Local Group of a galaxy nucleus at this spatial resolution, and it suggests that such small, high surface brightness cores may be common. 20 refs.

Lauer, T.R.; Faber, S.M.; Holtzman, J.A.; Baum, W.A.; Currie, D.G.; Ewald, S.P.; Groth, E.J.; Hester, J.J.; Kelsall, T. (Kitt Peak National Observatory, Tucson, AZ (USA) Lick Observatory, Santa Cruz, CA (USA) Lowell Observatory, Flagstaff, AZ (USA) Washington Univ., Seattle (USA) Maryland Univ., College Park (USA) Space Telescope Science Institute, Baltimore, MD (USA) Princeton Univ., NJ (USA) California Institute of Technology, Pasadena (USA) NASA, Goddard Space Flight Center, Greenbelt, MD (USA))

1991-03-01

278

Electron-tracking Compton gamma-ray camera for small animal and phantom imaging  

NASA Astrophysics Data System (ADS)

We have developed an electron-tracking Compton camera (ETCC) for medical use. Our ETCC has a wide energy dynamic range (200-1300 keV) and wide field of view (3 sr), and thus has potential for advanced medical use. To evaluate the ETCC, we imaged the head (brain) and bladder of mice that had been administered with F-18-FDG. We also imaged the head and thyroid gland of mice using double tracers of F-18-FDG and I-131 ions.

Kabuki, Shigeto; Kimura, Hiroyuki; Amano, Hiroo; Nakamoto, Yuji; Kubo, Hidetoshi; Miuchi, Kentaro; Kurosawa, Shunsuke; Takahashi, Michiaki; Kawashima, Hidekazu; Ueda, Masashi; Okada, Tomohisa; Kubo, Atsushi; Kunieda, Etuso; Nakahara, Tadaki; Kohara, Ryota; Miyazaki, Osamu; Nakazawa, Tetsuo; Shirahata, Takashi; Yamamoto, Etsuji; Ogawa, Koichi; Togashi, Kaori; Saji, Hideo; Tanimori, Toru

2010-11-01

279

The facsimile camera - Its potential as a planetary lander imaging system  

NASA Technical Reports Server (NTRS)

The facsimile camera is an optical-mechanical scanning device which is an attractive candidate for planetary lander imaging systems and has been selected for the Viking/Mars mission because of its light weight, small size, and low power requirement. Other advantages are that it can provide good radiometric and photogrammetric accuracy because the complete field of view is scanned with a single photodetector located on or near the optical axis of the objective lens. In addition, this device has the potential capability of multispectral imaging and spectrometric measurements.

Huck, F. O.; Katzberg, S. J.; Kelly, W. L.

1973-01-01

280

A gamma camera count rate saturation correction method for whole-body planar imaging.  

PubMed

Whole-body (WB) planar imaging has long been one of the staple methods of dosimetry, and its quantification has been formalized by the MIRD Committee in pamphlet no 16. One of the issues not specifically addressed in the formalism occurs when the count rates reaching the detector are sufficiently high to result in camera count saturation. Camera dead-time effects have been extensively studied, but all of the developed correction methods assume static acquisitions. However, during WB planar (sweep) imaging, a variable amount of imaged activity exists in the detector's field of view as a function of time and therefore the camera saturation is time dependent. A new time-dependent algorithm was developed to correct for dead-time effects during WB planar acquisitions that accounts for relative motion between detector heads and imaged object. Static camera dead-time parameters were acquired by imaging decaying activity in a phantom and obtaining a saturation curve. Using these parameters, an iterative algorithm akin to Newton's method was developed, which takes into account the variable count rate seen by the detector as a function of time. The algorithm was tested on simulated data as well as on a whole-body scan of high activity Samarium-153 in an ellipsoid phantom. A complete set of parameters from unsaturated phantom data necessary for count rate to activity conversion was also obtained, including build-up and attenuation coefficients, in order to convert corrected count rate values to activity. The algorithm proved successful in accounting for motion- and time-dependent saturation effects in both the simulated and measured data and converged to any desired degree of precision. The clearance half-life calculated from the ellipsoid phantom data was calculated to be 45.1 h after dead-time correction and 51.4 h with no correction; the physical decay half-life of Samarium-153 is 46.3 h. Accurate WB planar dosimetry of high activities relies on successfully compensating for camera saturation which takes into account the variable activity in the field of view, i.e. time-dependent dead-time effects. The algorithm presented here accomplishes this task. PMID:20071766

Hobbs, Robert F; Baechler, Sébastien; Senthamizhchelvan, Srinivasan; Prideaux, Andrew R; Esaias, Caroline E; Reinhardt, Melvin; Frey, Eric C; Loeb, David M; Sgouros, George

2010-02-01

281

Development of a Portable 3CCD Camera System for Multispectral Imaging of Biological Samples.  

PubMed

Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples. PMID:25350510

Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S

2014-01-01

282

Microfluidic oxygen imaging using integrated optical sensor layers and a color camera.  

PubMed

In this work we present a high resolution oxygen imaging approach, which can be used to study 2D oxygen distribution inside microfluidic environments. The presented setup comprises a fabrication process of microfluidic chips with integrated luminescent sensing films combined with referenced oxygen imaging applying a color CCD-camera. Enhancement of the sensor performance was achieved by applying the principle of light harvesting. This principle enabled ratiometric imaging employing the red and the green channel of a color CCD-camera. The oxygen sensitive emission of platinum(ii)-5,10,15,20-tetrakis-(2,3,4,5,6-pentafluorphenyl)-porphyrin (PtTFPP) was detected by the red channel, while the emission of a reference dye was detected by the green channel. This measurement setup allowed for accurate real-time 2D oxygen imaging with superior quality compared to intensity imaging. The sensor films were subsequently used to measure the respiratory activity of human cell cultures (HeLa carcinoma cells and normal human dermal fibroblasts) in a microfluidic system. The sensor setup is well suited for different applications from spatially and temporally resolving oxygen concentration inside microfluidic channels to parallelization of oxygen measurements and paves the way to novel cell based assays, e.g. in tissue engineering, tumor biology and hypoxia reperfusion phenomena. PMID:23443957

Ungerböck, Birgit; Charwat, Verena; Ertl, Peter; Mayr, Torsten

2013-04-21

283

Determining 3D Flow Fields via Multi-camera Light Field Imaging  

PubMed Central

In the field of fluid mechanics, the resolution of computational schemes has outpaced experimental methods and widened the gap between predicted and observed phenomena in fluid flows. Thus, a need exists for an accessible method capable of resolving three-dimensional (3D) data sets for a range of problems. We present a novel technique for performing quantitative 3D imaging of many types of flow fields. The 3D technique enables investigation of complicated velocity fields and bubbly flows. Measurements of these types present a variety of challenges to the instrument. For instance, optically dense bubbly multiphase flows cannot be readily imaged by traditional, non-invasive flow measurement techniques due to the bubbles occluding optical access to the interior regions of the volume of interest. By using Light Field Imaging we are able to reparameterize images captured by an array of cameras to reconstruct a 3D volumetric map for every time instance, despite partial occlusions in the volume. The technique makes use of an algorithm known as synthetic aperture (SA) refocusing, whereby a 3D focal stack is generated by combining images from several cameras post-capture 1. Light Field Imaging allows for the capture of angular as well as spatial information about the light rays, and hence enables 3D scene reconstruction. Quantitative information can then be extracted from the 3D reconstructions using a variety of processing algorithms. In particular, we have developed measurement methods based on Light Field Imaging for performing 3D particle image velocimetry (PIV), extracting bubbles in a 3D field and tracking the boundary of a flickering flame. We present the fundamentals of the Light Field Imaging methodology in the context of our setup for performing 3DPIV of the airflow passing over a set of synthetic vocal folds, and show representative results from application of the technique to a bubble-entraining plunging jet. PMID:23486112

Truscott, Tadd T.; Belden, Jesse; Nielson, Joseph R.; Daily, David J.; Thomson, Scott L.

2013-01-01

284

Application of adaptive optics in retinal imaging: a quantitative and clinical comparison with standard cameras  

NASA Astrophysics Data System (ADS)

Aim: The objective of this project was to evaluate high resolution images from an adaptive optics retinal imager through comparisons with standard film-based and standard digital fundus imagers. Methods: A clinical prototype adaptive optics fundus imager (AOFI) was used to collect retinal images from subjects with various forms of retinopathy to determine whether improved visibility into the disease could be provided to the clinician. The AOFI achieves low-order correction of aberrations through a closed-loop wavefront sensor and an adaptive optics system. The remaining high-order aberrations are removed by direct deconvolution using the point spread function (PSF) or by blind deconvolution when the PSF is not available. An ophthalmologist compared the AOFI images with standard fundus images and provided a clinical evaluation of all the modalities and processing techniques. All images were also analyzed using a quantitative image quality index. Results: This system has been tested on three human subjects (one normal and two with retinopathy). In the diabetic patient vascular abnormalities were detected with the AOFI that cannot be resolved with the standard fundus camera. Very small features, such as the fine vascular structures on the optic disc and the individual nerve fiber bundles are easily resolved by the AOFI. Conclusion: This project demonstrated that adaptive optic images have great potential in providing clinically significant detail of anatomical and pathological structures to the ophthalmologist.

Barriga, E. S.; Erry, G.; Yang, S.; Russell, S.; Raman, B.; Soliz, P.

2005-04-01

285

Characterization of digital cameras for reflected ultraviolet photography; implications for qualitative and quantitative image analysis during forensic examination.  

PubMed

Reflected ultraviolet imaging techniques allow for the visualization of evidence normally outside the human visible spectrum. Specialized digital cameras possessing extended sensitivity can be used for recording reflected ultraviolet radiation. Currently, there is a lack of standardized methods for ultraviolet image recording and processing using digital cameras, potentially limiting the implementation and interpretation. A methodology is presented for processing ultraviolet images based on linear responses and the sensitivity of the respective color channels. The methodology is applied to a FujiS3 UVIR camera, and a modified Nikon D70s camera, to reconstruct their respective spectral sensitivity curves between 320 and 400 nm. This method results in images with low noise and high contrast, suitable for qualitative and/or quantitative analysis. The application of this methodology is demonstrated in the recording of latent fingerprints. PMID:24117678

Garcia, Jair E; Wilksch, Philip A; Spring, Gale; Philp, Peta; Dyer, Adrian

2014-01-01

286

Digital image measurement of specimen deformation based on CCD cameras and Image J software: an application to human pelvic biomechanics  

NASA Astrophysics Data System (ADS)

A method of digital image measurement of specimen deformation based on CCD cameras and Image J software was developed. This method was used to measure the biomechanics behavior of human pelvis. Six cadaveric specimens from the third lumbar vertebra to the proximal 1/3 part of femur were tested. The specimens without any structural abnormalities were dissected of all soft tissue, sparing the hip joint capsules and the ligaments of the pelvic ring and floor. Markers with black dot on white background were affixed to the key regions of the pelvis. Axial loading from the proximal lumbar was applied by MTS in the gradient of 0N to 500N, which simulated the double feet standing stance. The anterior and lateral images of the specimen were obtained through two CCD cameras. Based on Image J software, digital image processing software, which can be freely downloaded from the National Institutes of Health, digital 8-bit images were processed. The procedure includes the recognition of digital marker, image invert, sub-pixel reconstruction, image segmentation, center of mass algorithm based on weighted average of pixel gray values. Vertical displacements of S1 (the first sacral vertebrae) in front view and micro-angular rotation of sacroiliac joint in lateral view were calculated according to the marker movement. The results of digital image measurement showed as following: marker image correlation before and after deformation was excellent. The average correlation coefficient was about 0.983. According to the 768 × 576 pixels image (pixel size 0.68mm × 0.68mm), the precision of the displacement detected in our experiment was about 0.018 pixels and the comparatively error could achieve 1.11\\perthou. The average vertical displacement of S1 of the pelvis was 0.8356+/-0.2830mm under vertical load of 500 Newtons and the average micro-angular rotation of sacroiliac joint in lateral view was 0.584+/-0.221°. The load-displacement curves obtained from our optical measure system matched the clinical results. Digital image measurement of specimen deformation based on CCD cameras and Image J software has good perspective for application in biomechanical research, which has the advantage of simple optical setup, no-contact, high precision, and no special requirement of test environment.

Jia, Yongwei; Cheng, Liming; Yu, Guangrong; Lou, Yongjian; Yu, Yan; Chen, Bo; Ding, Zuquan

2008-03-01

287

A Gaseous Compton Camera using a 2D-sensitive gaseous photomultiplier for Nuclear Medical Imaging  

NASA Astrophysics Data System (ADS)

A new Compton Camera (CC) concept based on a High Pressure Scintillation Chamber coupled to a position-sensitive Gaseous PhotoMultiplier for Nuclear Medical Imaging applications is proposed. The main goal of this work is to describe the development of a ?25×12 cm3 cylindrical prototype, which will be suitable for scintimammography and for small-animal imaging applications. The possibility to scale it to an useful human size device is also in study. The idea is to develop a device capable to compete with the standard Anger Camera. Despite the large success of the Anger Camera, it still presents some limitations, such as: low position resolution and fair energy resolutions for 140 keV. The CC arises a different solution as it provides information about the incoming photon direction, avoiding the use of a collimator, which is responsible for a huge reduction (10-4) of the sensitivity. The main problem of the CC's is related with the Doppler Broadening which is responsible for the loss of angular resolution. In this work, calculations for the Doppler Broadening in Xe, Ar, Ne and their mixtures are presented. Simulations of the detector performance together with discussion about the gas choice are also included .

Azevedo, C. D. R.; Pereira, F. A.; Lopes, T.; Correia, P. M. M.; Silva, A. L. M.; Carramate, L. F. N. D.; Covita, D. S.; Veloso, J. F. C. A.

2013-12-01

288

High performance gel imaging with a commercial single lens reflex camera  

NASA Astrophysics Data System (ADS)

A high performance gel imaging system was constructed using a digital single lens reflex camera with epi-illumination to image 19 × 23 cm agarose gels with up to 10,000 DNA bands each. It was found to give equivalent performance to a laser scanner in this high throughput DNA fingerprinting application using the fluorophore SYBR Green®. The specificity and sensitivity of the imager and scanner were within 1% using the same band identification software. Low and high cost color filters were also compared and it was found that with care, good results could be obtained with inexpensive dyed acrylic filters in combination with more costly dielectric interference filters, but that very poor combinations were also possible. Methods for determining resolution, dynamic range, and optical efficiency for imagers are also proposed to facilitate comparison between systems.

Slobodan, J.; Corbett, R.; Wye, N.; Schein, J. E.; Marra, M. A.; Coope, R. J. N.

2011-03-01

289

Optimal camera placement for cardiac imaging using rotating multi-segment slant-hole single photon emission computed tomography  

Microsoft Academic Search

Rotating multi-segment slant-hole (RMSSH) collimators provide much higher detection efficiency and allow fully 3D cardiac SPECT using a much lower number of camera positions as compared to conventional low-energy high-resolution parallel-hole collimators. In this study, the effects of number and placement of camera positions in RMSSH SPECT for cardiac imaging were studied, to achieve the best possible reconstruction image quality

Yuchuan Wang; Benjamin M. W. Tsui; William H. Baird

2002-01-01

290

Performance of a slow-scan CCD camera for macromolecular imaging in a 400 kV electron cryomicroscope  

Microsoft Academic Search

The feasibility and limitations of a 1024 × 1024 slow-scan charge-coupled device (CCD) camera were evaluated for imaging in a 400kV electron cryomicroscope. Catalase crystals and amorphous carbon film were used as test specimens. Using catalase crystals, it was found that the finite (24 ?m) pixel size of the slow-scan CCD camera governs the ultimate resolution in the acquired images.

Michael B. Sherman; Jacob Brink; Wah Chiu

1996-01-01

291

Megapixel imaging camera for expanded H{sup {minus}} beam measurements  

SciTech Connect

A charge coupled device (CCD) imaging camera system has been developed as part of the Ground Test Accelerator project at the Los Alamos National Laboratory to measure the properties of a large diameter, neutral particle beam. The camera is designed to operate in the accelerator vacuum system for extended periods of time. It would normally be cooled to reduce dark current. The CCD contains 1024 {times} 1024 pixels with pixel size of 19 {times} 19 {mu}m{sup 2} and with four phase parallel clocking and two phase serial clocking. The serial clock rate is 2.5{times}10{sup 5} pixels per second. Clock sequence and timing are controlled by an external logic-word generator. The DC bias voltages are likewise located externally. The camera contains circuitry to generate the analog clocks for the CCD and also contains the output video signal amplifier. Reset switching noise is removed by an external signal processor that employs delay elements to provide noise suppression by the method of double-correlated sampling. The video signal is digitized to 12 bits in an analog to digital converter (ADC) module controlled by a central processor module. Both modules are located in a VME-type computer crate that communicates via ethernet with a separate workstation where overall control is exercised and image processing occurs. Under cooled conditions the camera shows good linearity with dynamic range of 2000 and with dark noise fluctuations of about {plus_minus}1/2 ADC count. Full well capacity is about 5{times}10{sup 5} electron charges.

Simmons, J.E.; Lillberg, J.W.; McKee, R.J.; Slice, R.W.; Torrez, J.H. [Los Alamos National Lab., NM (United States); McCurnin, T.W.; Sanchez, P.G. [EG and G Energy Measurements, Inc., Los Alamos, NM (United States). Los Alamos Operations

1994-02-01

292

Optimization of camera exposure durations for multi-exposure speckle imaging of the microcirculation  

PubMed Central

Improved Laser Speckle Contrast Imaging (LSCI) blood flow analyses that incorporate inverse models of the underlying laser-tissue interaction have been used to develop more quantitative implementations of speckle flowmetry such as Multi-Exposure Speckle Imaging (MESI). In this paper, we determine the optimal camera exposure durations required for obtaining flow information with comparable accuracy with the prevailing MESI implementation utilized in recent in vivo rodent studies. A looping leave-one-out (LOO) algorithm was used to identify exposure subsets which were analyzed for accuracy against flows obtained from analysis with the original full exposure set over 9 animals comprising n = 314 regional flow measurements. From the 15 original exposures, 6 exposures were found using the LOO process to provide comparable accuracy, defined as being no more than 10% deviant, with the original flow measurements. The optimal subset of exposures provides a basis set of camera durations for speckle flowmetry studies of the microcirculation and confers a two-fold faster acquisition rate and a 28% reduction in processing time without sacrificing accuracy. Additionally, the optimization process can be used to identify further reductions in the exposure subsets for tailoring imaging over less expansive flow distributions to enable even faster imaging. PMID:25071956

Kazmi, S. M. Shams; Balial, Satyajit; Dunn, Andrew K.

2014-01-01

293

Orienting the camera and firing lasers to enhance large scale particle image velocimetry for streamflow monitoring  

NASA Astrophysics Data System (ADS)

scale particle image velocimetry (LSPIV) is a nonintrusive methodology for continuous surface flow monitoring in natural environments. Recent experimental studies demonstrate that LSPIV is a promising technique to estimate flow discharge in riverine systems. Traditionally, LSPIV implementations are based on the use of angled cameras to capture extended fields of view; images are then orthorectified and calibrated through the acquisition of ground reference points. As widely documented in the literature, the identification of ground reference points and image orthorectification are major hurdles in LSPIV. Here we develop an experimental apparatus to address both of these issues. The proposed platform includes a laser system for remote frame calibration and a low-cost camera that is maintained orthogonal with respect to the water surface to minimize image distortions. We study the feasibility of the apparatus on two complex natural riverine environments where the acquisition of ground reference points is prevented and illumination and seeding density conditions are challenging. While our results confirm that velocity estimations can be severely affected by inhomogeneously seeded surface tracers and adverse illumination settings, they demonstrate that LSPIV implementations can benefit from the proposed apparatus. Specifically, the presented system opens novel avenues in the development of stand-alone platforms for remote surface flow monitoring.

Tauro, Flavia; Porfiri, Maurizio; Grimaldi, Salvatore

2014-09-01

294

Real-Time On-Board Processing Validation of MSPI Ground Camera Images  

NASA Technical Reports Server (NTRS)

The Earth Sciences Decadal Survey identifies a multiangle, multispectral, high-accuracy polarization imager as one requirement for the Aerosol-Cloud-Ecosystem (ACE) mission. JPL has been developing a Multiangle SpectroPolarimetric Imager (MSPI) as a candidate to fill this need. A key technology development needed for MSPI is on-board signal processing to calculate polarimetry data as imaged by each of the 9 cameras forming the instrument. With funding from NASA's Advanced Information Systems Technology (AIST) Program, JPL is solving the real-time data processing requirements to demonstrate, for the first time, how signal data at 95 Mbytes/sec over 16-channels for each of the 9 multiangle cameras in the spaceborne instrument can be reduced on-board to 0.45 Mbytes/sec. This will produce the intensity and polarization data needed to characterize aerosol and cloud microphysical properties. Using the Xilinx Virtex-5 FPGA including PowerPC440 processors we have implemented a least squares fitting algorithm that extracts intensity and polarimetric parameters in real-time, thereby substantially reducing the image data volume for spacecraft downlink without loss of science information.

Pingree, Paula J.; Werne, Thomas A.; Bekker, Dmitriy L.

2010-01-01

295

Airborne imaging for heritage documentation using the Fotokite tethered flying camera  

NASA Astrophysics Data System (ADS)

Since the beginning of aerial photography, researchers used all kinds of devices (from pigeons, kites, poles, and balloons to rockets) to take still cameras aloft and remotely gather aerial imagery. To date, many of these unmanned devices are still used for what has been referred to as Low-Altitude Aerial Photography or LAAP. In addition to these more traditional camera platforms, radio-controlled (multi-)copter platforms have recently added a new aspect to LAAP. Although model airplanes have been around for several decades, the decreasing cost, increasing functionality and stability of ready-to-fly multi-copter systems has proliferated their use among non-hobbyists. As such, they became a very popular tool for aerial imaging. The overwhelming amount of currently available brands and types (heli-, dual-, tri-, quad-, hexa-, octo-, dodeca-, deca-hexa and deca-octocopters), together with the wide variety of navigation options (e.g. altitude and position hold, waypoint flight) and camera mounts indicate that these platforms are here to stay for some time. Given the multitude of still camera types and the image quality they are currently capable of, endless combinations of low- and high-cost LAAP solutions are available. In addition, LAAP allows for the exploitation of new imaging techniques, as it is often only a matter of lifting the appropriate device (e.g. video cameras, thermal frame imagers, hyperspectral line sensors). Archaeologists were among the first to adopt this technology, as it provided them with a means to easily acquire essential data from a unique point of view, whether for simple illustration purposes of standing historic structures or to compute three-dimensional (3D) models and orthophotographs from excavation areas. However, even very cheap multi-copters models require certain skills to pilot them safely. Additionally, malfunction or overconfidence might lift these devices to altitudes where they can interfere with manned aircrafts. As such, the safe operation of these devices is still an issue, certainly when flying on locations which can be crowded (such as students on excavations or tourists walking around historic places). As the future of UAS regulation remains unclear, this talk presents an alternative approach to aerial imaging: the Fotokite. Developed at the ETH Zürich, the Fotokite is a tethered flying camera that is essentially a multi-copter connected to the ground with a taut tether to achieve controlled flight. Crucially, it relies solely on onboard IMU (Inertial Measurement Unit) measurements to fly, launches in seconds, and is classified as not a UAS (Unmanned Aerial System), e.g. in the latest FAA (Federal Aviation Administration) UAS proposal. As a result it may be used for imaging cultural heritage in a variety of environments and settings with minimal training by non-experienced pilots. Furthermore, it is subject to less extensive certification, regulation and import/export restrictions, making it a viable solution for use at a greater range of sites than traditional methods. Unlike a balloon or a kite it is not subject to particular weather conditions and, thanks to active stabilization, is capable of a variety of intelligent flight modes. Finally, it is compact and lightweight, making it easy to transport and deploy, and its lack of reliance on GNSS (Global Navigation Satellite System) makes it possible to use in urban, overbuilt areas. After outlining its operating principles, the talk will present some archaeological case studies in which the Fotokite was used, hereby assessing its capabilities compared to the conventional UAS's on the market.

Verhoeven, Geert; Lupashin, Sergei; Briese, Christian; Doneus, Michael

2014-05-01

296

High-Contrast Exoplanet Imaging with CLIO2, the Magellan Adaptive Optics Infrared Camera  

NASA Astrophysics Data System (ADS)

MagAO is the adaptive-secondary AO system on the 6.5-m Magellan Clay telescope. With a high actuator density and a sensitive pyramid WFS, MagAO achieves down to ~130 nm rms WFE on bright guide stars in median seeing conditions (0.7'' V band) at Las Campanas Observatory in Chile. MagAO's infrared camera, Clio2, has a comprehensive suite of narrow and broad band filters that allow direct imaging of faint companions from 1-5 um. We present first-light results from Clio2, including images of exoplanetary system Beta Pictoris. High-contrast imaging is an important goal of AO for ELTs, and results from MagAO/Clio2 are the next step along that path --- particularly true for the GMT which is located very close to the Magellan site.

Morzinski, Katie; Close, Laird; Males, Jared; Hinz, Philip; Puglisi, Alfio; Esposito, Simone; Riccardi, Armando; Pinna, Enrico; Xompero, Marco; Briguglio, Runa; Follette, Kate; Kopon, Derek; Skemer, Andy; Gasho, Victor; Uomoto, Alan; Hare, Tyson; Arcidiacono, Carmelo; Quiros-Pacheco, Fernando; Argomedo, Javier; Busoni, Lorenzo; Rodigas, Timothy; Wu, Ya-Lin

2013-12-01

297

First experience DaTSCAN imaging using cadmium-zinc-telluride gamma camera SPECT.  

PubMed

We report our first experience of brain DaTSCAN SPECT imaging using cadmium-zinc-telluride gamma camera (CZT-GC) in 2 cases: a 64-year-old patient suffering from essential tremor and a 73-year-old patient presenting with atypical bilateral extrapyramidal syndrome. In both cases, 2 different acquisitions were performed and compared, using a double-head Anger-GC, followed immediately by a second acquisition on CZT-GC. There were no significant visual differences between images generated by different GC. Our first result suggests that DaTSCAN SPECT is feasible on CZT-GC, allowing both injected dose and acquisition time reductions without compromising image quality. This experience needs to be evaluated in larger series. PMID:22785531

Farid, Karim; Queneau, Mathieu; Guernou, Mohamed; Lussato, David; Poullias, Xavier; Petras, Slavomir; Caillat-Vigneron, Nadine; Songy, Bernard

2012-08-01

298

Toward Real-time quantum imaging with a single pixel camera  

SciTech Connect

We present a workbench for the study of real-time quantum imaging by measuring the frame-by-frame quantum noise reduction of multi-spatial-mode twin beams generated by four wave mixing in Rb vapor. Exploiting the multiple spatial modes of this squeezed light source, we utilize spatial light modulators to selectively transmit macropixels of quantum correlated modes from each of the twin beams to a high quantum efficiency balanced detector. In low-light-level imaging applications, the ability to measure the quantum correlations between individual spatial modes and macropixels of spatial modes with a single pixel camera will facilitate compressive quantum imaging with sensitivity below the photon shot noise limit.

Lawrie, Benjamin J [ORNL; Pooser, Raphael C [ORNL

2013-01-01

299

A new tubeless nanosecond streak camera based on optical deflection and direct CCD imaging  

SciTech Connect

A new optically deflected streaking camera with performance of nanosecond-range resolution, superior imaging quality, high signal detectability, and large format recording has been conceived and developed. Its construction is composed of an optomechanical deflector that deflects the line-shape image of spatial-distributed time-varying signals across the sensing surface of a cooled scientific two-dimensional CCD array with slow readout driving electronics, a lens assembly, and a desk-top computer for prompt digital data acquisition and processing. Its development utilizes the synergism of modern technologies in sensor, optical deflector, optics and microcomputer. With laser light as signal carrier, the deflecting optics produces near diffraction-limited streak images resolving to a single pixel size of 25{mu}. A 1kx1k-pixel array can thus provide a vast record of 1,000 digital data points along each spatial or temporal axis. Since only one photon-to-electron conversion exists in the entire signal recording path, the camera responses linearly to the incident light over a wide dynamic range in excess of 10{sup 4}:1. Various image deflection techniques are assessed for imaging fidelity, deflection speed, and capacity for external triggering. Innovative multiple-pass deflection methods for utilizing optomechanical deflector have been conceived and developed to attain multi-fold amplification for the optical scanning. speed across the CCD surface at a given angular deflector speed. Without significantly compromising imaging. quality or flux throughput efficiency, these optical methods enable a sub-10 ns/pixel streak speed with the deflector moving benignly at 500 radians/second, or equivalently 80 revolutions /second. Test results of the prototype performance are summarized including a spatial resolution of 10 lp/mm at 65% CTF and a temporal resolution of 11.4 ns at 3.8 ns/pixel.

Lai, C.C.

1992-12-01

300

A new tubeless nanosecond streak camera based on optical deflection and direct CCD imaging  

SciTech Connect

A new optically deflected streaking camera with performance of nanosecond-range resolution, superior imaging quality, high signal detectability, and large format recording has been conceived and developed. Its construction is composed of an optomechanical deflector that deflects the line-shape image of spatial-distributed time-varying signals across the sensing surface of a cooled scientific two-dimensional CCD array with slow readout driving electronics, a lens assembly, and a desk-top computer for prompt digital data acquisition and processing. Its development utilizes the synergism of modern technologies in sensor, optical deflector, optics and microcomputer. With laser light as signal carrier, the deflecting optics produces near diffraction-limited streak images resolving to a single pixel size of 25[mu]. A 1kx1k-pixel array can thus provide a vast record of 1,000 digital data points along each spatial or temporal axis. Since only one photon-to-electron conversion exists in the entire signal recording path, the camera responses linearly to the incident light over a wide dynamic range in excess of 10[sup 4]:1. Various image deflection techniques are assessed for imaging fidelity, deflection speed, and capacity for external triggering. Innovative multiple-pass deflection methods for utilizing optomechanical deflector have been conceived and developed to attain multi-fold amplification for the optical scanning. speed across the CCD surface at a given angular deflector speed. Without significantly compromising imaging. quality or flux throughput efficiency, these optical methods enable a sub-10 ns/pixel streak speed with the deflector moving benignly at 500 radians/second, or equivalently 80 revolutions /second. Test results of the prototype performance are summarized including a spatial resolution of 10 lp/mm at 65% CTF and a temporal resolution of 11.4 ns at 3.8 ns/pixel.

Lai, C.C.

1992-12-01

301

Synchroscan streak camera imaging at a 15-MeV photoinjector with emittance exchange  

NASA Astrophysics Data System (ADS)

At the Fermilab A0 photoinjector facility, bunch-length measurements of the laser micropulse and the e-beam micropulse have been done in the past with a fast single-sweep module of the Hamamatsu C5680 streak camera with an intrinsic shot-to-shot trigger jitter of 10-20 ps. We have upgraded the camera system with the synchroscan module tuned to 81.25 MHz to provide synchronous summing capability with less than 1.5 ps FWHM trigger jitter and a phase-locked delay box to provide phase stability of ˜1 ps over 10 s of minutes. These steps allowed us to measure both the UV laser pulse train at 263 nm and the e-beam via optical transition radiation (OTR). Due to the low electron beam energies and OTR signals, we typically summed over 50 micropulses with 0.25-1 nC per micropulse. The phase-locked delay box allowed us to assess chromatic temporal effects and instigated another upgrade to an all-mirror input optics barrel. In addition, we added a slow sweep horizontal deflection plug-in unit to provide dual-sweep capability for the streak camera. We report on a series of measurements made during the commissioning of these upgrades including bunch-length and phase effects using the emittance exchange beamline and simultaneous imaging of a UV drive laser component, OTR, and the 800 nm diagnostics laser.

Lumpkin, A. H.; Ruan, J.; Thurman-Keup, R.

2012-09-01

302

A 3D HIDAC-PET camera with sub-millimetre resolution for imaging small animals  

Microsoft Academic Search

A HIDAC-PET camera consisting essentially of 5 million 0.5 mm gas avalanching detectors has been constructed for small-animal imaging. The particular HIDAC advantage-a high 3D spatial resolution-has been improved to 0.95 mm fwhm and to 0.7 mm fwhm when reconstructing with 3D-OSEM methods incorporating resolution recovery. A depth-of-interaction resolution of 2.5 mm is implicit, due to the laminar construction. Scatter-corrected

A. P. Jeavons; R. A. Chandler; C. A. R. Dettmar

1999-01-01

303

Distinguishing between Camera and Scanned Images by Means of Frequency Analysis  

NASA Astrophysics Data System (ADS)

Distinguishing the kind of sensor which has acquired a digital image could be crucial in many scenarios where digital forensic techniques are called to give answers. In this paper a new methodology which permits to determine if a digital photo has been taken by a camera or has been scanned by a scanner is proposed. Such a technique exploits the specific geometrical features of the sensor pattern noise introduced by the sensor in both cases and by resorting to a frequency analysis can infer if a periodicity is present and consequently which is the origin of the digital content. Experimental results are presented to support the theoretical framework.

Caldelli, Roberto; Amerini, Irene; Picchioni, Francesco

304

Camera Image Transformation and Registration for Safe Spacecraft Landing and Hazard Avoidance  

NASA Technical Reports Server (NTRS)

Inherent geographical hazards of Martian terrain may impede a safe landing for science exploration spacecraft. Surface visualization software for hazard detection and avoidance may accordingly be applied in vehicles such as the Mars Exploration Rover (MER) to induce an autonomous and intelligent descent upon entering the planetary atmosphere. The focus of this project is to develop an image transformation algorithm for coordinate system matching between consecutive frames of terrain imagery taken throughout descent. The methodology involves integrating computer vision and graphics techniques, including affine transformation and projective geometry of an object, with the intrinsic parameters governing spacecraft dynamic motion and camera calibration.

Jones, Brandon M.

2005-01-01

305

New Mars Camera's First Image of Mars from Mapping Orbit (Full Frame)  

NASA Technical Reports Server (NTRS)

The high resolution camera on NASA's Mars Reconnaissance Orbiter captured its first image of Mars in the mapping orbit, demonstrating the full resolution capability, on Sept. 29, 2006. The High Resolution Imaging Science Experiment (HiRISE) acquired this first image at 8:16 AM (Pacific Time). With the spacecraft at an altitude of 280 kilometers (174 miles), the image scale is 25 centimeters per pixel (10 inches per pixel). If a person were located on this part of Mars, he or she would just barely be visible in this image.

The image covers a small portion of the floor of Ius Chasma, one branch of the giant Valles Marineris system of canyons. The image illustrates a variety of processes that have shaped the Martian surface. There are bedrock exposures of layered materials, which could be sedimentary rocks deposited in water or from the air. Some of the bedrock has been faulted and folded, perhaps the result of large-scale forces in the crust or from a giant landslide. The image resolves rocks as small as small as 90 centimeters (3 feet) in diameter. It includes many dunes or ridges of windblown sand.

This image (TRA_000823_1720) was taken by the High Resolution Imaging Science Experiment camera onboard the Mars Reconnaissance Orbiter spacecraft on Sept. 29, 2006. Shown here is the full image, centered at minus 7.8 degrees latitude, 279.5 degrees east longitude. The image is oriented such that north is to the top. The range to the target site was 297 kilometers (185.6 miles). At this distance the image scale is 25 centimeters (10 inches) per pixel (with one-by-one binning) so objects about 75 centimeters (30 inches) across are resolved. The image was taken at a local Mars time of 3:30 PM and the scene is illuminated from the west with a solar incidence angle of 59.7 degrees, thus the sun was about 30.3 degrees above the horizon. The season on Mars is northern winter, southern summer.

[Photojournal note: Due to the large sizes of the high-resolution TIFF and JPEG files, some systems may experience extremely slow downlink time while viewing or downloading these images; some systems may be incapable of handling the download entirely.]

NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Reconnaissance Orbiter for NASA's Science Mission Directorate, Washington. Lockheed Martin Space Systems, Denver, is the prime contractor for the project and built the spacecraft. The HiRISE camera was built by Ball Aerospace & Technologies Corporation, Boulder, Colo., and is operated by the University of Arizona, Tucson.

2006-01-01

306

Minimal camera networks for 3D image based modeling of cultural heritage objects.  

PubMed

3D modeling of cultural heritage objects like artifacts, statues and buildings is nowadays an important tool for virtual museums, preservation and restoration. In this paper, we introduce a method to automatically design a minimal imaging network for the 3D modeling of cultural heritage objects. This becomes important for reducing the image capture time and processing when documenting large and complex sites. Moreover, such a minimal camera network design is desirable for imaging non-digitally documented artifacts in museums and other archeological sites to avoid disturbing the visitors for a long time and/or moving delicate precious objects to complete the documentation task. The developed method is tested on the Iraqi famous statue "Lamassu". Lamassu is a human-headed winged bull of over 4.25 m in height from the era of Ashurnasirpal II (883-859 BC). Close-range photogrammetry is used for the 3D modeling task where a dense ordered imaging network of 45 high resolution images were captured around Lamassu with an object sample distance of 1 mm. These images constitute a dense network and the aim of our study was to apply our method to reduce the number of images for the 3D modeling and at the same time preserve pre-defined point accuracy. Temporary control points were fixed evenly on the body of Lamassu and measured by using a total station for the external validation and scaling purpose. Two network filtering methods are implemented and three different software packages are used to investigate the efficiency of the image orientation and modeling of the statue in the filtered (reduced) image networks. Internal and external validation results prove that minimal image networks can provide highly accurate records and efficiency in terms of visualization, completeness, processing time (>60% reduction) and the final accuracy of 1 mm. PMID:24670718

Alsadik, Bashar; Gerke, Markus; Vosselman, George; Daham, Afrah; Jasim, Luma

2014-01-01

307

Minimal Camera Networks for 3D Image Based Modeling of Cultural Heritage Objects  

PubMed Central

3D modeling of cultural heritage objects like artifacts, statues and buildings is nowadays an important tool for virtual museums, preservation and restoration. In this paper, we introduce a method to automatically design a minimal imaging network for the 3D modeling of cultural heritage objects. This becomes important for reducing the image capture time and processing when documenting large and complex sites. Moreover, such a minimal camera network design is desirable for imaging non-digitally documented artifacts in museums and other archeological sites to avoid disturbing the visitors for a long time and/or moving delicate precious objects to complete the documentation task. The developed method is tested on the Iraqi famous statue “Lamassu”. Lamassu is a human-headed winged bull of over 4.25 m in height from the era of Ashurnasirpal II (883–859 BC). Close-range photogrammetry is used for the 3D modeling task where a dense ordered imaging network of 45 high resolution images were captured around Lamassu with an object sample distance of 1 mm. These images constitute a dense network and the aim of our study was to apply our method to reduce the number of images for the 3D modeling and at the same time preserve pre-defined point accuracy. Temporary control points were fixed evenly on the body of Lamassu and measured by using a total station for the external validation and scaling purpose. Two network filtering methods are implemented and three different software packages are used to investigate the efficiency of the image orientation and modeling of the statue in the filtered (reduced) image networks. Internal and external validation results prove that minimal image networks can provide highly accurate records and efficiency in terms of visualization, completeness, processing time (>60% reduction) and the final accuracy of 1 mm. PMID:24670718

Alsadik, Bashar; Gerke, Markus; Vosselman, George; Daham, Afrah; Jasim, Luma

2014-01-01

308

Efficient Smart CMOS Camera Based on FPGAs Oriented to Embedded Image Processing  

PubMed Central

This article describes an image processing system based on an intelligent ad-hoc camera, whose two principle elements are a high speed 1.2 megapixel Complementary Metal Oxide Semiconductor (CMOS) sensor and a Field Programmable Gate Array (FPGA). The latter is used to control the various sensor parameter configurations and, where desired, to receive and process the images captured by the CMOS sensor. The flexibility and versatility offered by the new FPGA families makes it possible to incorporate microprocessors into these reconfigurable devices, and these are normally used for highly sequential tasks unsuitable for parallelization in hardware. For the present study, we used a Xilinx XC4VFX12 FPGA, which contains an internal Power PC (PPC) microprocessor. In turn, this contains a standalone system which manages the FPGA image processing hardware and endows the system with multiple software options for processing the images captured by the CMOS sensor. The system also incorporates an Ethernet channel for sending processed and unprocessed images from the FPGA to a remote node. Consequently, it is possible to visualize and configure system operation and captured and/or processed images remotely. PMID:22163739

Bravo, Ignacio; Balinas, Javier; Gardel, Alfredo; Lazaro, Jose L.; Espinosa, Felipe; Garcia, Jorge

2011-01-01

309

Resolving depth-measurement ambiguity with commercially available range imaging cameras  

NASA Astrophysics Data System (ADS)

Time-of-flight range imaging is typically performed with the amplitude modulated continuous wave method. This involves illuminating a scene with amplitude modulated light. Reflected light from the scene is received by the sensor with the range to the scene encoded as a phase delay of the modulation envelope. Due to the cyclic nature of phase, an ambiguity in the measured range occurs every half wavelength in distance, thereby limiting the maximum useable range of the camera. This paper proposes a procedure to resolve depth ambiguity using software post processing. First, the range data is processed to segment the scene into separate objects. The average intensity of each object can then be used to determine which pixels are beyond the non-ambiguous range. The results demonstrate that depth ambiguity can be resolved for various scenes using only the available depth and intensity information. This proposed method reduces the sensitivity to objects with very high and very low reflectance, normally a key problem with basic threshold approaches. This approach is very flexible as it can be used with any range imaging camera. Furthermore, capture time is not extended, keeping the artifacts caused by moving objects at a minimum. This makes it suitable for applications such as robot vision where the camera may be moving during captures. The key limitation of the method is its inability to distinguish between two overlapping objects that are separated by a distance of exactly one non-ambiguous range. Overall the reliability of this method is higher than the basic threshold approach, but not as high as the multiple frequency method of resolving ambiguity.

McClure, Shane H.; Cree, Michael J.; Dorrington, Adrian A.; Payne, Andrew D.

2010-01-01

310

Real time plume and laser spot recognition in IR camera images  

SciTech Connect

It is desirable to automatically guide the laser spot onto the effluent plume for maximum IR DIAL system sensitivity. This requires the use of a 2D focal plane array. The authors have demonstrated that a wavelength-filtered IR camera is capable of 2D imaging of both the plume and the laser spot. In order to identify the centers of the plume and the laser spot, it is first necessary to segment these features from the background. They report a demonstration of real time plume segmentation based on velocity estimation. They also present results of laser spot segmentation using simple thresholding. Finally, they describe current research on both advanced segmentation and recognition algorithms and on reconfigurable real time image processing hardware based on field programmable gate array technology.

Moore, K.R.; Caffrey, M.P.; Nemzek, R.J.; Salazar, A.A.; Jeffs, J. [Los Alamos National Lab., NM (United States); Andes, D.K.; Witham, J.C. [Naval Air Weapons Station, China Lake, CA (United States)

1997-08-01

311

The postcollapse core of M15 imaged with the HST planetary camera  

SciTech Connect

It is shown here that, despite the severe spherical aberration present in the HST, the Wide Field/Planetary Camera (WFPC) images still present useful high-resolution information on M15, the classic candidate for a cluster with a collapsed core. The stars in M15 have been resolved down to the main-sequence turnoff and have been subtracted from the images. The remaining faint, unresolved stars form a diffuse background with a surprisingly large core with r(c) = 0.13 pc. The existence of a large core interior to the power-law cusp may imply that M15 has evolved well past maximum core collapse and may rule out the presence of a massive central black hole as well. 26 refs.

Lauer, T.R.; Holtzman, J.A.; Faber, S.M.; Baum, W.A.; Currie, D.G.; Ewald, S.P.; Groth, E.J.; Hester, J.J.; Kelsall, T. (Kitt Peak National Observatory, Tucson, AZ (USA) Lowell Observatory, Flagstaff, AZ (USA) Lick Observatory, Santa Cruz, CA (USA) Washington Univ., Seattle (USA) Maryland Univ., College Park (USA) Space Telescope Science Institute, Baltimore, MD (USA) Princeton Univ., NJ (USA) California Institute of Technology, Pasadena (USA) NASA, Goddard Space Flight Center, Greenbelt, MD (USA))

1991-03-01

312

Flame spectra-temperature estimation based on a color imaging camera and a spectral reconstruction technique.  

PubMed

In this paper a low-cost, practical pixel-based flame spectrum and temperature estimation system based on flame color images is proposed. A spectral resolution of ?0.4??nm is achieved with an optical system formed by a color camera, a linear model, a flame's spectral training data, and a spectral reconstruction procedure. As a proof of concept, the estimated spectra are compared to local measurements performed with a commercial spectrometer. In order to estimate the absolute flame-temperature maps, two radiometric images at different wavelengths are reconstructed and the two-color pyrometry method is applied. Experiments show errors of about 2.0% over the estimated temperature, making this system a practical tool for flame sensing in combustion-process monitoring. PMID:25322218

Toro N, Carlos; Arias P, Luis; Torres, Sergio; Sbarbaro, Daniel

2014-10-01

313

Demonstration of three-dimensional imaging with a germanium Compton camera  

SciTech Connect

A proof-of-concept experiment was performed to demonstrate the three-dimensional imaging capabilities of an all-germanium Compton camera for near-field sources. Eight high purity germanium (HPGe) coaxial detectors were configured in two places of the authors detectors each. The data acquisition system was assembled from available hardware and off-the-shelf electronics using standard NIM and CAMAC modules. Sixteen-parameter data were recorded event-by-event for later analysis and image reconstruction. The energy resolution of the system was 0.3% at 1333 keV. Preliminary analysis indicates the position resolution for a source at one meter and a specific detector geometry was less than a centimeter.

McKisson, J.E.; Haskins, P.S. (Inst. for Space Science and Technology, Gainesville, FL (United States)); Phillips, G.W.; King, S.E.; August, R.A. (Naval Research Lab., Washington, DC (United States)); Piercey, R.B. (Sparrow Corp., Mississippi State, MS (United States)); Mania, R.C. (Kentucky State Univ., Frankfort, KY (United States). Division of Math and Sciences)

1994-08-01

314

Design and fabrication of MEMS-based thermally-actuated image stabilizer for cell phone camera  

NASA Astrophysics Data System (ADS)

A micro-electro-mechanical system (MEMS)-based image stabilizer is proposed to counteracting shaking in cell phone cameras. The proposed stabilizer (dimensions, 8.8 × 8.8 × 0.2 mm3) includes a two-axis decoupling XY stage and has sufficient strength to suspend an image sensor (IS) used for anti-shaking function. The XY stage is designed to send electrical signals from the suspended IS by using eight signal springs and 24 signal outputs. The maximum actuating distance of the stage is larger than 25 ?m, which is sufficient to resolve the shaking problem. Accordingly, the applied voltage for the 25 ?m moving distance is lower than 20 V; the dynamic resonant frequency of the actuating device is 4485 Hz, and the rising time is 21 ms.

Lin, Chun-Ying; Chiou, Jin-Chern

2012-11-01

315

Estimation of Enterococci Input from Bathers and Animals on A Recreational Beach Using Camera Images  

PubMed Central

Enterococci, are used nationwide as a water quality indicator of marine recreational beaches. Prior research has demonstrated that enterococci inputs to the study beach site (located in Miami, FL) are dominated by non-point sources (including humans and animals). We have estimated their respective source functions by developing a counting methodology for individuals to better understand their non-point source load impacts. The method utilizes camera images of the beach taken at regular time intervals to determine the number of people and animal visitors. The developed method translates raw image counts for weekdays and weekend days into daily and monthly visitation rates. Enterococci source functions were computed from the observed number of unique individuals for average days of each month of the year, and from average load contributions for humans and for animals. Results indicate that dogs represent the larger source of enterococci relative to humans and birds. PMID:20381094

D, Wang John; M, Solo-Gabriele Helena; M, Abdelzaher Amir; E, Fleming Lora

2010-01-01

316

Performance characteristics of a 511keV collimator for imaging positron emitters with a standard gamma-camera  

Microsoft Academic Search

Line-source experiments were conducted to assess the performance of a gamma-camera equipped with a specially designed 511-keV collimator for the planar imaging of positron emitters. The results were compared with the camera performance with routinely used collimators and radionuclides (thallium-201, technetium-99m and gallium-67). With positron emitters, scatter contributed less to the widening of the line spread function than with radionuclides

Arthur Van Lingen; Peter C. Huijgens; Frans C. Visser; Gert J. Ossenkoppele; Otto S. Hoekstra; Harry J. M. Martens; Hans Huitink; Koos D. M. Herscheid; Michael V. Greens; Gerrit J. J. Teule

1992-01-01

317

Imaging system for cardiac planar imaging using a dedicated dual-head gamma camera  

DOEpatents

A cardiac imaging system employing dual gamma imaging heads co-registered with one another to provide two dynamic simultaneous views of the heart sector of a patient torso. A first gamma imaging head is positioned in a first orientation with respect to the heart sector and a second gamma imaging head is positioned in a second orientation with respect to the heart sector. An adjustment arrangement is capable of adjusting the distance between the separate imaging heads and the angle between the heads. With the angle between the imaging heads set to 180 degrees and operating in a range of 140-159 keV and at a rate of up to 500kHz, the imaging heads are co-registered to produce simultaneous dynamic recording of two stereotactic views of the heart. The use of co-registered imaging heads maximizes the uniformity of detection sensitivity of blood flow in and around the heart over the whole heart volume and minimizes radiation absorption effects. A normalization/image fusion technique is implemented pixel-by-corresponding pixel to increase signal for any cardiac region viewed in two images obtained from the two opposed detector heads for the same time bin. The imaging system is capable of producing enhanced first pass studies, bloodpool studies including planar, gated and non-gated EKG studies, planar EKG perfusion studies, and planar hot spot imaging.

Majewski, Stanislaw (Morgantown, VA); Umeno, Marc M. (Woodinville, WA)

2011-09-13

318

Performance of the Aspect Camera Assembly for the Advanced X-Ray Astrophysics Facility: Imaging  

NASA Technical Reports Server (NTRS)

The Aspect Camera Assembly (ACA) is a "state-of-the-art" star tracker that provides real-time attitude information to the Advanced X-Ray Astrophysics Facility - Imaging (AXAF-I), and provides imaging data for "post-facto" ground processing. The ACA consists of a telescope with a CCD focal plane, associated focal plane read-out electronics, and an on-board processor that processes the focal plane data to produce star image location reports. On-board star image locations are resolved to 0.8 arcsec, and post-facto algorithms yield 0.2 arcsec star location accuracies (at end of life). The protoflight ACA has been built, along with a high accuracy vacuum test facility. Image position determination has been verified to < 0.2 arcsec accuracies. This paper is a follow-on paper to one presented by the author at the AeroSense '95 conference. This paper presents the "as built" configuration, the tested performance, and the test facility's design and demonstrated accuracy. The ACA has been delivered in anticipation of a August, 1998 shuttle launch.

Michaels, Dan

1998-01-01

319

CMOS detector arrays in a virtual 10-kilopixel camera for coherent terahertz real-time imaging.  

PubMed

We demonstrate the principle applicability of antenna-coupled complementary metal oxide semiconductor (CMOS) field-effect transistor arrays as cameras for real-time coherent imaging at 591.4 GHz. By scanning a few detectors across the image plane, we synthesize a focal-plane array of 100×100 pixels with an active area of 20×20 mm2, which is applied to imaging in transmission and reflection geometries. Individual detector pixels exhibit a voltage conversion loss of 24 dB and a noise figure of 41 dB for 16 ?W of the local oscillator (LO) drive. For object illumination, we use a radio-frequency (RF) source with 432 ?W at 590 GHz. Coherent detection is realized by quasioptical superposition of the image and the LO beam with 247 ?W. At an effective frame rate of 17 Hz, we achieve a maximum dynamic range of 30 dB in the center of the image and more than 20 dB within a disk of 18 mm diameter. The system has been used for surface reconstruction resolving a height difference in the ?m range. PMID:22344098

Boppel, Sebastian; Lisauskas, Alvydas; Max, Alexander; Krozer, Viktor; Roskos, Hartmut G

2012-02-15

320

Calibration of HST wide field camera for quantitative analysis of faint galaxy images  

NASA Technical Reports Server (NTRS)

We present the methods adopted to optimize the calibration of images obtained with the Hubble Space Telescope (HST) Wide Field Camera (WFC) (1991-1993). Our main goal is to improve quantitative measurement of faint images, with special emphasis on the faint (I approximately 20-24 mag) stars and galaxies observed as a part of the Medium-Deep Survey. Several modifications to the standard calibration procedures have been introduced, including improved bias and dark images, and a new supersky flatfield obtained by combining a large number of relatively object-free Medium-Deep Survey exposures of random fields. The supersky flat has a pixel-to-pixel rms error of about 2.0% in F555W and of 2.4% in F785LP; large-scale variations are smaller than 1% rms. Overall, our modifications improve the quality of faint images with respect to the standard calibration by about a factor of five in photometric accuracy and about 0.3 mag in sensitivity, corresponding to about a factor of two in observing time. The relevant calibration images have been made available to the scientific community.

Ratnatunga, Kavan U.; Griffiths, Richard E.; Casertano, Stefano; Neuschaefer, Lyman W.; Wyckoff, Eric W.

1994-01-01

321

Self-coherent camera: first results of a high-contrast imaging bench in visible light  

NASA Astrophysics Data System (ADS)

Extreme adaptive optics and coronagraphy are mandatory for direct imaging of exoplanets. Quasi-static aberrations limit the instrument performance producing speckle noise in the focal plane. We propose a Self-Coherent Camera (SCC) to both control a deformable mirror that actively compensates wavefront error, and calibrate the speckle noise. We create a reference beam to spatially modulate the coronagraphic speckle pattern with Fizeau fringes. In a first step, we are able to extract wavefront aberrations from the science image and correct for them using a deformable mirror. In a second step, we apply a post-processing algorithm to discriminate the companion image from the residual speckle field. To validate the instrumental concept, we developed a high contrast imaging bench in visible light. We associated a SCC to a four quadrant phase mask coronagraph and a deformable mirror (DM) with a high number of actuators (32x32 Boston Michromachines MEMS). We will present this bench and show first experimental results of focal plane wavefront sensing and high contrast imaging. The measurements are compared to numerical simulations.

Mas, Marion; Baudoz, Pierre; Rousset, Gerard; Galicher, Rapha"l.; Baudrand, Jacques

2010-07-01

322

A Semi-Automatic Image-Based Close Range 3D Modeling Pipeline Using a Multi-Camera Configuration  

PubMed Central

The generation of photo-realistic 3D models is an important task for digital recording of cultural heritage objects. This study proposes an image-based 3D modeling pipeline which takes advantage of a multi-camera configuration and multi-image matching technique that does not require any markers on or around the object. Multiple digital single lens reflex (DSLR) cameras are adopted and fixed with invariant relative orientations. Instead of photo-triangulation after image acquisition, calibration is performed to estimate the exterior orientation parameters of the multi-camera configuration which can be processed fully automatically using coded targets. The calibrated orientation parameters of all cameras are applied to images taken using the same camera configuration. This means that when performing multi-image matching for surface point cloud generation, the orientation parameters will remain the same as the calibrated results, even when the target has changed. Base on this invariant character, the whole 3D modeling pipeline can be performed completely automatically, once the whole system has been calibrated and the software was seamlessly integrated. Several experiments were conducted to prove the feasibility of the proposed system. Images observed include that of a human being, eight Buddhist statues, and a stone sculpture. The results for the stone sculpture, obtained with several multi-camera configurations were compared with a reference model acquired by an ATOS-I 2M active scanner. The best result has an absolute accuracy of 0.26 mm and a relative accuracy of 1:17,333. It demonstrates the feasibility of the proposed low-cost image-based 3D modeling pipeline and its applicability to a large quantity of antiques stored in a museum. PMID:23112656

Rau, Jiann-Yeou; Yeh, Po-Chia

2012-01-01

323

Field-programmable gate array-based hardware architecture for high-speed camera with KAI-0340 CCD image sensor  

NASA Astrophysics Data System (ADS)

We present a field-programmable gate array (FPGA)-based hardware architecture for high-speed camera which have fast auto-exposure control and colour filter array (CFA) demosaicing. The proposed hardware architecture includes the design of charge coupled devices (CCD) drive circuits, image processing circuits, and power supply circuits. CCD drive circuits transfer the TTL (Transistor-Transistor-Logic) level timing Sequences which is produced by image processing circuits to the timing Sequences under which CCD image sensor can output analog image signals. Image processing circuits convert the analog signals to digital signals which is processing subsequently, and the TTL timing, auto-exposure control, CFA demosaicing, and gamma correction is accomplished in this module. Power supply circuits provide the power for the whole system, which is very important for image quality. Power noises effect image quality directly, and we reduce power noises by hardware way, which is very effective. In this system, the CCD is KAI-0340 which is can output 210 full resolution frame-per-second, and our camera can work outstandingly in this mode. The speed of traditional auto-exposure control algorithms to reach a proper exposure level is so slow that it is necessary to develop a fast auto-exposure control method. We present a new auto-exposure algorithm which is fit high-speed camera. Color demosaicing is critical for digital cameras, because it converts a Bayer sensor mosaic output to a full color image, which determines the output image quality of the camera. Complexity algorithm can acquire high quality but cannot implement in hardware. An low-complexity demosaicing method is presented which can implement in hardware and satisfy the demand of quality. The experiment results are given in this paper in last.

Wang, Hao; Yan, Su; Zhou, Zuofeng; Cao, Jianzhong; Yan, Aqi; Tang, Linao; Lei, Yangjie

2013-08-01

324

Experimental comparison of the high-speed imaging performance of an EM-CCD and sCMOS camera in a dynamic live-cell imaging test case.  

PubMed

The study of living cells may require advanced imaging techniques to track weak and rapidly changing signals. Fundamental to this need is the recent advancement in camera technology. Two camera types, specifically sCMOS and EM-CCD, promise both high signal-to-noise and high speed (>100 fps), leaving researchers with a critical decision when determining the best technology for their application. In this article, we compare two cameras using a live-cell imaging test case in which small changes in cellular fluorescence must be rapidly detected with high spatial resolution. The EM-CCD maintained an advantage of being able to acquire discernible images with a lower number of photons due to its EM-enhancement. However, if high-resolution images at speeds approaching or exceeding 1000 fps are desired, the flexibility of the full-frame imaging capabilities of sCMOS is superior. PMID:24404178

Beier, Hope T; Ibey, Bennett L

2014-01-01

325

Experimental Comparison of the High-Speed Imaging Performance of an EM-CCD and sCMOS Camera in a Dynamic Live-Cell Imaging Test Case  

PubMed Central

The study of living cells may require advanced imaging techniques to track weak and rapidly changing signals. Fundamental to this need is the recent advancement in camera technology. Two camera types, specifically sCMOS and EM-CCD, promise both high signal-to-noise and high speed (>100 fps), leaving researchers with a critical decision when determining the best technology for their application. In this article, we compare two cameras using a live-cell imaging test case in which small changes in cellular fluorescence must be rapidly detected with high spatial resolution. The EM-CCD maintained an advantage of being able to acquire discernible images with a lower number of photons due to its EM-enhancement. However, if high-resolution images at speeds approaching or exceeding 1000 fps are desired, the flexibility of the full-frame imaging capabilities of sCMOS is superior. PMID:24404178

Beier, Hope T.; Ibey, Bennett L.

2014-01-01

326

Limb imaging of the Venus O2 visible nightglow with the Venus Monitoring Camera  

E-print Network

We investigated the Venus O2 visible nightglow with imagery from the Venus Monitoring Camera on Venus Express. Drawing from data collected between April 2007 and January 2011, we study the global distribution of this emission, discovered in the late 70s by the Venera 9 and 10 missions. The inferred limb-viewing intensities are on the order of 150 kiloRayleighs at the lower latitudes and seem to drop somewhat towards the poles. The emission is generally stable, although there are episodes when the intensities rise up to 500 kR. We compare a set of Venus Monitoring Camera observations with coincident measurements of the O2 nightglow at 1.27 {\\mu}m made with the Visible and Infrared Thermal Imaging Spectrometer, also on Venus Express. From the evidence gathered in this and past works, we suggest a direct correlation between the instantaneous emissions from the two O2 nightglow systems. Possible implications regarding the uncertain origin of the atomic oxygen green line at 557.7 nm are noted.

Muñoz, A García; Sánchez-Lavega, A; Markiewicz, W J; Titov, D V; Witasse, O; Opitz, A

2013-01-01

327

A Novel Method of Object Detection from a Moving Camera Based on Image Matching and Frame Coupling  

PubMed Central

A new method based on image matching and frame coupling to handle the problems of object detection caused by a moving camera and object motion is presented in this paper. First, feature points are extracted from each frame. Then, motion parameters can be obtained. Sub-images are extracted from the corresponding frame via these motion parameters. Furthermore, a novel searching method for potential orientations improves efficiency and accuracy. Finally, a method based on frame coupling is adopted, which improves the accuracy of object detection. The results demonstrate the effectiveness and feasibility of our proposed method for a moving object with changing posture and with a moving camera. PMID:25354301

Chen, Yong; Zhang, Rong hua; Shang, Lei

2014-01-01

328

An accelerated threshold-based back-projection algorithm for Compton camera image reconstruction  

SciTech Connect

Purpose: Compton camera imaging (CCI) systems are currently under investigation for radiotherapy dose reconstruction and verification. The ability of such a system to provide real-time images during dose delivery will be limited by the computational speed of the image reconstruction algorithm. In this work, the authors present a fast and simple method by which to generate an initial back-projected image from acquired CCI data, suitable for use in a filtered back-projection algorithm or as a starting point for iterative reconstruction algorithms, and compare its performance to the current state of the art. Methods: Each detector event in a CCI system describes a conical surface that includes the true point of origin of the detected photon. Numerical image reconstruction algorithms require, as a first step, the back-projection of each of these conical surfaces into an image space. The algorithm presented here first generates a solution matrix for each slice of the image space by solving the intersection of the conical surface with the image plane. Each element of the solution matrix is proportional to the distance of the corresponding voxel from the true intersection curve. A threshold function was developed to extract those pixels sufficiently close to the true intersection to generate a binary intersection curve. This process is repeated for each image plane for each CCI detector event, resulting in a three-dimensional back-projection image. The performance of this algorithm was tested against a marching algorithm known for speed and accuracy. Results: The threshold-based algorithm was found to be approximately four times faster than the current state of the art with minimal deficit to image quality, arising from the fact that a generically applicable threshold function cannot provide perfect results in all situations. The algorithm fails to extract a complete intersection curve in image slices near the detector surface for detector event cones having axes nearly parallel to the image plane. This effect decreases the sum of the image, thereby also affecting the mean, standard deviation, and SNR of the image. All back-projected events associated with a simulated point source intersected the voxel containing the source and the FWHM of the back-projected image was similar to that obtained from the marching method. Conclusions: The slight deficit to image quality observed with the threshold-based back-projection algorithm described here is outweighed by the 75% reduction in computation time. The implementation of this method requires the development of an optimum threshold function, which determines the overall accuracy of the method. This makes the algorithm well-suited to applications involving the reconstruction of many large images, where the time invested in threshold development is offset by the decreased image reconstruction time. Implemented in a parallel-computing environment, the threshold-based algorithm has the potential to provide real-time dose verification for radiation therapy.

Mundy, Daniel W.; Herman, Michael G. [Department of Radiation Oncology, Mayo Clinic, Rochester, Minnesota 55905 (United States)

2011-01-15

329

Apogee Imaging Systems Alta F42 CCD Camera with Back Illuminated EV2 CCD42-The Apogee Alta F42 CCD Camera has a back-illuminated full frame 4 megapixel EV2 CCD42-40  

E-print Network

Apogee Imaging Systems Alta F42 CCD Camera with Back Illuminated EV2 CCD42- 40 Sensor The Apogee Alta F42 CCD Camera has a back-illuminated full frame 4 megapixel EV2 CCD42-40 sensor with very high quantum efficiency. Midband, broadband, and UV-enhanced versions of the CCD are available. Ideal

Kleinfeld, David

330

Performance of a slow-scan CCD camera for macromolecular imaging in a 400 kV electron cryomicroscope.  

PubMed

The feasibility and limitations of a 1024 x 1024 slow-scan charge-coupled device (CCD) camera were evaluated for imaging in a 400kV electron cryomicroscope. Catalase crystals and amorphous carbon film were used as test specimens. Using catalase crystals, it was found that the finite (24 microns) pixel size of the slow-scan CCD camera governs the ultimate resolution in the acquired images. For instance, spot-scan images of ice-embedded catalase crystals showed resolutions of 8 A and 4 A at effective magnifications of 67,000 x and 132,000 x, respectively. Using an amorphous carbon film, the damping effect of the modulation transfer function (MTF) of the slow-scan CCD camera on the specimen's Fourier spectrum relative to that of the photographic film was evaluated. The MTF of the slow-scan CCD camera fell off more rapidly compared to that of the photographic film and reached the value of 0.2 at the Nyquist frequency. Despite this attenuation, the signal-to-noise ratio of the CCD data, as determined from reflections of negatively-stained catalase crystals, was found to decrease to approximately 50% of that of photographic film data. The phases computed from images of the same negatively-stained catalase crystals recorded consecutively on both the slow-scan CCD camera and photographic film were found to be comparable to each other within 12 degrees. Ways of minimizing the effect of the MTF of the slow-scan CCD camera on the acquired images are also presented. PMID:8858867

Sherman, M B; Brink, J; Chiu, W

1996-04-01

331

Measurement of effective temperature range of fire service thermal imaging cameras  

NASA Astrophysics Data System (ADS)

The use of thermal imaging cameras (TIC) by the fire service is increasing as fire fighters become more aware of the value of these tools. The National Fire Protection Association (NFPA) is currently developing a consensus standard for design and performance requirements of TIC as used by the fire service. The National Institute of Standards and Technology facilitates this process by providing recommendations for science-based performance metrics and test methods to the NFPA technical committee charged with the development of this standard. A suite of imaging performance metrics and test methods, based on the harsh operating environment and limitations of use particular to the fire service, has been proposed for inclusion in the standard. The Effective Temperature Range (ETR) measures the range of temperatures that a TIC can view while still providing useful information to the user. Specifically, extreme heat in the field of view tends to inhibit a TIC's ability to discern surfaces having intermediate temperatures, such as victims and fire fighters. The ETR measures the contrast of a target having alternating 25 °C and 30 °C bars while an increasing temperature range is imposed on other surfaces in the field of view. The ETR also indicates the thermal conditions that trigger a shift in integration time common to TIC employing microbolometer sensors. The reported values for this imaging performance metric are the hot surface temperature range within which the TIC provides adequate bar contrast, and the hot surface temperature at which the TIC shifts integration time.

Amon, Francine; Bryner, Nelson

2008-04-01

332

The Nuclear Region of M51 Imaged with the HST Planetary Camera  

E-print Network

We present high-resolution, broad- and narrow-band, pre-refurbishment images of the central region of M51 taken with the Planetary Camera of the Hubble Space Telescope. The V-band images show a rather chaotic distribution of dust lanes, though some are oriented radially, roughly aligned with the major axis of the bar, and may be transporting gas to the AGN in the nucleus. The dust lane obscuring the nucleus of the galaxy, which was previously thought to be an edge-on accretion disk feeding the AGN, is not centered on the nucleus. It is unlikely that this is a stable configuration, suggesting that the material has only recently entered the nuclear region. The nucleus is contained within a cluster of stars having a total luminosity of order 5x10^7 L_0. Fitting a King model to the least obscured portions of the cluster yields a maximum core radius of 14 pc. The morphology apparent in the forbidden-line images of the extra-nuclear cloud is consistent with a narrow jet striking and scattering off the boundary of a relatively dense cocoon of gas in the disk of the galaxy. The emission-line regions are concentrated along the inner borders of dust filaments, supporting the view that the nuclear jet is ramming into and stirring up the ISM of the disk.

Carl J. Grillmair; S. M. Faber; Tod R. Lauer; J. J. Hester; C. R. Lynds; E. J. O'Neil, Jr.; P. A. Scowen

1996-10-15

333

HERSCHEL/SCORE, imaging the solar corona in visible and EUV light: CCD camera characterization.  

PubMed

The HERSCHEL (helium resonant scattering in the corona and heliosphere) experiment is a rocket mission that was successfully launched last September from White Sands Missile Range, New Mexico, USA. HERSCHEL was conceived to investigate the solar corona in the extreme UV (EUV) and in the visible broadband polarized brightness and provided, for the first time, a global map of helium in the solar environment. The HERSCHEL payload consisted of a telescope, HERSCHEL EUV Imaging Telescope (HEIT), and two coronagraphs, HECOR (helium coronagraph) and SCORE (sounding coronagraph experiment). The SCORE instrument was designed and developed mainly by Italian research institutes and it is an imaging coronagraph to observe the solar corona from 1.4 to 4 solar radii. SCORE has two detectors for the EUV lines at 121.6 nm (HI) and 30.4 nm (HeII) and the visible broadband polarized brightness. The SCORE UV detector is an intensified CCD with a microchannel plate coupled to a CCD through a fiber-optic bundle. The SCORE visible light detector is a frame-transfer CCD coupled to a polarimeter based on a liquid crystal variable retarder plate. The SCORE coronagraph is described together with the performances of the cameras for imaging the solar corona. PMID:20428852

Pancrazzi, M; Focardi, M; Landini, F; Romoli, M; Fineschi, S; Gherardi, A; Pace, E; Massone, G; Antonucci, E; Moses, D; Newmark, J; Wang, D; Rossi, G

2010-07-01

334

Calibration and Validation of Images from the Mars Reconnaissance Orbiter Mars Color Imager (MARCI) and Context Camera (CTX) Instruments  

NASA Astrophysics Data System (ADS)

The MRO CTX instrument is a monochrome (611±189; nm), linear array CCD pushbroom camera with a nominal surface resolution of 6 m/pixel. The MARCI instrument is a 2-D CCD framing camera with 5 visible (420, 550, 600, 650, and 720 nm) and 2 UV (260 and 320 nm) filters, a 180° field of view, and a nominal resolution of about 1 km/pixel at nadir. Following Mars Orbital Insertion (MOI) in March 2006, CTX and MARCI images were acquired for initial instrument checkouts and validation of the pre-flight and in-flight calibration pipeline. CTX in-flight bias and dark current levels are derived from masked pixels at the edges of the array. A dark current model derived during pre-flight calibration is applied if the masked pixels exhibit a gradient across the field or noise above an acceptable threshold. The CTX flatfield removes residual pixel non-uniformities and a subtle ''jail bar'' effect caused by the CCD's alternating register readout. Radiances are derived from bias, dark, and flat-corrected images using pre-flight scaling factors. Dividing the average radiances by the solar spectral radiance convolved over the CTX filter transmission and applying a Minnaert phase angle correction yields an average I/F level in the CTX post-MOI Mars images near an expected value of 0.2. Bias and dark current subtraction of the MARCI images uses either a pre-flight model or dark sky data from the far left or far right parts of the field (nominally off the Mars limb). The preflight flatfield data were modified based on in-flight performance to remove residual non-pixel uniformities. Some residual pixel-dependent bias nonuniformities were also corrected using in-flight data. Bias, dark, and flat-corrected images were converted to radiance using pre-flight scaling factors. Phase-corrected 7-filter I/F values for the region of Mars imaged during the post-MOI campaign are consistent with previous data.

Schaeffer, Derek; Bell, J. F., III; Malin, M.; Caplinger, M.; Calvin, W. M.; Cantor, B.; Clancy, R. T.; Haberle, R. M.; James, P. B.; Lee, S.; Thomas, P.; Wolff, M. J.

2006-09-01

335

The social camera: a case-study in contextual image recommendation  

Microsoft Academic Search

The digital camera revolution has changed the world of photography and now most people have access to, and even regularly carry, a digital camera. Often these cameras have been designed with simplicity in mind: they harness a variety of sophisticated technologies in order to automatically take care of all manner of complex settings (aperture, shutter speed, flash etc.) for point-and-shoot

Steven Bourke; Kevin McCarthy; Barry Smyth

2011-01-01

336

Two Years of Digital Terrain Model Production Using the Lunar Reconnaissance Orbiter Narrow Angle Camera  

NASA Astrophysics Data System (ADS)

One of the primary objectives of the Lunar Reconnaissance Orbiter Camera (LROC) is to gather stereo observations with the Narrow Angle Camera (NAC). These stereo observations are used to generate digital terrain models (DTMs). The NAC has a pixel scale of 0.5 to 2.0 meters but was not designed for stereo observations and thus requires the spacecraft to roll off-nadir to acquire these images. Slews interfere with the data collection of the other instruments, so opportunities are currently limited to four per day. Arizona State University has produced DTMs from 95 stereo pairs for 11 Constellation Project (CxP) sites (Aristarchus, Copernicus crater, Gruithuisen domes, Hortensius domes, Ina D-caldera, Lichtenberg crater, Mare Ingenii, Marius hills, Reiner Gamma, South Pole-Aitkin Rim, Sulpicius Gallus) as well as 30 other regions of scientific interest (including: Bhabha crater, highest and lowest elevation points, Highland Ponds, Kugler Anuchin, Linne Crater, Planck Crater, Slipher crater, Sears Crater, Mandel'shtam Crater, Virtanen Graben, Compton/Belkovich, Rumker Domes, King Crater, Luna 16/20/23/24 landing sites, Ranger 6 landing site, Wiener F Crater, Apollo 11/14/15/17, fresh craters, impact melt flows, Larmor Q crater, Mare Tranquillitatis pit, Hansteen Alpha, Moore F Crater, and Lassell Massif). To generate DTMs, the USGS ISIS software and SOCET SET° from BAE Systems are used. To increase the absolute accuracy of the DTMs, data obtained from the Lunar Orbiter Laser Altimeter (LOLA) is used to coregister the NAC images and define the geodetic reference frame. NAC DTMs have been used in examination of several sites, e.g. Compton-Belkovich, Marius Hills and Ina D-caldera [1-3]. LROC will continue to acquire high-resolution stereo images throughout the science phase of the mission and any extended mission opportunities, thus providing a vital dataset for scientific research as well as future human and robotic exploration. [1] B.L. Jolliff (2011) Nature Geoscience, in press. [2] Lawrence et al. (2011) LPSC XLII, Abst 2228. [3] Garry et al. (2011) LPSC XLII, Abst 2605.

Burns, K.; Robinson, M. S.; Speyerer, E.; LROC Science Team

2011-12-01

337

Evaluating intensified camera systems  

SciTech Connect

This paper describes image evaluation techniques used to standardize camera system characterizations. The authors group is involved with building and fielding several types of camera systems. Camera types include gated intensified cameras, multi-frame cameras, and streak cameras. Applications range from X-ray radiography to visible and infrared imaging. Key areas of performance include sensitivity, noise, and resolution. This team has developed an analysis tool, in the form of image processing software, to aid an experimenter in measuring a set of performance metrics for their camera system. These performance parameters are used to identify a camera system's capabilities and limitations while establishing a means for camera system comparisons. The analysis tool is used to evaluate digital images normally recorded with CCD cameras. Electro-optical components provide fast shuttering and/or optical gain to camera systems. Camera systems incorporate a variety of electro-optical components such as microchannel plate (MCP) or proximity focused diode (PFD) image intensifiers; electro-static image tubes; or electron-bombarded (EB) CCDs. It is often valuable to evaluate the performance of an intensified camera in order to determine if a particular system meets experimental requirements.

S. A. Baker

2000-06-30

338

Tower Camera Handbook  

SciTech Connect

The tower camera in Barrow provides hourly images of ground surrounding the tower. These images may be used to determine fractional snow cover as winter arrives, for comparison with the albedo that can be calculated from downward-looking radiometers, as well as some indication of present weather. Similarly, during spring time, the camera images show the changes in the ground albedo as the snow melts. The tower images are saved in hourly intervals. In addition, two other cameras, the skydeck camera in Barrow and the piling camera in Atqasuk, show the current conditions at those sites.

Moudry, D

2005-01-01

339

Real-time imaging using a 2.8 THz quantum cascade laser and uncooled infrared microbolometer camera.  

PubMed

Real-time imaging in the terahertz (THz) spectral range was achieved using a milliwatt-scale, 2.8 THz quantum cascade laser and an uncooled, 160 x 120 pixel microbolometer camera modified with Picarin optics. Noise equivalent temperature difference of the camera in the 1-5 THz frequency range was estimated to be at least 3 K, confirming the need for external THz illumination when imaging in this frequency regime. Despite the appearance of fringe patterns produced by multiple diffraction effects, single-frame and extended video imaging of obscured objects show high-contrast differentiation between metallic and plastic materials, supporting the viability of this imaging approach for use in future security screening applications. PMID:18311285

Behnken, Barry N; Karunasiri, Gamani; Chamberlin, Danielle R; Robrish, Peter R; Faist, Jérôme

2008-03-01

340

Radioisotope guided surgery with imaging probe, a hand-held high-resolution gamma camera  

NASA Astrophysics Data System (ADS)

Since 1997, our group of Physics together with Nuclear Physicians studies imaging probes (IP), hand-held, high-resolution gamma cameras for radio-guided surgery (RGS). Present work is aimed to verify the usefulness of two updated IP in different surgical operations. Forty patients scheduled for breast cancer sentinel node (SN) biopsy, five patients with nodal recurrence of thyroid cancer, seven patients with parathyroid adenomas, five patients with neuroendocrine tumours (NET), were operated under the guide of IP. We used two different IP with field of view of 1 and 4 in. 2, respectively and intrinsic spatial resolution of about 2 mm. Radioisotopes were 99mTc, 123I and 111In. The 1 in. 2 IP detected SN in all the 40 patients and more than one node in 24, whereas anger camera (AC) failed locating SN in four patients and detected true positive second nodes in only nine patients. The 4 in. 2 IP was used for RGS of thyroid, parathyroid and NETs. It detected eight latero-cervical nodes. In the same patients, AC detected five invaded nodes. Parathyroid adenomas detected by IP were 10 in 7 patients, NET five in five patients. One and 4 in. 2 IPs showed usefulness in all operations. Initial studies on SN biopsy were carried out on small series of patients to validate IP and to demonstrate the effectiveness and usefulness of IP alone or against conventional probes. We propose the use of the IP as control method for legal documentation and surgeon strategy guide before and after lesion(s) removal.

Soluri, A.; Trotta, C.; Scopinaro, F.; Tofani, A.; D'Alessandria, C.; Pasta, V.; Stella, S.; Massari, R.

2007-12-01

341

The JANUS camera onboard JUICE mission for Jupiter system optical imaging  

NASA Astrophysics Data System (ADS)

JANUS (Jovis, Amorum ac Natorum Undique Scrutator) is the visible camera selected for the ESA JUICE mission to the Jupiter system. Resources constraints, S/C characteristics, mission design, environment and the great variability of observing conditions for several targets put stringent constraints on instrument architecture. In addition to the usual requirements for a planetary mission, the problem of mass and power consumption is particularly stringent due to the long-lasting cruising and operations at large distance from the Sun. JANUS design shall cope with a wide range of targets, from Jupiter atmosphere, to solid satellite surfaces, exosphere, rings, and lightning, all to be observed in several color and narrow-band filters. All targets shall be tracked during the mission and in some specific cases the DTM will be derived from stereo imaging. Mission design allows a quite long time range for observations in Jupiter system, with orbits around Jupiter and multiple fly-bys of satellites for 2.5 years, followed by about 6 months in orbit around Ganymede, at surface distances variable from 104 to few hundreds km. Our concept was based on a single optical channel, which was fine-tuned to cover all scientific objectives based on low to high-resolution imaging. A catoptric telescope with excellent optical quality is coupled with a rectangular detector, avoiding any scanning mechanism. In this paper the present JANUS design and its foreseen scientific capabilities are discussed.

Della Corte, Vincenzo; Schmitz, Nicole; Zusi, Michele; Castro, José Maria; Leese, Mark; Debei, Stefano; Magrin, Demetrio; Michalik, Harald; Palumbo, Pasquale; Jaumann, Ralf; Cremonese, Gabriele; Hoffmann, Harald; Holland, Andrew; Lara, Luisa Maria; Fiethe, Björn; Friso, Enrico; Greggio, Davide; Herranz, Miguel; Koncz, Alexander; Lichopoj, Alexander; Martinez-Navajas, Ignacio; Mazzotta Epifani, Elena; Michaelis, Harald; Ragazzoni, Roberto; Roatsch, Thomas; Rodrigo, Julio; Rodriguez, Emilio; Schipani, Pietro; Soman, Matthew; Zaccariotto, Mirco

2014-08-01

342

4 Vesta in Color: High Resolution Mapping from Dawn Framing Camera Images  

NASA Technical Reports Server (NTRS)

Rotational surface variations on asteroid 4 Vesta have been known from ground-based and HST observations, and they have been interpreted as evidence of compositional diversity. NASA s Dawn mission entered orbit around Vesta on July 16, 2011 for a year-long global characterization. The framing cameras (FC) onboard the Dawn spacecraft will image the asteroid in one clear (broad) and seven narrow band filters covering the wavelength range between 0.4-1.0 microns. We present color mapping results from the Dawn FC observations of Vesta obtained during Survey orbit (approx.3000 km) and High-Altitude Mapping Orbit (HAMO) (approx.950 km). Our aim is to create global color maps of Vesta using multi spectral FC images to identify the spatial extent of compositional units and link them with other available data sets to extract the basic mineralogy. While the VIR spectrometer onboard Dawn has higher spectral resolution (864 channels) allowing precise mineralogical assessment of Vesta s surface, the FC has three times higher spatial resolution in any given orbital phase. In an effort to extract maximum information from FC data we have developed algorithms using laboratory spectra of pyroxenes and HED meteorites to derive parameters associated with the 1-micron absorption band wing. These parameters will help map the global distribution of compositionally related units on Vesta s surface. Interpretation of these units will involve the integration of FC and VIR data.

Reddy, V.; LeCorre, L.; Nathues, A.; Sierks, H.; Christensen, U.; Hoffmann, M.; Schroeder, S. E.; Vincent, J. B.; McSween, H. Y.; Denevi, B. W.; Li, J.-Y.; Pieters, C. M.; Gaffey, M.; Mittlefehldt, D.; Buratti, B.; Hicks, M.; McCord, T.; Combe, J.-P.; DeSantis, M. C.; Russell, C. T.; Raymond, C. A.; Marques, P. Gutierrez; Maue, T.; Hall, I.

2011-01-01

343

Imaging the anisotropic elastic properties of paper with the INEEL laser ultrasonic camera  

NASA Astrophysics Data System (ADS)

An important material property in the paper industry is the anisotropic stiffness distribution due to the fibrous microstructure of paper. Ultrasonic methods offer a means of determining the stiffness of sheets of paper from the anisotropic propagation characteristics of elastic Lamb waves along the machine direction (MD) and the cross direction (CD). Currently, piezoelectric ultrasonic methods are employed in the industry to measure the elastic polar diagram of paper through multiple contacting measurements made in all directions. This paper describes a new approach utilizing the INEEL Laser Ultrasonic Camera to provide a complete image of the elastic waves traveling in all directions in the plane of the paper sheet. This approach is based on optical dynamic holographic methods that record the out of plane ultrasonic motion over the entire paper surface simultaneously without scanning. The full-field imaging technique offers great potential for increasing the speed of the measurement and it ultimately provides a great deal of information concerning local property variations and flaws in the paper. This report shows the success of the method and the manner in which it yields the elastic polar diagram for the paper from the dispersive flexural or antisymmetric Lamb wave.

Deason, V. A.; Telschow, K. L.; Schley, R. S.; Watson, S. M.

2000-05-01

344

Fire service and first responder thermal imaging camera (TIC) advances and standards  

NASA Astrophysics Data System (ADS)

Fire Service and First Responder Thermal Imaging Camera (TIC) applications are growing, saving lives and preventing injury and property damage. Firefighters face a wide range of serious hazards. TICs help mitigate the risks by protecting Firefighters and preventing injury, while reducing time spent fighting the fire and resources needed to do so. Most fire safety equipment is covered by performance standards. Fire TICs, however, are not covered by such standards and are also subject to inadequate operational performance and insufficient user training. Meanwhile, advancements in Fire TICs and lower costs are driving product demand. The need for a Fire TIC Standard was spurred in late 2004 through a Government sponsored Workshop where experts from the First Responder community, component manufacturers, firefighter training, and those doing research on TICs discussed strategies, technologies, procedures, best practices and R&D that could improve Fire TICs. The workshop identified pressing image quality, performance metrics, and standards issues. Durability and ruggedness metrics and standard testing methods were also seen as important, as was TIC training and certification of end-users. A progress report on several efforts in these areas and their impact on the IR sensor industry will be given. This paper is a follow up to the SPIE Orlando 2004 paper on Fire TIC usage (entitled Emergency Responders' Critical Infrared) which explored the technological development of this IR industry segment from the viewpoint of the end user, in light of the studies and reports that had established TICs as a mission critical tool for firefighters.

Konsin, Lawrence S.; Nixdorff, Stuart

2007-04-01

345

Image Fusion Using an Integrated, Dual-Head Coincidence Camera with X-Ray Tube-Based Attenuation Maps  

Microsoft Academic Search

The purpose of this study was to characterize a dual-head gamma camera capable of FDG imaging using coincidence detection and equipped with an integrated x-ray transmission system for attenuation correction, anatomic mapping, and image fusion. Methods: Radiation dose (425 mrads skin dose) and tissue contrast (0.7% deviation from expected values) were assessed for the x-ray system. Registration of transmission and

James A. Patton; Dominique Delbeke; Martin P. Sandier

346

Development of a pixelated GSO gamma camera system with tungsten parallel hole collimator for single photon imaging  

SciTech Connect

Purpose: In small animal imaging using a single photon emitting radionuclide, a high resolution gamma camera is required. Recently, position sensitive photomultiplier tubes (PSPMTs) with high quantum efficiency have been developed. By combining these with nonhygroscopic scintillators with a relatively low light output, a high resolution gamma camera can become useful for low energy gamma photons. Therefore, the authors developed a gamma camera by combining a pixelated Ce-doped Gd{sub 2}SiO{sub 5} (GSO) block with a high quantum efficiency PSPMT. Methods: GSO was selected for the scintillator, because it is not hygroscopic and does not contain any natural radioactivity. An array of 1.9 mm x 1.9 mm x 7 mm individual GSO crystal elements was constructed. These GSOs were combined with a 0.1-mm thick reflector to form a 22 x 22 matrix and optically coupled to a high quantum efficiency PSPMT (H8500C-100 MOD8). The GSO gamma camera was encased in a tungsten gamma-ray shield with tungsten pixelated parallel hole collimator, and the basic performance was measured for Co-57 gamma photons (122 keV). Results: In a two-dimensional position histogram, all pixels were clearly resolved. The energy resolution was {approx}15% FWHM. With the 20-mm thick tungsten pixelated collimator, the spatial resolution was 4.4-mm FWHM 40 mm from the collimator surface, and the sensitivity was {approx}0.05%. Phantom and small animal images were successfully obtained with our developed gamma camera. Conclusions: These results confirmed that the developed pixelated GSO gamma camera has potential as an effective instrument for low energy gamma photon imaging.

Yamamoto, S.; Watabe, H.; Kanai, Y.; Shimosegawa, E.; Hatazawa, J. [Kobe City College of Technology, 8-3 Gakuen-Higashi-machi, Nishi-ku, Kobe 651-2194 (Japan); Department of Molecular Imaging in Medicine, Osaka University Graduate School of Medicine, Osaka 565-0871 (Japan); Department of Nuclear Medicine and Tracer Kinetics, Osaka University Graduate School of Medicine, Osaka 565-0871 (Japan); Department of Molecular Imaging in Medicine, Osaka University Graduate School of Medicine, Osaka 565-0871 (Japan) and Department of Nuclear Medicine and Tracer Kinetics, Osaka University Graduate School of Medicine, Osaka 565-0871 (Japan)

2012-02-15

347

Educational Applications for Digital Cameras.  

ERIC Educational Resources Information Center

Discusses uses of digital cameras in education. Highlights include advantages and disadvantages, digital photography assignments and activities, camera features and operation, applications for digital images, accessory equipment, and comparisons between digital cameras and other digitizers. (AEF)

Cavanaugh, Terence; Cavanaugh, Catherine

1997-01-01

348

Streak camera time calibration procedures  

NASA Technical Reports Server (NTRS)

Time calibration procedures for streak cameras utilizing a modulated laser beam are described. The time calibration determines a writing rate accuracy of 0.15% with a rotating mirror camera and 0.3% with an image converter camera.

Long, J.; Jackson, I.

1978-01-01

349

Improved Digitization of Lunar Mare Ridges with LROC Derived Products  

NASA Astrophysics Data System (ADS)

Lunar wrinkle ridges (mare ridges) are positive-relief structures formed from compressional stress in basin-filling flood basalt deposits [1]. Previous workers have measured wrinkle ridge orientations and lengths to investigate their spatial distribution and infer basin-localized stress fields [2,3]. Although these plots include the most prominent mare ridges and their general trends, they may not have fully captured all of the ridges, particularly the smaller-scale ridges. Using Lunar Reconnaissance Orbiter Wide Angle Camera (WAC) global mosaics and derived topography (100m pixel scale) [4], we systematically remapped wrinkle ridges in Mare Serenitatis. By comparing two WAC mosaics with different lighting geometry, and shaded relief maps made from a WAC digital elevation model (DEM) [5], we observed that some ridge segments and some smaller ridges are not visible in previous structure maps [2,3]. In the past, mapping efforts were limited by a fixed Sun direction [6,7]. For systematic mapping we created three shaded relief maps from the WAC DEM with solar azimuth angles of 0°, 45°, and 90°, and a fourth map was created by combining the three shaded reliefs into one, using a simple averaging scheme. Along with the original WAC mosaic and the WAC DEM, these four datasets were imported into ArcGIS, and the mare ridges of Imbrium, Serenitatis, and Tranquillitatis were digitized from each of the six maps. Since the mare ridges are often divided into many ridge segments [8], each major component was digitized separately, as opposed to the ridge as a whole. This strategy enhanced our ability to analyze the lengths, orientations, and abundances of these ridges. After the initial mapping was completed, the six products were viewed together to identify and resolve discrepancies in order to produce a final wrinkle ridge map. Comparing this new mare ridge map with past lunar tectonic maps, we found that many mare ridges were not recorded in the previous works. It was noted in some cases, the lengths and orientations of previously digitized ridges were different than those of the ridges digitized in this study. This method of multi-map digitizing allows for a greater accuracy in spatial characterization of mare ridges than previous methods. We intend to map mare ridges on a global scale, creating a more comprehensive ridge map due to higher resolution. References Cited: [1] Schultz P.H. (1976) Moon Morphology, 308. [2] Wilhelms D.E. (1987) USGS Prof. Paper 1348, 5A-B. [3] Carr, M.H. (1966) USGS Geologic Atlas of the Moon, I-498. [4] Robinson M.S. (2010) Space Sci. Rev., 150:82. [5] Scholten F. et al. (2011) LPSC XLII, 2046. [6] Fielder G. and Kiang T. (1962) The Observatory: No. 926, 8. [7] Watters T.R. and Konopliv A.S. (2001) Planetary and Space Sci. 49. 743-748. [8] Aubele J.C. (1988) LPSC XIX, 19.

Crowell, J. M.; Robinson, M. S.; Watters, T. R.; Bowman-Cisneros, E.; Enns, A. C.; Lawrence, S.

2011-12-01

350

Retrieval of sulfur dioxide from a ground-based thermal infrared imaging camera  

NASA Astrophysics Data System (ADS)

Recent advances in uncooled detector technology now offer the possibility of using relatively inexpensive thermal (7 to 14 ?m) imaging devices as tools for studying and quantifying the behaviour of hazardous gases and particulates in atmospheric plumes. An experimental fast-sampling (60 Hz) ground-based uncooled thermal imager (Cyclops), operating with four spectral channels at central wavelengths of 8.6, 10, 11 and 12 ?m and one broadband channel (7-14 ?m) has been tested at several volcanoes and at an industrial site, where SO2 was a major constituent of the plumes. This paper presents new algorithms, which include atmospheric corrections to the data and better calibrations to show that SO2 slant column density can be reliably detected and quantified. Our results indicate that it is relatively easy to identify and discriminate SO2 in plumes, but more challenging to quantify the column densities. A full description of the retrieval algorithms, illustrative results and a detailed error analysis are provided. The noise-equivalent temperature difference (NE?T) of the spectral channels, a fundamental measure of the quality of the measurements, lies between 0.4 and 0.8 K, resulting in slant column density errors of 20%. Frame averaging and improved NE?T's can reduce this error to less than 10%, making a stand-off, day or night operation of an instrument of this type very practical for both monitoring industrial SO2 emissions and for SO2 column densities and emission measurements at active volcanoes. The imaging camera system may also be used to study thermal radiation from meteorological clouds and the atmosphere.

Prata, A. J.; Bernardo, C.

2014-09-01

351

Classification of volcanic ash particles from Sakurajima volcano using CCD camera image and cluster analysis  

NASA Astrophysics Data System (ADS)

Quantitative and speedy characterization of volcanic ash particle is needed to conduct a petrologic monitoring of ongoing eruption. We develop a new simple system using CCD camera images for quantitatively characterizing ash properties, and apply it to volcanic ash collected at Sakurajima. Our method characterizes volcanic ash particles by 1) apparent luminance through RGB filters and 2) a quasi-fractal dimension of the shape of particles. Using a monochromatic CCD camera (Starshoot by Orion Co. LTD.) attached to a stereoscopic microscope, we capture digital images of ash particles that are set on a glass plate under which white colored paper or polarizing plate is set. The images of 1390 x 1080 pixels are taken through three kinds of color filters (Red, Green and Blue) under incident-light and transmitted-light through polarizing plate. Brightness of the light sources is set to be constant, and luminance is calibrated by white and black colored papers. About fifteen ash particles are set on the plate at the same time, and their images are saved with a bit map format. We first extract the outlines of particles from the image taken under transmitted-light through polarizing plate. Then, luminances for each color are represented by 256 tones at each pixel in the particles, and the average and its standard deviation are calculated for each ash particle. We also measure the quasi-fractal dimension (qfd) of ash particles. We perform box counting that counts the number of boxes which consist of 1×1 and 128×128 pixels that catch the area of the ash particle. The qfd is estimated by taking the ratio of the former number to the latter one. These parameters are calculated by using software R. We characterize volcanic ash from Showa crater of Sakurajima collected in two days (Feb 09, 2009, and Jan 13, 2010), and apply cluster analyses. Dendrograms are formed from the qfd and following four parameters calculated from the luminance: Rf=R/(R+G+B), G=G/(R+G+B), B=B/(R+G+B), and total luminance=(R+G+B)/665. We classify the volcanic ash particles from the Dendrograms into three groups based on the euclid distance. The groups are named as Group A, B and C in order of increasing of the average value of total luminance. The classification shows that the numbers of particles belonging to Group A, B and C are 77, 25 and 6 in Feb, 09, 2009 sample, and 102, 19 and 6 in Jan, 13, 2010 sample, respectively. The examination under stereoscopic microscope suggests that Group A, B and C mainly correspond with juvenile, altered and free-crystal particles, respectively. So the result of classification by present method demonstrates a difference in the contribution of juvenile material between the two days. To evaluate reliability of our classification, we classify pseudo-samples in which errors of 10% are added in the measured parameters. We apply our method to one thousand psuedo-samples, and the result shows that the numbers of particles classified into the three groups vary less than 20 % of the total number of 235 particles. Our system can classify 120 particles within 6 minutes so that we easily increase the number of ash particles, which enable us to improve reliabilities and resolutions of the classification and to speedily capture temporal changes of the property of ash particles from active volcanoes.

Miwa, T.; Shimano, T.; Nishimura, T.

2012-12-01

352

Active hyperspectral imaging using a quantum cascade laser (QCL) array and digital-pixel focal plane array (DFPA) camera.  

PubMed

We demonstrate active hyperspectral imaging using a quantum-cascade laser (QCL) array as the illumination source and a digital-pixel focal-plane-array (DFPA) camera as the receiver. The multi-wavelength QCL array used in this work comprises 15 individually addressable QCLs in which the beams from all lasers are spatially overlapped using wavelength beam combining (WBC). The DFPA camera was configured to integrate the laser light reflected from the sample and to perform on-chip subtraction of the passive thermal background. A 27-frame hyperspectral image was acquired of a liquid contaminant on a diffuse gold surface at a range of 5 meters. The measured spectral reflectance closely matches the calculated reflectance. Furthermore, the high-speed capabilities of the system were demonstrated by capturing differential reflectance images of sand and KClO3 particles that were moving at speeds of up to 10 m/s. PMID:24977536

Goyal, Anish; Myers, Travis; Wang, Christine A; Kelly, Michael; Tyrrell, Brian; Gokden, B; Sanchez, Antonio; Turner, George; Capasso, Federico

2014-06-16

353

Removing cosmic-ray hits from multiorbit HST Wide Field Camera images  

NASA Technical Reports Server (NTRS)

We present an optimized algorithm that removes cosmic rays ('CRs') from multiorbit Hubble Space Telescope (HST) Wide Field/Planetary Camera ('WF/PC') images. It computes the image noise in every iteration from the WF/PC CCD equation. This includes all known sources of random and systematic calibration errors. We test this algorithm on WF/PC stacks of 2-12 orbits as a function of the number of available orbits and the formal Poissonian sigma-clipping level. We find that the algorithm needs greater than or equal 4 WF/PC exposures to locate the minimal sky signal (which is noticeably affected by CRs), with an optimal clipping level at 2-2.5 x sigma(sub Poisson). We analyze the CR flux detected on multiorbit 'CR stacks,' which are constructed by subtracting the best CR filtered images from the unfiltered 8-12 orbit average. We use an automated object finder to determine the surface density of CRS as a function of the apparent magnitude (or ADU flux) they would have generated in the images had they not been removed. The power law slope of the CR 'counts' (gamma approximately = 0.6 for N(m) m(exp gamma)) is steeper than that of the faint galaxy counts down to V approximately = 28 mag. The CR counts show a drop off between 28 less than or approximately V less than or approximately 30 mag (the latter is our formal 2 sigma point source sensitivity without spherical aberration). This prevents the CR sky integral from diverging, and is likely due to a real cutoff in the CR energy distribution below approximately 11 ADU per orbit. The integral CR surface density is less than or approximately 10(exp 8)/sq. deg, and their sky signal is V approximately = 25.5-27.0 mag/sq. arcsec, or 3%-13% of our NEP sky background (V = 23.3 mag/sq. arcsec), and well above the EBL integral of the deepest galaxy counts (B(sub J) approximately = 28.0 mag/sq. arcsec). We conclude that faint CRs will always contribute to the sky signal in the deepest WF/PC images. Since WFPC2 has approximately 2.7x lower read noise and a thicker CCD, this will result in more CR detections than in WF/PC, potentially affecting approximately 10%-20% of the pixels in multiorbit WFPC2 data cubes.

Windhorst, Rogier A.; Franklin, Barbara E.; Neuschaefer, Lyman W.

1994-01-01

354

Digital Electronic Still Camera  

NASA Technical Reports Server (NTRS)

Digital electronic still camera part of electronic recording, processing, tansmitting, and displaying system. Removable hard-disk drive in camera serves as digital electronic equivalent of photographic film. Images viewed, analyzed, or transmitted quickly. Camera takes images of nearly photographic quality and stores them in digital form. Portable, hand-held, battery-powered unit designed for scientific use. Camera used in conjunction with playback unit also serving as transmitting unit if images sent to remote station. Remote station equipped to store, process, and display images. Digital image data encoded with error-correcting code at playback/transmitting unit for error-free transmission to remote station.

Holland, Samuel D.; Yeates, Herbert D.

1993-01-01

355

A double photomultiplier Compton camera and its readout system for mice imaging  

SciTech Connect

We have designed a Compton Camera (CC) to image the bio-distribution of gamma-emitting radiopharmaceuticals in mice. A CC employs the 'electronic collimation', i.e. a technique that traces the gamma-rays instead of selecting them with physical lead or tungsten collimators. To perform such a task, a CC measures the parameters of the Compton interaction that occurs in the device itself. At least two detectors are required: one (tracker), where the primary gamma undergoes a Compton interaction and a second one (calorimeter), in which the scattered gamma is completely absorbed. Eventually the polar angle and hence a 'cone' of possible incident directions are obtained (event with 'incomplete geometry'). Different solutions for the two detectors are proposed in the literature: our design foresees two similar Position Sensitive Photomultipliers (PMT, Hamamatsu H8500). Each PMT has 64 output channels that are reduced to 4 using a charge multiplexed readout system, i.e. a Series Charge Multiplexing net of resistors. Triggering of the system is provided by the coincidence of fast signals extracted at the last dynode of the PMTs. Assets are the low cost and the simplicity of design and operation, having just one type of device; among drawbacks there is a lower resolution with respect to more sophisticated trackers and full 64 channels Readout. This paper does compare our design of our two-Hamamatsu CC to other solutions and shows how the spatial and energy accuracy is suitable for the inspection of radioactivity in mice.

Fontana, Cristiano Lino [Physics Department Galileo Galilei, University of Padua, Via Marzolo 8, Padova 35131 (Italy) and INFN Padova, Via Marzolo 8, Padova 35131 (Italy); Atroshchenko, Kostiantyn [Physics Department Galileo Galilei, University of Padua, Via Marzolo 8, Padova 35131 (Italy) and INFN Legnaro, Viale dell'Universita 2, Legnaro PD 35020 (Italy); Baldazzi, Giuseppe [Physics Department, University of Bologna, Viale Berti Pichat 6/2, Bologna 40127, Italy and INFN Bologna, Viale Berti Pichat 6/2, Bologna 40127 (Italy); Bello, Michele [INFN Legnaro, Viale dell'Universita 2, Legnaro PD 35020 (Italy); Uzunov, Nikolay [Department of Natural Sciences, Shumen University, 115 Universitetska str., Shumen 9712, Bulgaria and INFN Legnaro, Viale dell'Universita 2, Legnaro PD 35020 (Italy); Di Domenico, Giovanni [Physics Department, University of Ferrara, Via Saragat 1, Ferrara 44122 (Italy) and INFN Ferrara, Via Saragat 1, Ferrara 44122 (Italy)

2013-04-19

356

Camera Calibration Using Multiple Images Paul Beardsley, David Murray, Andrew Zisserman *  

E-print Network

which is a conic section, and the vanishing line generates an envelope which is also a conic section. We describe how such conics can be used to determine the camera's focal length, the principal point (the lines - this section outlines some of the elementary results. Figure 2 shows a camera viewing a plane

Murray, David

357

A compact, discrete CsI(Tl) scintillator/Si photodiode gamma camera for breast cancer imaging  

SciTech Connect

Recent clinical evaluations of scintimammography (radionuclide breast imaging) are promising and suggest that this modality may prove a valuable complement to X-ray mammography and traditional breast cancer detection and diagnosis techniques. Scintimammography, however, typically has difficulty revealing tumors that are less than 1 cm in diameter, are located in the medial part of the breast, or are located in the axillary nodes. These shortcomings may in part be due to the use of large, conventional Anger cameras not optimized for breast imaging. In this thesis I present compact single photon camera technology designed specifically for scintimammography which strives to alleviate some of these limitations by allowing better and closer access to sites of possible breast tumors. Specific applications are outlined. The design is modular, thus a camera of the desired size and geometry can be constructed from an array (or arrays) of individual modules and a parallel hole lead collimator for directional information. Each module consists of: (1) an array of 64 discrete, optically-isolated CsI(Tl) scintillator crystals 3 x 3 x 5 mm{sup 3} in size, (2) an array of 64 low-noise Si PIN photodiodes matched 1-to-1 to the scintillator crystals, (3) an application-specific integrated circuit (ASIC) that amplifies the 64 photodiode signals and selects the signal with the largest amplitude, and (4) connectors and hardware for interfacing the module with a motherboard, thereby allowing straightforward computer control of all individual modules within a camera.

Gruber, Gregory J.

2000-12-01

358

Automated Camera Calibration  

NASA Technical Reports Server (NTRS)

Automated Camera Calibration (ACAL) is a computer program that automates the generation of calibration data for camera models used in machine vision systems. Machine vision camera models describe the mapping between points in three-dimensional (3D) space in front of the camera and the corresponding points in two-dimensional (2D) space in the camera s image. Calibrating a camera model requires a set of calibration data containing known 3D-to-2D point correspondences for the given camera system. Generating calibration data typically involves taking images of a calibration target where the 3D locations of the target s fiducial marks are known, and then measuring the 2D locations of the fiducial marks in the images. ACAL automates the analysis of calibration target images and greatly speeds the overall calibration process.

Chen, Siqi; Cheng, Yang; Willson, Reg

2006-01-01

359

Variations of zonal wind speed at Venus cloud tops from Venus Monitoring Camera UV images  

NASA Astrophysics Data System (ADS)

7 years of continuous monitoring of Venus by ESA's Venus Express provided an opportunity to study dynamics of the atmosphere of Venus. Venus Monitoring Camera (VMC) [1] delivered the longest and the most complete so far set of UV images to study the cloud level circulation by tracking motion of the cloud features. We analyzed 130 orbits with manual cloud tracking and 600 orbits with digital correlation method. Here we present the latest update of our results. Total number of wind vectors derived in this work is approximately a half million. During Venus Express observations the mean zonal speed was in the range of 85-110 m/s. VMC observations indicated a long term trend for the zonal wind speed at low latitudes to increase. The origin of low frequency trend with a period about 3000 days is unclear. Fourier analysis [2-3] of revealed quasi-periodicities in the zonal circulation at low latitudes. Two groups of the periods were found. The first group is close to the period of superrotation at low latitudes (4.83±0.1 days) with the period 4.1-5.1 days and the amplitude ranging from ±4.2 to ±17.4 m/s. The amplitude and phase of oscillations demonstrates dependence from the latitude and also time variability with preserving stable parameters of oscillation during at least 70 days. Short term oscillations may be caused by wave processes in the mesosphere of Venus at the cloud top level. Wave number of the observed oscillations is 1. The second group is a long term periods caused by orbital motion of Venus (116 days, 224 days) and is related to the periodicity in VMC observations. Also VMC UV observations showed a clear diurnal pattern of the mean circulation. The zonal wind demonstrated semi-diurnal variations with minimum speed close to noon (11-14 h) and maxima in the morning (8-9 h) and in the evening (16-17 h). The meridional component clearly peaks in the early afternoon (13-15h) at latitudes near 50S. The minimum of the meridional wind is located at low latitudes in the morning (8-11h). References [1] Markiewicz W. J. et al.: Venus Monitoring Camera for Venus Express // Planet. Space Sci.. V.55(12). pp1701-1711. doi:10.1016/j.pss.2007.01.004, 2007. [2] Deeming T.J.: Fourier analysis with unequally-spaced data. Astroph. and Sp. Sci. V.36, pp137-158, 1975. [3] Terebizh, V.Yu. Time series analysis in astrophysics. Moscow: "Nauka," Glav. red. fiziko-matematicheskoi lit-ry, 1992. In Russian

Khatuntsev, Igor; Patsaeva, Marina; Ignatiev, Nikolai; Titov, Dmitri; Markiewicz, Wojciech J.

2013-04-01

360

Digital photogrammetric analysis of the IMP camera images: Mapping the Mars Pathfinder landing site in three dimensions  

USGS Publications Warehouse

This paper describes our photogrammetric analysis of the Imager for Mars Pathfinder data, part of a broader program of mapping the Mars Pathfinder landing site in support of geoscience investigations. This analysis, carried out primarily with a commercial digital photogrammetric system, supported by our in-house Integrated Software for Imagers and Spectrometers (ISIS), consists of three steps: (1) geometric control: simultaneous solution for refined estimates of camera positions and pointing plus three-dimensional (3-D) coordinates of ???103 features sitewide, based on the measured image coordinates of those features; (2) topographic modeling: identification of ???3 ?? 105 closely spaced points in the images and calculation (based on camera parameters from step 1) of their 3-D coordinates, yielding digital terrain models (DTMs); and (3) geometric manipulation of the data: combination of the DTMs from different stereo pairs into a sitewide model, and reprojection of image data to remove parallax between the different spectral filters in the two cameras and to provide an undistorted planimetric view of the site. These processes are described in detail and example products are shown. Plans for combining the photogrammetrically derived topographic data with spectrophotometry are also described. These include photometric modeling using surface orientations from the DTM to study surface microtextures and improve the accuracy of spectral measurements, and photoclinometry to refine the DTM to single-pixel resolution where photometric properties are sufficiently uniform. Finally, the inclusion of rover images in a joint photogrammetric analysis with IMP images is described. This challenging task will provide coverage of areas hidden to the IMP, but accurate ranging of distant features can be achieved only if the lander is also visible in the rover image used. Copyright 1999 by the American Geophysical Union.

Kirk, R.L.; Howington-Kraus, E.; Hare, T.; Dorrer, E.; Cook, D.; Becker, K.; Thompson, K.; Redding, B.; Blue, J.; Galuszka, D.; Lee, E.M.; Gaddis, L.R.; Johnson, J.R.; Soderblom, L.A.; Ward, A.W.; Smith, P.H.; Britt, D.T.

1999-01-01

361

Dual-view microscopy with a single camera: real-time imaging of molecular orientations and calcium  

PubMed Central

A new microscope technique, termed "W" (double view video) microscopy, enables simultaneous observation of two different images of an object through a single video camera or by eye. The image pair may, for example, be transmission and fluorescence, fluorescence at different wavelengths, or mutually perpendicular components of polarized fluorescence. Any video microscope can be converted into a dual imager by simple insertion of a small optical device. The continuous appearance of the dual image assures the best time resolution in existing and future video microscopes. As an application, orientations of actin protomers in individual, moving actin filaments have been imaged at the video rate. Asymmetric calcium influxes into a cell exposed to an intense electric pulse have also been visualized. PMID:1918140

1991-01-01

362

Observation of asymmetrically imploded core plasmas with a two-dimensional sampling image x-ray streak camera.  

PubMed

A shell target with a cone for guiding the heating beam has been proposed for the fast ignition scheme. Implosion of such target is no longer symmetric because of the cone. A fast two-dimensional x-ray imaging technique, two-dimensional (2D) sampling image x-ray streak camera was applied for the first time to observation of the dynamics of implosion and core plasma. X-ray emission image of the plasma was sampled with two-dimensionally distributed image sampling points, streaked with the tube, and the recorded signals were reconstructed as sequential 2D frame images. Shape and movement of the core plasma were clearly observed. PMID:19044575

Shiraga, Hiroyuki; Lee, Myongdok; Mahigashi, Norimitsu; Fujioka, Shinsuke; Azechi, Hiroshi

2008-10-01

363

Trajectory Reconstruction of CE-3 Soft Landing by Landing Camera Images  

NASA Astrophysics Data System (ADS)

In this paper, a method of trajectory reconstruction by space resection of landing camera is proposed. The process of avoidance obstacle at orbital altitude of about 100 meters is carefully studied, finally some useful results are given.

Liu, J. J.; Yan, W.; Li, C. L.; Ren, X.

2014-06-01

364

Non-photorealistic camera: depth edge detection and stylized rendering using multi-flash imaging  

Microsoft Academic Search

We present a non-photorealistic rendering approach to capture and convey shape features of real-world scenes. We use a camera with multiple flashes that are strategically positioned to cast shadows along depth discontinuities in the scene. The projective-geometric relationship of the camera-flash setup is then exploited to detect depth discontinuities and distinguish them from intensity edges due to material discontinuities.We introduce

Ramesh Raskar; Kar-Han Tan; Rogerio Feris; Jingyi Yu; Matthew Turk

2004-01-01

365

High-frame-rate CCD cameras with fast optical shutters for military and medical imaging applications  

Microsoft Academic Search

Los Alamos National Laboratory (LANL) has designed and prototyped high-frame rate intensified\\/shuttered Charge-Coupled-Device (CCD) cameras capable of operating at Kilohertz frame rates (non-interfaced mode) with optical shutters capable of acquiring nanosecond-to- microsecond exposures each frame. These cameras utilize an Interline Transfer CCD, Loral Fairchild CCD-222 with 244 (vertical) X 380 (horizontal) pixels operated at pixel rates approaching 100 Mhz. Initial

Nicholas S. King; Kevin L. Albright; Steven A. Jaramillo; Thomas E. McDonald; George J. Yates; Bojan T. Turko

1994-01-01

366

UVUDF: Ultraviolet Imaging of the Hubble Ultra Deep Field with Wide-Field Camera 3  

NASA Astrophysics Data System (ADS)

We present an overview of a 90 orbit Hubble Space Telescope treasury program to obtain near-ultraviolet imaging of the Hubble Ultra Deep Field using the Wide Field Camera 3 UVIS detector with the F225W, F275W, and F336W filters. This survey is designed to: (1) investigate the episode of peak star formation activity in galaxies at 1 < z < 2.5; (2) probe the evolution of massive galaxies by resolving sub-galactic units (clumps); (3) examine the escape fraction of ionizing radiation from galaxies at z ~ 2-3; (4) greatly improve the reliability of photometric redshift estimates; and (5) measure the star formation rate efficiency of neutral atomic-dominated hydrogen gas at z ~ 1-3. In this overview paper, we describe the survey details and data reduction challenges, including both the necessity of specialized calibrations and the effects of charge transfer inefficiency. We provide a stark demonstration of the effects of charge transfer inefficiency on resultant data products, which when uncorrected, result in uncertain photometry, elongation of morphology in the readout direction, and loss of faint sources far from the readout. We agree with the STScI recommendation that future UVIS observations that require very sensitive measurements use the instrument's capability to add background light through a "post-flash." Preliminary results on number counts of UV-selected galaxies and morphology of galaxies at z ~ 1 are presented. We find that the number density of UV dropouts at redshifts 1.7, 2.1, and 2.7 is largely consistent with the number predicted by published luminosity functions. We also confirm that the image mosaics have sufficient sensitivity and resolution to support the analysis of the evolution of star-forming clumps, reaching 28-29th magnitude depth at 5? in a 0.''2 radius aperture depending on filter and observing epoch. Based on observations made with the NASA/ESA Hubble Space Telescope, obtained at the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS 5-26555. These observations are #12534.

Teplitz, Harry I.; Rafelski, Marc; Kurczynski, Peter; Bond, Nicholas A.; Grogin, Norman; Koekemoer, Anton M.; Atek, Hakim; Brown, Thomas M.; Coe, Dan; Colbert, James W.; Ferguson, Henry C.; Finkelstein, Steven L.; Gardner, Jonathan P.; Gawiser, Eric; Giavalisco, Mauro; Gronwall, Caryl; Hanish, Daniel J.; Lee, Kyoung-Soo; de Mello, Duilia F.; Ravindranath, Swara; Ryan, Russell E.; Siana, Brian D.; Scarlata, Claudia; Soto, Emmaris; Voyer, Elysse N.; Wolfe, Arthur M.

2013-12-01

367

Camera Projector  

NSDL National Science Digital Library

In this activity (posted on March 14, 2011), learners follow the steps to construct a camera projector to explore lenses and refraction. First, learners use relatively simple materials to construct the projector. Then, learners discover that lenses project images upside down and backwards. They explore this phenomenon by creating their own slides (must be drawn upside down and backwards to appear normally). Use this activity to also introduce learners to spherical aberration and chromatic aberration.

Center, Oakland D.

2011-01-01

368

Estimation of the energy release and thermal properties of ejected clasts from explosive eruptions using a thermal imaging camera  

NASA Astrophysics Data System (ADS)

Thermal images were obtained at Popocatépetl, central Mexico, during the period of high lava-dome destruction activity between 1998 and 2002. Similarly, thermal cameras have operated at Colima volcano, western Mexico during episodes of similar explosive activity in 2005 and 2007. We have developed a method to estimate the relative thermal energy release among explosions, and the degree of conversion into mechanical energy spent in the fragmentation of the ejecta, based on the cooling rate inferred from successive thermal images obtained immediately after each explosion. The thermal imaging cameras were located at about 11 km from the crater at Popocatépetl, and at about 6 km from the crater at Colima. The selected explosions threw significant amounts of hot debris on the volcano flanks. The cooling rate was then measured on selected pixels of the thermal images, and compared with different possible distributions of fragment sizes considering weighted averages of fragments in the pixels. The optimal fitting of fragment distributions reveals the degree of fragmentation of individual explosions, and along with a model for the cooling process, permitted to estimate the relative thermal energy release on the area covered by the image. Additionally, the results indicate that the radiative thermal conductivity plays a significant role on the outer shell of the fragments, suggesting a free mean path of thermal infrared photons that may reach several millimeters or even a few centimeters.

De la Cruz-Reyna, S.; Cárdenas-Sánchez, E.

2012-04-01

369

Multiformat video and laser cameras: history, design considerations, acceptance testing, and quality control. Report of AAPM Diagnostic X-Ray Imaging Committee Task Group No. 1.  

PubMed

Acceptance testing and quality control of video and laser cameras is relatively simple, especially with the use of the SMPTE test pattern. Photographic quality control is essential if one wishes to be able to maintain the quality of video and laser cameras. In addition, photographic quality control must be carried out with the film used clinically in the video and laser cameras, and with a sensitometer producing a light spectrum similar to that of the video or laser camera. Before the end of the warranty period a second acceptance test should be carried out. At this time the camera should produce the same results as noted during the initial acceptance test. With the appropriate acceptance and quality control the video and laser cameras should produce quality images throughout the life of the equipment. PMID:8497235

Gray, J E; Anderson, W F; Shaw, C C; Shepard, S J; Zeremba, L A; Lin, P J

1993-01-01

370

Design of a Compton camera for 3D prompt-? imaging during ion beam therapy  

NASA Astrophysics Data System (ADS)

We investigate, by means of Geant4 simulations, a real-time method to control the position of the Bragg peak during ion therapy, based on a Compton camera in combination with a beam tagging device (hodoscope) in order to detect the prompt gamma emitted during nuclear fragmentation. The proposed set-up consists of a stack of 2 mm thick silicon strip detectors and a LYSO absorber detector. The ? emission points are reconstructed analytically by intersecting the ion trajectories given by the beam hodoscope and the Compton cones given by the camera. The camera response to a polychromatic point source in air is analyzed with regard to both spatial resolution and detection efficiency. Various geometrical configurations of the camera have been tested. In the proposed configuration, for a typical polychromatic photon point source, the spatial resolution of the camera is about 8.3 mm FWHM and the detection efficiency 2.5×10-4 (reconstructable photons/emitted photons in 4?). Finally, the clinical applicability of our system is considered and possible starting points for further developments of a prototype are discussed.

Roellinghoff, F.; Richard, M.-H.; Chevallier, M.; Constanzo, J.; Dauvergne, D.; Freud, N.; Henriquet, P.; Le Foulher, F.; Létang, J. M.; Montarou, G.; Ray, C.; Testa, E.; Testa, M.; Walenta, A. H.

2011-08-01

371

Calibration and Validation of Images from the Mars Reconnaissance Orbiter Mars Color Imager (MARCI) and Context Camera (CTX) Instruments  

Microsoft Academic Search

The MRO CTX instrument is a monochrome (611±189; nm), linear array CCD pushbroom camera with a nominal surface resolution of 6 m\\/pixel. The MARCI instrument is a 2-D CCD framing camera with 5 visible (420, 550, 600, 650, and 720 nm) and 2 UV (260 and 320 nm) filters, a 180° field of view, and a nominal resolution of about

Derek Schaeffer; J. F. Bell III; M. Malin; M. Caplinger; W. M. Calvin; B. Cantor; R. T. Clancy; R. M. Haberle; P. B. James; S. Lee; P. Thomas; M. J. Wolff

2006-01-01

372

Omnifocus video camera  

NASA Astrophysics Data System (ADS)

The omnifocus video camera takes videos, in which objects at different distances are all in focus in a single video display. The omnifocus video camera consists of an array of color video cameras combined with a unique distance mapping camera called the Divcam. The color video cameras are all aimed at the same scene, but each is focused at a different distance. The Divcam provides real-time distance information for every pixel in the scene. A pixel selection utility uses the distance information to select individual pixels from the multiple video outputs focused at different distances, in order to generate the final single video display that is everywhere in focus. This paper presents principle of operation, design consideration, detailed construction, and over all performance of the omnifocus video camera. The major emphasis of the paper is the proof of concept, but the prototype has been developed enough to demonstrate the superiority of this video camera over a conventional video camera. The resolution of the prototype is high, capturing even fine details such as fingerprints in the image. Just as the movie camera was a significant advance over the still camera, the omnifocus video camera represents a significant advance over all-focus cameras for still images.

Iizuka, Keigo

2011-04-01

373

Detection of Road Markings Recorded in In-Vehicle Camera Images by Using Position-Dependent Classifiers  

NASA Astrophysics Data System (ADS)

In this study, we propose a method for detecting road markings recorded in an image captured by an in-vehicle camera by using a position-dependent classifier. Road markings are symbols painted on the road surface that help in preventing traffic accidents and in ensuring traffic smooth. Therefore, driver support systems for detecting road markings, such as a system that provides warning in the case when traffic signs are overlooked, and supporting the stopping of a vehicle are required. It is difficult to detect road markings because their appearance changes with the actual traffic conditions, e. g. the shape and resolution change. The variation in these appearances depend on the positional relation between the vehicle and the road markings, and on the vehicle posture. Although these variations are quite large in an entire image, they are relatively small in a local area of the image. Therefore, we try to improve the detection performance by taking into account the local variations in these appearances. We propose a method in which a position-dependent classifier is used to detect road markings recorded in images captured by an in-vehicle camera. Further, to train the classifier efficiently, we propose a generative learning method that takes into consideration the positional relation between the vehicle and road markings, and also the vehicle posture. Experimental results showed that the detection performance when the proposed method was used was better than when a method involving a single classifier was used.

Noda, Masafumi; Takahashi, Tomokazu; Deguchi, Daisuke; Ide, Ichiro; Murase, Hiroshi; Kojima, Yoshiko; Naito, Takashi

374

The effect of acquisition interval and spatial resolution on dynamic cardiac imaging with a stationary SPECT camera.  

PubMed

The current SPECT scanning paradigm that acquires images by slow rotation of multiple detectors in body-contoured orbits around the patient is not suited to the rapid collection of tomographically complete data. During rapid image acquisition, mechanical and patient safety constraints limit the detector orbit to circular paths at increased distances from the patient, resulting in decreased spatial resolution. We consider a novel dynamic rotating slant-hole (DyRoSH) SPECT camera that can collect full tomographic data every 2 s, employing three stationary detectors mounted with slant-hole collimators that rotate at 30 rpm. Because the detectors are stationary, they can be placed much closer to the patient than is possible with conventional SPECT systems. We propose that the decoupling of the detector position from the mechanics of rapid image acquisition offers an additional degree of freedom which can be used to improve accuracy in measured kinetic parameter estimates. With simulations and list-mode reconstructions, we consider the effects of different acquisition intervals on dynamic cardiac imaging, comparing a conventional three detector SPECT system with the proposed DyRoSH SPECT system. Kinetic parameters of a two-compartment model of myocardial perfusion for technetium-99m-teboroxime were estimated. When compared to a conventional SPECT scanner for the same acquisition periods, the proposed DyRoSH system shows equivalent or reduced bias or standard deviation values for the kinetic parameter estimates. The DyRoSH camera with a 2 s acquisition period does not show any improvement compared to a DyRoSH camera with a 10 s acquisition period. PMID:17634648

Roberts, J; Maddula, R; Clackdoyle, R; DiBella, E; Fu, Z

2007-08-01

375

Cameras for Astronomy  

Microsoft Academic Search

\\u000a This chapter describes the cameras most commonly used for astronomical photography. The main component in these cameras is\\u000a a concave mirror S that forms the image of an object at infinity on its focal plane. The image formed by such a system usually exhibits aberrations\\u000a that are too large to be acceptable.

Antonio Romano

376

Embedded Smart Camera Performance Analysis  

Microsoft Academic Search

Increasingly powerful integrated circuits are making an entire range of new applications possible. Recent technological advances are enabling a new generation of smart cameras that represent a quantum leap in sophistication. While todaypsilas digital cameras capture images, smart camera capture high-level descriptions of the scene and analyze what they see. A smart camera combines video sensing, high-level video processing and

N. F. Kahar; R. B. Ahmad; Z. Hussin; A. N. C. Rosli

2009-01-01

377

Hourly and Spatial Analysis of Degrees of Congestion in the Observation Camera Scene by Accumulating Local Changes in the Image Sequence  

NASA Astrophysics Data System (ADS)

We can see a lot of observation cameras at public or private facilities such as stations, halls, shopping malls, hallways and gates. Some janitors of the facilities want to detect and investigate passing people or congestion existences in the camera scene. In order to respond to these requests, we extracted moving areas in the image and accumulated these data and then displayed them on the monitor. Consequently, we obtained spatial distributions and hourly changes of congestion in the observation camera scene. To extract moving areas, we estimated the differences of correlation value of segmented small windows in the image. This paper describes our procedure to analyze the congestion from the image sequence of observation camera, and some experimental results.

Niwakawa, Makoto; Onda, Toshikazu

378

Beyond GPS: determining the camera viewing direction of a geotagged image  

Microsoft Academic Search

Increasingly, geographic information is being associated with personal photos. Recent research results have shown that the additional global positioning system (GPS) information helps visual recognition for geotagged photos by providing location context. However, the current GPS data only identifies the camera location, leaving the viewing direction uncertain. To produce more precise location information, i.e. the viewing direction for geotagged photos,

Minwoo Park; Jiebo Luo; Robert T. Collins; Yanxi Liu

2010-01-01

379

Multi-frame x ray imaging with a large area 40ps camera  

Microsoft Academic Search

The authors have developed a large area short pulse framing camera that is capable of sixteen frames and shutter times of 40ps per frame. This is accomplished with a high fidelity electrical circuit and a L\\/D = 20 microchannel plate, driven by a short pulse (80ps) high amplitude electrical driver. They show results of this work they have done to

P. M. Bell; J. D. Kilkenny; O. L. Landen; R. L. Hanks; J. D. Wiedwald; D. K. Bradley

1992-01-01

380

Multiframe x-ray imaging with a large-area 40ps camera  

Microsoft Academic Search

We have developed a large area short pulse framing camera that is capable of sixteen frames and shutter times of 40 ps per frame. This is accomplished with a high fidelity electrical circuit and a L\\/D equals 20 microchannel plate, driven by a short pulse (80 ps) high amplitude electrical driver. We show results of this work we have done

Perry M. Bell; Joseph D. Kilkenny; Otto L. Landen; Roy L. Hanks; Jay D. Wiedwald; David K. Bradley

1993-01-01

381

An Integrated Image Motion Sensor for Micro Camera Module F. Gensolen1,2  

E-print Network

typically produced by handheld devices movement. We extract this global motion from local motion measures plane [2]. Nevertheless, this is a costly and burdening solution for embedded devices. The electronic, so called global motion (G.M.) or camera motion. In order to separate the intentional motion from

Paris-Sud XI, Université de

382

3D papillary image capturing by the stereo fundus camera system for clinical diagnosis on retina and optic nerve  

NASA Astrophysics Data System (ADS)

Glaucoma is the second main cause of the blindness in the world and there is a tendency to increase this number due to the lifetime expectation raise of the population. Glaucoma is related to the eye conditions, which leads the damage to the optic nerve. This nerve carries visual information from eye to brain, then, if it has damage, it compromises the visual quality of the patient. In the majority cases the damage of the optic nerve is irreversible and it happens due to increase of intraocular pressure. One of main challenge for the diagnosis is to find out this disease, because any symptoms are not present in the initial stage. When is detected, it is already in the advanced stage. Currently the evaluation of the optic disc is made by sophisticated fundus camera, which is inaccessible for the majority of Brazilian population. The purpose of this project is to develop a specific fundus camera without fluorescein angiography and red-free system to accomplish 3D image of optic disc region. The innovation is the new simplified design of a stereo-optical system, in order to make capable the 3D image capture and in the same time quantitative measurements of excavation and topography of optic nerve; something the traditional fundus cameras do not do. The dedicated hardware and software is developed for this ophthalmic instrument, in order to permit quick capture and print of high resolution 3D image and videos of optic disc region (20° field-of-view) in the mydriatic and nonmydriatic mode.

Motta, Danilo A.; Serillo, André; de Matos, Luciana; Yasuoka, Fatima M. M.; Bagnato, Vanderlei S.; Carvalho, Luis A. V.

2014-03-01

383

Experimental evaluation of an online gamma-camera imaging of permanent seed implantation (OGIPSI) prototype for partial breast irradiation  

SciTech Connect

Previously, our team used Monte Carlo simulation to demonstrate that a gamma camera could potentially be used as an online image guidance device to visualize seeds during permanent breast seed implant procedures. This could allow for intraoperative correction if seeds have been misplaced. The objective of this study is to describe an experimental evaluation of an online gamma-camera imaging of permanent seed implantation (OGIPSI) prototype. The OGIPSI device is intended to be able to detect a seed misplacement of 5 mm or more within an imaging time of 2 min or less. The device was constructed by fitting a custom built brass collimator (16 mm height, 0.65 mm hole pitch, 0.15 mm septal thickness) on a 64 pixel linear array CZT detector (eValuator-2000, eV Products, Saxonburg, PA). Two-dimensional projection images of seed distributions were acquired by the use of a digitally controlled translation stage. Spatial resolution and noise characteristics of the detector were measured. The ability and time needed for the OGIPSI device to image the seeds and to detect cold spots was tested using an anthropomorphic breast phantom. Mimicking a real treatment plan, a total of 52 {sup 103}Pd seeds of 65.8 MBq each were placed on three different layers at appropriate depths within the phantom. The seeds were reliably detected within 30 s with a median error in localization of 1 mm. In conclusion, an OGIPSI device can potentially be used for image guidance of permanent brachytherapy applications in the breast and, possibly, other sites.

Ravi, Ananth; Caldwell, Curtis B.; Pignol, Jean-Philippe [Department of Medical Biophysics, University of Toronto, Toronto, Ontario, M4N 3M5 (Canada); Department of Medical Biophysics and Department of Medical Imaging, University of Toronto, Toronto, Ontario, M4N 3M5 (Canada) and Department of Medical Physics, Sunnybrook Health Sciences Centre, Toronto, Ontario, M4N 3M5 (Canada); Department of Medical Biophysics, University of Toronto, Toronto, Ontario, M4N 3M5 (Canada) and Department of Radiation Oncology, Sunnybrook Health Sciences Centre, Toronto, Ontario, M4N 3M5 (Canada)

2008-06-15

384

Robotic Arm Camera Image of the South Side of the Thermal and Evolved-Gas Analyzer (Door TA4  

NASA Technical Reports Server (NTRS)

The Thermal and Evolved-Gas Analyzer (TEGA) instrument aboard NASA's Phoenix Mars Lander is shown with one set of oven doors open and dirt from a sample delivery. After the 'seventh shake' of TEGA, a portion of the dirt sample entered the oven via a screen for analysis. This image was taken by the Robotic Arm Camera on Sol 18 (June 13, 2008), or 18th Martian day of the mission.

The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

2008-01-01

385

Camera Obscura  

NSDL National Science Digital Library

Before photography was invented there was the camera obscura, useful for studying the sun, as an aid to artists, and for general entertainment. What is a camera obscura and how does it work ??? Camera = Latin for room Obscura = Latin for dark But what is a Camera Obscura? The Magic Mirror of Life What is a camera obscura? A French drawing camera with supplies A French drawing camera with supplies Drawing Camera Obscuras with Lens at the top Drawing Camera Obscuras with Lens at the top Read the first three paragraphs of this article. Under the portion Early Observations and Use in Astronomy you will find the answers to the ...

Engelman, Mr.

2008-10-28

386

Multi-frame x-ray imaging with a large area 40ps camera  

SciTech Connect

The authors have developed a large area short pulse framing camera that is capable of sixteen frames and shutter times of 40ps per frame. This is accomplished with a high fidelity electrical circuit and a L/D = 20 microchannel plate, driven by a short pulse (80ps) high amplitude electrical driver. They show results of this work they have done to support this type of shutter time and the difficulties associated with large area high speed shuttering.

Bell, P.M.; Kilkenny, J.D.; Landen, O.L.; Hanks, R.L.; Wiedwald, J.D. [Lawrence Livermore National Lab., CA (US); Bradley, D.K. [Univ. of Rochester, NY (US). Lab. for Laser Energetics

1992-11-09

387

Multiframe x-ray imaging with a large-area 40ps camera  

NASA Astrophysics Data System (ADS)

We have developed a large area short pulse framing camera that is capable of sixteen frames and shutter times of 40 ps per frame. This is accomplished with a high fidelity electrical circuit and a L/D equals 20 microchannel plate, driven by a short pulse (80 ps) high amplitude electrical driver. We show results of this work we have done to support this type of shutter time and the difficulties associated with large area high speed shuttering.

Bell, Perry M.; Kilkenny, Joseph D.; Landen, Otto L.; Hanks, Roy L.; Wiedwald, Jay D.; Bradley, David K.

1993-01-01

388

Multi-frame x ray imaging with a large area 40ps camera  

NASA Astrophysics Data System (ADS)

The authors have developed a large area short pulse framing camera that is capable of sixteen frames and shutter times of 40ps per frame. This is accomplished with a high fidelity electrical circuit and a L/D = 20 microchannel plate, driven by a short pulse (80ps) high amplitude electrical driver. They show results of this work they have done to support this type of shutter time and the difficulties associated with large area high speed shuttering.

Bell, P. M.; Kilkenny, J. D.; Landen, O. L.; Hanks, R. L.; Wiedwald, J. D.; Bradley, D. K.

1992-11-01

389

Miniaturized fundus camera  

NASA Astrophysics Data System (ADS)

We present a miniaturized version of a fundus camera. The camera is designed for the use in screening for retinopathy of prematurity (ROP). There, but also in other applications a small, light weight, digital camera system can be extremely useful. We present a small wide angle digital camera system. The handpiece is significantly smaller and lighter then in all other systems. The electronics is truly portable fitting in a standard boardcase. The camera is designed to be offered at a compatible price. Data from tests on young rabbits' eyes is presented. The development of the camera system is part of a telemedicine project screening for ROP. Telemedical applications are a perfect application for this camera system using both advantages: the portability as well as the digital image.

Gliss, Christine; Parel, Jean-Marie A.; Flynn, John T.; Pratisto, Hans S.; Niederer, Peter F.

2003-07-01

390

Digital monochrome CCD camera for robust pixel correspondant, data compression, and preprocessing in an integrated PC-based image-processing environment  

NASA Astrophysics Data System (ADS)

This paper describes the development of a compact digital CCD camera which contains image digitization and processing which interfaces to a personal computer (PC) via a standard enhanced parallel port. Digitizing of precise pixel samples coupled with the provision of putting a single chip FPGA for data processing, became the main digital components of the camera prior to sending the data to the PC. A form of compression scheme is applied so that the digital images may be transferred within the existing parallel port bandwidth. The data is decompressed in the PC environment for a real- time display of the video images using purely native processor resources. Frame capture is built into the camera so that a full uncompressed digital image could be sent for special processing.

Arshad, Norhashim M.; Harvey, David M.; Hobson, Clifford A.

1996-12-01

391

Cellphone camera imaging of a periodically patterned chip as a potential method for point-of-care diagnostics.  

PubMed

In this study, we demonstrate that a disposable chip periodically patterned with suitable ligands, an ordinary cellphone camera, and a simple pattern recognition software, can potentially be used for quantitative diagnostics. A key factor in this demonstration is the design of a calibration grid around the chip that, through a contrast transfer process, enables reliable analysis of the images collected under variable ambient lighting conditions. After exposure to a dispersion of amine terminated silica beads used as analyte mimicking pathogens, an epoxy-terminated glass substrate microcontact printed with octadecyltrichlorosilane (250 ?m periodicity) developed a characteristic pattern of beads which could be easily imaged with a cellphone camera of 3.2 MP pixels. A simple pattern recognition algorithm using fast Fourier transform produced a quantitative estimate of the analyte concentration present in the test solution. In this method importantly, neither the chip fabrication process nor the fill-factor of the periodic pattern need be perfect to arrive at a conclusive diagnosis. The method suggests a viable platform that may potentially find use in fault-tolerant and robust point-of-care diagnostic applications. PMID:24564576

Gupta, Ritu; Reifenberger, Ronald G; Kulkarni, Giridhar U

2014-03-26

392

Antenna-coupled microbolometer based uncooled 2D array and camera for 2D real-time terahertz imaging  

NASA Astrophysics Data System (ADS)

CEA-Leti has developed a monolithic large focal plane array bolometric technology optimized for 2D real-time imaging in the terahertz range. Each pixel consists in a silicon microbolometer coupled to specific antennas and a resonant quarter-wavelength cavity. First prototypes of imaging arrays have been designed and manufactured for optimized sensing in the 1-3.5THz range where THz quantum cascade lasers are delivering high optical power. NEP in the order of 1 pW/sqrt(Hz) has been assessed at 2.5 THz. This paper reports the steps of this development, starting from the pixel level, to an array associated monolithically to its CMOS ROIC and finally a stand-alone camera. For each step, modeling, technological prototyping and experimental characterizations are presented.

Simoens, F.; Meilhan, J.; Gidon, S.; Lasfargues, G.; Lalanne Dera, J.; Ouvrier-Buffet, J. L.; Pocas, S.; Rabaud, W.; Guellec, F.; Dupont, B.; Martin, S.; Simon, A. C.

2013-09-01

393

The Automatically Triggered Video or Imaging Station (ATVIS): An Inexpensive Way to Catch Geomorphic Events on Camera  

NASA Astrophysics Data System (ADS)

To understand how single events can affect landscape change, we must catch the landscape in the act. Direct observations are rare and often dangerous. While video is a good alternative, commercially-available video systems for field installation cost 11,000, weigh ~100 pounds (45 kg), and shoot 640x480 pixel video at 4 frames per second. This is the same resolution as a cheap point-and-shoot camera, with a frame rate that is nearly an order of magnitude worse. To overcome these limitations of resolution, cost, and portability, I designed and built a new observation station. This system, called ATVIS (Automatically Triggered Video or Imaging Station), costs 450--500 and weighs about 15 pounds. It can take roughly 3 hours of 1280x720 pixel video, 6.5 hours of 640x480 video, or 98,000 1600x1200 pixel photos (one photo every 7 seconds for 8 days). The design calls for a simple Canon point-and-shoot camera fitted with custom firmware that allows 5V pulses through its USB cable to trigger it to take a picture or to initiate or stop video recording. These pulses are provided by a programmable microcontroller that can take input from either sensors or a data logger. The design is easily modifiable to a variety of camera and sensor types, and can also be used for continuous time-lapse imagery. We currently have prototypes set up at a gully near West Bijou Creek on the Colorado high plains and at tributaries to Marble Canyon in northern Arizona. Hopefully, a relatively inexpensive and portable system such as this will allow geomorphologists to supplement sensor networks with photo or video monitoring and allow them to see—and better quantify—the fantastic array of processes that modify landscapes as they unfold. Camera station set up at Badger Canyon, Arizona.Inset: view into box. Clockwise from bottom right: camera, microcontroller (blue), DC converter (red), solar charge controller, 12V battery. Materials and installation assistance courtesy of Ron Griffiths and the USGS Grand Canyon Monitoring and Research Center.

Wickert, A. D.

2010-12-01

394

EDGE DETECTION PERFORMANCE IN SUPER-RESOLUTION IMAGE RECONSTRUCTION FROM CAMERA ARRAYS  

E-print Network

complex image analysis for both image processing sys- tems and biological systems. This paper explores with iden- tifiable null spaces and arrays with diverse magnification or shift variant systems which provide

Rajan, Dinesh

395

Assessment of the metrological performance of an in situ storage image sensor ultra-high speed camera for full-field deformation measurements  

NASA Astrophysics Data System (ADS)

Ultra-high speed (UHS) cameras allow us to acquire images typically up to about 1 million frames s-1 for a full spatial resolution of the order of 1 Mpixel. Different technologies are available nowadays to achieve these performances, an interesting one is the so-called in situ storage image sensor architecture where the image storage is incorporated into the sensor chip. Such an architecture is all solid state and does not contain movable devices as occurs, for instance, in the rotating mirror UHS cameras. One of the disadvantages of this system is the low fill factor (around 76% in the vertical direction and 14% in the horizontal direction) since most of the space in the sensor is occupied by memory. This peculiarity introduces a series of systematic errors when the camera is used to perform full-field strain measurements. The aim of this paper is to develop an experimental procedure to thoroughly characterize the performance of such kinds of cameras in full-field deformation measurement and identify the best operative conditions which minimize the measurement errors. A series of tests was performed on a Shimadzu HPV-1 UHS camera first using uniform scenes and then grids under rigid movements. The grid method was used as full-field measurement optical technique here. From these tests, it has been possible to appropriately identify the camera behaviour and utilize this information to improve actual measurements.

Rossi, Marco; Pierron, Fabrice; Forquin, Pascal

2014-02-01

396

Spatial resolution limit study of a CCD camera and scintillator based neutron imaging system according to MTF determination and analysis.  

PubMed

Spatial resolution limit is a very important parameter of an imaging system that should be taken into consideration before examination of any object. The objectives of this work are the determination of a neutron imaging system's response in terms of spatial resolution. The proposed procedure is based on establishment of the Modulation Transfer Function (MTF). The imaging system being studied is based on a high sensitivity CCD neutron camera (2×10(-5)lx at f1.4). The neutron beam used is from the horizontal beam port (H.6) of the Algerian Es-Salam research reactor. Our contribution is on the MTF determination by proposing an accurate edge identification method and a line spread function undersampling problem-resolving procedure. These methods and procedure are integrated into a MatLab code. The methods, procedures and approaches proposed in this work are available for any other neutron imaging system and allow for judging the ability of a neutron imaging system to produce spatial (internal details) properties of any object under examination. PMID:22014891

Kharfi, F; Denden, O; Bourenane, A; Bitam, T; Ali, A

2012-01-01

397

Frequency-domain streak camera for ultrafast imaging of evolving light-velocity objects.  

PubMed

We demonstrate a frequency-domain streak camera (FDSC) that captures the picosecond time evolution of luminal-velocity refractive index structures in a single shot. In our prototype FDSC, a probe-refer