These are representative sample records from Science.gov related to your search topic.
For comprehensive and current results, perform a real-time search at Science.gov.
1

Lunar Reconnaissance Orbiter Camera (LROC) instrument overview  

USGS Publications Warehouse

The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.

Robinson, M.S.; Brylow, S.M.; Tschimmel, M.; Humm, D.; Lawrence, S.J.; Thomas, P.C.; Denevi, B.W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M.A.; Caplinger, M.A.; Ghaemi, F.T.; Schaffner, J.A.; Malin, M.C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T.N.; Eliason, E.M.; McEwen, A.S.; Turtle, E.; Jolliff, B.L.; Hiesinger, H.

2010-01-01

2

Lunar Reconnaissance Orbiter Camera (LROC) Instrument Overview  

Microsoft Academic Search

The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar\\u000a Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m\\/pixel visible and UV, respectively), while\\u000a the two NACs are monochrome narrow-angle linescan imagers (0.5 m\\/pixel). The primary mission of LRO is to obtain measurements\\u000a of the Moon that

M. S. Robinson; S. M. Brylow; M. Tschimmel; D. Humm; S. J. Lawrence; P. C. Thomas; B. W. Denevi; E. Bowman-Cisneros; J. Zerr; M. A. Ravine; M. A. Caplinger; F. T. Ghaemi; J. A. Schaffner; M. C. Malin; P. Mahanti; A. Bartels; J. Anderson; T. N. Tran; E. M. Eliason; A. S. McEwen; E. Turtle; B. L. Jolliff; H. Hiesinger

2010-01-01

3

Retrieving lunar topography from multispectral LROC images  

NASA Astrophysics Data System (ADS)

A technique for retrieving information about the lunar topography from any individual multispectral LROC Wide Angle Camera (WAC) image has been developed. This technology is possible, since images acquired at different wavelengths correspond to different viewing angles and the influence of color differences between the images on the parallax assessments is small. This method provides the precision of Digital Elevation Models (DEMs) comparable to the global lunar 100 m raster DTM retrieved from the LROC WAC stereo model (GLD100). It potentially allows one to obtain maps of the elevations with better horizontal resolution than those of the GLD100. An empirical model of the distortion for LROC WAC has been developed and used for correction of the initial WAC images. In contrast to the standard pre-flight model, our model allows for compensation of the radial distortion, decentering the optics, and tilt of the CCD array almost fully. The DEMs obtained using our approach exhibit real morphological details in some cases that are invisible in GLD100 maps. Thus, our method suggests additional independent information about the lunar topography. The fact that our elevation maps have the same projection as the initial images allows valid corrections of these images to account for topographic effects (i. e. orthorectification) in contrast to the use of the GLD100 that may have slightly different coordinates referencing in comparison to individual WAC images.

Korokhin, Viktor V.; Velikodsky, Yuri I.; Shalygin, Eugene V.; Shkuratov, Yuriy G.; Kaydash, Vadym G.; Videen, Gorden

2014-03-01

4

Initial Results of 3D Topographic Mapping Using Lunar Reconnaissance Orbiter Camera (LROC) Stereo Imagery  

Microsoft Academic Search

The Lunar Reconnaissance Orbiter (LRO), launched June 18, 2009, carries the Lunar Reconnaissance Orbiter Camera (LROC) as one of seven remote sensing instruments on board. The camera system is equipped with a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NAC) for systematic lunar surface mapping and detailed site characterization for potential landing site selection and resource identification. The

R. Li; J. Oberst; A. S. McEwen; B. A. Archinal; R. A. Beyer; P. C. Thomas; Y. Chen; J. Hwangbo; J. D. Lawver; F. Scholten; S. S. Mattson; A. E. Howington-Kraus; M. S. Robinson

2009-01-01

5

LROC NAC Stereo Anaglyphs  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter Camera Narrow Angle Camera (LROC NAC) acquires high resolution (50 to 200 cm pixel scale) images of the Moon. In operation since June 2009, LROC NAC acquires geometric stereo pairs by rolling off-nadir on subsequent orbits. A new automated processing system currently in development will produce anaglyphs from most of the NAC geometric stereo pairs. An anaglyph is an image formed by placing one image from the stereo pair in the red channel, and the other image from the stereo pair in the green and blue channels, so that together with red-blue or red-cyan glasses, the 3D information in the pair can be readily viewed. These new image products will make qualitative interpretation of the lunar surface in 3D more accessible, without the need for intensive computational resources or special equipment. The LROC NAC is composed of two separate pushbroom CCD cameras (NAC L and R) aligned to increase the full swath width to 5 km from an altitude of 50 km. Development of the anaglyph processing system incorporates stereo viewing geometry, proper alignment of the NAC L and R frames, and optimal contrast normalization of the stereo pair to minimize extreme brightness differences, which can make stereo viewing difficult in an anaglyph. The LROC NAC anaglyph pipeline is based on a similar automated system developed for the HiRISE camera, on the Mars Reconnaissance Orbiter. Improved knowledge of camera pointing and spacecraft position allows for the automatic registration of the L and R frames by map projecting them to a polar stereographic projection. One half of the stereo pair must then be registered to the other so there is no offset in the vertical (y) direction. Stereo viewing depends on parallax only in the horizontal (x) direction. High resolution LROC NAC anaglyphs will be made available to the lunar science community and to the public on the LROC web site (http://lroc.sese.asu.edu).

Mattson, S.; McEwen, A. S.; Speyerer, E.; Robinson, M. S.

2012-12-01

6

Investigating at the Moon With new Eyes: The Lunar Reconnaissance Orbiter Mission Camera (LROC)  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter Mission Camera (LROC) H. Hiesinger (1,2), M.S. Robinson (3), A.S. McEwen (4), E.P. Turtle (4), E.M. Eliason (4), B.L. Jolliff (5), M.C. Malin (6), and P.C. Thomas (7) (1) Brown Univ., Dept. of Geological Sciences, Providence RI 02912, Harald_Hiesinger@brown.edu, (2) Westfaelische Wilhelms-University, (3) Northwestern Univ., (4) LPL, Univ. of Arizona, (5) Washington Univ., (6) Malin Space Science Systems, (7) Cornell Univ. The Lunar Reconnaissance Orbiter (LRO) mission is scheduled for launch in October 2008 as a first step to return humans to the Moon by 2018. The main goals of the Lunar Reconnaissance Orbiter Camera (LROC) are to: 1) assess meter and smaller- scale features for safety analyses for potential lunar landing sites near polar resources, and elsewhere on the Moon; and 2) acquire multi-temporal images of the poles to characterize the polar illumination environment (100 m scale), identifying regions of permanent shadow and permanent or near permanent illumination over a full lunar year. In addition, LROC will return six high-value datasets such as 1) meter-scale maps of regions of permanent or near permanent illumination of polar massifs; 2) high resolution topography through stereogrammetric and photometric stereo analyses for potential landing sites; 3) a global multispectral map in 7 wavelengths (300-680 nm) to characterize lunar resources, in particular ilmenite; 4) a global 100-m/pixel basemap with incidence angles (60-80 degree) favorable for morphologic interpretations; 5) images of a variety of geologic units at sub-meter resolution to investigate physical properties and regolith variability; and 6) meter-scale coverage overlapping with Apollo Panoramic images (1-2 m/pixel) to document the number of small impacts since 1971-1972, to estimate hazards for future surface operations. LROC consists of two narrow-angle cameras (NACs) which will provide 0.5-m scale panchromatic images over a 5-km swath, a wide-angle camera (WAC) to acquire images at about 100 m/pixel in seven color bands over a 100-km swath, and a common Sequence and Compressor System (SCS). Each NAC has a 700-mm-focal-length optic that images onto a 5000-pixel CCD line-array, providing a cross-track field-of-view (FOV) of 2.86 degree. The NAC readout noise is better than 100 e- , and the data are sampled at 12 bits. Its internal buffer holds 256 MB of uncompressed data, enough for a full-swath image 25-km long or a 2x2 binned image 100-km long. The WAC has two 6-mm- focal-length lenses imaging onto the same 1000 x 1000 pixel, electronically shuttered CCD area-array, one imaging in the visible/near IR, and the other in the UV. Each has a cross-track FOV of 90 degree. From the nominal 50-km orbit, the WAC will have a resolution of 100 m/pixel in the visible, and a swath width of ˜100 km. The seven-band color capability of the WAC is achieved by color filters mounted directly 1 over the detector, providing different sections of the CCD with different filters [1]. The readout noise is less than 40 e- , and, as with the NAC, pixel values are digitized to 12-bits and may be subsequently converted to 8-bit values. The total mass of the LROC system is about 12 kg; the total LROC power consumption averages at 22 W (30 W peak). Assuming a downlink with lossless compression, LRO will produce a total of 20 TeraBytes (TB) of raw data. Production of higher-level data products will result in a total of 70 TB for Planetary Data System (PDS) archiving, 100 times larger than any previous missions. [1] Malin et al., JGR, 106, 17651-17672, 2001. 2

Hiesinger, H.; Robinson, M. S.; McEwen, A. S.; Turtle, E. P.; Eliason, E. M.; Jolliff, B. L.; Malin, M. C.; Thomas, P. C.

7

Where did the Luna 23 and 24 spacecraft land?: Comparing the spacecraft seen in LROC NAC images with synthetic images  

NASA Astrophysics Data System (ADS)

Shkuratov et al. (2013) described specific photometric anomalies found around the Luna 16, 20 and 23 spacecraft and not found around the Luna 24 spacecraft. The authors explained this lack of an anomaly at the Luna 24 site as a result of the misidentification of the Luna 23 and 24 spacecraft in the LROC images by Robinson et al. (2012). In order to address this question, we synthesized images of the Luna spacecraft as they might appear in the LROC images (made by the Lavochkin Association, builders of the Luna spacecraft series). We compared the model images of the virtual Luna 23 and Luna 24 spacecraft sitting on the lunar surface with the spacecraft seen in the LROC images and concluded, on the basis of similarity of the spacecraft seen in the LROC images with the synthetic images, that identification of the spacecraft in these images by Robinson et al. (2012) is likely correct.

Dolgopolov, V. P.; Basilevsky, A. T.; Robinson, M. S.; Plescia, J. B.; Head, J. W.

2013-06-01

8

Secondary Craters and the Size-Velocity Distribution of Ejected Fragments around Lunar Craters Measured Using LROC Images  

NASA Astrophysics Data System (ADS)

Title: Secondary Craters and the Size-Velocity Distribution of Ejected Fragments around Lunar Craters Measured Using LROC Images Authors: Kelsi N. Singer1, Bradley L. Jolliff1, and William B. McKinnon1 Affiliations: 1. Earth and Planetary Sciences, Washington University in St Louis, St. Louis, MO, United States. We report results from analyzing the size-velocity distribution (SVD) of secondary crater forming fragments from the 93 km diameter Copernicus impact. We measured the diameters of secondary craters and their distances from Copernicus using LROC Wide Angle Camera (WAC) and Narrow Angle Camera (NAC) image data. We then estimated the velocity and size of the ejecta fragment that formed each secondary crater from the range equation for a ballistic trajectory on a sphere and Schmidt-Holsapple scaling relations. Size scaling was carried out in the gravity regime for both non-porous and porous target material properties. We focus on the largest ejecta fragments (dfmax) at a given ejection velocity (?ej) and fit the upper envelope of the SVD using quantile regression to an equation of the form dfmax = A*?ej ^- ?. The velocity exponent, ?, describes how quickly fragment sizes fall off with increasing ejection velocity during crater excavation. For Copernicus, we measured 5800 secondary craters, at distances of up to 700 km (15 crater radii), corresponding to an ejecta fragment velocity of approximately 950 m/s. This mapping only includes secondary craters that are part of a radial chain or cluster. The two largest craters in chains near Copernicus that are likely to be secondaries are 6.4 and 5.2 km in diameter. We obtained a velocity exponent, ?, of 2.2 × 0.1 for a non-porous surface. This result is similar to Vickery's [1987, GRL 14] determination of ? = 1.9 × 0.2 for Copernicus using Lunar Orbiter IV data. The availability of WAC 100 m/pix global mosaics with illumination geometry optimized for morphology allows us to update and extend the work of Vickery [1986, Icarus 67, and 1987], who compared secondary crater SVDs for craters on the Moon, Mercury, and Mars. Additionally, meter-scale NAC images enable characterization of secondary crater morphologies and fields around much smaller primary craters than were previously investigated. Combined results from all previous studies of ejecta fragment SVDs from secondary crater fields show that ? ranges between approximately 1 and 3. First-order spallation theory predicts a ? of 1 [Melosh 1989, Impact Cratering, Oxford Univ. Press]. Results in Vickery [1987] for the Moon exhibit a generally decreasing ? with increasing primary crater size (5 secondary fields mapped). In the same paper, however, this trend is flat for Mercury (3 fields mapped) and opposite for Mars (4 fields mapped). SVDs for craters on large icy satellites (Ganymede and Europa), with gravities not too dissimilar to lunar gravity, show generally low velocity exponents (? between 1 and 1.5), except for the very largest impactor measured: the 585-km-diameter Gilgamesh basin on Ganymede (? = 2.6 × 0.4) [Singer et al., 2013, Icarus 226]. The present work, focusing initially on lunar craters using LROC data, will attempt to confirm or clarify these trends, and expand the number of examples under a variety of impact conditions and surface materials to evaluate possible causes of variations.

Singer, K. N.; Jolliff, B. L.; McKinnon, W. B.

2013-12-01

9

Photometric parameter maps of the Moon derived from LROC WAC images  

NASA Astrophysics Data System (ADS)

Spatially resolved photometric parameter maps were computed from 21 months of Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) images. Due to a 60° field-of-view (FOV), the WAC achieves nearly global coverage of the Moon each month with more than 50% overlap from orbit-to-orbit. From the repeat observations at various viewing and illumination geometries, we calculated Hapke bidirectional reflectance model parameters [1] for 1°x1° "tiles" from 70°N to 70°S and 0°E to 360°E. About 66,000 WAC images acquired from February 2010 to October 2011 were converted from DN to radiance factor (I/F) though radiometric calibration, partitioned into gridded tiles, and stacked in a time series (tile-by-tile method [2]). Lighting geometries (phase, incidence, emission) were computed using the WAC digital terrain model (100 m/pixel) [3]. The Hapke parameters were obtained by model fitting against I/F within each tile. Among the 9 parameters of the Hapke model, we calculated 3 free parameters (w, b, and hs) by setting constant values for 4 parameters (Bco=0, hc=1, ?, ?=0) and interpolating 2 parameters (c, Bso). In this simplification, we ignored the Coherent Backscatter Opposition Effect (CBOE) to avoid competing CBOE and Shadow Hiding Opposition Effect (SHOE). We also assumed that surface regolith porosity is uniform across the Moon. The roughness parameter (?) was set to an averaged value from the equator (× 3°N). The Henyey-Greenstein double lobe function (H-G2) parameter (c) was given by the 'hockey stick' relation [4] (negative correlation) between b and c based on laboratory measurements. The amplitude of SHOE (Bso) was given by the correlation between w and Bso at the equator (× 3°N). Single scattering albedo (w) is strongly correlated to the photometrically normalized I/F, as expected. The c shows an inverse trend relative to b due to the 'hockey stick' relation. The parameter c is typically low for the maria (0.08×0.06) relative to the highlands (0.47×0.16). Since c controls the fraction of backward/forward scattering in H-G2, lower c for the maria indicates more forward scattering relative to the highlands. This trend is opposite to what was expected because darker particles are usually more backscattering. However, the lower albedo of the maria is due to the higher abundance of ilmenite, which is an opaque mineral that scatters all of the light by specular reflection from the its surface. If their surface facets are relatively smooth the ilmenite particles will be forward scattering. Other factors (e.g. grain shape, grain size, porosity, maturity) besides the mineralogy might also be affecting c. The angular-width of SHOE (hs) typically shows lower values (0.047×0.02) for the maria relative to the highlands (0.074×0.025). An increase in hs for the maria theoretically suggests lower porosity or a narrower grain size distribution [1], but the link between actual materials and hs is not well constrained. Further experiments using both laboratory and spacecraft observations will help to unravel the photometric properties of the surface materials of the Moon. [1] Hapke, B.: Cambridge Univ. Press, 2012. [2] Sato, H. et al.: 42nd LPSC, abstract #1974, 2011. [3] Scholten, F. et al.: JGR, 117, E00H17, 2012. [4] Hapke, B.: Icarus, 221(2), p1079-1083, 2012.

Sato, H.; Robinson, M. S.; Hapke, B. W.; Denevi, B. W.; Boyd, A. K.

2013-12-01

10

Depths, Diameters, and Profiles of Small Lunar Craters From LROC NAC Stereo Images  

NASA Astrophysics Data System (ADS)

Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) images (pixel scale ~0.5 m) provide new 3-D views of small craters (40m>D>200m). We extracted topographic profiles from 85 of these craters in mare and highland terrains between 18.1-19.1°N and 5.2-5.4°E to investigate relationships among crater shape, age, and target. Obvious secondary craters (e.g., clustered) and moderately- to heavily-degraded craters were excluded. The freshest craters included in the study have crisp rims, bright ejecta, and no superposed craters. The depth, diameter, and profiles of each crater were determined from a NAC-derived DTM (M119808916/M119815703) tied to LOLA topography with better than 1 m vertical resolution (see [1]). Depth/diameter ratios for the selected craters are generally between 0.12 and 0.2. Crater profiles were classified into one of 3 categories: V-shaped, U-shaped, or intermediate (craters on steep slopes were excluded). Craters were then morphologically classified according to [2], where crater shape is determined by changes in material strength between subsurface layers, resulting in bowl-shaped, flat-bottomed, concentric, or central-mound crater forms. In this study, craters with U-shaped profiles tend to be small (<60 m) and flat-bottomed, while V-shaped craters have steep slopes (~20°), little to no floor, and a range of diameters. Both fresh and relatively degraded craters display the full range of profile shapes (from U to V and all stages in between). We found it difficult to differentiate U-shaped craters from V-shaped craters without the DTM, and we saw no clear correlation between morphologic and profile classification. Further study is still needed to increase our crater statistics and expand on the relatively small population of craters included here. For the craters in this study, we found that block abundances correlate with relative crater degradation state as defined by [3], where abundant blocks signal fresher craters; however, block abundances do not correlate with U- or V-shaped profiles. The craters examined here show that profile shape cannot be used to determine the relative age or degradation state as might be inferred from [4, for example]. The observed variability in crater profiles may be explained by local variations in regolith thickness [e.g., 2, 5], impactor velocity, and/or possibly bolide density. Ongoing efforts will quantify the possible effects of solitary secondary craters and investigate whether or not depth/diameter ratios and crater profiles vary between different regions of the Moon (thick vs thin regolith, highlands vs mare, and old vs young mare). References: [1] Tran T. et al. (2010) LPSC XXXXI, Abstract 2515. [2] Quaide W. L. and V. R. Oberbeck (1968) JGR, 73: 5247-5270. [3] Basilevsky A. T. (1976) Proc LPSC 7th, p. 1005-1020. [4] Soderblom L. A. and L. A. Lebofsky (1972) JGR, 77: 279-296. [5] Wilcox B. B. et al. (2005) Met. Planet. Sci., 40: 695-710.

Stopar, J. D.; Robinson, M.; Barnouin, O. S.; Tran, T.

2010-12-01

11

LROC assessment of non-linear filtering methods in Ga-67 SPECT imaging  

NASA Astrophysics Data System (ADS)

In emission tomography, iterative reconstruction is usually followed by a linear smoothing filter to make such images more appropriate for visual inspection and diagnosis by a physician. This will result in a global blurring of the images, smoothing across edges and possibly discarding valuable image information for detection tasks. The purpose of this study is to investigate which possible advantages a non-linear, edge-preserving postfilter could have on lesion detection in Ga-67 SPECT imaging. Image quality can be defined based on the task that has to be performed on the image. This study used LROC observer studies based on a dataset created by CPU-intensive Gate Monte Carlo simulations of a voxelized digital phantom. The filters considered in this study were a linear Gaussian filter, a bilateral filter, the Perona-Malik anisotropic diffusion filter and the Catte filtering scheme. The 3D MCAT software phantom was used to simulate the distribution of Ga-67 citrate in the abdomen. Tumor-present cases had a 1-cm diameter tumor randomly placed near the edges of the anatomical boundaries of the kidneys, bone, liver and spleen. Our data set was generated out of a single noisy background simulation using the bootstrap method, to significantly reduce the simulation time and to allow for a larger observer data set. Lesions were simulated separately and added to the background afterwards. These were then reconstructed with an iterative approach, using a sufficiently large number of MLEM iterations to establish convergence. The output of a numerical observer was used in a simplex optimization method to estimate an optimal set of parameters for each postfilter. No significant improvement was found for using edge-preserving filtering techniques over standard linear Gaussian filtering.

De Clercq, Stijn; Staelens, Steven; De Beenhouwer, Jan; D'Asseler, Yves; Lemahieu, Ignace

2006-03-01

12

Human-observer LROC study of lesion detection in Ga-67 SPECT images reconstructed using MAP with anatomical priors.  

PubMed

We compare the image quality of SPECT reconstruction with and without an anatomical prior. Area under the localization-response operating characteristic (LROC) curve is our figure of merit. Simulated Ga-67 citrate images, a SPECT lymph-nodule imaging agent, were generated using the MCAT digital phantom. Reconstructed images were read by human observers.Several reconstruction strategies are compared, including rescaled block iterative (RBI) and maximum-a-posteriori (MAP) with various priors. We find that MAP reconstruction using prior knowledge of organ and lesion boundaries significantly improves lesion-detection performance (p < 0.05). Pseudo-lesion boundaries, regions without increased uptake which are incorrectly treated as prior knowledge of lesion boundaries, do not decrease performance. PMID:19412357

Lehovich, Andre; Bruyant, Philippe P; Gifford, Howard C; Schneider, Peter B; Squires, Shane; Licho, Robert; Gindi, Gene; King, Michael A

2006-01-01

13

Dry imaging cameras  

PubMed Central

Dry imaging cameras are important hard copy devices in radiology. Using dry imaging camera, multiformat images of digital modalities in radiology are created from a sealed unit of unexposed films. The functioning of a modern dry camera, involves a blend of concurrent processes, in areas of diverse sciences like computers, mechanics, thermal, optics, electricity and radiography. Broadly, hard copy devices are classified as laser and non laser based technology. When compared with the working knowledge and technical awareness of different modalities in radiology, the understanding of a dry imaging camera is often superficial and neglected. To fill this void, this article outlines the key features of a modern dry camera and its important issues that impact radiology workflow. PMID:21799589

Indrajit, IK; Alam, Aftab; Sahni, Hirdesh; Bhatia, Mukul; Sahu, Samaresh

2011-01-01

14

LROC Observations of Geologic Features in the Marius Hills  

NASA Astrophysics Data System (ADS)

Lunar volcanic cones, domes, and their associated geologic features are important objects of study for the LROC science team because they represent possible volcanic endmembers that may yield important insights into the history of lunar volcanism and are potential sources of lunar resources. Several hundred domes, cones, and associated volcanic features are currently targeted for high-resolution LROC Narrow Angle Camera [NAC] imagery[1]. The Marius Hills, located in Oceanus Procellarum (centered at ~13.4°N, -55.4°W), represent the largest concentration of these volcanic features on the Moon including sinuous rilles, volcanic cones, domes, and depressions [e.g., 2-7]. The Marius region is thus a high priority for future human lunar exploration, as signified by its inclusion in the Project Constellation list of notional future human lunar exploration sites [8], and will be an intense focus of interest for LROC science investigations. Previous studies of the Marius Hills have utilized telescopic, Lunar Orbiter, Apollo, and Clementine imagery to study the morphology and composition of the volcanic features in the region. Complementary LROC studies of the Marius region will focus on high-resolution NAC images of specific features for studies of morphology (including flow fronts, dome/cone structure, and possible layering) and topography (using stereo imagery). Preliminary studies of the new high-resolution images of the Marius Hills region reveal small-scale features in the sinuous rilles including possible outcrops of bedrock and lobate lava flows from the domes. The observed Marius Hills are characterized by rough surface textures, including the presence of large boulders at the summits (~3-5m diameter), which is consistent with the radar-derived conclusions of [9]. Future investigations will involve analysis of LROC stereo photoclinometric products and coordinating NAC images with the multispectral images collected by the LROC WAC, especially the ultraviolet data, to enable measurements of color variations within and amongst deposits and provide possible compositional insights, including the location of possibly related pyroclastic deposits. References: [1] J. D. Stopar et al. (2009), LRO Science Targeting Meeting, Abs. 6039 [2] Greeley R (1971) Moon, 3, 289-314 [3] Guest J. E. (1971) Geol. and Phys. of the Moon, p. 41-53. [4] McCauley J. F. (1967) USGS Geologic Atlas of the Moon, Sheet I-491 [5] Weitz C. M. and Head J. W. (1999) JGR, 104, 18933-18956 [6] Heather D. J. et al. (2003) JGR, doi:10.1029/2002JE001938 [7] Whitford-Stark, J. L., and J. W. Head (1977) Proc. LSC 8th, 2705-2724 [8] Gruener J. and Joosten B. K. (2009) LRO Science Targeting Meeting, Abs. 6036 [9] Campbell B. A. et al. (2009) JGR, doi:10.1029/2008JE003253.

Lawrence, S.; Stopar, J. D.; Hawke, R. B.; Denevi, B. W.; Robinson, M. S.; Giguere, T.; Jolliff, B. L.

2009-12-01

15

Computer-Assisted Detection of Collapse Pits in LROC NAC Images  

NASA Astrophysics Data System (ADS)

Pits in mare basalts and impact melt deposits provide unique environments for human shelters and preservation of geologic information. Due to their steep walls, pits are most distinguishable when the Sun is high (pit walls are casting shadows and impact crater walls are not). Because of the large number of NAC images acquired every day (>350), each typically with 5000 samples and 52,224 lines, it is not feasible to carefully search each image manually, so we developed a shadow detection algorithm (Pitscan) which analyzes an image in thirty seconds. It locates blocks of pixels that are below a digital number (DN) cutoff value, indicating that the block of pixels is "in shadow", and then runs a DN profile in the direction of solar lighting, comparing average DN values of the up-Sun and down-Sun sides. If the up-Sun average DN is higher than the down-Sun average, the shadow is assumed to be from a positive relief feature, and ignored. Otherwise, Pitscan saves a 200 x 200 pixel sub-image for later manual review. The algorithm currently generates ~150 false positives for each successful pit identification. This number would be unacceptable for an algorithm designed to catalog a common feature, but since the logic is merely intended to assist humans in locating an unusual type of feature, the false alarm rate is acceptable, and the current version allows a human to effectively check 10,000 NAC images for pits (over 2500 gigapixels) per hour. The false negative rate is not yet known, however Pitscan detected every pit in a test on a small subset of the images known to contain pits. Pitscan is only effective when the Sun is within 50° of the zenith. When the Sun is closer to the horizon crater walls often cast shadows, resulting in unacceptable numbers of false positives. Due to the Sun angle limit, only regions within 50° latitude of the equator are searchable. To date, 25.42% of the Moon has been imaged within this constraint. Early versions of Pitscan found more than 150 small (average diameter 15m) pits in impact melt deposits of Copernican craters [1]. More recently, improvements to the algorithm revealed two new large mare pits, similar to the three pits discovered in Kaguya images [2]. One is in Schlüter crater, a mare-filled crater near Orientale basin, with a 20 x 40m opening, approximately 60 m deep. The second new pit is in Lacus Mortis (44.96°N, 25.61°E) in a tectonically complex region west of Burg crater, This pit is the largest mare pit found to date, with an opening approximately 100 x 150 m, and a floor more than 90 m below the surrounding terrain. Most interesting from an exploration point of view is the fact that the east wall appears to have collapsed, leaving a relatively smooth ~22° slope from the surrounding mare down to the pit floor. Computer-assisted feature detection is an effective method of locating rare features in the extremely large high-resolution NAC dataset. Pitscan enabled the discovery of unknown collapse pits both in the mare and highlands. These pits are an important resource for future surface exploration, both by providing access to pristine cross-sections of the near-surface and by providing radiation and micrometorite shielding for human outposts. [1] Wagner, R.V. et al. (2012), LPSC XLIII, #2266 [2] Haruyama, J. et al. (2010), LPSC XLI, #1285

Wagner, R. V.; Robinson, M. S.

2012-12-01

16

Neutron Imaging Camera  

NASA Technical Reports Server (NTRS)

The Neutron Imaging Camera (NIC) is based on the Three-dimensional Track Imager (3DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution, 3-D tracking of charged particles. The incident direction of fast neutrons, En > 0.5 MeV, are reconstructed from the momenta and energies of the proton and triton fragments resulting from (sup 3)He(n,p) (sup 3)H interactions in the 3-DTI volume. The performance of the NIC from laboratory and accelerator tests is presented.

Hunter, Stanley; deNolfo, G. A.; Barbier, L. M.; Link, J. T.; Son, S.; Floyd, S. R.; Guardala, N.; Skopec, M.; Stark, B.

2008-01-01

17

Image Sensors Enhance Camera Technologies  

NASA Technical Reports Server (NTRS)

In the 1990s, a Jet Propulsion Laboratory team led by Eric Fossum researched ways of improving complementary metal-oxide semiconductor (CMOS) image sensors in order to miniaturize cameras on spacecraft while maintaining scientific image quality. Fossum s team founded a company to commercialize the resulting CMOS active pixel sensor. Now called the Aptina Imaging Corporation, based in San Jose, California, the company has shipped over 1 billion sensors for use in applications such as digital cameras, camera phones, Web cameras, and automotive cameras. Today, one of every three cell phone cameras on the planet feature Aptina s sensor technology.

2010-01-01

18

Neutron Imaging Camera  

NASA Technical Reports Server (NTRS)

We describe the Neutron Imaging Camera (NIC) being developed for DTRA applications by NASA/GSFC and NSWC/Carderock. The NIC is based on the Three-dimensional Track Imager (3-DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution. 3-D tracking of charged particles. The incident direction of fast neutrons, E(sub N) > 0.5 MeV. arc reconstructed from the momenta and energies of the proton and triton fragments resulting from 3He(n,p)3H interactions in the 3-DTI volume. We present angular and energy resolution performance of the NIC derived from accelerator tests.

Hunter, Stanley D.; DeNolfo, Georgia; Floyd, Sam; Krizmanic, John; Link, Jason; Son, Seunghee; Guardala, Noel; Skopec, Marlene; Stark, Robert

2008-01-01

19

Marius Hills: Surface Roughness from LROC and Mini-RF  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter Camera (LROC) Team is collecting hundreds of high-resolution (0.5 m/pixel) Narrow Angle Camera (NAC) images of lunar volcanic constructs (domes, “cones”, and associated features) [1,2]. Marius Hills represents the largest concentration of volcanic features on the Moon and is a high-priority target for future exploration [3,4]. NAC images of this region provide new insights into the morphology and geology of specific features at the meter scale, including lava flow fronts, tectonic features, layers, and topography (using LROC stereo imagery) [2]. Here, we report initial results from Mini-RF and LROC collaborative studies of the Marius Hills. Mini-RF uses a hybrid polarimetric architecture to measure surface backscatter characteristics and can acquire data in one of two radar bands, S (12 cm) or X (4 cm) [5]. The spatial resolution of Mini-RF (15 m/pixel) enables correlation of features observed in NAC images to Mini-RF data. Mini-RF S-Band zoom-mode data and daughter products, such as circular polarization ratio (CPR), were directly compared to NAC images. Mini-RF S-Band radar images reveal enhanced radar backscatter associated with volcanic constructs in the Marius Hills region. Mini-RF data show that Marius Hills volcanic constructs have enhanced average CPR values (0.5-0.7) compared to the CPR values of the surrounding mare (~0.4). This result is consistent with the conclusions of [6], and implies that the lava flows comprising the domes in this region are blocky. To quantify the surface roughness [e.g., 6,7] block populations associated with specific geologic features in the Marius Hills region are being digitized from NAC images. Only blocks that can be unambiguously identified (>1 m diameter) are included in the digitization process, producing counts and size estimates of the block population. High block abundances occur mainly at the distal ends of lava flows. The average size of these blocks is 9 m, and 50% of observed blocks are between 9-12 m in diameter. These blocks are not associated with impact craters and have at most a thin layer of regolith. There is minimal visible evidence for downslope movement. Relatively high block abundances are also seen on the summits of steep-sided asymmetrical positive relief features (“cones”) atop low-sided domes. Digitization efforts will continue as we study the block populations of different geologic features in the Marius Hills region and correlate the results with Mini-RF data, which will provide new information about the emplacement of volcanic features in the region. [1] J.D. Stopar et al., LPI Contribution 1483 (2009) 93-94. [2] S.J. Lawrence et al. (2010) LPSC 41 #1906. [2] S.J. Lawrence et al. (2010) LPSC 41 # 2689. [3] C. Coombs & B.R. Hawke (1992) 2nd Proc. Lun. Bases & Space Act. 21st Cent pp. 219-229. [4]J.Gruener and B. Joosten (2009) LPI Contributions 1483 50-51. [5] D.B.J. Bussey et al. (2010) LPSC 41 # 2319. [6] B.A. Campbell et al. (2009) JGR-Planets, 114, 01001. [7] S.W. Anderson et al. (1998) GSA Bull, 110, 1258-1267.

Lawrence, S.; Hawke, B. R.; Bussey, B.; Stopar, J. D.; Denevi, B.; Robinson, M.; Tran, T.

2010-12-01

20

Digital Elevation Models and Derived Products from Lroc Nac Stereo Observations  

NASA Astrophysics Data System (ADS)

One of the primary objectives of the Lunar Reconnaissance Orbiter Camera (LROC) is to acquire stereo observations with the Narrow Angle Camera (NAC) to enable production of high resolution digital elevation models (DEMs). This work describes the processes and techniques used in reducing the NAC stereo observations to DEMs through a combination of USGS integrated Software for Imagers and Spectrometers (ISIS) and SOCET SET® from BAE Systems by a team at Arizona State University (ASU). LROC Science Operations Center personnel have thus far reduced 130 stereo observations to DEMs of more than 130 stereo pairs for 11 Constellation Program (CxP) sites and 53 other regions of scientific interest. The NAC DEM spatial sampling is typically 2 meters, and the vertical precision is 1-2 meters. Such high resolution provides the three-dimensional view of the lunar surface required for site selection, hazard avoidance and planning traverses that minimize resource consumption. In addition to exploration analysis, geologists can measure parameters such as elevation, slope, and volume to place constraints on composition and geologic history. The NAC DEMs are released and archived through NASA's Planetary Data System.

Burns, K. N.; Speyerer, E. J.; Robinson, M. S.; Tran, T.; Rosiek, M. R.; Archinal, B. A.; Howington-Kraus, E.; the LROC Science Team

2012-08-01

21

Intelligent thermal imaging camera with network interface  

NASA Astrophysics Data System (ADS)

In recent years, a significant increase in usage of thermal imagining cameras can be observed in both public and commercial sector, due to the lower cost and expanding availability of uncooled microbolometer infrared radiation detectors. Devices present on the market vary in their parameters and output interfaces. However, all these thermographic cameras are only a source of an image, which is then analyzed in external image processing unit. There is no possibility to run users dedicated image processing algorithms by thermal imaging camera itself. This paper presents a concept of realization, architecture and hardware implementation of "Intelligent thermal imaging camera with network interface" utilizing modern technologies, standards and approach in one single device.

Sielewicz, Krzysztof M.; Kasprowicz, Grzegorz; Po?niak, Krzysztof T.; Romaniuk, R. S.

2011-10-01

22

Uncertainty Analysis of LROC NAC Derived Elevation Models  

NASA Astrophysics Data System (ADS)

One of the primary objectives of the Lunar Reconnaissance Orbiter Camera (LROC) [1] is to gather stereo observations with the Narrow Angle Camera (NAC) to generate digital elevation models (DEMs). From an altitude of 50 km, the NAC acquires images with a pixel scale of 0.5 meters, and a dual NAC observation covers approximately 5 km cross-track by 25 km down-track. This low altitude was common from September 2009 to December 2011. Images acquired during the commissioning phase and those acquired from the fixed orbit (after 11 December 2011) have pixel scales that range from 0.35 meters at the south pole to 2 meters at the north pole. Alimetric observations obtained by the Lunar Orbiter Laser Altimeter (LOLA) provide measurements of ±0.1 m between the spacecraft and the surface [2]. However, uncertainties in the spacecraft positioning can result in offsets (±20m) between altimeter tracks over many orbits. The LROC team is currently developing a tool to automatically register alimetric observations to NAC DEMs [3]. Using a generalized pattern search (GPS) algorithm, the new automatic registration adjusts the spacecraft position and pointing information during times when NAC images, as well as LOLA measurements, of the same region are acquired to provide an absolute reference frame for the DEM. This information is then imported into SOCET SET to aide in creating controlled NAC DEMs. For every DEM, a figure of merit (FOM) map is generated using SOCET SET software. This is a valuable tool for determining the relative accuracy of a specific pixel in a DEM. Each pixel in a FOM map is given a value to determine its "quality" by determining if the specific pixel was shadowed, saturated, suspicious, interpolated/extrapolated, or successfully correlated. The overall quality of a NAC DEM is a function of both the absolute and relative accuracies. LOLA altimetry provides the most accurate absolute geodetic reference frame with which the NAC DEMs can be compared. Offsets between LOLA profiles and NAC DEMs are used to quantify the absolute accuracy. Small lateral movements in the LOLA points coupled with large changes in topography contribute to sizeable offsets between the datasets. The steep topography of Lichtenberg Crater provides an example of the offsets in the LOLA data. Ten tracks that cross the region of interest were used to calculate the offset with a root mean square (RMS) error of 9.67 m, an average error of 7.02 m, and a standard deviation of 9.61m. Large areas (>375 km sq) covered by a mosaic of NAC DEMs were compared to the Wide Angel Camera (WAC) derived Global Lunar DTM 100 m topographic model (GLD100) [4]. The GLD100 has a pixel scale of 100 m; therefore, the NAC DEMs were reduced to calculate the offsets between two datasets. When comparing NAC DEMs to WAC DEMs, it was determined that the vertical offsets were as follows [Site name (average offset in meters, standard deviation in meters)]: Lichtenberg Crater (-7.74, 20.49), Giordano Bruno (-5.31, 28.80), Hortensius Domes (-3.52, 16.00), and Reiner Gamma (-0.99,14.11). Resources: [1] Robinson et al. (2010) Space Sci. Rev. [2] Smith et al. (2010) Space Sci. Rev. [3]Speyerer et al. (2012) European Lunar Symp. [4] Scholten et al. (2012) JGR-Planets.

Burns, K.; Yates, D. G.; Speyerer, E.; Robinson, M. S.

2012-12-01

23

On an assessment of surface roughness estimates from lunar laser altimetry pulse-widths for the Moon from LOLA using LROC narrow-angle stereo DTMs.  

NASA Astrophysics Data System (ADS)

Neumann et al. [1] proposed that laser altimetry pulse-widths could be employed to derive "within-footprint" surface roughness as opposed to surface roughness estimated from between laser altimetry pierce-points such as the example for Mars [2] and more recently from the 4-pointed star-shaped LOLA (Lunar reconnaissance Orbiter Laser Altimeter) onboard the NASA-LRO [3]. Since 2009, the LOLA has been collecting extensive global laser altimetry data with a 5m footprint and ?25m between the 5 points in a star-shape. In order to assess how accurately surface roughness (defined as simple RMS after slope correction) derived from LROC matches with surface roughness derived from LOLA footprints, publicly released LROC-NA (LRO Camera Narrow Angle) 1m Digital Terrain Models (DTMs) were employed to measure the surface roughness directly within each 5m footprint. A set of 20 LROC-NA DTMs were examined. Initially the match-up between the LOLA and LROC-NA orthorectified images (ORIs) is assessed visually to ensure that the co-registration is better than the LOLA footprint resolution. For each LOLA footprint, the pulse-width geolocation is then retrieved and this is used to "cookie-cut" the surface roughness and slopes derived from the LROC-NA DTMs. The investigation which includes data from a variety of different landforms shows little, if any correlation between surface roughness estimated from DTMs with LOLA pulse-widths at sub-footprint scale. In fact there is only any perceptible correlation between LOLA and LROC-DTMs at baselines of 40-60m for surface roughness and 20m for slopes. [1] Neumann et al. Mars Orbiter Laser Altimeter pulse width measurements and footprint-scale roughness. Geophysical Research Letters (2003) vol. 30 (11), paper 1561. DOI: 10.1029/2003GL017048 [2] Kreslavsky and Head. Kilometer-scale roughness of Mars: results from MOLA data analysis. J Geophys Res (2000) vol. 105 (E11) pp. 26695-26711. [3] Rosenburg et al. Global surface slopes and roughness of the Moon from the Lunar Orbiter Laser Altimeter. Journal of Geophysical Research (2011) vol. 116, paper E02001. DOI: 10.1029/2010JE003716 [4] Chin et al. Lunar Reconnaissance Orbiter Overview: The Instrument Suite and Mission. Space Science Reviews (2007) vol. 129 (4) pp. 391-419

Muller, Jan-Peter; Poole, William

2013-04-01

24

Image dissector camera system study  

NASA Technical Reports Server (NTRS)

Various aspects of a rendezvous and docking system using an image dissector detector as compared to a GaAs detector were discussed. Investigation into a gimbled scanning system is also covered and the measured video response curves from the image dissector camera are presented. Rendezvous will occur at ranges greater than 100 meters. The maximum range considered was 1000 meters. During docking, the range, range-rate, angle, and angle-rate to each reflector on the satellite must be measured. Docking range will be from 3 to 100 meters. The system consists of a CW laser diode transmitter and an image dissector receiver. The transmitter beam is amplitude modulated with three sine wave tones for ranging. The beam is coaxially combined with the receiver beam. Mechanical deflection of the transmitter beam, + or - 10 degrees in both X and Y, can be accomplished before or after it is combined with the receiver beam. The receiver will have a field-of-view (FOV) of 20 degrees and an instantaneous field-of-view (IFOV) of two milliradians (mrad) and will be electronically scanned in the image dissector. The increase in performance obtained from the GaAs photocathode is not needed to meet the present performance requirements.

Howell, L.

1984-01-01

25

Apogee Imaging Systems Camera Installation Guide  

E-print Network

camera 4) Begin using your Apogee camera! 1.1 Install MaxIm DL/CCD Your Apogee camera may have included a copy of MaxIm DL/CCD image capture and processing software, developed by Diffraction Limited. If you will be using the MaxIm DL/CCD software, we recommend that you install it prior to setting up your Apogee camera

Kleinfeld, David

26

LRO Camera Imaging of Potential Landing Sites in the South Pole-Aitken Basin  

NASA Astrophysics Data System (ADS)

We show results of WAC (Wide Angle Camera) and NAC (Narrow Angle Camera) imaging of candidate landing sites within the South Pole-Aitken (SPA) basin of the Moon obtained by the Lunar Reconnaissance Orbiter during the first full year of operation. These images enable a greatly improved delineation of geologic units, determination of unit thicknesses and stratigraphy, and detailed surface characterization that has not been possible with previous data. WAC imaging encompasses the entire SPA basin, located within an area ranging from ~ 130-250 degrees east longitude and ~15 degrees south latitude to the South Pole, at different incidence angles, with the specific range of incidence dependent on latitude. The WAC images show morphology and surface detail at better than 100 m per pixel, with spatial coverage and quality unmatched by previous data sets. NAC images reveal details at the sub-meter pixel scale that enable new ways to evaluate the origins and stratigraphy of deposits. Key among new results is the capability to discern extents of ancient volcanic deposits that are covered by later crater ejecta (cryptomare) [see Petro et al., this conference] using new, complementary color data from Kaguya and Chandrayaan-1. Digital topographic models derived from WAC and NAC geometric stereo coverage show broad intercrater-plains areas where slopes are acceptably low for high-probability safe landing [see Archinal et al., this conference]. NAC images allow mapping and measurement of small, fresh craters that excavated boulders and thus provide information on surface roughness and depth to bedrock beneath regolith and plains deposits. We use these data to estimate deposit thickness in areas of interest for landing and potential sample collection to better understand the possible provenance of samples. Also, small regions marked by fresh impact craters and their associated boulder fields are readily identified by their bright ejecta patterns and marked as lander keep-out zones. We will show examples of LROC data including those for Constellation sites on the SPA rim and interior, a site between Bose and Alder Craters, sites east of Bhabha Crater, and sites on and near the “Mafic Mound” [see Pieters et al., this conference]. Together the LROC data and complementary products provide essential information for ensuring identification of safe landing and sampling sites within SPA basin that has never before been available for a planetary mission.

Jolliff, B. L.; Wiseman, S. M.; Gibson, K. E.; Lauber, C.; Robinson, M.; Gaddis, L. R.; Scholten, F.; Oberst, J.; LROC Science; Operations Team

2010-12-01

27

The European Photon Imaging Camera on XMM-Newton: The MOS cameras : The MOS cameras  

Microsoft Academic Search

The EPIC focal plane imaging spectrometers on XMM-Newton use CCDs to record the images and spectra of celestial X-ray sources focused by the three X-ray mirrors. There is one camera at the focus of each mirror; two of the cameras contain seven MOS CCDs, while the third uses twelve PN CCDs, defining a circular field of view of 30' diameter

M. J. L. Turner; A. Abbey; M. Arnaud; M. Balasini; M. Barbera; E. Belsole; P. J. Bennie; J. P. Bernard; G. F. Bignami; M. Boer; U. Briel; I. Butler; C. Cara; C. Chabaud; R. Cole; A. Collura; M. Conte; A. Cros; M. Denby; P. Dhez; G. Di Coco; J. Dowson; P. Ferrando; S. Ghizzardi; F. Gianotti; C. V. Goodall; L. Gretton; R. G. Griffiths; O. Hainaut; J. F. Hochedez; A. D. Holland; E. Jourdain; E. Kendziorra; A. Lagostina; R. Laine; N. La Palombara; M. Lortholary; D. Lumb; P. Marty; S. Molendi; C. Pigot; E. Poindron; K. A. Pounds; J. N. Reeves; C. Reppin; R. Rothenflug; P. Salvetat; J. L. Sauvageot; D. Schmitt; S. Sembay; A. D. T. Short; J. Spragg; J. Stephen; L. Strüder; A. Tiengo; M. Trifoglio; J. Trümper; S. Vercellone; L. Vigroux; G. Villa; M. J. Ward; S. Whitehead; E. Zonca

2001-01-01

28

Lroc Observations of Permanently Shadowed Regions: Seeing into the Dark  

NASA Astrophysics Data System (ADS)

Permanently shadowed regions (PSRs) near the lunar poles that receive secondary illumination from nearby Sun facing slopes were imaged by the Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Cameras (NAC). Typically secondary lighting is optimal in polar areas around respective solstices and when the LRO orbit is nearly coincident with the sub-solar point (low spacecraft beta angles). NAC PSR images provide the means to search for evidence of surface frosts and unusual morphologies from ice rich regolith, and aid in planning potential landing sites for future in-situ exploration. Secondary illumination imaging in PSRs requires NAC integration times typically more than ten times greater than nominal imaging. The increased exposure time results in downtrack smear that decreases the spatial resolution of the NAC PSR images. Most long exposure NAC images of PSRs were acquired with exposure times of 24.2-ms (1-m by 40-m pixels, sampled to 20-m) and 12-ms (1-m by 20-m, sampled to 10-m). The initial campaign to acquire long exposure NAC images of PSRs in the north pole region ran from February 2013 to April 2013. Relative to the south polar region, PSRs near the north pole are generally smaller (D<24-km) and located in simple craters. Long exposure NAC images of PSRs in simple craters are often well illuminated by secondary light reflected from Sun-facing crater slopes during the northern summer solstice, allowing many PSRs to be imaged with the shorter exposure time of 12-ms (resampled to 10-m). With the exception of some craters in Peary crater, most northern PSRs with diameters >6-km were successfully imaged (ex. Whipple, Hermite A, and Rozhestvenskiy U). The third PSR south polar campaign began in April 2013 and will continue until October 2013. The third campaign will expand previous NAC coverage of PSRs and follow up on discoveries with new images of higher signal to noise ratio (SNR), higher resolution, and varying secondary illumination conditions. Utilizing previous campaign images and Sun's position, an individualized approach for targeting each crater drives this campaign. Secondary lighting within the PSRs, though somewhat diffuse, is at low incidence angles and coupled with nadir NAC imaging results in large phase angles. Such conditions tend to reduce albedo contrasts, complicating identification of patchy frost or ice deposits. Within the long exposure PSR images, a few small craters (D<200-m) with highly reflective ejecta blankets have been identified and interpreted as small fresh impact craters. Sylvester N and Main L are Copernican-age craters with PSRs; NAC images reveal debris flows, boulders, and morphologically fresh interior walls indicative of their young age. The identifications of albedo anomalies associated with these fresh craters and debris flows indicate that strong albedo contrasts (~2x) associated with small fresh impact craters can be distinguished in PSRs. Lunar highland material has an albedo of ~0.2, while pure water frost has an albedo of ~0.9. If features in PSRs have an albedo similar to lunar highlands, significant surface frost deposits could result in detectable reflective anomalies in the NAC images. However, no reflective anomalies have thus far been identified in PSRs attributable to frost.

Koeber, S. D.; Robinson, M. S.

2013-12-01

29

Heart imaging by cadmium telluride gamma camera  

Microsoft Academic Search

Cadmium telluride semiconductor detectors (CdTe) operating at room temperature are attractive for medical imaging because of their good energy resolution providing excellent spatial and contrast resolution. The compactness of the detection system allows the building of small light camera heads which can be used for bedside imaging. A mobile pixellated gamma camera based on 2304 CdTe (pixel size: 3×3mm, field

Ch. Scheiber; B. Eclancher; J. Chambron; V. Prat; A. Kazandjan; A. Jahnke; R. Matz; S. Thomas; S. Warren; M. Hage-Hali; R. Regal; P. Siffert; M. Karman

1999-01-01

30

High performance imaging using large camera arrays  

Microsoft Academic Search

The advent of inexpensive digital image sensors and the ability to create photographs that combine information from a number of sensed images are changing the way we think about photography. In this paper, we describe a unique array of 100 custom video cameras that we have built, and we summarize our experiences using this array in a range of imaging

Bennett Wilburn; Neel Joshi; Vaibhav Vaish; Eino-Ville Talvala; Emilio R. Antúnez; Adam Barth; Andrew Adams; Mark Horowitz; Marc Levoy

2005-01-01

31

Digital image processing of metric camera imagery  

Microsoft Academic Search

The use of digitized Spacelab metric camera imagery for map updating is demonstrated for an area of Germany featuring agricultural and industrial areas, and a region of the White Nile. LANDSAT and Spacelab images were combined, and digital image processing techniques used for image enhancement. Updating was achieved by semiautomatic techniques, but for many applications manual editing may be feasible.

P. Lohmann

1985-01-01

32

Classroom multispectral imaging using inexpensive digital cameras.  

NASA Astrophysics Data System (ADS)

The proliferation of increasingly cheap digital cameras in recent years means that it has become easier to exploit the broad wavelength sensitivity of their CCDs (360 - 1100 nm) for classroom-based teaching. With the right tools, it is possible to open children's eyes to the invisible world of UVA and near-IR radiation either side of our narrow visual band. The camera-filter combinations I describe can be used to explore the world of animal vision, looking for invisible markings on flowers, or in bird plumage, for example. In combination with a basic spectroscope (such as the Project-STAR handheld plastic spectrometer, 25), it is possible to investigate the range of human vision and camera sensitivity, and to explore the atomic and molecular absorption lines from the solar and terrestrial atmospheres. My principal use of the cameras has been to teach multispectral imaging of the kind used to determine remotely the composition of planetary surfaces. A range of camera options, from 50 circuit-board mounted CCDs up to $900 semi-pro infrared camera kits (including mobile phones along the way), and various UV-vis-IR filter options will be presented. Examples of multispectral images taken with these systems are used to illustrate the range of classroom topics that can be covered. Particular attention is given to learning about spectral reflectance curves and comparing images from Earth and Mars taken using the same filter combination that it used on the Mars Rovers.

Fortes, A. D.

2007-12-01

33

Prediction of Viking lander camera image quality  

NASA Technical Reports Server (NTRS)

Formulations are presented that permit prediction of image quality as a function of camera performance, surface radiance properties, and lighting and viewing geometry. Predictions made for a wide range of surface radiance properties reveal that image quality depends strongly on proper camera dynamic range command and on favorable lighting and viewing geometry. Proper camera dynamic range commands depend mostly on the surface albedo that will be encountered. Favorable lighting and viewing geometries depend mostly on lander orientation with respect to the diurnal sun path over the landing site, and tend to be independent of surface albedo and illumination scattering function. Side lighting with low sun elevation angles (10 to 30 deg) is generally favorable for imaging spatial details and slopes, whereas high sun elevation angles are favorable for measuring spectral reflectances.

Huck, F. O.; Burcher, E. E.; Jobson, D. J.; Wall, S. D.

1976-01-01

34

Improvement of passive THz camera images  

NASA Astrophysics Data System (ADS)

Terahertz technology is one of emerging technologies that has a potential to change our life. There are a lot of attractive applications in fields like security, astronomy, biology and medicine. Until recent years, terahertz (THz) waves were an undiscovered, or most importantly, an unexploited area of electromagnetic spectrum. The reasons of this fact were difficulties in generation and detection of THz waves. Recent advances in hardware technology have started to open up the field to new applications such as THz imaging. The THz waves can penetrate through various materials. However, automated processing of THz images can be challenging. The THz frequency band is specially suited for clothes penetration because this radiation does not point any harmful ionizing effects thus it is safe for human beings. Strong technology development in this band have sparked with few interesting devices. Even if the development of THz cameras is an emerging topic, commercially available passive cameras still offer images of poor quality mainly because of its low resolution and low detectors sensitivity. Therefore, THz image processing is very challenging and urgent topic. Digital THz image processing is a really promising and cost-effective way for demanding security and defense applications. In the article we demonstrate the results of image quality enhancement and image fusion of images captured by a commercially available passive THz camera by means of various combined methods. Our research is focused on dangerous objects detection - guns, knives and bombs hidden under some popular types of clothing.

Kowalski, Marcin; Piszczek, Marek; Palka, Norbert; Szustakowski, Mieczyslaw

2012-10-01

35

Camera Trajectory fromWide Baseline Images  

NASA Astrophysics Data System (ADS)

Camera trajectory estimation, which is closely related to the structure from motion computation, is one of the fundamental tasks in computer vision. Reliable camera trajectory estimation plays an important role in 3D reconstruction, self localization, and object recognition. There are essential issues for a reliable camera trajectory estimation, for instance, choice of the camera and its geometric projection model, camera calibration, image feature detection and description, and robust 3D structure computation. Most of approaches rely on classical perspective cameras because of the simplicity of their projection models and ease of their calibration. However, classical perspective cameras offer only a limited field of view, and thus occlusions and sharp camera turns may cause that consecutive frames look completely different when the baseline becomes longer. This makes the image feature matching very difficult (or impossible) and the camera trajectory estimation fails under such conditions. These problems can be avoided if omnidirectional cameras, e.g. a fish-eye lens convertor, are used. The hardware which we are using in practice is a combination of Nikon FC-E9 mounted via a mechanical adaptor onto a Kyocera Finecam M410R digital camera. Nikon FC-E9 is a megapixel omnidirectional addon convertor with 180° view angle which provides images of photographic quality. Kyocera Finecam M410R delivers 2272×1704 images at 3 frames per second. The resulting combination yields a circular view of diameter 1600 pixels in the image. Since consecutive frames of the omnidirectional camera often share a common region in 3D space, the image feature matching is often feasible. On the other hand, the calibration of these cameras is non-trivial and is crucial for the accuracy of the resulting 3D reconstruction. We calibrate omnidirectional cameras off-line using the state-of-the-art technique and Mi?ušík's two-parameter model, that links the radius of the image point r to the angle ? of its corresponding rays w.r.t. the optical axis as ? = ar 1+br2 . After a successful calibration, we know the correspondence of the image points to the 3D optical rays in the coordinate system of the camera. The following steps aim at finding the transformation between the camera and the world coordinate systems, i.e. the pose of the camera in the 3D world, using 2D image matches. For computing 3D structure, we construct a set of tentative matches detecting different affine covariant feature regions including MSER, Harris Affine, and Hessian Affine in acquired images. These features are alternative to popular SIFT features and work comparably in our situation. Parameters of the detectors are chosen to limit the number of regions to 1-2 thousands per image. The detected regions are assigned local affine frames (LAF) and transformed into standard positions w.r.t. their LAFs. Discrete Cosine Descriptors are computed for each region in the standard position. Finally, mutual distances of all regions in one image and all regions in the other image are computed as the Euclidean distances of their descriptors and tentative matches are constructed by selecting the mutually closest pairs. Opposed to the methods using short baseline images, simpler image features which are not affine covariant cannot be used because the view point can change a lot between consecutive frames. Furthermore, feature matching has to be performed on the whole frame because no assumptions on the proximity of the consecutive projections can be made for wide baseline images. This is making the feature detection, description, and matching much more time-consuming than it is for short baseline images and limits the usage to low frame rate sequences when operating in real-time. Robust 3D structure can be computed by RANSAC which searches for the largest subset of the set of tentative matches which is, within a predefined threshold ", consistent with an epipolar geometry. We use ordered sampling as suggested in to draw 5-tuples from the list of tentative matches ordered ascendin

Havlena, M.; Torii, A.; Pajdla, T.

2008-09-01

36

Imaging characteristics of photogrammetric camera systems  

USGS Publications Warehouse

In view of the current interest in high-altitude and space photographic systems for photogrammetric mapping, the United States Geological Survey (U.S.G.S.) undertook a comprehensive research project designed to explore the practical aspects of applying the latest image quality evaluation techniques to the analysis of such systems. The project had two direct objectives: (1) to evaluate the imaging characteristics of current U.S.G.S. photogrammetric camera systems; and (2) to develop methodologies for predicting the imaging capabilities of photogrammetric camera systems, comparing conventional systems with new or different types of systems, and analyzing the image quality of photographs. Image quality was judged in terms of a number of evaluation factors including response functions, resolving power, and the detectability and measurability of small detail. The limiting capabilities of the U.S.G.S. 6-inch and 12-inch focal length camera systems were established by analyzing laboratory and aerial photographs in terms of these evaluation factors. In the process, the contributing effects of relevant parameters such as lens aberrations, lens aperture, shutter function, image motion, film type, and target contrast procedures for analyzing image quality and predicting and comparing performance capabilities. ?? 1973.

Welch, R.; Halliday, J.

1973-01-01

37

COMPUTER ANALYSIS OF PLANAR GAMMA CAMERA IMAGES  

EPA Science Inventory

COMPUTER ANALYSIS OF PLANAR GAMMA CAMERA IMAGES T Martonen1 and J Schroeter2 1Experimental Toxicology Division, National Health and Environmental Effects Research Laboratory, U.S. EPA, Research Triangle Park, NC 27711 USA and 2Curriculum in Toxicology, Unive...

38

Imaging spectrometer/camera having convex grating  

NASA Technical Reports Server (NTRS)

An imaging spectrometer has fore-optics coupled to a spectral resolving system with an entrance slit extending in a first direction at an imaging location of the fore-optics for receiving the image, a convex diffraction grating for separating the image into a plurality of spectra of predetermined wavelength ranges; a spectrometer array for detecting the spectra; and at least one concave sperical mirror concentric with the diffraction grating for relaying the image from the entrance slit to the diffraction grating and from the diffraction grating to the spectrometer array. In one embodiment, the spectrometer is configured in a lateral mode in which the entrance slit and the spectrometer array are displaced laterally on opposite sides of the diffraction grating in a second direction substantially perpendicular to the first direction. In another embodiment, the spectrometer is combined with a polychromatic imaging camera array disposed adjacent said entrance slit for recording said image.

Reininger, Francis M. (Inventor)

2000-01-01

39

Beta camera low activity tumor imaging.  

PubMed

A new technique, the beta camera, to complement film autoradiography, with fast quantitative imaging of beta particle-emitting radionuclides has been developed. It consists of a thin plastic scintillator and a light-sensitive microchannel plate detector. The thin tissue sample is mounted on the scintillator. Our first system had a high background and a moderate spatial resolution of 900 microns. We now report an improved system with a photomultiplier tube mounted on the scintillator of the microchannel plate detector. Only events registered by both detectors are accepted. A fast coincidence unit processes the signals, and if a time overlap exists, an event is generated in the beta camera. In the coincidence mode, images with low activity distribution of 201Tl (count rate 1 s-1) in 50 microns-thick slices of a human glioma tumor could be recorded with a spatial resolution of 500 microns. PMID:8305238

Ljunggren, K; Strand, S E; Ceberg, C P; Sjöholm, H; Elmqvist, D; Brun, A; Salford, L G

1993-01-01

40

Why do the image widths from the various cameras change?  

Atmospheric Science Data Center

The nadir camera Level 1 images have a swath width of about 376 km, compared to 414 km ... viewing the same Earth area with a more oblique-pointing camera the focal length must be greater in order to preserve resolution. All of the camera images are 1504 pixels wide, and the focal lengths of the D, C, B, and ...

2014-12-08

41

Cervical SPECT Camera for Parathyroid Imaging  

SciTech Connect

Primary hyperparathyroidism characterized by one or more enlarged parathyroid glands has become one of the most common endocrine diseases in the world affecting about 1 per 1000 in the United States. Standard treatment is highly invasive exploratory neck surgery called ���¢��������Parathyroidectomy���¢�������. The surgery has a notable mortality rate because of the close proximity to vital structures. The move to minimally invasive parathyroidectomy is hampered by the lack of high resolution pre-surgical imaging techniques that can accurately localize the parathyroid with respect to surrounding structures. We propose to develop a dedicated ultra-high resolution (~ 1 mm) and high sensitivity (10x conventional camera) cervical scintigraphic imaging device. It will be based on a multiple pinhole-camera SPECT system comprising a novel solid state CZT detector that offers the required performance. The overall system will be configured to fit around the neck and comfortably image a patient.

None

2012-08-31

42

New insight into lunar impact melt mobility from the LRO camera  

USGS Publications Warehouse

The Lunar Reconnaissance Orbiter Camera (LROC) is systematically imaging impact melt deposits in and around lunar craters at meter and sub-meter scales. These images reveal that lunar impact melts, although morphologically similar to terrestrial lava flows of similar size, exhibit distinctive features (e.g., erosional channels). Although generated in a single rapid event, the post-impact mobility and morphology of lunar impact melts is surprisingly complex. We present evidence for multi-stage influx of impact melt into flow lobes and crater floor ponds. Our volume and cooling time estimates for the post-emplacement melt movements noted in LROC images suggest that new flows can emerge from melt ponds an extended time period after the impact event.

Bray, Veronica J.; Tornabene, Livio L.; Keszthelyi, Laszlo P.; McEwen, Alfred S.; Hawke, B. Ray; Giguere, Thomas A.; Kattenhorn, Simon A.; Garry, William B.; Rizk, Bashar; Caudill, C.M.; Gaddis, Lisa R.; van der Bogert, Carolyn H.

2010-01-01

43

Tectonic Mapping of Mare Frigoris Using Lunar Reconnaissance Orbiter Camera Images  

NASA Astrophysics Data System (ADS)

Conventional wisdom has been that extensional tectonism on the Moon largely ended ~3.6 billion years ago and that contractional deformation ended ~1.2 billion years ago. New NASA Lunar Reconnaissance Orbiter Camera (LROC) high resolution images are forcing a re-assessment of this view. Mapping in Mare Frigoris and the surrounding area has revealed many tectonic landforms enabling new investigations of the region's structural evolution. Sinuous wrinkle ridges with hundreds of meters of relief are interpreted as folded basalt layers overlying thrust faults. They have often been associated with lunar mascons identified by positive free-air gravity anomalies where thick basaltic lava causes flexure and subsidence to form ridges. No mascon-like gravity anomaly is associated with Mare Frigoris, yet large ridges deform the mare basalts. Lobate scarps are also found near Frigoris. These asymmetric linear hills inferred to be surface expressions of thrust faults are distributed globally and thought to originate from cooling and radial contraction of the lunar interior. Clusters of meter-scale extensional troughs or graben bounded by normal faults also occur in Frigoris. Tectonic landforms are being mapped in and around Mare Frigoris using LROC Narrow Angle Camera (NAC) images. Preliminary results show that wrinkle ridges in Frigoris occur both near and distal to the basin perimeter, trend E/W in western and central Frigoris, and form a polygonal pattern in the eastern section. Several complex wrinkle ridges are observed to transition into morphologically simpler lobate scarps at mare/highland boundaries, with the contrast in tectonic morphology likely due to the change from layered (mare) to un-layered (highlands) substrate. Lobate scarps in Frigoris occur primarily in the highlands, tend to strike E/W, and often but not always follow the boundary between mare and highlands. Small graben mapped in Frigoris occur in several clusters adjacent to or atop ridges and scarps, and are often oriented nearly parallel or perpendicular to the nearest ridge or scarp. Ridges are generally thought to form shortly after lava emplacement (in Frigoris, ~2.6-3.8 Ga). Conversely, lobate scarps are estimated to be < 1.0 Ga. If deformation across ridge-scarp transitions was concurrent, either a) some wrinkle ridges are younger than previous estimates, b) some lobate scarps are older than 1 Ga, or c) late-stage compression reactivated pre-existing mare ridges near the basin margin and thrust faults extended into the highlands to form scarps. The crisp undegraded morphology of many ridges and scarps, coupled with the presence of shallow unfilled graben, suggest some deformation within Mare Frigoris occurred recently, perhaps within 50 Ma. These new populations of lobate scarps, wrinkle ridges, and graben discovered in Mare Frigoris imply a more complex and longer-lasting history of lunar crustal deformation.

Williams, N. R.; Bell, J. F.; Watters, T. R.; Banks, M. E.; Robinson, M. S.

2012-12-01

44

Digital Camera Identification from Images Estimating False Acceptance Probability  

E-print Network

, time and date, to detect image forgeries and ma- nipulations, reverse-engineer cameras and more introduced a large number of image forensic tools [4] that can reveal forgeries. Forensic analysisDigital Camera Identification from Images ­ Estimating False Acceptance Probability Miroslav Goljan

Fridrich, Jessica

45

Image Alignment for Precise Camera Fixation and Aim  

Microsoft Academic Search

Two important problems in camera control are how to keep a moving camera fixated on a target point, and how to precisely aim a camera, whose approximate pose is known, towards a given 3D position. This paper describes how electronic image alignment techniques can be used to solve these problems, as well as provide other benefits such as stabilized video.

Lambert E. Wixson; Jayakrishnan Eledath; Michael W. Hansen; Robert Mandelbaum; Deepam Mishra

1998-01-01

46

A Survey on Digital Camera Image Forensic Methods  

Microsoft Academic Search

There are two main interests in Digital Camera Image Forensics, namely source identification and forgery detection. In this paper, we first briefly provide an introduction to the major processing stages inside a digital camera and then review several methods for source digital camera identification and forgery detection. Existing methods for source identification explore the various processing stages inside a digital

Tran Van Lanh; Kai-sen Chong; Sabu Emmanuel; Mohan S. Kankanhalli

2007-01-01

47

Control Camera and Light Source Positions using Image Gradient Information  

E-print Network

Control Camera and Light Source Positions using Image Gradient Information Eric Marchand Abstract-- In this paper, we propose an original approach to control camera position and/or lighting conditions and maximizing the contrast in the image. Solutions are proposed to consider either a static light and a moving

Paris-Sud XI, Université de

48

Photorealistic image synthesis and camera validation from 2D images  

NASA Astrophysics Data System (ADS)

This paper presents a new 3D scene reconstruction technique using the Unity 3D game engine. The method presented here allow us to reconstruct the shape of simple objects and more complex ones from multiple 2D images, including infrared and digital images from indoor scenes and only digital images from outdoor scenes and then add the reconstructed object to the simulated scene created in Unity 3D, these scenes are then validated with real world scenes. The method used different cameras settings and explores different properties in the reconstructions of the scenes including light, color, texture, shapes and different views. To achieve the highest possible resolution, it was necessary the extraction of partial textures from visible surfaces. To recover the 3D shapes and the depth of simple objects that can be represented by the geometric bodies, there geometric characteristics were used. To estimate the depth of more complex objects the triangulation method was used, for this the intrinsic and extrinsic parameters were calculated using geometric camera calibration. To implement the methods mentioned above the Matlab tool was used. The technique presented here also let's us to simulate small simple videos, by reconstructing a sequence of multiple scenes of the video separated by small margins of time. To measure the quality of the reconstructed images and video scenes the Fast Low Band Model (FLBM) metric from the Video Quality Measurement (VQM) software was used. Low bandwidth perception based features include edges and motion.

Santos Ferrer, Juan C.; González Chévere, David; Manian, Vidya

2014-06-01

49

Evaluation of mobile phone camera benchmarking using objective camera speed and image quality metrics  

NASA Astrophysics Data System (ADS)

When a mobile phone camera is tested and benchmarked, the significance of image quality metrics is widely acknowledged. There are also existing methods to evaluate the camera speed. However, the speed or rapidity metrics of the mobile phone's camera system has not been used with the quality metrics even if the camera speed has become a more and more important camera performance feature. There are several tasks in this work. First, the most important image quality and speed-related metrics of a mobile phone's camera system are collected from the standards and papers and, also, novel speed metrics are identified. Second, combinations of the quality and speed metrics are validated using mobile phones on the market. The measurements are done toward application programming interface of different operating systems. Finally, the results are evaluated and conclusions are made. The paper defines a solution to combine different image quality and speed metrics to a single benchmarking score. A proposal of the combined benchmarking metric is evaluated using measurements of 25 mobile phone cameras on the market. The paper is a continuation of a previous benchmarking work expanded with visual noise measurement and updates of the latest mobile phone versions.

Peltoketo, Veli-Tapani

2014-11-01

50

NV-CMOS HD camera for day/night imaging  

NASA Astrophysics Data System (ADS)

SRI International (SRI) has developed a new multi-purpose day/night video camera with low-light imaging performance comparable to an image intensifier, while offering the size, weight, ruggedness, and cost advantages enabled by the use of SRI's NV-CMOS HD digital image sensor chip. The digital video output is ideal for image enhancement, sharing with others through networking, video capture for data analysis, or fusion with thermal cameras. The camera provides Camera Link output with HD/WUXGA resolution of 1920 x 1200 pixels operating at 60 Hz. Windowing to smaller sizes enables operation at higher frame rates. High sensitivity is achieved through use of backside illumination, providing high Quantum Efficiency (QE) across the visible and near infrared (NIR) bands (peak QE <90%), as well as projected low noise (<2h+) readout. Power consumption is minimized in the camera, which operates from a single 5V supply. The NVCMOS HD camera provides a substantial reduction in size, weight, and power (SWaP) , ideal for SWaP-constrained day/night imaging platforms such as UAVs, ground vehicles, fixed mount surveillance, and may be reconfigured for mobile soldier operations such as night vision goggles and weapon sights. In addition the camera with the NV-CMOS HD imager is suitable for high performance digital cinematography/broadcast systems, biofluorescence/microscopy imaging, day/night security and surveillance, and other high-end applications which require HD video imaging with high sensitivity and wide dynamic range. The camera comes with an array of lens mounts including C-mount and F-mount. The latest test data from the NV-CMOS HD camera will be presented.

Vogelsong, T.; Tower, J.; Sudol, Thomas; Senko, T.; Chodelka, D.

2014-06-01

51

Imaging Emission Spectra with Handheld and Cellphone Cameras  

ERIC Educational Resources Information Center

As point-and-shoot digital camera technology advances it is becoming easier to image spectra in a laboratory setting on a shoestring budget and get immediate results. With this in mind, I wanted to test three cameras to see how their results would differ. Two undergraduate physics students and I used one handheld 7.1 megapixel (MP) digital Cannon…

Sitar, David

2012-01-01

52

Automatic Camera Recovery for Closed or Open Image Sequences  

Microsoft Academic Search

. We describe progress in completely automatically recovering3D scene structure together with 3D camera positions from a sequence ofimages acquired by an unknown camera undergoing unknown movement.The main departure from previous structure from motion strategies isthat processing is not sequential. Instead a hierarchical approach is employedbuilding from image triplets and associated trifocal tensors. Thisis advantageous both in obtaining correspondences and

Andrew W. Fitzgibbon; Andrew Zisserman

1998-01-01

53

Thermal analysis of the ultraviolet imager camera and electronics  

NASA Technical Reports Server (NTRS)

The Ultraviolet Imaging experiment has undergone design changes that necessiate updating the reduced thermal models (RTM's) for both the Camera and Electronics. In addition, there are several mission scenarios that need to be evaluated in terms of thermal response of the instruments. The impact of these design changes and mission scenarios on the thermal performance of the Camera and Electronics assemblies is discussed.

Dirks, Gregory J.

1991-01-01

54

Performance characterization of an image converter based streak camera  

Microsoft Academic Search

The performance response of an electronic subnanosecond streak camera to a spatially distributed optical signal varies significantly with the image location on output screen. The variations are due mainly to the combined effects of (1) electron-optics aberrations, (2) camera sweep ramps and gating waveform imperfections, (3) photocathode and phosphor quantum efficiency nonuniformities, and (4) excessive incident intensity or power. Consequently,

C. C. Lai; L. B. Olk

1985-01-01

55

Recording Images Using a Simple Pinhole Camera  

NSDL National Science Digital Library

In this lesson, students develop and expand their observational skills and technological understanding by building and operating a pinhole camera. The interdisciplinary connections are in the realm of application in this motivating activity. The lesson pr

John Eichinger

2009-05-30

56

An Evaluation of Iterative Reconstruction Strategies on Mediastinal Lesion Detection Using Hybrid Ga-67 SPECT Images.  

PubMed

Hybrid LROC studies can be used to more realistically assess the impact of reconstruction strategies, compared to those constructed with digital phantoms. This is because hybrid data provides the background variability that is present in clinical imaging, as well as, control over critical imaging parameters, required to conduct meaningful tests. Hybrid data is obtained by adding Monte Carlo simulated lesions to disease free clinical projection data. Due to Ga-67 being a particularly challenging radionuclide for imaging, we use Ga-67 hybrid SPECT data to study the effectiveness of the various correction strategies developed to account for degradations in SPECT imaging. Our data was obtained using GE-VG dual detector SPECT-CT camera. After determining a target lesion contrast we conduct pilot LROC studies to obtain a near-optimal set of reconstruction parameters for the different strategies individually. These near-optimal parameters are then used to reconstruct the final evaluation study sets. All LROC study results reported here were obtained employing human observers only. We use final LROC study results to assess the impact of attenuation compensation, scatter compensation and detector resolution compensation on data reconstructed with the RBI-EM algorithm. We also compare these with FBP reconstructions of the same dataset. Our experiment indicates an improvement in detection accuracy, as various degradations inherent in the image acquisition process are compensated for in the reconstruction process. PMID:19169427

Pereira, N F; Gifford, H C; Pretorius, P H; Farncombe, T; Smyczynski, M; Licho, R; Schneider, P; King, M A

2007-01-01

57

Image Quality and Performance of the LSST Camera  

NASA Astrophysics Data System (ADS)

The LSST camera, which will be the largest digital camera built to date, presents a number of novel challenges. The field of view will be 3.5 degrees in diameter and will be sampled by a 3.2 billion pixel array of sensors to be read-out in under 2 seconds, which leads to demanding constraints on the sensor architecture and read-out electronics. The camera also incorporates three large refractive lenses, an array of five wide-band large filters mounted on a carousel, and a mechanical shutter. Given the fast optical beam (f/1.2) and tight tolerances for image quality and throughput specifications, the requirements on the optical design, assembly and alignment, and contamination control of the optical elements and focal plane are crucial. We present an overview of the LSST camera, with an emphasis on models of camera image quality and throughput performance that are characterized by various analysis packages and design considerations.

Gilmore, D. Kirk; Kahn, S.; Rassmussen, A.; Singel, J.

2012-01-01

58

Digital camera zooming based on unified CFA image processing steps  

Microsoft Academic Search

A unified camera image processing system that performs zooming and full color image reconstruction for single-sensor digital cameras is introduced. Compact and low-cost single-sensor solutions often lack optical zooming capabilities and thus depend on digital techniques. However, the computational power required for high-quality output using traditional techniques is generally too prohibitive to implement in such devices. The proposed scheme employs

R. Lukac; K. Martin; K. N. Platanoitis

2004-01-01

59

Bin mode estimation methods for Compton camera imaging  

NASA Astrophysics Data System (ADS)

We study the image reconstruction problem of a Compton camera which consists of semiconductor detectors. The image reconstruction is formulated as a statistical estimation problem. We employ a bin-mode estimation (BME) and extend an existing framework to a Compton camera with multiple scatterers and absorbers. Two estimation algorithms are proposed: an accelerated EM algorithm for the maximum likelihood estimation (MLE) and a modified EM algorithm for the maximum a posteriori (MAP) estimation. Numerical simulations demonstrate the potential of the proposed methods.

Ikeda, S.; Odaka, H.; Uemura, M.; Takahashi, T.; Watanabe, S.; Takeda, S.

2014-10-01

60

[New medical imaging based on electron tracking Compton camera (ETCC)].  

PubMed

We have developed an Electron-Tracking Compton Camera (ETCC) for medical imaging due to its wide energy dynamic range (200-1,500keV) and wide field of view (FOV, 3 str). This camera has a potential of developing the new reagents. We have carried out several imaging reagent studies as examples; (1) 18F-FDG and 131I-MIBG simultaneous imaging for double clinical tracer imaging, (2) imaging of some minerals (Mn-54, Zn-65, Fe-59) in mouse and plants. In addition, ETCC has a potential of real-time monitoring of the Bragg peak location by imaging prompt gamma rays for the beam therapy. We carried out the water phantom experiment using 140MeV proton beam, and obtained the images of both 511 keV and high energy gamma rays (800-2,000keV). Here better correlation of the latter image to the Bragg peak has been observed. Another potential of ETCC is to reconstruct the 3D image using only one-head camera without rotations of both the target and camera. Good 3D images of the thyroid grant phantom and the mouse with tumor were observed. In order to advance those features to the practical use, we are improving the all components and then construct the multi-head ETCC system. PMID:24592680

Tanimori, Toru; Kubo, Hidetoshi; Kabuki, Shigeto; Kimura, Hiroyuki

2012-01-01

61

Multiview: a novel multispectral IR imaging camera  

NASA Astrophysics Data System (ADS)

The Surveillance Sciences Directorate of the Northrop Grumman Advanced Systems and Technology organization is developing a novel Multispectral IR camera known as Multiview. This prototype system is capable of simultaneously acquiring 4-color SWIR/MWIR 2D imagery that is both spatially and temporally registered utilizing a single 2562 HgCdTe snapshot IR FPA capable of frame rates in excess of 240 Hz. The patented design offers an extremely compact package that contains the entire optomechanical assembly (lenses, interchangeable filters, and cold shield) in less than a 3 in3 volume. The unique imagery collected with this camera has the potential to significantly improve the effectiveness of clutter suppression algorithms, multi-color target detection and target-background discrimination for a wide variety of mission scenarios. This paper describes the key aspects of the Multiview prototype camera design and operation. Multiview's ability to dynamically manage flux imbalances between the four subbands is discussed. Radiometric performance predictions are presented along with laboratory validation of many of these performance metrics. Several examples of field collected imagery is shown including examples of transient rocket plume data measured at 240 Hz sample rate. The importance and utility of spatio-temporal multi-band imagery is also discussed.

Soel, Michael A.; Rudman, Stanley; Ryan, Robert; Fonneland, Nils J.; Milano, Steve J.

1997-06-01

62

Acquisition and evaluation of radiography images by digital camera.  

PubMed

To determine applicability of low-cost digital imaging for different radiographic modalities used in consultations from remote areas of the Ecuadorian rainforest with limited resources, both medical and financial. Low-cost digital imaging, consisting of hand-held digital cameras, was used for image capture at a remote location. Diagnostic radiographic images were captured in Ecuador by digital camera and transmitted to a password-protected File Transfer Protocol (FTP) server at VCU Medical Center in Richmond, Virginia, using standard Internet connectivity with standard security. After capture and subsequent transfer of images via low-bandwidth Internet connections, attending radiologists in the United States compared diagnoses to those from Ecuador to evaluate quality of image transfer. Corroborative diagnoses were obtained with the digital camera images for greater than 90% of the plain film and computed tomography studies. Ultrasound (U/S) studies demonstrated only 56% corroboration. Images of radiographs captured utilizing commercially available digital cameras can provide quality sufficient for expert consultation for many plain film studies for remote, underserved areas without access to advanced modalities. PMID:15857253

Cone, Stephen W; Carucci, Laura R; Yu, Jinxing; Rafiq, Azhar; Doarn, Charles R; Merrell, Ronald C

2005-04-01

63

Advances in imaging fission neutrons with a neutron scatter camera  

Microsoft Academic Search

Special nuclear material (SNM) emits high energy radiation during active and passive interrogation. This radiation can be imaged thus allowing visualization of shielded and\\/or smuggled SNM. Lower backgrounds and higher penetration through hi-Z materials make neutrons the preferred detectable in many scenarios. We have developed a neutron scatter camera that directly images fast fission neutrons from SNM sources while simultaneously

P. Marleau; J. Brennan; K. Krenz; N. Mascarenhas; S. Mrowka

2007-01-01

64

Imaging concert hall acoustics using visual and audio cameras  

Microsoft Academic Search

Using a recently developed real time audio camera, that uses the output of a spherical microphone array beamformer steered in all directions to create central projection to create acoustic intensity images, we present a technique to measure the acoustics of rooms and halls. A panoramic mosaiced visual image of the space is also create. Since both the visual and the

Adam O'Donovan; Ramani Duraiswami; Dmitry N. Zotkin

2008-01-01

65

Quantitative imaging of holmium-166 with an Anger camera.  

PubMed

The objective of this study was to develop a quantitative Anger camera imaging approach for 166Ho in the skeletal system of patients. A dual energy window method was designed to subtract the interference from septal penetration and bremsstrahlung events in Anger camera images acquired with the 80 keV x-rays emitted by 166Ho. The validity of this scatter subtraction method for 166Ho images was demonstrated as improvements of the line spread function and modulation transfer function. Camera sensitivity was found to be nearly independent of source-to-collimator distance only for images acquired with a high-energy collimator. Studies in an Alderson abdominal water phantom demonstrated scatter subtraction can provide quantitative Anger camera images of 166Ho with a scatter multiplier of k = 1.0 and a correction for attenuation. Attenuation correction factors derived from a transmission image were measured for the phantom and verified with water-equivalent blocks of known thickness. Whole-body scan images of 166Ho localized in the skeletal system of patients were significantly improved with this simple scatter subtraction method, and when used to estimate the activity distribution within separate bone regions of the skeleton. PMID:15552124

Bayouth, J E; Macey, D J

1994-02-01

66

Color image restoration based on camera microscanning  

NASA Astrophysics Data System (ADS)

In this work, we propose a method to restore color images from a set of degraded color images obtained with a microscanning imaging system. Using the set of observed images, image restoration is carried out by solving a system of equations that is derived from optimization of an objective function. Since the proposed method possesses a high computational complexity, a fast algorithm is developed. Experimental and computer simulation results obtained with the proposed method are analyzed in terms of restoration accuracy and tolerance to additive input noise.

López Martínez, José L.; Kober, Vitaly; Escalante-Torres, Manuel

2014-09-01

67

Mars Global Surveyor Mars Orbiter Camera Image Gallery  

NSDL National Science Digital Library

This site from Malin Space Science Systems provides access to all of the images acquired by the Mars Orbiter Camera (MOC) during the Mars Global Surveyor mission through March 2005. MOC consists of several cameras: A narrow angle system that provides grayscale high resolution views of the planet's surface (typically, 1.5 to 12 meters/pixel), and red and blue wide angle cameras that provide daily global weather monitoring, context images to determine where the narrow angle views were actually acquired, and regional coverage to monitor variable surface features such as polar frost and wind streaks. Ancillary data for each image is provided and instructions regarding gallery usage are also available on the site.

Malin Space Science Systems

68

Copernican craters: Early results from the Lunar Reconnaissance Orbiter Camera  

NASA Astrophysics Data System (ADS)

The youngest (Copernican) craters on the Moon provide the best examples of original crater morphology and a record of the impact flux over the last ~1 Ga in the Earth-Moon system. The LRO Narrow Angle Cameras (NAC) provide 50 cm pixels from an altitude of 50 km. With changing incidence angle, global access, and very high data rates, these cameras provide unprecedented data on lunar craters. Stereo image pairs are being acquired for detailed topographic mapping. These data allow comparisons of relative ages of the larger young craters, some of which are tied to absolute radiometric ages from Apollo-returned samples. These relative ages, the crater populations at small diameters, and details of crater morphology including ejecta and melt morphologies, allow better delineation of recent lunar history and the formation and modification of impact craters. Crater counts may also reveal differences in the formation and preservation of small diameter craters as a function of target material (e.g., unconsolidated regolith versus solid impact melt). One key question: Is the current cratering rate constant or does it fluctuate. We will constrain the very recent cratering rate (at 10-100 m diameter) by comparing LROC images with those taken by Apollo nearly 40 years ago to determine the number of new impact craters. The current cratering rate and an assumption of constant cratering rate over time may or may not correctly predict the number of craters superimposed over radiometrically-dated surfaces such as South Ray, Cone, and North Ray craters, which range from 2-50 Ma and are not saturated by 10-100 m craters. If the prediction fails with realistic consideration of errors, then the present-day cratering rate must be atypical. Secondary craters complicate this analysis, but the resolution and coverage of LROC enables improved recognition of secondary craters. Of particular interest for the youngest Copernican craters is the possibility of self-cratering. LROC is providing the the image quality needed to classify small craters by state of degradation (i.e., relative age); concentrations of craters with uniform size and age indicate secondary formation. Portion of LROC image M103703826LE showing a sparsely-cratered pond of impact melt on the floor of farside Copernican crater Necho (4.95 S, 123.6 E).

McEwen, A. S.; Hiesinger, H.; Thomas, P. C.; Robinson, M. S.; van der Bogert, C.; Ostrach, L.; Plescia, J. B.; Bray, V. J.; Tornabene, L. L.

2009-12-01

69

High-Resolution Mars Camera Test Image of Moon (Infrared)  

NASA Technical Reports Server (NTRS)

This crescent view of Earth's Moon in infrared wavelengths comes from a camera test by NASA's Mars Reconnaissance Orbiter spacecraft on its way to Mars. The mission's High Resolution Imaging Science Experiment camera took the image on Sept. 8, 2005, while at a distance of about 10 million kilometers (6 million miles) from the Moon. The dark feature on the right is Mare Crisium. From that distance, the Moon would appear as a star-like point of light to the unaided eye. The test verified the camera's focusing capability and provided an opportunity for calibration. The spacecraft's Context Camera and Optical Navigation Camera also performed as expected during the test.

The Mars Reconnaissance Orbiter, launched on Aug. 12, 2005, is on course to reach Mars on March 10, 2006. After gradually adjusting the shape of its orbit for half a year, it will begin its primary science phase in November 2006. From the mission's planned science orbit about 300 kilometers (186 miles) above the surface of Mars, the high resolution camera will be able to discern features as small as one meter or yard across.

2005-01-01

70

Cheetah: A high frame rate, high resolution SWIR image camera  

NASA Astrophysics Data System (ADS)

A high resolution, high frame rate InGaAs based image sensor and associated camera has been developed. The sensor and the camera are capable of recording and delivering more than 1700 full 640x512pixel frames per second. The FPA utilizes a low lag CTIA current integrator in each pixel, enabling integration times shorter than one microsecond. On-chip logics allows for four different sub windows to be read out simultaneously at even higher rates. The spectral sensitivity of the FPA is situated in the SWIR range [0.9-1.7 ?m] and can be further extended into the Visible and NIR range. The Cheetah camera has max 16 GB of on-board memory to store the acquired images and transfer the data over a Gigabit Ethernet connection to the PC. The camera is also equipped with a full CameralinkTM interface to directly stream the data to a frame grabber or dedicated image processing unit. The Cheetah camera is completely under software control.

Neys, Joel; Bentell, Jonas; O'Grady, Matt; Vermeiren, Jan; Colin, Thierry; Hooylaerts, Peter; Grietens, Bob

2008-10-01

71

Digital camera with apparatus for authentication of images produced from an image file  

NASA Technical Reports Server (NTRS)

A digital camera equipped with a processor for authentication of images produced from an image file taken by the digital camera is provided. The digital camera processor has embedded therein a private key unique to it, and the camera housing has a public key that is so uniquely based upon the private key that digital data encrypted with the private key by the processor may be decrypted using the public key. The digital camera processor comprises means for calculating a hash of the image file using a predetermined algorithm, and second means for encrypting the image hash with the private key, thereby producing a digital signature. The image file and the digital signature are stored in suitable recording means so they will be available together. Apparatus for authenticating at any time the image file as being free of any alteration uses the public key for decrypting the digital signature, thereby deriving a secure image hash identical to the image hash produced by the digital camera and used to produce the digital signature. The apparatus calculates from the image file an image hash using the same algorithm as before. By comparing this last image hash with the secure image hash, authenticity of the image file is determined if they match, since even one bit change in the image hash will cause the image hash to be totally different from the secure hash.

Friedman, Gary L. (inventor)

1993-01-01

72

Removing Image Artifacts Due to Dirty Camera Lenses and Thin Occluders Columbia University  

E-print Network

- ital cameras, or the front windows of security cameras, often accu- mulate various types outdoor security cameras, underwater cameras or covert surveillance be- hind a fence. Therefore, weRemoving Image Artifacts Due to Dirty Camera Lenses and Thin Occluders Jinwei Gu Columbia

Nayar, Shree K.

73

Anger camera image generation with microcomputers  

E-print Network

n g 8 o r h the m i c r o c o rr p u t e r s n 8 t1. e i n t e r a c e b o a r d was 2 sof. tware system. It generated images o:) the microcomputer m"nitor, controlled the movement of data, and was able tc, extract data from an image series... and Data 112 hPPENDIX F NED-IV Acquired Image and Data Page 115 Vi APPENI'&IX G /(CT I. an(. l giiiD ? I V;(cqu i r(. d I m; 6(. a go rm:& I I. un 8 Scan . 118 API'EIID IX 8 ACTI and IIED ? IV Acquired Imag(. e Abnora&aI Lur. g Scan 121 APPEIIDIX...

Williams, Karl Morgan

2012-06-07

74

Uncooled detector, optics, and camera development for THz imaging  

Microsoft Academic Search

A prototype THz imaging system based on modified uncooled microbolometer detector arrays, INO MIMICII camera electronics, and a custom f\\/1 THz optics has been assembled. A variety of new detector layouts and architectures have been designed; the detector THz absorption was optimized via several methods including integration of thin film metallic absorbers, thick film gold black absorbers, and antenna structures.

Timothy Pope; Michel Doucet; Fabien Dupont; Linda Marchese; Bruno Tremblay; Georges Baldenberger; Sonia Verrault; Frédéric Lamontagne

2009-01-01

75

Coincidence ion imaging with a fast frame camera.  

PubMed

A new time- and position-sensitive particle detection system based on a fast frame CMOS (complementary metal-oxide semiconductors) camera is developed for coincidence ion imaging. The system is composed of four major components: a conventional microchannel plate/phosphor screen ion imager, a fast frame CMOS camera, a single anode photomultiplier tube (PMT), and a high-speed digitizer. The system collects the positional information of ions from a fast frame camera through real-time centroiding while the arrival times are obtained from the timing signal of a PMT processed by a high-speed digitizer. Multi-hit capability is achieved by correlating the intensity of ion spots on each camera frame with the peak heights on the corresponding time-of-flight spectrum of a PMT. Efficient computer algorithms are developed to process camera frames and digitizer traces in real-time at 1 kHz laser repetition rate. We demonstrate the capability of this system by detecting a momentum-matched co-fragments pair (methyl and iodine cations) produced from strong field dissociative double ionization of methyl iodide. PMID:25554285

Lee, Suk Kyoung; Cudry, Fadia; Lin, Yun Fei; Lingenfelter, Steven; Winney, Alexander H; Fan, Lin; Li, Wen

2014-12-01

76

Coincidence ion imaging with a fast frame camera  

NASA Astrophysics Data System (ADS)

A new time- and position-sensitive particle detection system based on a fast frame CMOS (complementary metal-oxide semiconductors) camera is developed for coincidence ion imaging. The system is composed of four major components: a conventional microchannel plate/phosphor screen ion imager, a fast frame CMOS camera, a single anode photomultiplier tube (PMT), and a high-speed digitizer. The system collects the positional information of ions from a fast frame camera through real-time centroiding while the arrival times are obtained from the timing signal of a PMT processed by a high-speed digitizer. Multi-hit capability is achieved by correlating the intensity of ion spots on each camera frame with the peak heights on the corresponding time-of-flight spectrum of a PMT. Efficient computer algorithms are developed to process camera frames and digitizer traces in real-time at 1 kHz laser repetition rate. We demonstrate the capability of this system by detecting a momentum-matched co-fragments pair (methyl and iodine cations) produced from strong field dissociative double ionization of methyl iodide.

Lee, Suk Kyoung; Cudry, Fadia; Lin, Yun Fei; Lingenfelter, Steven; Winney, Alexander H.; Fan, Lin; Li, Wen

2014-12-01

77

Image quality against metric capability of an aerial camera  

NASA Technical Reports Server (NTRS)

Glass plate targets containing multiple contrast three bar patterns and a military standard resolution target were photographed through a light-source collimator-camera system. The glass plate targets and images were then scanned. Through a computer plotting program selected portions of the scanned items were presented. The plots were then measured and analyzed to obtain information of image quality parameters and effects. The measurement errors are discussed together with aspects of resolution determination.

Ghosh, S. K.

1973-01-01

78

Halftoning-Inspired Methods for Foveation in Variable-Acuity Superpixel Imager* Cameras  

E-print Network

Halftoning-Inspired Methods for Foveation in Variable-Acuity Superpixel Imager* Cameras Thayne R acuity superpixel imager (VASITM) cameras. We foveate a set of images using control signals generated the binary control signals used by Variable Acuity Superpixel Imager (VASITM) foveating cameras

Evans, Brian L.

79

The TrICE Prototype MAMPT Imaging Camera  

E-print Network

The Track Imaging Cerenov Experiment (TrICE) is an air Cerenkov prototype telescope designed to use multi-anode photomultiplier to acheive a high angular resolution for measuring cosmic-ray composition at TeV-PeV energies. The TrICE camera, composed of 16 Hamamatsu R8900 16-channel multi-anode photomultiplier tubes, achieves 0.086 degree angular width per pixel over 1.5 degree wide field of view. We present a description of the TrICE camera design, calibration and performance.

K. Byrum; J. Cunningham; G. Drake; E. Hays; D. Kieda; E. Kovacs; S. Magill; L. Nodulmann; R. Norhtrop; S. Swordy; R. G. Wagner; S. P. Wakely; S. A. Wissel

2007-10-02

80

A compact gamma camera for biological imaging  

SciTech Connect

A compact detector, sized particularly for imaging a mouse, is described. The active area of the detector is approximately 46 mm; spl times/ 96 mm. Two flat-panel Hamamatsu H8500 position-sensitive photomultiplier tubes (PSPMTs) are coupled to a pixellated NaI(Tl) scintillator which views the animal through a copper-beryllium (CuBe) parallel-hole collimator specially designed for {sup 125}I. Although the PSPMTs have insensitive areas at their edges and there is a physical gap, corrections for scintillation light collection at the junction between the two tubes results in a uniform response across the entire rectangular area of the detector. The system described has been developed to optimize both sensitivity and resolution for in-vivo imaging of small animals injected with iodinated compounds. We demonstrate an in-vivo application of this detector, particularly to SPECT, by imaging mice injected with approximately 10-15; spl mu/Ci of {sup 125}I.

Bradley, E.L.; Cella, J.; Majewski, S.; Popov, V.; Jianguo Qian; Saha, M.S.; Smith, M.F.; Weisenberger, A.G.; Welsh, R.E.

2006-02-01

81

A novel SPECT camera for molecular imaging of the prostate  

NASA Astrophysics Data System (ADS)

The objective of this work is to develop an improved SPECT camera for dedicated prostate imaging. Complementing the recent advancements in agents for molecular prostate imaging, this device has the potential to assist in distinguishing benign from aggressive cancers, to improve site-specific localization of cancer, to improve accuracy of needle-guided prostate biopsy of cancer sites, and to aid in focal therapy procedures such as cryotherapy and radiation. Theoretical calculations show that the spatial resolution/detection sensitivity of the proposed SPECT camera can rival or exceed 3D PET and further signal-to-noise advantage is attained with the better energy resolution of the CZT modules. Based on photon transport simulation studies, the system has a reconstructed spatial resolution of 4.8 mm with a sensitivity of 0.0001. Reconstruction of a simulated prostate distribution demonstrates the focal imaging capability of the system.

Cebula, Alan; Gilland, David; Su, Li-Ming; Wagenaar, Douglas; Bahadori, Amir

2011-10-01

82

Filtered backprojection reconstruction and redundancy in Compton camera imaging.  

PubMed

During the acquisition process with the Compton gamma-camera, integrals of the intensity distribution of the source on conical surfaces are measured. They represent the Compton projections of the intensity. The inversion of the Compton transform reposes on a particular Fourier-Slice theorem. This paper proposes a filtered backprojection algorithm for image reconstruction from planar Compton camera data. We show how different projections are related together and how they may be combined in the tomographical reconstruction step. Considering a simulated Compton imaging system, we conclude that the proposed method yields accurate reconstructed images for simple sources. An elongation of the source in the direction orthogonal to the camera may be observed and is to be related to the truncation of the projections induced by the finite extent of the device. This phenomenon was previously observed with other reconstruction methods, e.g., iterative maximum likelihood expectation maximization. The redundancy of the Compton transform is thus an important feature for the reduction of noise in Compton images, since the ideal assumptions of infinite width and observation time are never met in practice. We show that a selection operated on the set of data allows to partially get around projection truncation, at the expense of an enhancement of the noise in the images. PMID:24196864

Maxim, Voichi?a

2014-01-01

83

Measuring SO2 ship emissions with an ultraviolet imaging camera  

NASA Astrophysics Data System (ADS)

Over the last few years fast-sampling ultraviolet (UV) imaging cameras have been developed for use in measuring SO2 emissions from industrial sources (e.g. power plants; typical emission rates ~ 1-10 kg s-1) and natural sources (e.g. volcanoes; typical emission rates ~ 10-100 kg s-1). Generally, measurements have been made from sources rich in SO2 with high concentrations and emission rates. In this work, for the first time, a UV camera has been used to measure the much lower concentrations and emission rates of SO2 (typical emission rates ~ 0.01-0.1 kg s-1) in the plumes from moving and stationary ships. Some innovations and trade-offs have been made so that estimates of the emission rates and path concentrations can be retrieved in real time. Field experiments were conducted at Kongsfjord in Ny Ålesund, Svalbard, where SO2 emissions from cruise ships were made, and at the port of Rotterdam, Netherlands, measuring emissions from more than 10 different container and cargo ships. In all cases SO2 path concentrations could be estimated and emission rates determined by measuring ship plume speeds simultaneously using the camera, or by using surface wind speed data from an independent source. Accuracies were compromised in some cases because of the presence of particulates in some ship emissions and the restriction of single-filter UV imagery, a requirement for fast-sampling (> 10 Hz) from a single camera. Despite the ease of use and ability to determine SO2 emission rates from the UV camera system, the limitation in accuracy and precision suggest that the system may only be used under rather ideal circumstances and that currently the technology needs further development to serve as a method to monitor ship emissions for regulatory purposes. A dual-camera system or a single, dual-filter camera is required in order to properly correct for the effects of particulates in ship plumes.

Prata, A. J.

2014-05-01

84

Imaging tissues with a polarized light video camera  

NASA Astrophysics Data System (ADS)

A method for imaging the superficial epidermal and papillary dermal layers of the skin is needed when assessing many skin lesions. We have developed an imaging modality using a video camera whose mechanism of contrast is the reflectance of polarized light from superficial skin. By selecting only polarized light to create the image, one rejects the large amount of diffusely reflected light from the deeper dermis. The specular reflectance (or glare) from the skin surface is also avoided in the setup. The resulting polarization picture maximally accents the details of the superficial layer of the skin and removes the effects of melanin pigmentation from the image. For example, freckles simply disappear and nevi lose their dark pigmentation to reveal the details of abnormal cellular growth. An initial clinical study demonstrated that the polarization camera could identify the margins of sclerosing basal cell carcinoma while the eye of the doctor underestimated the margin estimate. The camera identified an 11-mm-diameter lesion while the unaided eye identified a 6-mm-diameter lesion.

Jacques, Steven L.; Lee, Kenneth

1999-09-01

85

A four-step camera calibration procedure with implicit image correction  

Microsoft Academic Search

In geometrical camera calibration the objective is to determine a set of camera parameters that describe the mapping between 3-D reference coordinates and 2-D image coordinates. Various methods for camera calibration can be found from the literature. However surprisingly little attention has been paid to the whole calibration procedure, i.e., control point extraction from images, model fitting, image correction, and

Janne Heikkila; O. Silven

1997-01-01

86

VLSI Architecture and FPGA Prototyping of a Digital Camera for Image Security and Authentication  

E-print Network

VLSI Architecture and FPGA Prototyping of a Digital Camera for Image Security and Authentication and security mechanism for images produced by it. Since the proposal of the trustworthy digital camera Watermarking Unit Flash Memory Compression Unit Encryption Unit Fig. 1. Secure digital camera for image

Mohanty, Saraju P.

87

Automatic DEM Generation from CE-1's CCD Stereo Camera Images  

NASA Astrophysics Data System (ADS)

The goal of the CCD Stereo Camera is to acquire 3D-images of lunar surface bewteen 70S and 70N. We describe the process of the images acquired, configuration of imaging system, camera sensor model, camera trajectory model and EFP photogrammetric triangulation algorithm.

Liu, J. J.; Ren, X.; Mu, L. L.; Zhao, B. C.; Xiangli, B.; Yang, J. F.; Zou, Y. L.; Zhang, H. B.; Lu, C.; Liu, J. Z.; Zuo, W.; Su, Y.; Wen, W. B.; Bian, W.; Zou, X. D.; Li, C. L.

2009-03-01

88

Arbitrary viewpoint image generation method of unknown objects in known environment using a single camera  

Microsoft Academic Search

In this paper, we propose a method to generate arbitrary viewpoint images of unknown objects using a single camera set in a known environment. In recent years, various methods for generating arbitrary viewpoint images have been proposed. These methods often use multiple cameras. When we can generate the arbitrary viewpoint images using only a single camera, we can apply the

Asumi Yamachika; Katsuya Kondo; Syoji Kobashi; Yutaka Hata

2004-01-01

89

A Comparative Study of Microscopic Images Captured by a Box Type Digital Camera Versus a Standard Microscopic Photography Camera Unit  

PubMed Central

Introduction: Obtaining images of slides viewed by a microscope can be invaluable for both diagnosis and teaching.They can be transferred among technologically-advanced hospitals for further consultation and evaluation. But a standard microscopic photography camera unit (MPCU)(MIPS-Microscopic Image projection System) is costly and not available in resource poor settings. The aim of our endeavour was to find a comparable and cheaper alternative method for photomicrography. Materials and Methods: We used a NIKON Coolpix S6150 camera (box type digital camera) with Olympus CH20i microscope and a fluorescent microscope for the purpose of this study. Results: We got comparable results for capturing images of light microscopy, but the results were not as satisfactory for fluorescent microscopy. Conclusion: A box type digital camera is a comparable, less expensive and convenient alternative to microscopic photography camera unit. PMID:25478350

Desai, Nandini J.; Gupta, B. D.; Patel, Pratik Narendrabhai

2014-01-01

90

TIRCAM2: The TIFR near infrared imaging camera  

NASA Astrophysics Data System (ADS)

TIRCAM2 (TIFR near infrared imaging camera - II) is a closed cycle cooled imager that has been developed by the Infrared Astronomy Group at the Tata Institute of Fundamental Research for observations in the near infrared band of 1 to 3.7 ?m with existing Indian telescopes. In this paper, we describe some of the technical details of TIRCAM2 and report its observing capabilities, measured performance and limiting magnitudes with the 2-m IUCAA Girawali telescope and the 1.2-m PRL Gurushikhar telescope. The main highlight is the camera's capability of observing in the nbL (3.59 mum) band enabling our primary motivation of mapping of Polycyclic Aromatic Hydrocarbon (PAH) emission at 3.3 mum.

Naik, M. B.; Ojha, D. K.; Ghosh, S. K.; Poojary, S. S.; Jadhav, R. B.; Meshram, G. S.; Sandimani, P. R.; Bhagat, S. B.; D'Costa, S. L. A.; Gharat, S. M.; Bakalkar, C. B.; Ninan, J. P.; Joshi, J. S.

2012-12-01

91

Scalar wave-optical reconstruction of plenoptic camera images.  

PubMed

We investigate the reconstruction of plenoptic camera images in a scalar wave-optical framework. Previous publications relating to this topic numerically simulate light propagation on the basis of ray tracing. However, due to continuing miniaturization of hardware components it can be assumed that in combination with low-aperture optical systems this technique may not be generally valid. Therefore, we study the differences between ray- and wave-optical object reconstructions of true plenoptic camera images. For this purpose we present a wave-optical reconstruction algorithm, which can be run on a regular computer. Our findings show that a wave-optical treatment is capable of increasing the detail resolution of reconstructed objects. PMID:25321378

Junker, André; Stenau, Tim; Brenner, Karl-Heinz

2014-09-01

92

Traffic monitoring with serial images from airborne cameras  

NASA Astrophysics Data System (ADS)

The classical means to measure traffic density and velocity depend on local measurements from induction loops and other on site instruments. This information does not give the whole picture of the two-dimensional traffic situation. In order to obtain precise knowledge about the traffic flow of a large area, only airborne cameras or cameras positioned at very high locations (towers, etc.) can provide an up-to-date image of all roads covered. The paper aims at showing the potential of using image time series from these cameras to derive traffic parameters on the basis of single car measurements. To be able to determine precise velocities and other parameters from an image time series, exact geocoding is one of the first requirements for the acquired image data. The methods presented here for determining several traffic parameters for single vehicles and vehicle groups involve recording and evaluating a number of digital or analog aerial images from high altitude and with a large total field of view. Visual and automatic methods for the interpretation of images are compared. It turns out that the recording frequency of the individual images should be at least 1/3 Hz (visual interpretation), but is preferably 3 Hz or more, especially for automatic vehicle tracking. The accuracy and potentials of the methods are analyzed and presented, as well as the usage of a digital road database for improving the tracking algorithm and for integrating the results for further traffic applications. Shortcomings of the methods are given as well as possible improvements regarding methodology and sensor platform.

Reinartz, Peter; Lachaise, Marie; Schmeer, Elisabeth; Krauss, Thomas; Runge, Hartmut

93

Segmentation of Camera Captured Business Card Images for Mobile Devices  

Microsoft Academic Search

Due to huge deformation in the camera captured images, variety in nature of\\u000athe business cards and the computational constraints of the mobile devices,\\u000adesign of an efficient Business Card Reader (BCR) is challenging to the\\u000aresearchers. Extraction of text regions and segmenting them into characters is\\u000aone of such challenges. In this paper, we have presented an efficient character

Ayatullah Faruk Mollah; Subhadip Basu; Mita Nasipuri

2011-01-01

94

Camera head of the DSSC X-ray imager  

Microsoft Academic Search

A description of the camera-head electronics of the DSSC mega-pixel X-ray imager is given. The integration of 32 sensor chips, 256 ASICs, 112 PCBs, and 16 flex cables in the space of 241mm × 251mm × 224mm underlines its compactness. Prototypes of subassemblies behind the readout-ASIC layer are designed and fabricated. Measurement results of the main functionalities clock distribution, power

Karsten Hansen; Helmut Klar; Dirk Muntefering

2011-01-01

95

New technique of three-dimensional imaging through a 3-mm single lens camera  

NASA Astrophysics Data System (ADS)

We present a technique for imaging full-color 3-D images with a single camera in this paper. Unlike a typical 3-D-imaging system comprising two independent cameras each contributing one viewpoint, the technique presented here creates two viewpoints using a single-lens camera with a bipartite filter whose bandpass characteristics are complementary to each other. The bipartite filter divides the camera's limiting aperture into two spatially separated apertures or viewpoints that alternately image an object field using filter-passband matched, time-sequenced illumination. This technique was applied to construct a 3-D camera to image scenes at a working distance of 10 mm. We evaluated the effectiveness of the 3-D camera in generating stereo images using statistical comparison of the depth resolutions achieved by the 3-D camera and a similar 2D camera arrangement. The comparison showed that the complementary filters produce effective stereopsis at prescribed working distances.

Bae, Sam Y.; Korniski, Ron; Ream, Allen; Shahinian, Hrayr; Manohara, Harish M.

2012-02-01

96

Electronic imaging system incorporating a hand-held fundus camera for canine ophthalmology.  

PubMed

An electronic imaging system incorporating a hand-held fundus camera was used to collect images of the canine ocular fundus. The electronic imaging system comprised a hand-held fundus camera, an IBM personal computer (PC 350), Microsoft Windows NT 4.0, Adobe Photoshop, and a color printer (Tektronix Phaser 550) and was used to store, edit, and print the images captured by the fundus camera. Hand-held fundus cameras are essential for use in canine ophthalmology. The Nidek NM-100 hand-held fundus camera digitalizes images, enabling their direct transfer into reports and their storage on writeable CDs. PMID:11703058

Hoang, H D; Brant, L M; Jaksetic, M D; Lake, S G; Stuart, B P

2001-11-01

97

AstraLux: the Calar Alto Lucky Imaging Camera  

E-print Network

AstraLux is the Lucky Imaging camera for the Calar Alto 2.2-m telescope, based on an electron-multiplying high speed CCD. By selecting only the best 1-10% of several thousand short exposure frames, AstraLux provides nearly diffraction limited imaging capabilities in the SDSS i' and z' filters over a field of view of 24x24 arcseconds. By choosing commercially available components wherever possible, the instrument could be built in short time and at comparably low cost. We present the instrument design, the data reduction pipeline, and summarise the performance and characteristics

Felix Hormuth; Stefan Hippler; Wolfgang Brandner; Karl Wagner; Thomas Henning

2008-07-03

98

LROC WAC 100 Meter Scale Photometrically Normalized Map of the Moon  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) monthly global observations allowed derivation of a robust empirical photometric solution over a broad range of incidence, emission and phase (i, e, g) angles. Combining the WAC stereo-based GLD100 [1] digital terrain model (DTM) and LOLA polar DTMs [2] enabled precise topographic corrections to photometric angles. Over 100,000 WAC observations at 643 nm were calibrated to reflectance (I/F). Photometric angles (i, e, g), latitude, and longitude were calculated and stored for each WAC pixel. The 6-dimensional data set was then reduced to 3 dimensions by photometrically normalizing I/F with a global solution similar to [3]. The global solution was calculated from three 2°x2° tiles centered on (1°N, 147°E), (45°N, 147°E), and (89°N, 147°E), and included over 40 million WAC pixels. A least squares fit to a multivariate polynomial of degree 4 (f(i,e,g)) was performed, and the result was the starting point for a minimum search solving the non-linear function min[{1-[ I/F / f(i,e,g)] }2]. The input pixels were filtered to incidence angles (calculated from topography) < 89° and I/F greater than a minimum threshold to avoid shadowed pixels, and the output normalized I/F values were gridded into an equal-area map projection at 100 meters/pixel. At each grid location the median, standard deviation, and count of valid pixels were recorded. The normalized reflectance map is the result of the median of all normalized WAC pixels overlapping that specific 100-m grid cell. There are an average of 86 WAC normalized I/F estimates at each cell [3]. The resulting photometrically normalized mosaic provides the means to accurately compare I/F values for different regions on the Moon (see Nuno et al. [4]). The subtle differences in normalized I/F can now be traced across the local topography at regions that are illuminated at any point during the LRO mission (while the WAC was imaging), including at polar latitudes. This continuous map of reflectance at 643 nm, normalized to a standard geometry of i=30, e=0, g=30, ranges from 0.036 to 0.36 (0.01%-99.99% of the histogram) with a global mean reflectance of 0.115. Immature rays of Copernican craters are typically >0.14 and maria are typically <0.07 with averages for individual maria ranging from 0.046 to 0.060. The materials with the lowest normalized reflectance on the Moon are pyroclastic deposits at Sinus Aestuum (<0.036) and those with the highest normalized reflectance are found on steep crater walls (>0.36)[4]. 1. Scholten et al. (2012) J. Geophys. Res., 117, doi: 10.1029/2011JE003926. 2. Smith et al. (2010), Geophys. Res. Lett., 37, L18204, doi:10.1029/2010GL043751. 3. Boyd et al. (2012) LPSC XLIII, #2795 4. Nuno et al. AGU, (this conference)

Boyd, A. K.; Nuno, R. G.; Robinson, M. S.; Denevi, B. W.; Hapke, B. W.

2013-12-01

99

The role of camera-bundled image management software in the consumer digital imaging value chain  

NASA Astrophysics Data System (ADS)

This research was undertaken by the Convergence Center at the Syracuse University School of Information Studies (www.digital-convergence.info). Project ICONICA, the name for the research, focuses on the strategic implications of digital Images and the CONvergence of Image management and image CApture. Consumer imaging - the activity that we once called "photography" - is now recognized as in the throes of a digital transformation. At the end of 2003, market researchers estimated that about 30% of the households in the U.S. and 40% of the households in Japan owned digital cameras. In 2004, of the 86 million new cameras sold (excluding one-time use cameras), a majority (56%) were estimated to be digital cameras. Sales of photographic film, while still profitable, are declining precipitously.

Mueller, Milton; Mundkur, Anuradha; Balasubramanian, Ashok; Chirania, Virat

2005-02-01

100

An efficient image compressor for charge coupled devices camera.  

PubMed

Recently, the discrete wavelet transforms- (DWT-) based compressor, such as JPEG2000 and CCSDS-IDC, is widely seen as the state of the art compression scheme for charge coupled devices (CCD) camera. However, CCD images project on the DWT basis to produce a large number of large amplitude high-frequency coefficients because these images have a large number of complex texture and contour information, which are disadvantage for the later coding. In this paper, we proposed a low-complexity posttransform coupled with compressing sensing (PT-CS) compression approach for remote sensing image. First, the DWT is applied to the remote sensing image. Then, a pair base posttransform is applied to the DWT coefficients. The pair base are DCT base and Hadamard base, which can be used on the high and low bit-rate, respectively. The best posttransform is selected by the l p -norm-based approach. The posttransform is considered as the sparse representation stage of CS. The posttransform coefficients are resampled by sensing measurement matrix. Experimental results on on-board CCD camera images show that the proposed approach significantly outperforms the CCSDS-IDC-based coder, and its performance is comparable to that of the JPEG2000 at low bit rate and it does not have the high excessive implementation complexity of JPEG2000. PMID:25114977

Li, Jin; Xing, Fei; You, Zheng

2014-01-01

101

Parallel phase-sensitive three-dimensional imaging camera  

DOEpatents

An apparatus is disclosed for generating a three-dimensional (3-D) image of a scene illuminated by a pulsed light source (e.g. a laser or light-emitting diode). The apparatus, referred to as a phase-sensitive 3-D imaging camera utilizes a two-dimensional (2-D) array of photodetectors to receive light that is reflected or scattered from the scene and processes an electrical output signal from each photodetector in the 2-D array in parallel using multiple modulators, each having inputs of the photodetector output signal and a reference signal, with the reference signal provided to each modulator having a different phase delay. The output from each modulator is provided to a computational unit which can be used to generate intensity and range information for use in generating a 3-D image of the scene. The 3-D camera is capable of generating a 3-D image using a single pulse of light, or alternately can be used to generate subsequent 3-D images with each additional pulse of light.

Smithpeter, Colin L. (Albuquerque, NM); Hoover, Eddie R. (Sandia Park, NM); Pain, Bedabrata (Los Angeles, CA); Hancock, Bruce R. (Altadena, CA); Nellums, Robert O. (Albuquerque, NM)

2007-09-25

102

An Efficient Image Compressor for Charge Coupled Devices Camera  

PubMed Central

Recently, the discrete wavelet transforms- (DWT-) based compressor, such as JPEG2000 and CCSDS-IDC, is widely seen as the state of the art compression scheme for charge coupled devices (CCD) camera. However, CCD images project on the DWT basis to produce a large number of large amplitude high-frequency coefficients because these images have a large number of complex texture and contour information, which are disadvantage for the later coding. In this paper, we proposed a low-complexity posttransform coupled with compressing sensing (PT-CS) compression approach for remote sensing image. First, the DWT is applied to the remote sensing image. Then, a pair base posttransform is applied to the DWT coefficients. The pair base are DCT base and Hadamard base, which can be used on the high and low bit-rate, respectively. The best posttransform is selected by the lp-norm-based approach. The posttransform is considered as the sparse representation stage of CS. The posttransform coefficients are resampled by sensing measurement matrix. Experimental results on on-board CCD camera images show that the proposed approach significantly outperforms the CCSDS-IDC-based coder, and its performance is comparable to that of the JPEG2000 at low bit rate and it does not have the high excessive implementation complexity of JPEG2000. PMID:25114977

Li, Jin; Xing, Fei; You, Zheng

2014-01-01

103

Goal-oriented rectification of camera-based document images.  

PubMed

Document digitization with either flatbed scanners or camera-based systems results in document images which often suffer from warping and perspective distortions that deteriorate the performance of current OCR approaches. In this paper, we present a goal-oriented rectification methodology to compensate for undesirable document image distortions aiming to improve the OCR result. Our approach relies upon a coarse-to-fine strategy. First, a coarse rectification is accomplished with the aid of a computationally low cost transformation which addresses the projection of a curved surface to a 2-D rectangular area. The projection of the curved surface on the plane is guided only by the textual content's appearance in the document image while incorporating a transformation which does not depend on specific model primitives or camera setup parameters. Second, pose normalization is applied on the word level aiming to restore all the local distortions of the document image. Experimental results on various document images with a variety of distortions demonstrate the robustness and effectiveness of the proposed rectification methodology using a consistent evaluation methodology that encounters OCR accuracy and a newly introduced measure using a semi-automatic procedure. PMID:20876019

Stamatopoulos, Nikolaos; Gatos, Basilis; Pratikakis, Ioannis; Perantonis, Stavros J

2011-04-01

104

Quantifying biodiversity using digital cameras and automated image analysis.  

NASA Astrophysics Data System (ADS)

Monitoring the effects on biodiversity of extensive grazing in complex semi-natural habitats is labour intensive. There are also concerns about the standardization of semi-quantitative data collection. We have chosen to focus initially on automating the most time consuming aspect - the image analysis. The advent of cheaper and more sophisticated digital camera technology has lead to a sudden increase in the number of habitat monitoring images and information that is being collected. We report on the use of automated trail cameras (designed for the game hunting market) to continuously capture images of grazer activity in a variety of habitats at Moor House National Nature Reserve, which is situated in the North of England at an average altitude of over 600m. Rainfall is high, and in most areas the soil consists of deep peat (1m to 3m), populated by a mix of heather, mosses and sedges. The cameras have been continuously in operation over a 6 month period, daylight images are in full colour and night images (IR flash) are black and white. We have developed artificial intelligence based methods to assist in the analysis of the large number of images collected, generating alert states for new or unusual image conditions. This paper describes the data collection techniques, outlines the quantitative and qualitative data collected and proposes online and offline systems that can reduce the manpower overheads and increase focus on important subsets in the collected data. By converting digital image data into statistical composite data it can be handled in a similar way to other biodiversity statistics thus improving the scalability of monitoring experiments. Unsupervised feature detection methods and supervised neural methods were tested and offered solutions to simplifying the process. Accurate (85 to 95%) categorization of faunal content can be obtained, requiring human intervention for only those images containing rare animals or unusual (undecidable) conditions, and enabling automatic deletion of images generated by erroneous triggering (e.g. cloud movements). This is the first step to a hierarchical image processing framework, where situation subclasses such as birds or climatic conditions can be fed into more appropriate automated or semi-automated data mining software.

Roadknight, C. M.; Rose, R. J.; Barber, M. L.; Price, M. C.; Marshall, I. W.

2009-04-01

105

The Atlases of Vesta derived from Dawn Framing Camera images  

NASA Astrophysics Data System (ADS)

The Dawn Framing Camera acquired during its two HAMO (High Altitude Mapping Orbit) phases in 2011 and 2012 about 6,000 clear filter images with a resolution of about 60 m/pixel. We combined these images in a global ortho-rectified mosaic of Vesta (60 m/pixel resolution). Only very small areas near the northern pole were still in darkness and are missing in the mosaic. The Dawn Framing Camera also acquired about 10,000 high-resolution clear filter images (about 20 m/pixel) of Vesta during its Low Altitude Mapping Orbit (LAMO). Unfortunately, the northern part of Vesta was still in darkness during this phase, good illumination (incidence angle < 70°) was only available for 66.8 % of the surface [1]. We used the LAMO images to calculate another global mosaic of Vesta, this time with 20 m/pixel resolution. Both global mosaics were used to produce atlases of Vesta: a HAMO atlas with 15 tiles at a scale of 1:500,000 and a LAMO atlas with 30 tiles at a scale between 1:200,000 and 1:225,180. The nomenclature used in these atlases is based on names and places historically associated with the Roman goddess Vesta, and is compliant with the rules of the IAU. 65 names for geological features were already approved by the IAU, 39 additional names are currently under review. Selected examples of both atlases will be shown in this presentation. Reference: [1]Roatsch, Th., etal., High-resolution Vesta Low Altitude Mapping Orbit Atlas derived from Dawn Framing Camera images. Planetary and Space Science (2013), http://dx.doi.org/10.1016/j.pss.2013.06.024i

Roatsch, T.; Kersten, E.; Matz, K.; Preusker, F.; Scholten, F.; Jaumann, R.; Raymond, C. A.; Russell, C. T.

2013-12-01

106

Imaging of Venus from Galileo: Early results and camera performance  

USGS Publications Warehouse

Three images of Venus have been returned so far by the Galileo spacecraft following an encounter with the planet on UT February 10, 1990. The images, taken at effective wavelengths of 4200 and 9900 A??, characterize the global motions and distribution of haze near the Venus cloud tops and, at the latter wavelength, deep within the main cloud. Previously undetected markings are clearly seen in the near-infrared image. The global distribution of these features, which have maximum contrasts of 3%, is different from that recorded at short wavelengths. In particular, the "polar collar," which is omnipresent in short wavelength images, is absent at 9900 A??. The maximum contrast in the features at 4200 A?? is about 20%. The optical performance of the camera is described and is judged to be nominal. ?? 1992.

Belton, M.J.S.; Gierasch, P.; Klaasen, K.P.; Anger, C.D.; Carr, M.H.; Chapman, C.R.; Davies, M.E.; Greeley, R.; Greenberg, R.; Head, J.W.; Neukum, G.; Pilcher, C.B.; Veverka, J.; Fanale, F.P.; Ingersoll, A.P.; Pollock, J.B.; Morrison, D.; Clary, M.C.; Cunningham, W.; Breneman, H.

1992-01-01

107

A two-camera imaging system for pest detection and aerial application  

Technology Transfer Automated Retrieval System (TEKTRAN)

This presentation reports on the design and testing of an airborne two-camera imaging system for pest detection and aerial application assessment. The system consists of two digital cameras with 5616 x 3744 effective pixels. One camera captures normal color images with blue, green and red bands, whi...

108

Image-intensifier camera studies of shocked metal surfaces  

SciTech Connect

A high-space-resolution image-intensifier camera with luminance gain of up to 5000 and exposure times as short as 30 ns has been applied to the study of the interaction of posts and welds with strongly shocked metal surfaces, which included super strong steels. The time evolution of a single experiment can be recorded by multiple pulsing of the camera. Phenomena that remain coherent for relatively long durations have been observed. An important feature of the hydrodynamic flow resulting from post-plate interactions is the creation of a wave that propagates outward on the plate; the flow blocks the explosive product gases from escaping through the plate for greater than 10 ..mu..s. Electron beam welds were ineffective in blocking product gases from escaping for even short periods of time.

Engelke, R.P.; Thurston, R.S.

1986-01-01

109

Segmentation of Camera Captured Business Card Images for Mobile Devices  

E-print Network

Due to huge deformation in the camera captured images, variety in nature of the business cards and the computational constraints of the mobile devices, design of an efficient Business Card Reader (BCR) is challenging to the researchers. Extraction of text regions and segmenting them into characters is one of such challenges. In this paper, we have presented an efficient character segmentation technique for business card images captured by a cell-phone camera, designed in our present work towards developing an efficient BCR. At first, text regions are extracted from the card images and then the skewed ones are corrected using a computationally efficient skew correction technique. At last, these skew corrected text regions are segmented into lines and characters based on horizontal and vertical histogram. Experiments show that the present technique is efficient and applicable for mobile devices, and the mean segmentation accuracy of 97.48% is achieved with 3 mega-pixel (500-600 dpi) images. It takes only 1.1 se...

Mollah, Ayatullah Faruk; Nasipuri, Mita

2011-01-01

110

MECHANICAL ADVANCING HANDLE THAT SIMPLIFIES MINIRHIZOTRON CAMERA REGISTRATION AND IMAGE COLLECTION  

EPA Science Inventory

Minirkizotrons in conjunction with a minirkizotron video camera system are becoming widely used tools for investigating root production and survical in a variety of ecosystems. Image collection with a minirhizotron camera can be time consuming and tedious particularly when hundre...

111

Superresolution imaging in optical tweezers using high-speed cameras.  

PubMed

High-speed cameras are reliable alternatives for the direct characterization of optical trap force and particle motion in optical tweezers setups, replacing indirect motion measurements often performed by quadrant detectors. In the present approach, subpixel motion data of the trapped particle is retrieved from a high-speed low-resolution video sequence. Due to the richness structure of motion diversity of microscopic trapped particles, which are subjected to a Brownian motion, we propose to also use the obtained motion information for tackling the inherent lack of resolution by applying superresolution algorithms on the low-resolution image sequence. The obtained results both for trapping calibration beads and for living bacteria show that the proposed approach allows the proper characterization of the optical tweezers by obtaining the real particle motion directly from the image domain, while still providing high resolution imaging. PMID:20389339

Staforelli, Juan Pablo; Vera, Esteban; Brito, José Manuel; Solano, Pablo; Torres, Sergio; Saavedra, Carlos

2010-02-15

112

Frequency Identification of Vibration Signals Using Video Camera Image Data  

PubMed Central

This study showed that an image data acquisition system connecting a high-speed camera or webcam to a notebook or personal computer (PC) can precisely capture most dominant modes of vibration signal, but may involve the non-physical modes induced by the insufficient frame rates. Using a simple model, frequencies of these modes are properly predicted and excluded. Two experimental designs, which involve using an LED light source and a vibration exciter, are proposed to demonstrate the performance. First, the original gray-level resolution of a video camera from, for instance, 0 to 256 levels, was enhanced by summing gray-level data of all pixels in a small region around the point of interest. The image signal was further enhanced by attaching a white paper sheet marked with a black line on the surface of the vibration system in operation to increase the gray-level resolution. Experimental results showed that the Prosilica CV640C CMOS high-speed camera has the critical frequency of inducing the false mode at 60 Hz, whereas that of the webcam is 7.8 Hz. Several factors were proven to have the effect of partially suppressing the non-physical modes, but they cannot eliminate them completely. Two examples, the prominent vibration modes of which are less than the associated critical frequencies, are examined to demonstrate the performances of the proposed systems. In general, the experimental data show that the non-contact type image data acquisition systems are potential tools for collecting the low-frequency vibration signal of a system. PMID:23202026

Jeng, Yih-Nen; Wu, Chia-Hung

2012-01-01

113

High-contrast imaging with a self-coherent camera  

NASA Astrophysics Data System (ADS)

Direct imaging of exoplanets is very challenging because the planet is 104 to 1010 fainter than the star at a separation of a fraction of arcsec. Several coronagraphs have been proposed to reduce the contrast ratio but their performance strongly depends on the level of phase and amplitude aberrations that induce speckles in the science image. An active control of the aberrations and a posteriori calibration are thus required to reach very high contrasts. Classical adaptive optics are not sufficient for this purpose because of non-common path aberrations. Our team proposed a self-coherent camera that spatially modulates the speckles in the science image. It is then possible to both actively control a deformable mirror and calibrate the residuals a posteriori. The current paper is an overview of the developments we have been working on for 7 years. We present the principle of the self-coherent camera, laboratory performance obtained in monochromatic light, and upgrades of the technique to make it achromatic.

Galicher, Raphaël.; Mazoyer, Johan; Baudoz, Pierre; Rousset, Gérard

2013-09-01

114

The Herschel/PACS 2560 bolometers imaging camera  

E-print Network

The development program of the flight model imaging camera for the PACS instrument on-board the Herschel spacecraft is nearing completion. This camera has two channels covering the 60 to 210 microns wavelength range. The focal plane of the short wavelength channel is made of a mosaic of 2x4 3-sides buttable bolometer arrays (16x16 pixels each) for a total of 2048 pixels, while the long wavelength channel has a mosaic of 2 of the same bolometer arrays for a total of 512 pixels. The 10 arrays have been fabricated, individually tested and integrated in the photometer. They represent the first filled arrays of fully collectively built bolometers with a cold multiplexed readout, allowing for a properly sampled coverage of the full instrument field of view. The camera has been fully characterized and the ground calibration campaign will take place after its delivery to the PACS consortium in mid 2006. The bolometers, working at a temperature of 300 mK, have a NEP close to the BLIP limit and an optical bandwidth of 4 to 5 Hz that will permit the mapping of large sky areas. This paper briefly presents the concept and technology of the detectors as well as the cryocooler and the warm electronics. Then we focus on the performances of the integrated focal planes (responsivity, NEP, low frequency noise, bandwidth).

Nicolas Billot; Patrick Agnese; Jean-Louis Augueres; Alain Beguin; Andre Bouere; Olivier Boulade; Christophe Cara; Christelle Cloue; Eric Doumayrou; Lionel Duband; Benoit Horeau; Isabelle Le Mer; Jean Le Pennec; Jerome Martignac; Koryo Okumura; Vincent Reveret; Marc Sauvage; Francois Simoens; Laurent Vigroux

2006-06-26

115

Noise evaluation of Compton camera imaging for proton therapy.  

PubMed

Compton Cameras emerged as an alternative for real-time dose monitoring techniques for Particle Therapy (PT), based on the detection of prompt-gammas. As a consequence of the Compton scattering process, the gamma origin point can be restricted onto the surface of a cone (Compton cone). Through image reconstruction techniques, the distribution of the gamma emitters can be estimated, using cone-surfaces backprojections of the Compton cones through the image space, along with more sophisticated statistical methods to improve the image quality. To calculate the Compton cone required for image reconstruction, either two interactions, the last being photoelectric absorption, or three scatter interactions are needed. Because of the high energy of the photons in PT the first option might not be adequate, as the photon is not absorbed in general. However, the second option is less efficient. That is the reason to resort to spectral reconstructions, where the incoming ? energy is considered as a variable in the reconstruction inverse problem. Jointly with prompt gamma, secondary neutrons and scattered photons, not strongly correlated with the dose map, can also reach the imaging detector and produce false events. These events deteriorate the image quality. Also, high intensity beams can produce particle accumulation in the camera, which lead to an increase of random coincidences, meaning events which gather measurements from different incoming particles. The noise scenario is expected to be different if double or triple events are used, and consequently, the reconstructed images can be affected differently by spurious data. The aim of the present work is to study the effect of false events in the reconstructed image, evaluating their impact in the determination of the beam particle ranges. A simulation study that includes misidentified events (neutrons and random coincidences) in the final image of a Compton Telescope for PT monitoring is presented. The complete chain of detection, from the beam particle entering a phantom to the event classification, is simulated using FLUKA. The range determination is later estimated from the reconstructed image obtained from a two and three-event algorithm based on Maximum Likelihood Expectation Maximization. The neutron background and random coincidences due to a therapeutic-like time structure are analyzed for mono-energetic proton beams. The time structure of the beam is included in the simulations, which will affect the rate of particles entering the detector. PMID:25658644

Ortega, P G; Torres-Espallardo, I; Cerutti, F; Ferrari, A; Gillam, J E; Lacasta, C; Llosá, G; Oliver, J F; Sala, P R; Solevi, P; Rafecas, M

2015-02-21

116

A comparison of select image-compression algorithms for an electronic still camera  

NASA Technical Reports Server (NTRS)

This effort is a study of image-compression algorithms for an electronic still camera. An electronic still camera can record and transmit high-quality images without the use of film, because images are stored digitally in computer memory. However, high-resolution images contain an enormous amount of information, and will strain the camera's data-storage system. Image compression will allow more images to be stored in the camera's memory. For the electronic still camera, a compression algorithm that produces a reconstructed image of high fidelity is most important. Efficiency of the algorithm is the second priority. High fidelity and efficiency are more important than a high compression ratio. Several algorithms were chosen for this study and judged on fidelity, efficiency and compression ratio. The transform method appears to be the best choice. At present, the method is compressing images to a ratio of 5.3:1 and producing high-fidelity reconstructed images.

Nerheim, Rosalee

1989-01-01

117

Embedded image enhancement for high-throughput cameras  

NASA Astrophysics Data System (ADS)

This paper presents image enhancement for a novel Ultra-High-Definition (UHD) video camera offering 4K images and higher. Conventional image enhancement techniques need to be reconsidered for the high-resolution images and the low-light sensitivity of the new sensor. We study two image enhancement functions and evaluate and optimize the algorithms for embedded implementation in programmable logic (FPGA). The enhancement study involves high-quality Auto White Balancing (AWB) and Local Contrast Enhancement (LCE). We have compared multiple algorithms from literature, both with objective and subjective metrics. In order to objectively compare Local Contrast (LC), an existing LC metric is modified for LC measurement in UHD images. For AWB, we have found that color histogram stretching offers a subjective high image quality and it is among the algorithms with the lowest complexity, while giving only a small balancing error. We impose a color-to-color gain constraint, which improves robustness of low-light images. For local contrast enhancement, a combination of contrast preserving gamma and single-scale Retinex is selected. A modified bilateral filter is designed to prevent halo artifacts, while significantly reducing the complexity and simultaneously preserving quality. We show that by cascading contrast preserving gamma and single-scale Retinex, the visibility of details is improved towards the level appropriate for high-quality surveillance applications. The user is offered control over the amount of enhancement. Also, we discuss the mapping of those functions on a heterogeneous platform to come to an effective implementation while preserving quality and robustness.

Geerts, Stan J. C.; Cornelissen, Dion; de With, Peter H. N.

2014-03-01

118

Measurement of Space Variant PSF for Restoring Degraded Images by Security Cameras  

Microsoft Academic Search

Images recorded by a security camera have often severely degraded due to dirty lens or secular distortion of the recording system. To restore these images, the fully determination of space variant point spread function (PSF) is required. To measure PSF, we used a liquid crystal display. We made some experiment to restore the images by a CCD camera with intentionally

Tadashi Ito; Y. Fujii; N. Ohta; S. Saitoh; T. Matsuura; T. Yamamoto

2006-01-01

119

LROC NAC Photometry as a Tool for Studying Physical and Compositional Properties of the Lunar Surface  

NASA Astrophysics Data System (ADS)

LROC NAC photometry has been used to study the effects of rocket exhaust on lunar soil properties, and here we apply the same photometric methods to place compositional constraints on regions of silicic volcanism and pure anorthosite on the Moon.

Clegg, R. N.; Jolliff, B. L.; Boyd, A. K.; Stopar, J. D.; Sato, H.; Robinson, M. S.; Hapke, B. W.

2014-10-01

120

A Hybrid Camera for simultaneous imaging of gamma and optical photons  

NASA Astrophysics Data System (ADS)

We present a new concept for a medical imaging system, the Hybrid Mini Gamma Camera (HMGC). This combines an optical and a gamma-ray camera in a co-aligned configuration that offers high spatial resolution multi-modality imaging for superimposition of a scintigraphic image on an optical image. This configuration provides visual identification of the sites of localisation of radioactivity that would be especially suited to medical imaging. An extension of the new concept using two hybrid cameras (The StereoScope) offers the potential for stereoscopic imaging with depth estimation for a gamma emitting source.

Lees, J. E.; Bassford, D. J.; Blake, O. E.; Blackshaw, P. E.; Perkins, A. C.

2012-06-01

121

Distant Supernova Remnant Imaged by Chandra's High Resolution Camera  

NASA Astrophysics Data System (ADS)

The High Resolution Camera (HRC), one of the two X-ray cameras on NASA's Chandra X-ray Observatory, was placed into the focus for the first time on Monday, August 30. The first target was LMC X-1, a point-like source of X rays in the Large Magellanic Cloud. The Large Magellanic Cloud, a companion galaxy to the Milky Way, is 160,000 light years from Earth. After checking the focus with LMC X-1, Chandra observed N132D, a remnant of an exploded star in the Large Magellanic Cloud. "These were preliminary test observations," emphasized Dr. Stephen Murray, of the Harvard-Smithsonian Center for Astrophysics, principal investigator for the High Resolution Camera. "But we are very pleased with the results. All indications are that the HRC will produce X-ray images of unprecedented clarity." The N132D image shows a highly structured remnant, or shell, of 10-million-degree gas that is 80 light years across. Such a shell in the vicinity of the Sun would encompass more than fifty nearby stars. The amount of material in the N132D hot gas remnant is equal to that of 600 suns. The N132D supernova remnant appears to be colliding with a giant molecular cloud, which produces the brightening on the southern rim of the remnant. The molecular cloud, visible with a radio telescope, has the mass of 300,000 suns. The relatively weak x-radiation on the upper left shows that the shock wave is expanding into a less dense region on the edge of the molecular cloud. A number of small circular structures are visible in the central regions and a hint of a large circular loop can be seen in the upper part of the remnant. Whether the peculiar shape of the supernova remnant can be fully explained in terms of these effects, or whether they point to a peculiar cylindrically shaped explosion remains to be seen. -more- "The image is so rich in structure that it will take a while to sort out what is really going on," Murray said. "It could be multiple supernovas, or absorbing clouds in the vicinity of the supernova." The unique capabilities of the HRC stem from the close match of its imaging capability to the focusing power of the mirrors. When used with the Chandra mirrors, the HRC will make images that reveal detail as small as one-half an arc second. This is equivalent to the ability to read a stop sign at a distance of twelve miles. The checkout period for the HRC will continue for the next few weeks, during which time the team expects to acquire images of other supernova remnants, star clusters, and starburst galaxies. To follow Chandra's progress, visit the Chandra News Web site at: http://chandra.harvard.edu AND http://chandra.nasa.gov NASA's Marshall Space Flight Center in Huntsville, Alabama, manages the Chandra X-ray Observatory for NASA's Office of Space Science, NASA Headquarters, Washington, D.C. The Smithsonian Astrophysical Observatory's Chandra X-ray Center in Cambridge, Mass., manages the Chandra science program and controls the observatory for NASA. TRW Space and Electronics Group of Redondo Beach, Calif., leads the contractor team that built Chandra. High resolution digital versions of the X-ray image (300 dpi JPG, TIFF) and other information associated with this release are available on the Internet at: http://chandra.harvard.edu/photo/0050/ or via links in: http://chandra.harvard.edu

1999-09-01

122

Accurate camera calibration with distortion models using sphere images  

NASA Astrophysics Data System (ADS)

In order to improve the accuracy of sphere images based camera calibration, a novel approach was proposed in this paper which can calibrate both linear parameters and distortion coefficients simultaneously. The great axis and bitangent lines of the projection conics are applied to solve principal point and sphere center projections. Then the focal length is computed using rotational symmetry of projective cone. Finally distortion coefficients are estimated by optimization search algorithm. Synthetic data experiments analyzed the main error factors and real data results showed that the re-projection error of the proposed method was less than 0.1 pixel which seems to be more accurate than existing methods using spheres and reach the same accuracy level as the planar target method.

Sun, Junhua; Chen, Xu; Gong, Zheng; Liu, Zhen; Zhao, Yuntao

2015-01-01

123

Extreme ultra-violet movie camera for imaging microsecond time scale magnetic reconnection  

SciTech Connect

An ultra-fast extreme ultra-violet (EUV) movie camera has been developed for imaging magnetic reconnection in the Caltech spheromak/astrophysical jet experiment. The camera consists of a broadband Mo:Si multilayer mirror, a fast decaying YAG:Ce scintillator, a visible light block, and a high-speed visible light CCD camera. The camera can capture EUV images as fast as 3.3 × 10{sup 6} frames per second with 0.5 cm spatial resolution. The spectral range is from 20 eV to 60 eV. EUV images reveal strong, transient, highly localized bursts of EUV radiation when magnetic reconnection occurs.

Chai, Kil-Byoung; Bellan, Paul M. [Applied Physics, Caltech, 1200 E. California Boulevard, Pasadena, California 91125 (United States)] [Applied Physics, Caltech, 1200 E. California Boulevard, Pasadena, California 91125 (United States)

2013-12-15

124

Extreme ultra-violet movie camera for imaging microsecond time scale magnetic reconnection  

NASA Astrophysics Data System (ADS)

An ultra-fast extreme ultra-violet (EUV) movie camera has been developed for imaging magnetic reconnection in the Caltech spheromak/astrophysical jet experiment. The camera consists of a broadband Mo:Si multilayer mirror, a fast decaying YAG:Ce scintillator, a visible light block, and a high-speed visible light CCD camera. The camera can capture EUV images as fast as 3.3 × 106 frames per second with 0.5 cm spatial resolution. The spectral range is from 20 eV to 60 eV. EUV images reveal strong, transient, highly localized bursts of EUV radiation when magnetic reconnection occurs.

Chai, Kil-Byoung; Bellan, Paul M.

2013-12-01

125

Coded-aperture Compton camera for gamma-ray imaging  

NASA Astrophysics Data System (ADS)

This dissertation describes the development of a novel gamma-ray imaging system concept and presents results from Monte Carlo simulations of the new design. Current designs for large field-of-view gamma cameras suitable for homeland security applications implement either a coded aperture or a Compton scattering geometry to image a gamma-ray source. Both of these systems require large, expensive position-sensitive detectors in order to work effectively. By combining characteristics of both of these systems, a new design can be implemented that does not require such expensive detectors and that can be scaled down to a portable size. This new system has significant promise in homeland security, astronomy, botany and other fields, while future iterations may prove useful in medical imaging, other biological sciences and other areas, such as non-destructive testing. A proof-of-principle study of the new gamma-ray imaging system has been performed by Monte Carlo simulation. Various reconstruction methods have been explored and compared. General-Purpose Graphics-Processor-Unit (GPGPU) computation has also been incorporated. The resulting code is a primary design tool for exploring variables such as detector spacing, material selection and thickness and pixel geometry. The advancement of the system from a simple 1-dimensional simulation to a full 3-dimensional model is described. Methods of image reconstruction are discussed and results of simulations consisting of both a 4 x 4 and a 16 x 16 object space mesh have been presented. A discussion of the limitations and potential areas of further study is also presented.

Farber, Aaron M.

126

Progress beyond ISIS: combined triple-ISIS camera, video trigger, and terraced image sensor  

NASA Astrophysics Data System (ADS)

In 2001, we developed a video camera of 1,000,000 fps with an in-situ storage image sensor (ISIS). The performance is briefly explained at first. We are now developing innovative technologies to provide the ultra-high-speed video camera with higher level of performance and more useful functions, including the combined triple-ISIS camera, the built-in video trigger system, and the terraced image sensor. Their concepts are explained together with the expected performance.

Etoh, Takeharu G.

2003-07-01

127

Progress beyond ISIS: combined triple-ISIS camera, video trigger, and terraced image sensor  

Microsoft Academic Search

In 2001, we developed a video camera of 1,000,000 fps with an in-situ storage image sensor (ISIS). The performance is briefly explained at first. We are now developing innovative technologies to provide the ultra-high-speed video camera with higher level of performance and more useful functions, including the combined triple-ISIS camera, the built-in video trigger system, and the terraced image sensor.

Takeharu G. Etoh

2003-01-01

128

Imaging Cameras Imagers can be put at almost any focus, but  

E-print Network

400x400 #12;Pixels/Camera #12;Direct Imaging · Filter systems ­ Photometry · Point sources ­ Aperture ­ PSF fitting · Extended sources (surface photometry) · Star-galaxy separation #12;Filter Systems and the bandpass is affected by temperature and beam speed: CWL = 1�/5°C CWL = 17�; f/13 f/2.8 #12;Photometry

Masci, Frank

129

A CCD CAMERA-BASED HYPERSPECTRAL IMAGING SYSTEM FOR STATIONARY AND AIRBORNE APPLICATIONS  

Technology Transfer Automated Retrieval System (TEKTRAN)

This paper describes a charge coupled device (CCD) camera-based hyperspectral imaging system designed for both stationary and airborne remote sensing applications. The system consists of a high performance digital CCD camera, an imaging spectrograph, an optional focal plane scanner, and a PC comput...

130

Importance of Developing Image Restoration Techniques for Security Cameras under Severe Conditions  

Microsoft Academic Search

A concept, which was proposed and has been pursued by the authors for restoring the images of a suspected person taken by a security camera, is reviewed. In the concept, the peculiar facts concerning the security camera system that all the things in the image except the suspected person itself are usually preserved and that can be used for investigations

Y. Fujii; T. Ito; N. Ohta; S. Saitoh; T. Matsuura; T. Yamamoto

2006-01-01

131

Image Restoration for Security Cameras with Dirty Lens under Oblique Illumination  

Microsoft Academic Search

An approach for restoring the images of a suspected person taken by a security camera with dirty lens is proposed. In the approach, the peculiar facts concerning the security camera system that all the things in the image except the suspected person itself are usually preserved and that can be used for investigations are to be used as fully as

Yusaku Fujii; Naoya Ohta; Tadashi Ito; Saburou Saitoh; Tsutomu Matsuura; Takao Yamamoto

2006-01-01

132

Experimental and modeling studies of imaging with curvilinear electronic eye cameras  

E-print Network

Experimental and modeling studies of imaging with curvilinear electronic eye cameras Viktor of the imaging properties of planar, hemispherical, and elliptic parabolic electronic eye cameras are compared.-J. Yu, J. B. Geddes 3rd, J. Xiao, S. Wang, Y. Huang, and J. A. Rogers, "A hemispherical electronic eye

Rogers, John A.

133

ATR/OTR-SY Tank Camera Purge System and in Tank Color Video Imaging System  

SciTech Connect

This procedure will document the satisfactory operation of the 101-SY tank Camera Purge System (CPS) and 101-SY in tank Color Camera Video Imaging System (CCVIS). Included in the CPRS is the nitrogen purging system safety interlock which shuts down all the color video imaging system electronics within the 101-SY tank vapor space during loss of nitrogen purge pressure.

Werry, S.M.

1995-06-06

134

Watermarking of raw digital images in camera firmware: embedding and detection  

Microsoft Academic Search

\\u000a In this paper we investigate ‘real-time’ watermarking of single-sensor digital camera images (often called ‘raw’ images) and\\u000a blind watermark detection in demosaicked images. We describe the software-only implementation of simple additive spread-spectrum\\u000a embedding in the firmware of a digital camera. For blind watermark detection, we develop a scheme which adaptively combines\\u000a the polyphase components of the demosaicked image, taking advantage

Peter~Meerwald

2009-01-01

135

An improved wide-field camera for imaging Earth's plasmasphere at 30.4 nm  

NASA Astrophysics Data System (ADS)

The Extreme Ultraviolet Imager (IMAGE/EUV) aboard NASA's IMAGE mission studied the distribution of singly ionized helium (He+) in the Earth's plasmasphere by imaging its emission at 30.4 nm. This instrument consisted of three separate camera heads, each with a 28° field-of-view, with 0.6°resolution. We describe an improved imaging system that can simultaneously image a 40° field-of-view with 0.45° resolution utilizing only one compact camera head and detector. This improved imager also increases sensitivity over the heritage EUV imager by a factor of four due to improvements in optical coatings, detector technology, and a larger entrance aperture.

Davis, Michael W.; Gladstone, G. Randall; Goldstein, Jerry; Sandel, Bill R.; Greathouse, Thomas K.; Retherford, Kurt D.; Winters, Gregory S.

2013-09-01

136

Toward an image compression algorithm for the high-resolution electronic still camera  

NASA Technical Reports Server (NTRS)

Taking pictures with a camera that uses a digital recording medium instead of film has the advantage of recording and transmitting images without the use of a darkroom or a courier. However, high-resolution images contain an enormous amount of information and strain data-storage systems. Image compression will allow multiple images to be stored in the High-Resolution Electronic Still Camera. The camera is under development at Johnson Space Center. Fidelity of the reproduced image and compression speed are of tantamount importance. Lossless compression algorithms are fast and faithfully reproduce the image, but their compression ratios will be unacceptably low due to noise in the front end of the camera. Future efforts will include exploring methods that will reduce the noise in the image and increase the compression ratio.

Nerheim, Rosalee

1989-01-01

137

Analysis of a multiple reception model for processing images from the solid-state imaging camera  

NASA Technical Reports Server (NTRS)

A detection model to identify the presence of Galileo optical communications from an Earth-based Transmitter (GOPEX) signal by processing multiple signal receptions extracted from the camera images is described. The model decomposes a multi-signal reception camera image into a set of images so that the location of the pixel being illuminated is known a priori and the laser can illuminate only one pixel at each reception instance. Numerical results show that if effects on the pointing error due to atmospheric refraction can be controlled to between 20 to 30 microrad, the beam divergence of the GOPEX laser should be adjusted to be between 30 to 40 microrad when the spacecraft is 30 million km away from Earth. Furthermore, increasing beyond 5 the number of receptions for processing will not produce a significant detection probability advantage.

Yan, T.-Y.

1991-01-01

138

A novel ultra-high speed camera for digital image processing applications  

NASA Astrophysics Data System (ADS)

Multi-channel gated-intensified cameras are commonly used for capturing images at ultra-high frame rates. The use of image intensifiers reduces the image resolution and increases the error in applications requiring high-quality images, such as digital image correlation. We report the development of a new type of non-intensified multi-channel camera system that permits recording of image sequences at ultra-high frame rates at the native resolution afforded by the imaging optics and the cameras used. This camera system is based upon the concept of using a sequence of short-duration light pulses of different wavelengths for illumination and using wavelength selective elements in the imaging system to route each particular wavelength of light to a particular camera. As such, the duration of the light pulses controls the exposure time and the timing of the light pulses controls the interframe time. A prototype camera system built according to this concept comprises four dual-frame cameras synchronized with four dual-cavity pulsed lasers producing 5 ns pulses in four different wavelengths. The prototype is capable of recording four-frame full-resolution image sequences at frame rates up to 200 MHz and eight-frame image sequences at frame rates up to 8 MHz. This system is built around a stereo microscope to capture stereoscopic image sequences usable for 3D digital image correlation. The camera system is used for imaging the chip-workpiece interface area during high speed machining, and the images are used to map the strain rate in the primary shear zone.

Hijazi, A.; Madhavan, V.

2008-08-01

139

Development of filter exchangeable 3CCD camera for multispectral imaging acquisition  

NASA Astrophysics Data System (ADS)

There are a lot of methods to acquire multispectral images. Dynamic band selective and area-scan multispectral camera has not developed yet. This research focused on development of a filter exchangeable 3CCD camera which is modified from the conventional 3CCD camera. The camera consists of F-mounted lens, image splitter without dichroic coating, three bandpass filters, three image sensors, filer exchangeable frame and electric circuit for parallel image signal processing. In addition firmware and application software have developed. Remarkable improvements compared to a conventional 3CCD camera are its redesigned image splitter and filter exchangeable frame. Computer simulation is required to visualize a pathway of ray inside of prism when redesigning image splitter. Then the dimensions of splitter are determined by computer simulation which has options of BK7 glass and non-dichroic coating. These properties have been considered to obtain full wavelength rays on all film planes. The image splitter is verified by two line lasers with narrow waveband. The filter exchangeable frame is designed to make swap bandpass filters without displacement change of image sensors on film plane. The developed 3CCD camera is evaluated to application of detection to scab and bruise on Fuji apple. As a result, filter exchangeable 3CCD camera could give meaningful functionality for various multispectral applications which need to exchange bandpass filter.

Lee, Hoyoung; Park, Soo Hyun; Kim, Moon S.; Noh, Sang Ha

2012-05-01

140

High Performance Imaging Streak Camera for the National Ignition Facility  

SciTech Connect

An x-ray streak camera platform has been characterized and implemented for use at the National Ignition Facility. The camera has been modified to meet the experiment requirements of the National Ignition Campaign and to perform reliably in conditions that produce high EMI. A train of temporal UV timing markers has been added to the diagnostic in order to calibrate the temporal axis of the instrument and the detector efficiency of the streak camera was improved by using a CsI photocathode. The performance of the streak camera has been characterized and is summarized in this paper. The detector efficiency and cathode measurements are also presented.

Opachich, Y. P. [LLNL; Kalantar, D. [LLNL; MacPhee, A. [LLNL; Holder, J. [LLNL; Kimbrough, J. [LLNL; Bell, P. M. [LLNL; Bradley, D. [LLNL; Hatch, B. [LLNL; Brown, C. [LLNL; Landen, O. [LLNL; Perfect, B. H. [LLNL, HMC; Guidry, B. [LLNL; Mead, A. [NSTec; Charest, M. [NSTec; Palmer, N. [LLNL; Homoelle, D. [LLNL; Browning, D. [LLNL; Silbernagel, C. [NSTec; Brienza-Larsen, G. [NSTec; Griffin, M. [NSTec; Lee, J. J. [NSTec; Haugh, M. J. [NSTec

2012-12-01

141

High performance imaging streak camera for the National Ignition Facility  

NASA Astrophysics Data System (ADS)

An x-ray streak camera platform has been characterized and implemented for use at the National Ignition Facility. The camera has been modified to meet the experiment requirements of the National Ignition Campaign and to perform reliably in conditions that produce high electromagnetic interference. A train of temporal ultra-violet timing markers has been added to the diagnostic in order to calibrate the temporal axis of the instrument and the detector efficiency of the streak camera was improved by using a CsI photocathode. The performance of the streak camera has been characterized and is summarized in this paper. The detector efficiency and cathode measurements are also presented.

Opachich, Y. P.; Kalantar, D. H.; MacPhee, A. G.; Holder, J. P.; Kimbrough, J. R.; Bell, P. M.; Bradley, D. K.; Hatch, B.; Brienza-Larsen, G.; Brown, C.; Brown, C. G.; Browning, D.; Charest, M.; Dewald, E. L.; Griffin, M.; Guidry, B.; Haugh, M. J.; Hicks, D. G.; Homoelle, D.; Lee, J. J.; Mackinnon, A. J.; Mead, A.; Palmer, N.; Perfect, B. H.; Ross, J. S.; Silbernagel, C.; Landen, O.

2012-12-01

142

A New Lunar Atlas: Mapping the Moon with the Wide Angle Camera  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter (LRO) spacecraft launched in June 2009 and began systematically mapping the lunar surface and providing a priceless dataset for the planetary science community and future mission planners. From 20 September 2009 to 11 December 2011, the spacecraft was in a nominal 50 km polar orbit, except for two one-month long periods when a series of spacecraft maneuvers enabled low attitude flyovers (as low as 22 km) of key exploration and scientifically interesting targets. One of the instruments, the Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) [1], captured nearly continuous synoptic views of the illuminated lunar surface. The WAC is a 7-band (321, 360, 415, 566, 604, 643, 689 nm) push frame imager with field of view of 60° in color mode and 90° in monochrome mode. This broad field of view enables the WAC to reimage nearly 50% (at the equator where the orbit tracks our spaced the furthest) of the terrain it imaged in the previous orbit. The visible bands of map projected WAC images have a pixel scale of 100 m, while UV bands have a pixel scale of 400 m due to 4x4 pixel on-chip binning that increases signal-to-noise. The nearly circular polar orbit and short (two hour) orbital periods enable seamless mosaics of broad areas of the surface with uniform lighting and resolution. In March of 2011, the LROC team released the first version of the global monochrome (643nm) morphologic map [2], which was comprised of 15,000 WAC images collected over three periods. With the over 130,000 WAC images collected while the spacecraft was in the 50 km orbit, a new set of mosaics are being produced by the LROC Team and will be released to the Planetary Data Systems. These new maps include an updated morphologic map with an improved set of images (limiting illumination variations and gores due to off-nadir observation of other instruments) and a new photometric correction derived from the LROC WAC dataset. In addition, a higher sun (lower incidence angle) mosaic will also be released. This map has minimal shadows and highlights albedo differences. In addition, seamless regional WAC mosaics acquired under multiple lighting geometries (Sunlight coming from the East, overhead, and West) will also be produced for key areas of interest. These new maps use the latest terrain model (LROC WAC GLD100) [3], updated spacecraft ephemeris provided by the LOLA team [4], and improved WAC distortion model [5] to provide accurate placement of each WAC pixel on the lunar surface. References: [1] Robinson et al. (2010) Space Sci. Rev. [2] Speyerer et al. (2011) LPSC, #2387. [3] Scholten et al. (2012) JGR. [4] Mazarico et al. (2012) J. of Geodesy [5] Speyerer et al. (2012) ISPRS Congress.

Speyerer, E.; Robinson, M. S.; Boyd, A.; Sato, H.

2012-12-01

143

MEM-FLIM: all-solid-state camera for fluorescence lifetime imaging  

NASA Astrophysics Data System (ADS)

We have built an all-solid-state camera which is directly modulated at the pixel level for frequency domain fluorescence lifetime imaging microscopy (FLIM) measurement. This novel camera eliminates the need for an image intensifier through the use of an application-specific CCD design, which is being used in a frequency domain FLIM system. The first stage of evaluation for the camera has been carried out. Camera characteristics such as noise distribution, dark current influence, camera gain, sampling density, sensitivity, linearity of photometric response, and contrast modulation transfer function have been studied through experiments. We are able to do lifetime measurement using MEM-FLIM cameras for various objects, e.g. fluorescence plastic test slides, fluorescein solution, fixed GFP cells, and GFP - Actin stained live cells.

Zhao, Qiaole; Young, Ian Ted; Schelen, Ben; Schouten, Raymond; van den Oever, Rein; Leenen, René; van Kuijk, Harry; Peters, Inge; Polderdijk, Frank; Bosiers, Jan; Jalink, Kees; de Jong, Sander; van Geest, Bert; Stoop, Karel

2012-03-01

144

Cloud Detection with the Earth Polychromatic Imaging Camera (EPIC)  

NASA Technical Reports Server (NTRS)

The Earth Polychromatic Imaging Camera (EPIC) on board the Deep Space Climate Observatory (DSCOVR) would provide a unique opportunity for Earth and atmospheric research due not only to its Lagrange point sun-synchronous orbit, but also to the potential for synergistic use of spectral channels in both the UV and visible spectrum. As a prerequisite for most applications, the ability to detect the presence of clouds in a given field of view, known as cloud masking, is of utmost importance. It serves to determine both the potential for cloud contamination in clear-sky applications (e.g., land surface products and aerosol retrievals) and clear-sky contamination in cloud applications (e.g., cloud height and property retrievals). To this end, a preliminary cloud mask algorithm has been developed for EPIC that applies thresholds to reflected UV and visible radiances, as well as to reflected radiance ratios. This algorithm has been tested with simulated EPIC radiances over both land and ocean scenes, with satisfactory results. These test results, as well as algorithm sensitivity to potential instrument uncertainties, will be presented.

Meyer, Kerry; Marshak, Alexander; Lyapustin, Alexei; Torres, Omar; Wang, Yugie

2011-01-01

145

Fly's Eye camera system: optical imaging using a hexapod platform  

NASA Astrophysics Data System (ADS)

The Fly's Eye Project is a high resolution, high coverage time-domain survey in multiple optical passbands: our goal is to cover the entire visible sky above the 30° horizontal altitude with a cadence of ~3 min. Imaging is going to be performed by 19 wide-field cameras mounted on a hexapod platform resembling a fly's eye. Using a hexapod developed and built by our team allows us to create a highly fault-tolerant instrument that uses the sky as a reference to define its own tracking motion. The virtual axis of the platform is automatically aligned with the Earth's rotational axis; therefore the same mechanics can be used independently from the geographical location of the device. Its enclosure makes it capable of autonomous observing and withstanding harsh environmental conditions. We briefly introduce the electrical, mechanical and optical design concepts of the instrument and summarize our early results, focusing on sidereal tracking. Due to the hexapod design and hence the construction is independent from the actual location, it is considerably easier to build, install and operate a network of such devices around the world.

Jaskó, Attila; Pál, András.; Vida, Krisztián.; Mészáros, László; Csépány, Gergely; Mez?, György

2014-07-01

146

The Potential of Dual Camera Systems for Multimodal Imaging of Cardiac Electrophysiology and Metabolism  

PubMed Central

Fluorescence imaging has become a common modality in cardiac electrodynamics. A single fluorescent parameter is typically measured. Given the growing emphasis on simultaneous imaging of more than one cardiac variable, we present an analysis of the potential of dual camera imaging, using as an example our straightforward dual camera system that allows simultaneous measurement of two dynamic quantities from the same region of the heart. The advantages of our system over others include an optional software camera calibration routine that eliminates the need for precise camera alignment. The system allows for rapid setup, dichroic image separation, dual-rate imaging, and high spatial resolution, and it is generally applicable to any two-camera measurement. This type of imaging system offers the potential for recording simultaneously not only transmembrane potential and intracellular calcium, two frequently measured quantities, but also other signals more directly related to myocardial metabolism, such as [K+]e, NADH, and reactive oxygen species, leading to the possibility of correlative multimodal cardiac imaging. We provide a compilation of dye and camera information critical to the design of dual camera systems and experiments. PMID:19657065

Holcomb, Mark R.; Woods, Marcella C.; Uzelac, Ilija; Wikswo, John P.; Gilligan, Jonathan M.; Sidorov, Veniamin Y.

2013-01-01

147

First results from the Faint Object Camera - Images of the gravitational lens system G2237 + 0305  

NASA Technical Reports Server (NTRS)

Images of the gravitational lens system G2237 + 0305 have been obtained with the Faint Object Camera on board the Hubble Space Telescope. A preliminary analysis of these images is reported here and includes measurements of the relative positions and magnitudes of the lensed images of the QSO, and of the lensing galaxy. No evidence is found for a fifth lensed image.

Crane, P.; Albrecht, R.; Barbieri, C.; Blades, J. C.; Boksenberg, A.

1991-01-01

148

Compact large FoV gamma camera for breast molecular imaging  

NASA Astrophysics Data System (ADS)

The very low sensitivity of scintimammography for tumours under 1 cm in diameter, with current nuclear medicine cameras, is the major limitation in recommending this test modality for screening purposes. To improve this diagnostic technique,a new concept of scintillation gamma camera, which fits the best requirements for functional breast imaging has been developed under "Integrated Mammographic Imaging" (IMI) project. This camera consists of a large detection head (6?×7?),very compact sized and with light weight to be easily positioned in the same X-ray geometry. The detection head consists of matrix of 42 photodetector Hamamatsu 1 in 2 square H8520-C12 PSPMTs, which are closely packed and coupled to a NaI(Tl) scintillating array, with individual crystal pixel 2×2×6 mm 3 size. Large FoV camera shows a very good pixel identification in the detection dead zones between tubes allowing an accurate LUT correction of the final image reconstruction. Electronic read-out was especially designed to optimize the intrinsic spatial resolution and camera compactness. With respect to Anger camera, the overall spatial resolution is improved up to 40% while the overall energy resolution values is ˜16% at 140 keV. Large FoV dedicated camera was characterized and tested by phantom studies; and clinical trials are currently performed. For all patients, compression views have been acquiring for both breasts in craniocaudal projections, and are compared with standard gamma camera images.

Pani, R.; Cinti, M. N.; Pellegrini, R.; Betti, M.; Devincentis, G.; Bennati, P.; Ridolfi, S.; Iurlaro, G.; Montani, L.; Scafè, R.; Marini, M.; Porfiri, L. M.; Giachetti, G.; Baglini, F.; Salvadori, G.; Madesani, M.; Pieracci, M.; Catarsi, F.; Bigongiari, A.

2006-12-01

149

Sequential Motion and Scene Reconstruction from Image Sequences Captured by a Multi-camera System  

Microsoft Academic Search

This paper deals with structure and motion estimation by using image se- quences captured by a multi-camera system moving in a scene. Single cam- era systems with a relatively small eld of view have limited accuracy be- cause of the inherent confusion between translation and rotation. This is also the case for the stereo camera systems if the systems need

Jae-Hean Kim; Myung Jin; Chung Ü

150

An airborne multispectral imaging system based on two consumer-grade cameras for agricultural remote sensing  

Technology Transfer Automated Retrieval System (TEKTRAN)

This paper describes the design and evaluation of an airborne multispectral imaging system based on two identical consumer-grade cameras for agricultural remote sensing. The cameras are equipped with a full-frame complementary metal oxide semiconductor (CMOS) sensor with 5616 × 3744 pixels. One came...

151

Adaptive image feature prediction and control for visual tracking with a hand-eye coordinated camera  

Microsoft Academic Search

An adaptive method for visually tracking a known moving object with a single mobile camera is described. The method differs from previous methods of motion estimation in that both the camera and the object are moving. The objective is to predict the location of features of the object on the image plane based on past observations and past control inputs

J. T. Feddema; C. S. G. Lee

1990-01-01

152

Sensor Fingerprint Digests for Fast Camera Identification from Geometrically Distorted Images  

E-print Network

Sensor Fingerprint Digests for Fast Camera Identification from Geometrically Distorted Images,fridrich}@binghamton.edu ABSTRACT In camera identification using sensor fingerprint, it is absolutely essential that the fingerprint to a geometrical trans- formation, fingerprint detection becomes significantly more complicated. Besides

Fridrich, Jessica

153

Electronic Still Camera image of Astronaut Claude Nicollier working with RMS  

NASA Technical Reports Server (NTRS)

The robot arm controlling work of Swiss scientist Astronaut Claude Nicollier was photographed with an Electronic Still Camera (ESC), and downlinked to ground controllers soon afterward. Electronic still photography is a technology which provides the means for a handheld camera to electronically capture and digitze an image with resolution approaching film quality.

1993-01-01

154

Effects of environment factors on imaging performance of long focal length space camera  

NASA Astrophysics Data System (ADS)

In course of developing, testing, launching and working in orbit, Space camera has to undergo the shock of external loads and changing environment. The optical performance of a long focal length space camera is largely determined by external mechanical loads and ambient temperature. The performance of the camera is a result of the interaction between environment factors. The performance of the optical system should be making an accurate forecast when a modern optical instrument is designed. In this paper, the research methods are reviewed firstly. Then the related technologies are described. The analysis methods of environment temperature and structural characteristics effecting space camera imaging performance are also discussed.

Guo, Quanfeng; Jin, Guang; Dong, Jihong; Li, Wei; Li, Yanchun; Wang, Haiping; Wang, Kejun; Zhao, Weiguo

2012-10-01

155

A gated microchannel plate image intensifier packaged in a reflex camera back  

NASA Astrophysics Data System (ADS)

A fast (exposure down to approx. 10 ns) electrically triggered camera back/shutter designed for use with the Hasselblad 500C still camera is described. This camera back contains a microchannel plate image intensifier which can be gated and provides a gain of approximately X1000. It reads out on to Polaroid film through a fiber optic face plate. The gated camera back is interchangeable with the regular Polariod back (or any other) for the Hasselbald 500C and maintains the through the lens focussing.

Hauver, S.; Pechacek, R. E.; Grieg, J. R.; Murphy, D. P.; Raleigh, M.

1985-12-01

156

D Hyperspectral Frame Imager Camera Data in Photogrammetric Mosaicking  

NASA Astrophysics Data System (ADS)

A new 2D hyperspectral frame camera system has been developed by VTT (Technical Research Center of Finland) and Rikola Ltd. It contains frame based and very light camera with RGB-NIR sensor and it is suitable for light weight and cost effective UAV planes. MosaicMill Ltd. has converted the camera data into proper format for photogrammetric processing, and camera's geometrical accuracy and stability are evaluated to guarantee required accuracies for end user applications. MosaicMill Ltd. has also applied its' EnsoMOSAIC technology to process hyperspectral data into orthomosaics. This article describes the main steps and results on applying hyperspectral sensor in orthomosaicking. The most promising results as well as challenges in agriculture and forestry are also described.

Mäkeläinen, A.; Saari, H.; Hippi, I.; Sarkeala, J.; Soukkamäki, J.

2013-08-01

157

A small field of view camera for hybrid gamma and optical imaging  

NASA Astrophysics Data System (ADS)

The development of compact low profile gamma-ray detectors has allowed the production of small field of view, hand held imaging devices for use at the patient bedside and in operating theatres. The combination of an optical and a gamma camera, in a co-aligned configuration, offers high spatial resolution multi-modal imaging giving a superimposed scintigraphic and optical image. This innovative introduction of hybrid imaging offers new possibilities for assisting surgeons in localising the site of uptake in procedures such as sentinel node detection. Recent improvements to the camera system along with results of phantom and clinical imaging are reported.

Lees, J. E.; Bugby, S. L.; Bhatia, B. S.; Jambi, L. K.; Alqahtani, M. S.; McKnight, W. R.; Ng, A. H.; Perkins, A. C.

2014-12-01

158

New opportunities for quality enhancing of images captured by passive THz camera  

NASA Astrophysics Data System (ADS)

As it is well-known, the passive THz camera allows seeing concealed object without contact with a person and this camera is non-dangerous for a person. Obviously, efficiency of using the passive THz camera depends on its temperature resolution. This characteristic specifies possibilities of the detection for concealed object: minimal size of the object; maximal distance of the detection; image quality. Computer processing of the THz image may lead to many times improving of the image quality without any additional engineering efforts. Therefore, developing of modern computer code for its application to THz images is urgent problem. Using appropriate new methods one may expect such temperature resolution which will allow to see banknote in pocket of a person without any real contact. Modern algorithms for computer processing of THz images allow also to see object inside the human body using a temperature trace on the human skin. This circumstance enhances essentially opportunity of passive THz camera applications for counterterrorism problems. We demonstrate opportunities, achieved at present time, for the detection both of concealed objects and of clothes components due to using of computer processing of images captured by passive THz cameras, manufactured by various companies. Another important result discussed in the paper consists in observation of both THz radiation emitted by incandescent lamp and image reflected from ceramic floorplate. We consider images produced by THz passive cameras manufactured by Microsemi Corp., and ThruVision Corp., and Capital Normal University (Beijing, China). All algorithms for computer processing of the THz images under consideration in this paper were developed by Russian part of author list. Keywords: THz wave, passive imaging camera, computer processing, security screening, concealed and forbidden objects, reflected image, hand seeing, banknote seeing, ceramic floorplate, incandescent lamp.

Trofimov, Vyacheslav A.; Trofimov, Vladislav V.

2014-10-01

159

Modulated electron-multiplied fluorescence lifetime imaging microscope: all-solid-state camera for fluorescence lifetime imaging  

NASA Astrophysics Data System (ADS)

We have built an all-solid-state camera that is directly modulated at the pixel level for frequency-domain fluorescence lifetime imaging microscopy (FLIM) measurements. This novel camera eliminates the need for an image intensifier through the use of an application-specific charge coupled device design in a frequency-domain FLIM system. The first stage of evaluation for the camera has been carried out. Camera characteristics such as noise distribution, dark current influence, camera gain, sampling density, sensitivity, linearity of photometric response, and optical transfer function have been studied through experiments. We are able to do lifetime measurement using our modulated, electron-multiplied fluorescence lifetime imaging microscope (MEM-FLIM) camera for various objects, e.g., fluorescein solution, fixed green fluorescent protein (GFP) cells, and GFP-actin stained live cells. A detailed comparison of a conventional microchannel plate (MCP)-based FLIM system and the MEM-FLIM system is presented. The MEM-FLIM camera shows higher resolution and a better image quality. The MEM-FLIM camera provides a new opportunity for performing frequency-domain FLIM.

Zhao, Qiaole; Schelen, Ben; Schouten, Raymond; van den Oever, Rein; Leenen, René; van Kuijk, Harry; Peters, Inge; Polderdijk, Frank; Bosiers, Jan; Raspe, Marcel; Jalink, Kees; de Jong, Jan Geert Sander; van Geest, Bert; Stoop, Karel; Young, Ian Ted

2012-12-01

160

Modulated electron-multiplied fluorescence lifetime imaging microscope: all-solid-state camera for fluorescence lifetime imaging.  

PubMed

We have built an all-solid-state camera that is directly modulated at the pixel level for frequency-domain fluorescence lifetime imaging microscopy (FLIM) measurements. This novel camera eliminates the need for an image intensifier through the use of an application-specific charge coupled device design in a frequency-domain FLIM system. The first stage of evaluation for the camera has been carried out. Camera characteristics such as noise distribution, dark current influence, camera gain, sampling density, sensitivity, linearity of photometric response, and optical transfer function have been studied through experiments. We are able to do lifetime measurement using our modulated, electron-multiplied fluorescence lifetime imaging microscope (MEM-FLIM) camera for various objects, e.g., fluorescein solution, fixed green fluorescent protein (GFP) cells, and GFP-actin stained live cells. A detailed comparison of a conventional microchannel plate (MCP)-based FLIM system and the MEM-FLIM system is presented. The MEM-FLIM camera shows higher resolution and a better image quality. The MEM-FLIM camera provides a new opportunity for performing frequency-domain FLIM. PMID:23323290

Zhao, Qiaole; Schelen, Ben; Schouten, Raymond; van den Oever, Rein; Leenen, René; van Kuijk, Harry; Peters, Inge; Polderdijk, Frank; Bosiers, Jan; Raspe, Marcel; Jalink, Kees; Geert Sander de Jong, Jan; van Geest, Bert; Stoop, Karel; Young, Ian Ted

2012-12-01

161

Coarse-grained reconfigurable image stream processor architecture for high-definition cameras and camcorders  

Microsoft Academic Search

The coarse-grained reconfigurable image stream processor (CRISP) architecture is introduced for the image processing demands of high-definition (HD) cameras and camcorders. With several architectural concepts of the reconfigurable architecture, the CRISP architecture is proposed to meet the performance and flexibility requirements of the HD cameras. A multi-frame processing system with CRISP is implemented to achieve the real-time HD video recording

Teng-Yuan Cheng; Tsung-Huang Chen; Jason C. Chen; Shao-Yi Chien

2010-01-01

162

Remarks on 3D human body posture reconstruction from multiple camera images  

Microsoft Academic Search

This paper proposes a human body posture estimation method based on back projection of human silhouette images extracted from multi-camera images. To achieve real-time 3D human body posture estimation, a server-client system is introduced into the multi-camera system, improvements of the background subtraction and back projection are investigated. To evaluate the feasibility of the proposed method, 3D estimation experiments of

Yusuke Nagasawa; Takako Ohta; Yukiko Mutsuji; Kazuhiko Takahashi; Masafumi Hashimoto

2007-01-01

163

Low-Light AutoFocus Enhancement for Digital and CellPhone Camera Image Pipelines  

Microsoft Academic Search

Images captured by a digital or cell-phone camera in low-light environments usually suffer from a lack of sharpness due to the failure of the camera's passive auto-focus (AF) system to locate the peak in-focus position of a sharpness function that is extracted from the image. In low-light, the sharpness function becomes flat, making it quite difficult to locate the peak.In

Mark Gamadia; Nasser Kehtarnavaz; Katie Roberts-Hoffman

2007-01-01

164

PCA-Based Spatially Adaptive Denoising of CFA Images for Single-Sensor Digital Cameras  

Microsoft Academic Search

Single-sensor digital color cameras use a process called color demosaicking to produce full color images from the data captured by a color filter array (CFA). The quality of demosaicked images is degraded due to the sensor noise introduced during the image acquisition process. The conventional solution to combating CFA sensor noise is demosaicking first, followed by a separate denoising processing.

Lei Zhang; Rastislav Lukac; Xiaolin Wu; David Zhang

2009-01-01

165

Status of an Atmospheric Cherenkov Imaging Camera for the CANGAROO-III  

E-print Network

1 Status of an Atmospheric Cherenkov Imaging Camera for the CANGAROO-III Experiment and Perspective in the dark site Southern hemisphere à galactic objects Very high energy gamma-rays Cherenkov light #12 Atmospheric Cherenkov Telescope·j Satellite (EGRET) 10km #12;4 Imaging analysis Image axis Length Width

Enomoto, Ryoji

166

RECONSTRUCTION OF ISOMETRICALLY DEFORMABLE FLAT SURFACES IN 3D FROM MULTIPLE CAMERA IMAGES  

E-print Network

RECONSTRUCTION OF ISOMETRICALLY DEFORMABLE FLAT SURFACES IN 3D FROM MULTIPLE CAMERA IMAGES R experiment con- sisting of real images of a sheet of paper is shown. Index Terms-- Machine vision, Isometric is to infer the structure of a non- rigid isometric surface observed in multiple images. The example provided

Instituto de Sistemas e Robotica

167

PhotoDoc: A Toolbox for Processing Document Images Acquired Using Portable Digital Cameras  

Microsoft Academic Search

This paper introduces PhotoDoc a software toolbox designed to process document images acquired with portable digital cameras. PhotoDoc was developed as an ImageJ plug-in. It performs border removal, perspective and skew correction, and image binarization. PhotoDoc interfaces with Tesseract, an open source Optical Character Recognizer originally developed by HP and distributed by Google.

Gabriel Pereira; Rafael Dueire Lins

168

Reconstruction of face image from security camera based on a measurement of space variant PSF  

Microsoft Academic Search

Images recorded by security camera are often severely degraded due to dirty lens or secular distortion of the recording system. To restore these images, the fully determination of space variant point spread function (PSF) is required. To measure PSF, we have proposed a method using a liquid crystal display, and shown some experimental results to restore the images by a

Tadashi Ito; Hitoshi Hoshino; Yusaku Fujii; Naoya Ohta

2009-01-01

169

IMAGE SPLICING DETECTION USING CAMERA RESPONSE FUNCTION CONSISTENCY AND AUTOMATIC SEGMENTATION  

E-print Network

integrity has become increasingly important. An intuitive and promising approach to image forgery detectionIMAGE SPLICING DETECTION USING CAMERA RESPONSE FUNCTION CONSISTENCY AND AUTOMATIC SEGMENTATION Yu,sfchang}@ee.columbia.edu ABSTRACT We propose a fully automatic spliced image detection method based on consistency checking

Chang, Shih-Fu

170

Joint Depth Estimation and Camera Shake Removal from Single Blurry Image Ming-Hsuan Yang1  

E-print Network

) restored image by the method [10] assuming constant scene depth; (c) estimated depth map by our method; (d) restored image by our method using the estimated depth map. aware motion deblur have been recent proposedJoint Depth Estimation and Camera Shake Removal from Single Blurry Image Zhe Hu1 Li Xu2 Ming

Yang, Ming-Hsuan

171

Digital-image capture system for the IR camera used in Alcator C-Mod  

SciTech Connect

An infrared imaging system, based on an Amber Radiance 1 infrared camera, is used at Alcator C-Mod to measure the surface temperatures in the lower divertor region. Due to the supra-linear dependence of the thermal radiation with temperature it is important to make use of the 12-bit digitization of the focal plane array of the Amber camera and not be limited by the 8 bits inherent to the video signal. It is also necessary for the image capture device (i.e., fast computer) to be removed from the high magnetic field environment surrounding the experiment. Finally, the coupling between the digital camera output and the capture device should be nonconductive for isolation purposes (i.e., optical coupling). A digital video remote camera interface (RCI) coupled to a PCI bus fiber optic interface board is used to accomplish this task. Using this PCI-RCI system, the 60 Hz images from the Amber Radiance 1 camera, each composed of 256x256 pixels and 12 bits/pixel, are captured by a Windows NT computer. An electrical trigger signal is given directly to the RCI module to synchronize the image stream with the experiment. The RCI can be programmed from the host computer to work with a variety of digital cameras, including the Amber Radiance 1 camera.

Maqueda, R. J.; Wurden, G. A.; Terry, J. L.; Gaffke, J.

2001-01-01

172

Digital-image capture system for the IR camera used in Alcator C-Mod  

NASA Astrophysics Data System (ADS)

An infrared imaging system, based on an Amber Radiance 1 infrared camera, is used at Alcator C-Mod to measure the surface temperatures in the lower divertor region. Due to the supra-linear dependence of the thermal radiation with temperature it is important to make use of the 12-bit digitization of the focal plane array of the Amber camera and not be limited by the 8 bits inherent to the video signal. It is also necessary for the image capture device (i.e., fast computer) to be removed from the high magnetic field environment surrounding the experiment. Finally, the coupling between the digital camera output and the capture device should be nonconductive for isolation purposes (i.e., optical coupling). A digital video remote camera interface (RCI) coupled to a PCI bus fiber optic interface board is used to accomplish this task. Using this PCI-RCI system, the 60 Hz images from the Amber Radiance 1 camera, each composed of 256×256 pixels and 12 bits/pixel, are captured by a Windows NT computer. An electrical trigger signal is given directly to the RCI module to synchronize the image stream with the experiment. The RCI can be programmed from the host computer to work with a variety of digital cameras, including the Amber Radiance 1 camera.

Maqueda, R. J.; Wurden, G. A.; Terry, J. L.; Gaffke, J.

2001-01-01

173

An image compression algorithm for a high-resolution digital still camera  

NASA Technical Reports Server (NTRS)

The Electronic Still Camera (ESC) project will provide for the capture and transmission of high-quality images without the use of film. The image quality will be superior to video and will approach the quality of 35mm film. The camera, which will have the same general shape and handling as a 35mm camera, will be able to send images to earth in near real-time. Images will be stored in computer memory (RAM) in removable cartridges readable by a computer. To save storage space, the image will be compressed and reconstructed at the time of viewing. Both lossless and loss-y image compression algorithms are studied, described, and compared.

Nerheim, Rosalee

1989-01-01

174

Temperature resolution enhancing of commercially available THz passive cameras due to computer processing of images  

NASA Astrophysics Data System (ADS)

As it is well-known, application of the passive THz camera for the security problems is very promising way. It allows seeing concealed object without contact with a person and this camera is non-dangerous for a person. Efficiency of using the passive THz camera depends on its temperature resolution. This characteristic specifies possibilities of the detection of concealed object: minimal size of the object, maximal distance of the detection, image detail. One of probable ways for a quality image enhancing consists in computer processing of image. Using computer processing of the THz image of objects concealed on the human body, one may improve it many times. Consequently, the instrumental resolution of such device may be increased without any additional engineering efforts. We demonstrate new possibilities for seeing the clothes details, which raw images, produced by the THz cameras, do not allow to see. We achieve good quality of the image due to applying various spatial filters with the aim to demonstrate independence of processed images on math operations. This result demonstrates a feasibility of objects seeing. We consider images produced by THz passive cameras manufactured by Microsemi Corp., and ThruVision Corp., and Capital Normal University (Beijing, China).

Trofimov, Vyacheslav A.; Trofimov, Vladislav V.; Kuchik, Igor E.

2014-06-01

175

UCXp camera imaging principle and key technologies of data post-processing  

NASA Astrophysics Data System (ADS)

The large format digital aerial camera product UCXp was introduced into the Chinese market in 2008, the image consists of 17310 columns and 11310 rows with a pixel size of 6 mm. The UCXp camera has many advantages compared with the same generation camera, with multiple lenses exposed almost at the same time and no oblique lens. The camera has a complex imaging process whose principle will be detailed in this paper. On the other hand, the UCXp image post-processing method, including data pre-processing and orthophoto production, will be emphasized in this article. Based on the data of new Beichuan County, this paper will describe the data processing and effects.

Yuan, Fangyan; Li, Guoqing; Zuo, Zhengli; Liu, Jianmin; Wu, Liang; Yu, Xiaoping; Zhao, Haitao

2014-03-01

176

Matching Images to Models: Camera Calibration for 3-D Surface Reconstruction  

NASA Technical Reports Server (NTRS)

In a previous paper we described a system which recursively recovers a super-resolved three dimensional surface model from a set of images of the surface. In that paper we assumed that the camera calibration for each image was known. In this paper we solve two problems. Firstly, if an estimate of the surface is already known, the problem is to calibrate a new image relative to the existing surface model. Secondly, if no surface estimate is available, the relative camera calibration between the images in the set must be estimated. This will allow an initial surface model to be estimated. Results of both types of estimation are given.

Morris, Robin D.; Smelyanskiy, Vadim N.; Cheeseman. Peter C.; Norvig, Peter (Technical Monitor)

2001-01-01

177

Proposal for real-time terahertz imaging system with palm-size terahertz camera and compact quantum cascade laser  

E-print Network

This paper describes a real-time terahertz (THz) imaging system, using the combination of a palm-size THz camera with a compact quantum cascade laser (QCL). The THz camera contains a 320x240 microbolometer focal plane array ...

Oda, Naoki

178

DEFINITION OF AIRWAY COMPOSITION WITHIN GAMMA CAMERA IMAGES  

EPA Science Inventory

The efficacies on inhaled pharmacologic drugs in the prophylaxis and treatment if airway diseases could be improved if particles were selectively directed to appropriate Sites. n the medical arena, planar gamma scintillation cameras may be employed to study factors affecting such...

179

Color enhancement of image of earth photographed by UV camera  

NASA Technical Reports Server (NTRS)

A color enhancement of an ultraviolet photograph of the geocorona, a halo of low density hydrogen around the Earth. Sunlight is shining from the left, and the geocorona is brighter on that side. The UV camera was operated by Astronaut John W. Young on the Apollo 16 lunar landing mission.

1972-01-01

180

Scalable Virtual Viewpoint Image Synthesis for Multiple Camera Environments  

Microsoft Academic Search

One of the main aims of emerging audio-visual (AV) ap- plications is to provide interactive navigation within a cap- tured event or scene. This paper presents a view synthe- sis algorithm that provides a scalable and flexible approach to virtual viewpoint synthesis in multiple camera environ- ments. The Multi-View Synthesis (MVS) process consists of four different phases that are described

E. Cooke; N. O'connor

2005-01-01

181

Camera Animation  

NSDL National Science Digital Library

A general discussion of the use of cameras in computer animation. This section includes principles of traditional film techniques and suggestions for the use of a camera during an architectural walkthrough. This section includes html pages, images and one video.

182

Optimal geometrical configuration of a double-scattering compton camera for maximum imaging resolution and sensitivity  

NASA Astrophysics Data System (ADS)

A novel type of Compton camera, called a double-scattering Compton imager (DOCI), is under development for nuclear medicine and molecular imaging applications. Two plane-type position-sensitive semiconductor detectors are employed as the scatterer detectors, and a 3?×3? cylindrical NaI(Tl) scintillation detector is employed as the absorber detector. This study determined the optimal geometrical configuration of these component detectors to maximize the performance of the Compton camera in imaging resolution and sensitivity. To that end, the Compton camera was simulated very realistically, with the GEANT4 detector simulation toolkit, including various detector characteristics such as energy resolution, spatial resolution, energy discrimination, and Doppler energy broadening. According to our simulation results, the Compton camera is expected to show its maximum performance when the two scatterer detectors are positioned in parallel, with ˜8 cm of separation. The Compton camera will show the maximum performance also when the gamma-ray energy is about 500 keV, which suggests that the Compton camera is a suitable device to image the distribution of the positron emission tomography (PET) isotopes in the human body.

Seo, Hee; Lee, Se Hyung; Kim, Chan Hyeong; An, So Hyun; Lee, Ju Hahn; Lee, Chun Sik

2008-06-01

183

High speed short wave infrared (SWIR) imaging and range gating cameras  

NASA Astrophysics Data System (ADS)

Imaging in the Short Wave Infrared (SWIR) provides unique surveillance capabilities, both with passive illumination from the night glow in the atmosphere or with active illumination from covert LED or eye-safe lasers. Spectral effects specific to the 0.9 to 1.7 um wavelength range reveal camouflage and chemical signatures of ordinance. The longer wavelength range also improves image resolution over visible cameras in foggy or dusty environments. Increased military interest in cameras that image all laser range finders and target designators on the battlefield has driven development of a new class of uncooled InGaAs cameras with higher resolution and larger field of view than previously available. Current and upcoming needs include: imaging in all lighting conditions, from direct sunlight to partial starlight while using minimal power; range gating the camera to image through obscurants or beyond unimportant objects; and high speed capture of muzzle flare, projectile tracking, guide star and communications laser-beam tracking and wavefront correction. This paper will present images from new COTS cameras now available to address these needs and discuss the technology roadmap for further improvements.

Malchow, Douglas; Battaglia, Jesse; Brubaker, Robert; Ettenberg, Martin

2007-04-01

184

High-frame-rate intensified fast optically shuttered TV cameras with selected imaging applications  

SciTech Connect

This invited paper focuses on high speed electronic/electro-optic camera development by the Applied Physics Experiments and Imaging Measurements Group (P-15) of Los Alamos National Laboratory`s Physics Division over the last two decades. The evolution of TV and image intensifier sensors and fast readout fast shuttered cameras are discussed. Their use in nuclear, military, and medical imaging applications are presented. Several salient characteristics and anomalies associated with single-pulse and high repetition rate performance of the cameras/sensors are included from earlier studies to emphasize their effects on radiometric accuracy of electronic framing cameras. The Group`s test and evaluation capabilities for characterization of imaging type electro-optic sensors and sensor components including Focal Plane Arrays, gated Image Intensifiers, microchannel plates, and phosphors are discussed. Two new unique facilities, the High Speed Solid State Imager Test Station (HSTS) and the Electron Gun Vacuum Test Chamber (EGTC) arc described. A summary of the Group`s current and developmental camera designs and R&D initiatives are included.

Yates, G.J.; King, N.S.P.

1994-08-01

185

Design Considerations Of A Compton Camera For Low Energy Medical Imaging  

SciTech Connect

Development of a Compton camera for low energy medical imaging applications is underway. The ProSPECTus project aims to utilize position sensitive detectors to generate high quality images using electronic collimation. This method has the potential to significantly increase the imaging efficiency compared with mechanically collimated SPECT systems, a highly desirable improvement on clinical systems. Design considerations encompass the geometrical optimisation and evaluation of image quality from the system which is to be built and assessed.

Harkness, L. J.; Boston, A. J.; Boston, H. C.; Cresswell, J. R.; Grint, A. N.; Judson, D. S.; Nolan, P. J.; Oxley, D. C. [Department of Physics, University of Liverpool, Oliver Lodge Laboratory, Liverpool, UK L697ZE (United Kingdom); Lazarus, I.; Simpson, J. [STFC Daresbury Laboratory, Daresbury, Warrington, WA44AD (United Kingdom)

2009-12-02

186

MOSAIC. [Mosaicked Optical Self-scanned Array Imaging Camera for high resolution UV space astronomy missions  

NASA Technical Reports Server (NTRS)

The Mosaicked Optical Self-scanned Array Imaging Camera (MOSAIC), is a 50 sq cm active area detector system encompassing 5.76 million picture elements. This camera is being developed for use in astronomical instrumentation requiring the use of large area imaging detectors with high resolution photon counting capability in the space ultraviolet. This paper gives a descriptive outline of the MOSAIC camera system including: the 100 mm diameter microchannel plate intensifier, the 3 x 3 array of 800 x 800 pixel charge coupled devices, the signal processing and buffer storage subsystem, the electronic support and control subsystem, and the data processing subsystem. Performance characteristics of the camera and its subsystems are presented, based on system analysis.

Williams, J. T.; Weistrop, D.

1984-01-01

187

Imaging high-dimensional spatial entanglement with a camera  

PubMed Central

The light produced by parametric down-conversion shows strong spatial entanglement that leads to violations of EPR criteria for separability. Historically, such studies have been performed by scanning a single-element, single-photon detector across a detection plane. Here we show that modern electron-multiplying charge-coupled device cameras can measure correlations in both position and momentum across a multi-pixel field of view. This capability allows us to observe entanglement of around 2,500 spatial states and demonstrate Einstein–Podolsky–Rosen type correlations by more than two orders of magnitude. More generally, our work shows that cameras can lead to important new capabilities in quantum optics and quantum information science. PMID:22871804

Edgar, M.P.; Tasca, D.S.; Izdebski, F.; Warburton, R.E.; Leach, J.; Agnew, M.; Buller, G.S.; Boyd, R.W.; Padgett, M.J.

2012-01-01

188

Imaging high-dimensional spatial entanglement with a camera  

E-print Network

The light produced by parametric down-conversion shows strong spatial entanglement that leads to violations of EPR criteria for separability. Historically, such studies have been performed by scanning a single-element, single-photon detector across a detection plane. Here we show that modern electron-multiplying charge-coupled device cameras can measure correlations in both position and momentum across a multi-pixel field of view. This capability allows us to observe entanglement of around 2,500 spatial states and demonstrate Einstein-Podolsky-Rosen type correlations by more than two orders of magnitude. More generally, our work shows that cameras can lead to important new capabilities in quantum optics and quantum information science.

Matthew P. Edgar; Daniel S. Tasca; Frauke Izdebski; Ryan E. Warburton; Jonathan Leach; Megan Agnew; Gerald S. Buller; Robert W. Boyd; Miles J. Padgett

2012-08-09

189

Optical characterization of UV multispectral imaging cameras for SO2 plume measurements  

NASA Astrophysics Data System (ADS)

Only a few years ago spectral imaging cameras for SO2 plume monitoring were developed for remote sensing of volcanic plumes. We describe the development from a first camera using a single filter in the absorption band of SO2 to more advanced systems using several filters and an integrated spectrometer. The first system was based on the Hamamatsu C8484 UV camera (1344 x 1024 pixels) with high quantum efficiency in the UV region from 280 nm onward. At the heart of the second UV camera system, EnviCam, is a cooled Alta U47 camera, equipped with two on-band (310 and 315 nm) and two off-band (325 and 330 nm) filters. The third system utilizes again the uncooled Hamamatsu camera for faster sampling (~10 Hz) and a four-position filter-wheel equipped with two 10 nm filters centered at 310 and 330 nm, a UV broadband view and a blackened plate for dark-current measurement. Both cameras have been tested with lenses with different focal lengths. A co-aligned spectrometer provides a ~0.3nm resolution spectrum within the field-of-view of the camera. We describe the ground-based imaging cameras systems developed and utilized at our Institute. Custom made cylindrical quartz calibration cells with 50 mm diameter, to cover the entire field of view of the camera optics, are filled with various amounts of gaseous SO2 (typically between 100 and 1500 ppm•m). They are used for calibration and characterization of the cameras in the laboratory. We report about the procedures for monitoring and analyzing SO2 path-concentration and fluxes. This includes a comparison of the calibration in the atmosphere using the SO2 cells versus the SO2 retrieval from the integrated spectrometer. The first UV cameras have been used to monitor ship emissions (Ny-Ålesund, Svalbard and Genova, Italy). The second generation of cameras were first tested for industrial stack monitoring during a field campaign close to the Rovinari (Romania) power plant in September 2010, revealing very high SO2 emissions (> 1000 ppm•m). The second generation cameras are now used by students from several universities in Romania. The newest system has been tested for volcanic plume monitoring at Turrialba, Costa Rica in January, 2011, at Merapi volcani, Indonesia in February 2011, at Lascar volcano in Chile in July 2011 and at Etna/Stromboli (Italy) in November 2011. Retrievals from some of these campaigns will be presented.

Stebel, K.; Prata, F.; Dauge, F.; Durant, A.; Amigo, A.,

2012-04-01

190

Microchannel plate pinhole camera for 20 to 100 keV x-ray imaging  

SciTech Connect

We present the design and construction of a sensitive pinhole camera for imaging suprathermal x-rays. Our device is a pinhole camera consisting of four filtered pinholes and microchannel plate electron multiplier for x-ray detection and signal amplification. We report successful imaging of 20, 45, 70, and 100 keV x-ray emissions from the fusion targets at our Novette laser facility. Such imaging reveals features of the transport of hot electrons and provides views deep inside the target.

Wang, C.L.; Leipelt, G.R.; Nilson, D.G.

1984-10-03

191

Temporal-noise measurements of a CMOS camera used for image quality measurements  

NASA Astrophysics Data System (ADS)

We have used a CMOS camera as multi-purpose instrument for measuring image quality of high-resolution LCD displays for use in medical diagnosis. The camera uses a Foveon sensor so that the RGB pixels are co-located making for simplified spatial sampling. In this presentation we examine the temporal-noise characteristics of the CMOS camera. We are using one of several LED's in an integrating sphere as the light source to provide the nominally uniform illumination for each experiment. In first experiments we place the camera sensor near to the opening of the integrating sphere without introducing an imaging lens in order to initially eliminate the characteristics of the lens. The capture parameters for the camera control software are set to give us as nearly raw data as possible. We suppress the native processing because our application is much different than the usual image-reproduction function. We capture sequences of images with the same illumination and same capture parameters and write them to TIFF files. The Matlab-based software we've written first pre-processes the images then calculates the pixel-wise mean and standard deviation over the sequence of frames. The calculation is repeated for each of the three (RGB) channels of the CMOS camera. These statistics are displayed as graytone images in three windows with the maxima and minima displayed in the title-bars. The short-term goal of the experiments is to prepare the way to compare the performance of cooled and un-cooled versions of the camera.

Roehrig, Hans; Dallas, William J.; Redford, Gary R.

2008-03-01

192

Camera simulation engine enables efficient system optimization for super-resolution imaging  

NASA Astrophysics Data System (ADS)

Quantitative fluorescent imaging requires optimization of the complete optical system, from the sample to the detector. Such considerations are especially true for precision localization microscopy such as PALM and (d)STORM where the precision of the result is limited by the noise in both the optical and detection systems. Here, we present a Camera Simulation Engine (CSE) that allows comparison of imaging results from CCD, CMOS and EM-CCD cameras under various sample conditions and can accurately validate the quality of precision localization algorithms and camera performance. To achieve these results, the CSE incorporates the following parameters: 1) Sample conditions including optical intensity, wavelength, optical signal shot noise, and optical background shot noise; 2) Camera specifications including QE, pixel size, dark current, read noise, EM-CCD excess noise; 3) Camera operating conditions such as exposure, binning and gain. A key feature of the CSE is that, from a single image (either real or simulated "ideal") we generate a stack of statistically realistic images. We have used the CSE to validate experimental data showing that certain current scientific CMOS technology outperforms EM-CCD in most super-resolution scenarios. Our results support using the CSE to efficiently and methodically select cameras for quantitative imaging applications. Furthermore, the CSE can be used to robustly compare and evaluate new algorithms for data analysis and image reconstruction. These uses of the CSE are particularly relevant to super-resolution precision localization microscopy and provide a faster, simpler and more cost effective means of system optimization, especially camera selection.

Fullerton, Stephanie; Bennett, Keith; Toda, Eiji; Takahashi, Teruo

2012-02-01

193

A high-resolution airborne four-camera imaging system for agricultural remote sensing  

Technology Transfer Automated Retrieval System (TEKTRAN)

This paper describes the design and testing of an airborne multispectral digital imaging system for remote sensing applications. The system consists of four high resolution charge coupled device (CCD) digital cameras and a ruggedized PC equipped with a frame grabber and image acquisition software. T...

194

Automatic Compensation for Camera Settings for Images Taken under Different Illuminants  

Microsoft Academic Search

The combination of two images shot for the same scene but under different illumination has been used in wide applications ranging from estimating scene illumination, to enhancing pho- tographs shot in dark environments, to shadow removal. An ex- ample is the use of a pair of images shot with and without a flash. However, for consumer-grade digital cameras, due to

Cheng Lu; Mark S. Drew

2006-01-01

195

Managing the Storage and Battery Resources in an Image Capture Device (Digital Camera) using Dynamic  

E-print Network

of battery power. Storage technologies with expensive read and write operations (such as micro disks) canManaging the Storage and Battery Resources in an Image Capture Device (Digital Camera) using to be matched with intel- ligent image storage mechanisms that are aware of local storage and battery

Ellis, Carla

196

Fast Zonal DCT-based image compression for Wireless Camera Sensor Networks  

E-print Network

Fast Zonal DCT-based image compression for Wireless Camera Sensor Networks Leila Makkaoui, Vincent. Here we propose to solve this problem by the design of a fast zonal DCT-based image compression into account the DCT energy compaction property that allows to discard high frequency coefficients

Paris-Sud XI, Université de

197

Algorithm for turbidity mapping using digital camera images from a low-altitude light aircraft  

Microsoft Academic Search

Due to high cost and difficulty in obtaining cloud-free satellite images in the equatorial region, new techniques are invented and introduced. Besides that, in situ measurements are accurate but slow, costly, and difficult in adverse weather conditions. In this study, high spatial resolution digital airborne images were captured by a conventional digital camera and used for turbidity mapping. Turbidity distribution

H. S. Lim; M. Z. MatJafri; K. Abdullah

2009-01-01

198

Novel color processing architecture for digital cameras with CMOS image sensors  

Microsoft Academic Search

This paper presents a color processing architecture for digital color cameras utilizing complementary metal oxide semiconductor (CMOS) image sensors. The proposed architecture gives due consideration to the peculiar aspects of CMOS image sensors and the human visual perception related to the particular application of digital color photography. A main difference between the proposed method arid the conventional systems is the

Chaminda Weerasinghe; Wanqing Li; Igor Kharitonenko; Magnus Nilsson; Sue Twelves

2005-01-01

199

An Effective Edge-Adaptive Color Demosaicking Algorithm for Single Sensor Digital Camera Images  

Microsoft Academic Search

\\u000a Most digital cameras use a color filter array of mosaic pattern to capture the colors of the scene. To render a full-resolution\\u000a color image using a singlechip camera, the missing information must be estimated from the surrounding pixels. In this paper,\\u000a we present an edge-adaptive demosaicking method for color demosaicking. The algorithm first estimate missing green samples\\u000a in red and

Mohammad Reza Alsharif; Katsumi Yamashita

2009-01-01

200

Simultaneous Images of the Northern and Southern Auroras with the Earth Camera on the Polar Spacecraft  

Microsoft Academic Search

During the period 22 October through late November, 2001, the Polar spacecraft was fortuitously positioned near Earth's equatorial plane such that the wide field-of-view of its far-ultraviolet camera was able to capture simultaneous images of the auroras in the Northern and Southern Hemispheres. This Earth Camera provided sufficient resolution of auroral features that the luminosities could be mapped into corrected

L. A. Frank; J. B. Sigwarth

2002-01-01

201

Digital video camera for application of particle image velocimetry in high-speed flows  

Microsoft Academic Search

A high-speed digital camera based on video technology for application of particle image velocimetry in wind tunnels is described. The camera contains two independently triggerable interline CCD sensors which are mounted on two faces of a cube beam splitter permitting the use of a single lens. Each of the sensors has a minimal exposure time of 0.8 microsecond(s) with a

Christian Willert; Boleslaw Stasicki; Markus Raffel; Juergen Kompenhans

1995-01-01

202

Using the EMVA1288 standard to select an image sensor or camera  

NASA Astrophysics Data System (ADS)

Selecting an image sensor or a camera for an industrial application is a difficult task. Data sheets usually provide incomplete performance information. Even if detailed performance information is provided, each supplier has its own test and data reporting methods. Many customers will then chose to evaluate each candidate sensor or camera in their lab. This approach is time consuming and requires appropriate lab equipment. EMVA1288 is a measurement and reporting standard developed to

Darmont, Arnaud

2010-01-01

203

10-micron GaAs\\/AlGaAs multiquantum well scanned array infrared imaging camera  

Microsoft Academic Search

A long-wavelength infrared imaging camera that uses a GaAs\\/Al(x)Ga(1-x)As quantum-well infrared photodetector (QWIP) array is demonstrated. Excellent noise equivalent temperature difference sensitivity (NE Delta T less than 0.1 C) has been achieved. The long-wavelength cutoff for the QWIP used in this camera is at lambda(c) = 10.7 micron with the peak response being at lambda p = 9.8 micron. A

C. G. Bethea; B. F. Levine; V. O. Shen; R. R. Abbott; S. J. Hseih

1991-01-01

204

640x512 InGaAs focal plane array camera for visible and SWIR imaging  

Microsoft Academic Search

We report on our 640x512 pixel InGaAs\\/InP focal plane array camera for visible and short-wavelength infrared imaging. For this camera, we have fabricated a 640x512 element substrate-removed backside-illuminated InGaAs\\/InP photodiode array (PDA) with a 25 mm pixel pitch. The PDA is indium bump bonded to a silicon read out integrated circuit. Removing the InP substrate from the focal plane array

Tara Martin; Robert Brubaker; Peter Dixon; Mari-Anne Gagliardi; Tom Sudol

2005-01-01

205

Constructing a Database from Multiple 2D Images for Camera Pose Estimation and Robot Localization  

NASA Technical Reports Server (NTRS)

The LMDB (Landmark Database) Builder software identifies persistent image features (landmarks) in a scene viewed multiple times and precisely estimates the landmarks 3D world positions. The software receives as input multiple 2D images of approximately the same scene, along with an initial guess of the camera poses for each image, and a table of features matched pair-wise in each frame. LMDB Builder aggregates landmarks across an arbitrarily large collection of frames with matched features. Range data from stereo vision processing can also be passed to improve the initial guess of the 3D point estimates. The LMDB Builder aggregates feature lists across all frames, manages the process to promote selected features to landmarks, and iteratively calculates the 3D landmark positions using the current camera pose estimations (via an optimal ray projection method), and then improves the camera pose estimates using the 3D landmark positions. Finally, it extracts image patches for each landmark from auto-selected key frames and constructs the landmark database. The landmark database can then be used to estimate future camera poses (and therefore localize a robotic vehicle that may be carrying the cameras) by matching current imagery to landmark database image patches and using the known 3D landmark positions to estimate the current pose.

Wolf, Michael; Ansar, Adnan I.; Brennan, Shane; Clouse, Daniel S.; Padgett, Curtis W.

2012-01-01

206

Imaging of breast cancer with mid- and long-wave infrared camera.  

PubMed

In this novel study the breasts of 15 women with palpable breast cancer were preoperatively imaged with three technically different infrared (IR) cameras - micro bolometer (MB), quantum well (QWIP) and photo voltaic (PV) - to compare their ability to differentiate breast cancer from normal tissue. The IR images were processed, the data for frequency analysis were collected from dynamic IR images by pixel-based analysis and from each image selectively windowed regional analysis was carried out, based on angiogenesis and nitric oxide production of cancer tissue causing vasomotor and cardiogenic frequency differences compared to normal tissue. Our results show that the GaAs QWIP camera and the InSb PV camera demonstrate the frequency difference between normal and cancerous breast tissue; the PV camera more clearly. With selected image processing operations more detailed frequency analyses could be applied to the suspicious area. The MB camera was not suitable for tissue differentiation, as the difference between noise and effective signal was unsatisfactory. PMID:18432466

Joro, R; Lääperi, A-L; Dastidar, P; Soimakallio, S; Kuukasjärvi, T; Toivonen, T; Saaristo, R; Järvenpää, R

2008-01-01

207

Hybrid Compton camera/coded aperture imaging system  

DOEpatents

A system in one embodiment includes an array of radiation detectors; and an array of imagers positioned behind the array of detectors relative to an expected trajectory of incoming radiation. A method in another embodiment includes detecting incoming radiation with an array of radiation detectors; detecting the incoming radiation with an array of imagers positioned behind the array of detectors relative to a trajectory of the incoming radiation; and performing at least one of Compton imaging using at least the imagers and coded aperture imaging using at least the imagers. A method in yet another embodiment includes detecting incoming radiation with an array of imagers positioned behind an array of detectors relative to a trajectory of the incoming radiation; and performing Compton imaging using at least the imagers.

Mihailescu, Lucian (Livermore, CA); Vetter, Kai M. (Alameda, CA)

2012-04-10

208

Periscope-camera system for visible and infrared imaging diagnostics on TFTR  

NASA Astrophysics Data System (ADS)

An optical diagnostic consisting of a periscope which relays images of the torus interior to an array of cameras is used on the Tokamak Fusion Test Reactor (TFTR) to view plasma discharge phenomena and inspect the vacuum vessel internal structures in both the visible and near-infrared wavelength regions. Three periscopes view through 20-cm-diam fused-silica windows which are spaced around the torus midplane to provide a viewing coverage of approximately 75% of the vacuum vessel internal surface area. The periscopes have f/8 optics and motor-driven controls for focusing, magnification selection (5°, 20°, and 60° field of view), elevation and azimuth setting, mast rotation, filter selection, iris aperture, and viewing port selection. The four viewing ports on each periscope are equipped with multiple imaging devices which include: (1) an inspection eyepiece, (2) standard (RCA TC2900) and fast (RETICON) framing rate television cameras, (3) a PtSi CCD infrared imaging camera, (4) a 35-mm Nikon F3 still camera, or (5) a 16-mm Locam II movie camera with variable framing rate up to 500 fps. Operation of the periscope-camera system is controlled either locally or remotely through a computer-CAMAC interface. A description of the equipment and examples of its application are presented.

Medley, S. S.; Dimock, D. L.; Hayes, S.; Long, D.; Lowrance, J. L.; Mastrocola, V.; Renda, G.; Ulrickson, M.; Young, K. M.

1985-10-01

209

Digital video camera for application of particle image velocimetry in high-speed flows  

NASA Astrophysics Data System (ADS)

A high-speed digital camera based on video technology for application of particle image velocimetry in wind tunnels is described. The camera contains two independently triggerable interline CCD sensors which are mounted on two faces of a cube beam splitter permitting the use of a single lens. Each of the sensors has a minimal exposure time of 0.8 microsecond(s) with a trigger response time of less than 1 microsecond(s) . The asynchronous reset capability permits the camera to trigger directly off a pulsed laser with a repetition rate differing from the standard 25 Hz CCIR video frame rate. Captured images are digitized within and stored in RAM the camera which can be read through the parallel port of a computer. The camera is software configurable with the settings being non-volatile. Technical aspect such as sensor alignment and calibration through software are described. Close-up PIV measurements on a free jet illustrated that, in the future, the camera can be successfully utilized at imaging high-speed flows over a small field of view covering several cm2, such as the flow between turbine blades. Further, the electronic shutter permits its use in luminous environments such as illuminated laboratories, wind tunnels or flames.

Willert, Christian; Stasicki, Boleslaw; Raffel, Markus; Kompenhans, Juergen

1995-09-01

210

Review of the characteristics of 384x288 pixel THz camera for see-through imaging  

NASA Astrophysics Data System (ADS)

Terahertz is a field in constant expansion. Multiple applications are foreseen including see-through imaging. To develop deployable systems, real-time two-dimensional cameras are needed rather than monopixel detectors or linear arrays that require mechanical scanning systems. INO has recently developed a real-time (video rate) 384x288 THz camera exhibiting excellent sensitivity and low noise levels. The core of the THz imager is the 35 ?m pitch detector array that is based on INO's uncooled VOx microbolometer technology and fabricated in INO's clean room. A standard ceramic package is used for final packaging. The detector FPA is finally sealed with a high resistivity float zone silicon (HRFZ-Si) window having an anti-reflective coating consisting of thick Parylene, which the thickness of which depends on the required optimization wavelength. The FPA is mounted on an INO IRXCAM core giving a passive THz camera assembly. The additional THz objective consists of a refractive 44 mm focal length F/1 THz lens. In this paper, a review of the characteristics of the THz camera at is performed. The sensitivity of the camera at various THz wavelengths is presented along with examples of the resolution obtained with the IRXCAM-384-THz camera core. See-through imaging results are also presented.

Marchese, Linda; Terroux, Marc; Genereux, Francis; Tremblay, Bruno; Bolduc, Martin; Bergeron, Alain

2013-10-01

211

A digital video camera for application of particle image velocimetry in high-speed flows  

SciTech Connect

A high-speed digital camera based on video technology for application of particle image velocimetry in wind tunnels is described. The camera contains two independently triggerable interline CCD sensors which are mounted on two faces of a cube beam splitter permitting the use of a single lens. Each of the sensors has a minimal exposure time of 0.8 {micro}s with a trigger response time of less than 1 {micro}s. The asynchronous reset capability permits the camera to trigger directly off a pulsed laser with a repetition rate differing from the standard 25 Hz CCIR video frame rate. Captured images are digitized within and stored in RAM the camera which can be read through the parallel port of a computer. The camera is software configurable with the settings being non-volatile. Technical aspects such as sensor alignment and calibration through software are described. Close-up PIV measurements on a free jet illustrate that, in the future, the camera can be successfully utilized at imaging high-speed flows over a small field of view covering several cm{sup 2}, such as the flow between turbine blades. Further, the electronic shutter permits its use in luminous environments such as illuminated laboratories, wind tunnels or flames.

Willert, C.; Stasicki, B.; Raffel, M.; Kompenhans, J. [Deutsche Forschungsanstalt fuer Luft und Raumfahrt e.V., Goettingen (Germany). Inst. fuer Stroemungsmechanik

1995-12-31

212

Computer image correlation for the determination of camera station parameters  

Microsoft Academic Search

This paper describes the use of scanned images, electronic image display, and computer image correlation as a photogrammetric measuring system. The resulting measurements are used to determine space position and attitude angles which are required by the visual simulator developed by LTV Aerospace and Defense (LTVAD) for the U.S. Navy's A-7E Weapon System Trainer. Also, presented is the methodology for

D. R. Graff; B. K. Mudunuri

1985-01-01

213

Validation of 3D surface imaging in breath-hold radiotherapy for breast cancer: one central camera unit versus three camera units  

NASA Astrophysics Data System (ADS)

In this work we investigated the benefit of the use of two lateral camera units additional to a central camera unit for 3D surface imaging for image guidance in deep-inspiration breath-hold (DIBH) radiotherapy by comparison with cone-beam computed tomography (CBCT). Ten patients who received DIBH radiotherapy after breast-conserving surgery were included. The performance of surface imaging using one and three camera units was compared to using CBCT for setup verification. Breast-surface registrations were performed for CBCT as well as for 3D surfaces, captured concurrently with CBCT, to planning CT. The resulting setup errors were compared with linear regression analysis. For the differences between setup errors an assessment of the group mean, systematic error, random error, and 95% limits of agreement was made. Correlations between derived surface-imaging [one camera unit;three camera units] and CBCT setup errors were: R2=[0.67;0.75], [0.76;0.87], [0.88;0.91] in left-right, cranio-caudal, and anterior-posterior direction, respectively. Group mean, systematic and random errors were slightly smaller (sub-millimeter differences) and the limits of agreement were 0.10 to 0.25cm tighter when using three camera units compared with one. For the majority of the data, the use of three camera units compared with one resulted in setup errors more similar to the CBCT derived setup errors for the craniocaudal and anterior-posterior directions (p<0.01, Wilcoxon-signed-ranks test). This study shows a better correlation and agreement between 3D surface imaging and CBCT when three camera units are used instead of one and further outlines the conditions under which the benefit of using three camera units is significant.

Alderliesten, Tanja; Betgen, Anja; van Vliet-Vroegindeweij, Corine; Remeijer, Peter

2013-03-01

214

A 5-18 micron array camera for high-background astronomical imaging  

NASA Technical Reports Server (NTRS)

A new infrared array camera system using a Hughes/SBRC 58 x 62 pixel hybrid Si:Ga array detector has been successfully applied to high-background 5-18-micron astronomical imaging observations. The off-axis reflective optical system minimizes thermal background loading and produces diffraction-limited images with negligible spatial distortion. The noise equivalent flux density (NEFD) of the camera at 10 microns on the 3.0-m NASA/Infrared Telescope Facility with broadband interference filters and 0.26 arcsec pixel is NEFD = 0.01 Jy/sq rt min per pixel (1sigma), and it operates at a frame rate of 30 Hz with no compromise in observational efficiency. The electronic and optical design of the camera, its photometric characteristics, examples of observational results, and techniques for successful array imaging in a high- background astronomical application are discussed.

Gezari, Daniel Y.; Folz, Walter C.; Woods, Lawrence A.; Varosi, Frank

1992-01-01

215

Correction of Over-Exposed Images Captured by CellPhone Cameras  

Microsoft Academic Search

Due to the limited dynamic range of commonly used cell-phone camera sensors, the entire or part of an auto-exposure (AE) image may appear overexposed. In this paper, we present an over-exposure correction algorithm which utilizes one low-exposure (LE) image in addition to the AE image. It consists of registration, fusion, consistency verification, and merging components. Registration is achieved by using

Q. R. Razlighi; N. Kehtarnavaz

2007-01-01

216

Imaging with depth extension: where are the limits in fixed- focus cameras?  

NASA Astrophysics Data System (ADS)

The integration of novel optics designs, miniature CMOS sensors, and powerful digital processing into a single imaging module package is driving progress in handset camera systems in terms of performance, size (thinness) and cost. The miniature cameras incorporating high resolution sensors and fixed-focus Extended Depth of Field (EDOF) optics allow close-range reading of printed material (barcode patterns, business cards), while providing high quality imaging in more traditional applications. These cameras incorporate modified optics and digital processing to recover the soft-focus images and restore sharpness over a wide range of object distances. The effects a variety of parameters of the imaging module on the EDOF range were analyzed for a family of high resolution CMOS modules. The parameters include various optical properties of the imaging lens, and the characteristics of the sensor. The extension factors for the EDOF imaging module were defined in terms of an improved absolute resolution in object space while maintaining focus at infinity. This definition was applied for the purpose of identifying the minimally resolvable object details in mobile cameras with bar-code reading feature.

Bakin, Dmitry; Keelan, Brian

2008-08-01

217

Determination of visual range during fog and mist using digital camera images  

NASA Astrophysics Data System (ADS)

During the winter of 2008, daily time series of images of five "unit-cell chequerboard" targets were acquired using a digital camera. The camera and targets were located in the Majura Valley approximately 3 km from Canberra airport. We show how the contrast between the black and white sections of the targets is related to the meteorological range (or standard visual range), and compare estimates of this quantity derived from images acquired during fog and mist conditions with those from the Vaisala FD-12 visibility meter operated by the Bureau of Meteorology at Canberra Airport. The two sets of ranges are consistent but show the variability of visibility in the patchy fog conditions that often prevail in the Majura Valley. Significant spatial variations of the light extinction coefficient were found to occur over the longest 570 m optical path sampled by the imaging system. Visual ranges could be estimated out to ten times the distance to the furthest target, or approximately 6 km, in these experiments. Image saturation of the white sections of the targets was the major limitation on the quantitative interpretation of the images. In the future, the camera images will be processed in real time so that the camera exposure can be adjusted to avoid saturation.

Taylor, John R.; Moogan, Jamie C.

2010-08-01

218

Driving micro-optical imaging systems towards miniature camera applications  

NASA Astrophysics Data System (ADS)

Up to now, multi channel imaging systems have been increasingly studied and approached from various directions in the academic domain due to their promising large field of view at small system thickness. However, specific drawbacks of each of the solutions prevented the diffusion into corresponding markets so far. Most severe problems are a low image resolution and a low sensitivity compared to a conventional single aperture lens besides the lack of a cost-efficient method of fabrication and assembly. We propose a microoptical approach to ultra-compact optics for real-time vision systems that are inspired by the compound eyes of insects. The demonstrated modules achieve a VGA resolution with 700x550 pixels within an optical package of 6.8mm x 5.2mm and a total track length of 1.4mm. The partial images that are separately recorded within different optical channels are stitched together to form a final image of the whole field of view by means of image processing. These software tools allow to correct the distortion of the individual partial images so that the final image is also free of distortion. The so-called electronic cluster eyes are realized by state-of-the-art microoptical fabrication techniques and offer a resolution and sensitivity potential that makes them suitable for consumer, machine vision and medical imaging applications.

Brückner, Andreas; Duparré, Jacques; Dannberg, Peter; Leitel, Robert; Bräuer, Andreas

2010-05-01

219

Imaging performance of a multiwire proportional-chamber positron camera  

SciTech Connect

A new design - fully three-dimensional - Positron Camera is presented, made of six MultiWire Proportional Chamber modules arranged to form the lateral surface of a hexagonal prism. A true coincidence rate of 56000 c/s is expected with an equal accidental rate for a 400 ..mu..Ci activity uniformly distributed in a approx. 3 l water phantom. A detailed Monte Carlo program has been used to investigate the dependence of the spatial resolution on the geometrical and physical parameters. A spatial resolution of 4.8 mm FWHM has been obtained for a /sup 18/F point-like source in a 10 cm radius water phantom. The main properties of the limited angle reconstruction algorithms are described in relation to the proposed detector geometry.

Perez-Mandez, V.; Del Guerra, A.; Nelson, W.R.; Tam, K.C.

1982-08-01

220

Binarization method based on evolution equation for document images produced by cameras  

NASA Astrophysics Data System (ADS)

We present an evolution equation-based binarization method for document images produced by cameras. Unlike the existing thresholding techniques, the idea behind our method is that a family of gradually binarized images is obtained by the solution of an evolution partial differential equation, starting with an original image. In our formulation, the evolution is controlled by a global force and a local force, both of which have opposite sign inside and outside the object of interests in the original image. A simple finite difference scheme with a significantly larger time step is used to solve the evolution equation numerically; the desired binarization is typically obtained after only one or two iterations. Experimental results on 122 camera document images show that our method yields good visual quality and OCR performance.

Wang, Yan; He, Chuanjiang

2012-04-01

221

Camera sensor arrangement for crop/weed detection accuracy in agronomic images.  

PubMed

In Precision Agriculture, images coming from camera-based sensors are commonly used for weed identification and crop line detection, either to apply specific treatments or for vehicle guidance purposes. Accuracy of identification and detection is an important issue to be addressed in image processing. There are two main types of parameters affecting the accuracy of the images, namely: (a) extrinsic, related to the sensor's positioning in the tractor; (b) intrinsic, related to the sensor specifications, such as CCD resolution, focal length or iris aperture, among others. Moreover, in agricultural applications, the uncontrolled illumination, existing in outdoor environments, is also an important factor affecting the image accuracy. This paper is exclusively focused on two main issues, always with the goal to achieve the highest image accuracy in Precision Agriculture applications, making the following two main contributions: (a) camera sensor arrangement, to adjust extrinsic parameters and (b) design of strategies for controlling the adverse illumination effects. PMID:23549361

Romeo, Juan; Guerrero, José Miguel; Montalvo, Martín; Emmi, Luis; Guijarro, María; Gonzalez-de-Santos, Pablo; Pajares, Gonzalo

2013-01-01

222

A mobile phone-based retinal camera for portable wide field imaging.  

PubMed

Digital fundus imaging is used extensively in the diagnosis, monitoring and management of many retinal diseases. Access to fundus photography is often limited by patient morbidity, high equipment cost and shortage of trained personnel. Advancements in telemedicine methods and the development of portable fundus cameras have increased the accessibility of retinal imaging, but most of these approaches rely on separate computers for viewing and transmission of fundus images. We describe a novel portable handheld smartphone-based retinal camera capable of capturing high-quality, wide field fundus images. The use of the mobile phone platform creates a fully embedded system capable of acquisition, storage and analysis of fundus images that can be directly transmitted from the phone via the wireless telecommunication system for remote evaluation. PMID:24344230

Maamari, Robi N; Keenan, Jeremy D; Fletcher, Daniel A; Margolis, Todd P

2014-04-01

223

Camera Sensor Arrangement for Crop/Weed Detection Accuracy in Agronomic Images  

PubMed Central

In Precision Agriculture, images coming from camera-based sensors are commonly used for weed identification and crop line detection, either to apply specific treatments or for vehicle guidance purposes. Accuracy of identification and detection is an important issue to be addressed in image processing. There are two main types of parameters affecting the accuracy of the images, namely: (a) extrinsic, related to the sensor's positioning in the tractor; (b) intrinsic, related to the sensor specifications, such as CCD resolution, focal length or iris aperture, among others. Moreover, in agricultural applications, the uncontrolled illumination, existing in outdoor environments, is also an important factor affecting the image accuracy. This paper is exclusively focused on two main issues, always with the goal to achieve the highest image accuracy in Precision Agriculture applications, making the following two main contributions: (a) camera sensor arrangement, to adjust extrinsic parameters and (b) design of strategies for controlling the adverse illumination effects. PMID:23549361

Romeo, Juan; Guerrero, José Miguel; Montalvo, Martín; Emmi, Luis; Guijarro, María; Gonzalez-de-Santos, Pablo; Pajares, Gonzalo

2013-01-01

224

A Fast Visible Camera Divertor-Imaging Diagnostic on DIII-D  

SciTech Connect

In recent campaigns, the Photron Ultima SE fast framing camera has proven to be a powerful diagnostic when applied to imaging divertor phenomena on the National Spherical Torus Experiment (NSTX). Active areas of NSTX divertor research addressed with the fast camera include identification of types of EDGE Localized Modes (ELMs)[1], dust migration, impurity behavior and a number of phenomena related to turbulence. To compare such edge and divertor phenomena in low and high aspect ratio plasmas, a multi-institutional collaboration was developed for fast visible imaging on NSTX and DIII-D. More specifically, the collaboration was proposed to compare the NSTX small type V ELM regime [2] and the residual ELMs observed during Type I ELM suppression with external magnetic perturbations on DIII-D[3]. As part of the collaboration effort, the Photron camera was installed recently on DIII-D with a tangential view similar to the view implemented on NSTX, enabling a direct comparison between the two machines. The rapid implementation was facilitated by utilization of the existing optics that coupled the visible spectral output from the divertor vacuum ultraviolet UVTV system, which has a view similar to the view developed for the divertor tangential TV camera [4]. A remote controlled filter wheel was implemented, as was the radiation shield required for the DIII-D installation. The installation and initial operation of the camera are described in this paper, and the first images from the DIII-D divertor are presented.

Roquemore, A; Maingi, R; Lasnier, C; Nishino, N; Evans, T; Fenstermacher, M; Nagy, A

2007-06-19

225

In vitro near-infrared imaging of occlusal dental caries using germanium enhanced CMOS camera  

PubMed Central

The high transparency of dental enamel in the near-infrared (NIR) at 1310-nm can be exploited for imaging dental caries without the use of ionizing radiation. The objective of this study was to determine whether the lesion contrast derived from NIR transillumination can be used to estimate lesion severity. Another aim was to compare the performance of a new Ge enhanced complementary metal-oxide-semiconductor (CMOS) based NIR imaging camera with the InGaAs focal plane array (FPA). Extracted human teeth (n=52) with natural occlusal caries were imaged with both cameras at 1310-nm and the image contrast between sound and carious regions was calculated. After NIR imaging, teeth were sectioned and examined using more established methods, namely polarized light microscopy (PLM) and transverse microradiography (TMR) to calculate lesion severity. Lesions were then classified into 4 categories according to the lesion severity. Lesion contrast increased significantly with lesion severity for both cameras (p<0.05). The Ge enhanced CMOS camera equipped with the larger array and smaller pixels yielded higher contrast values compared with the smaller InGaAs FPA (p<0.01). Results demonstrate that NIR lesion contrast can be used to estimate lesion severity. PMID:22162916

Lee, Chulsung; Darling, Cynthia L.; Fried, Daniel

2011-01-01

226

The image pretreatment based on the FPGA inside digital CCD camera  

NASA Astrophysics Data System (ADS)

In a space project, a digital CCD camera which can image more clearly in the 1 Lux light environment has been asked to design . The CCD sensor ICX285AL produced by SONY Co.Ltd has been used in the CCD camera. The FPGA (Field Programmable Gate Array) chip XQR2V1000 has been used as a timing generator and a signal processor inside the CCD camera. But in the low-light environment, two kinds of random noise become apparent because of the improving of CCD camera's variable gain, one is dark current noise in the image background, the other is vertical transfer noise. The real time method for eliminating noise based on FPGA inside the CCD camera would be introduced. The causes and characteristics of the random noise have been analyzed. First, several ideas for eliminating dark current noise had been motioned; then they were emulated by VC++ in order to compare their speed and effect; Gauss filter has been chosen because of the filtering effect. The vertical transfer vertical noise has the character that the vertical noise points have regular ordinate in the image two-dimensional coordinates; and the performance of the noise is fixed, the gray value of the noise points is 16-20 less than the surrounding pixels. According to these characters, local median filter has been used to clear up the vertical noise. Finally, these algorithms had been transplanted into the FPGA chip inside the CCD camera. A large number of experiments had proved that the pretreatment has better real-time features. The pretreatment makes the digital CCD camera improve the signal-to-noise ratio of 3-5dB in the low-light environment.

Tian, Rui; Liu, Yan-ying

2009-07-01

227

First Light with RATIR: An Automated 6-band Optical/NIR Imaging Camera  

NASA Astrophysics Data System (ADS)

The Reionization and Transients InfraRed camera (RATIR) is a simultaneous optical/NIR multi-band imaging camera which is 100% time-dedicated to the followup of Gamma-ray Bursts. The camera is mounted on the 1.5-meter Johnson telescope of the Mexican Observatorio Astronomico Nacional on Sierra San Pedro Martir in Baja California. With rapid slew capability and autonomous interrupt capabilities, the system will image GRBs in 6 bands (i, r, Z, Y, J, and H) within minutes of receiving a satellite position, detecting optically faint afterglows in the NIR and quickly alerting the community to potential GRBs at high redshift (z>6-10). We report here on this Spring's first light observing campaign with RATIR. We summarize the instrumental characteristics, capabilities, and observing modes.

Butler, Nat; Klein, Chris; Fox, Ori; Lotkin, Gennadiy; Bloom, Josh; Prochaska, J. Xavier; Ramirez-Ruiz, Enrico; de Diego, José A.; Georgiev, Leonid; González, Jesús; Lee, William H.; Richer, Michael G.; Román, Carlos; Watson, Alan M.; Gehrels, Neil; Kutyrev, Alexander; Bernstein, Rebecca; Alvarez, Luis Carlos; Ceseña, Urania; Clark, David; Colorado, Enrique; Córdova, Antolín.; Farah, Alejandro; García, Benjamín.; Guisa, Gerardo; Herrera, Joel; Lazo, Francisco; López, Eduardo; Luna, Esteban; Martínez, Benjamín.; Murillo, Francisco; Murillo, José Manuel; Núñez, Juan Manuel; Pedrayes, M. Herlinda; Quirós, Fernando; Ochoa, José Luis; Sierra, Gerardo; Moseley, Harvey; Rapchun, David; Robinson, Frederick D.; Samuel, Mathew V.; Sparr, Leroy M.

2012-09-01

228

Camera correlation focus: an image-based focusing technique  

Microsoft Academic Search

Determining the focal position of an overlay target with respect to an objective lens is an important prerequisite of overlay metrology. At best, an out-of-focus image will provide less than optimal information for metrology; focal depth for a high-NA imaging system at the required magnification is of the order of 5 microns. In most cases poor focus will lead to

Greg Reynolds; Mike Hammond; Lewis A. Binns

2005-01-01

229

The application of camera calibration in range-gated 3D imaging technology  

NASA Astrophysics Data System (ADS)

Range-gated laser imaging technology was proposed in 1966 by LF Gillespiethe in U.S. Army Night Vision Laboratory(NVL). Using pulse laser and intensified charge-coupled device(ICCD) as light source and detector respectively, range-gated laser imaging technology can realize space-slice imaging while restraining the atmospheric backs-catter, and in turn detect the target effectively, by controlling the delay between the laser pulse and strobe. Owing to the constraints of the development of key components such as narrow pulse laser and gated imaging devices, the research has been progressed slowly in the next few decades. Until the beginning of this century, as the hardware technology continues to mature, this technology has developed rapidly in fields such as night vision, underwater imaging, biomedical imaging, three-dimensional imaging, especially range-gated three-dimensional(3-D) laser imaging field purposing of access to target spatial information. 3-D reconstruction is the processing of restoration of 3-D objects visible surface geometric structure from three-dimensional(2-D) image. Range-gated laser imaging technology can achieve gated imaging of slice space to form a slice image, and in turn provide the distance information corresponding to the slice image. But to inverse the information of 3-D space, we need to obtain the imaging visual field of system, that is, the focal length of the system. Then based on the distance information of the space slice, the spatial information of each unit space corresponding to each pixel can be inversed. Camera calibration is an indispensable step in 3-D reconstruction, including analysis of the internal structure of camera parameters and the external parameters . In order to meet the technical requirements of the range-gated 3-D imaging, this paper intends to study the calibration of the zoom lens system. After summarizing the camera calibration technique comprehensively, a classic calibration method based on line is selected. One-to-one correspondence between visual filed and focal length of system is obtained and offers effective visual field information for the matching of imaging filed and illumination filed in range-gated 3-D imaging technology. On the basis of the experimental results, combined with the depth of field theory, the application of camera calibration in range-gated 3-D imaging technology is futher studied.

Liu, Xiao-quan; Wang, Xian-wei; Zhou, Yan

2013-09-01

230

Interactive alignment and image reconstruction for wafer-level multi-aperture camera systems  

NASA Astrophysics Data System (ADS)

Assembly of miniaturized high-resolution cameras is typically carried out by active alignment. The sensor image is constantly monitored while the lens stack is adjusted. When sharpness is acceptable in all regions of the image, the lens position over the sensor is fixed. For multi-aperture cameras, this approach is not sufficient. During prototyping, it is beneficial to see the complete reconstructed image, assembled from all optical channels. However, typical reconstruction algorithms are high-quality offline methods that require calibration. As the geometric setup of the camera repeatedly changes during assembly, this would require frequent re-calibration. We present a real-time algorithm for an interactive preview of the reconstructed image during camera alignment. With this algorithm, systematic alignment errors can be tracked and corrected during assembly. Known imperfections of optical components can also be included in the reconstruction. Finally, the algorithm easily maps to very simple GPU operations, making it ideal for applications in mobile devices where power consumption is critical.

Oberdörster, Alexander; Brückner, Andreas; Lensch, Hendrik P. A.

2014-09-01

231

Status of an Atmospheric Cherenkov Imaging Camera for the CANGAROOIII Experiment  

E-print Network

Status of an Atmospheric Cherenkov Imaging Camera for the CANGAROO­III Experiment and Perspective atmospheric Cherenkov telescopes in Australia. The observa- tion with the first telescope (CANGAROO'ichi Tsuchiya experiment is based on the other approach - the ground-based Cherenkov telescopes. 2

Enomoto, Ryoji

232

RECONSTRUCTION OF ISOMETRICALLY DEFORMABLE FLAT SURFACES IN 3D FROM MULTIPLE CAMERA IMAGES  

E-print Network

RECONSTRUCTION OF ISOMETRICALLY DEFORMABLE FLAT SURFACES IN 3D FROM MULTIPLE CAMERA IMAGES R of paper is shown. Index Terms-- Machine vision, Isometric Non-Rigid Recon- struction, Manifold Learning 1 as well. The motivating application here is to infer the structure of a non- rigid isometric surface

233

Strong-Lensing Analysis of A1689 from Deep Advanced Camera Images  

Microsoft Academic Search

We analyze deep multicolor Advanced Camera images of the largest known gravitational lens, A1689. Radial and tangential arcs delineate the critical curves in unprecedented detail, and many small counterimages are found near the center of mass. We construct a flexible light deflection field to predict the appearance and positions of counterimages. The model is refined as new counterimages are identified

Tom Broadhurst; Narciso Benítez; Dan Coe; Keren Sharon; Kerry Zekser; Rick White; Holland Ford; Rychard Bouwens; John Blakeslee; Marc Clampin; Nick Cross; Marijn Franx; Brenda Frye; George Hartig; Garth Illingworth; Leopoldo Infante; Felipe Menanteau; Gerhardt Meurer; Marc Postman; D. R. Ardila; F. Bartko; R. A. Brown; C. J. Burrows; E. S. Cheng; P. D. Feldman; D. A. Golimowski; T. Goto; C. Gronwall; D. Herranz; B. Holden; N. Homeier; J. E. Krist; M. P. Lesser; A. R. Martel; G. K. Miley; P. Rosati; M. Sirianni; W. B. Sparks; S. Steindling; H. D. Tran; Z. I. Tsvetanov; W. Zheng

2005-01-01

234

Planning and Control of Mobile Robots in Image Space from Overhead Cameras  

E-print Network

Planning and Control of Mobile Robots in Image Space from Overhead Cameras Rahul S. Rao, Vijay Kumar, Camillo J. Taylor General Robotics, Automation, Sensing and Perception (GRASP) Laboratory}@grasp.cis.upenn.edu Abstract-- In this work, we present a framework for the development of a planar mobile robot controller

Taylor, Camillo J.

235

A multiple-plate, multiple-pinhole camera for X-ray gamma-ray imaging  

NASA Technical Reports Server (NTRS)

Plates with identical patterns of precisely aligned pinholes constitute lens system which, when rotated about optical axis, produces continuous high resolution image of small energy X-ray or gamma ray source. Camera has applications in radiation treatment and nuclear medicine.

Hoover, R. B.

1971-01-01

236

A survey of Martian dust devil activity using Mars Global Surveyor Mars Orbiter Camera images  

Microsoft Academic Search

A survey of dust devils using the Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) wide- and narrow-angle (WA and NA) images has been undertaken. The survey comprises two parts: (1) sampling of nine broad regions from September 1997 to July 2001 and (2) a focused seasonal monitoring of variability in the Amazonis region, an active dust devil site, from

Jenny A. Fisher; Mark I. Richardson; Claire E. Newman; Mark A. Szwast; Chelsea Graf; Shabari Basu; Shawn P. Ewald; Anthony D. Toigo; R. John Wilson

2005-01-01

237

THz imaging and radiometric measurements using a microbolometer-based camera  

Microsoft Academic Search

A THz VOx-based 160×120 microbolometer FPA camera with 52 ?m pixel pitch has been developed at INO. Radiometric NEP values have been measured at different wavelengths. Real-time video-rate reflectance imaging using large beam area from a high-power FIR optically pumped THz laser has been also performed, demonstrating high sensitivity for stand alone applications.

M. Bolduc; M. Terroux; L. Marchese; B. Tremblay; E. Savard; M. Doucet; H. Oulachgar; C. Alain; H. Jeronimek; A. Bergeron

2011-01-01

238

Future Planetary Surface Imager Development by the Beagle 2 Stereo Camera System Team  

NASA Astrophysics Data System (ADS)

The Stereo Camera System provided Beagle 2 with wide-angle multi-spectral stereo imaging (IFOV=0.043°). The SCS team plans to build on this design heritage to provide improved stereo capabilities to the Pasteur payload of the Aurora ExoMars rover.

Griffiths, A. D.; Coates, A. J.; Josset, J.-L.; Paar, G.

2004-03-01

239

Saturn's hydrogen aurora: Wide field and planetary camera 2 imaging from the Hubble Space Telescope  

Microsoft Academic Search

Wide field and planetary camera 2\\/Hubble Space Telescope (WFPC2\\/HST) images of Saturn's far ultraviolet aurora reveal emissions confined to a narrow band of latitudes near Saturn's north and south poles. The aurorae are most prominent in the morning sector with patterns that appear fixed in local time. The geographic distribution and vertical extent of the auroral emissions seen in these

John T. Trauger; John T. Clarke; Gilda E. Ballester; Robin W. Evans; Christopher J. Burrows; David Crisp; John S. Gallagher; Richard E. Griffiths; J. Jeff Hester; John G. Hoessel; Jon A. Holtzman; John E. Krist; Jeremy R. Mould; Raghvendra Sahai; Paul A. Scowen; Karl R. Stapelfeldt; Alan M. Watson

1998-01-01

240

Geologic Analysis of the Surface Thermal Emission Images Taken by the VMC Camera, Venus Express  

NASA Astrophysics Data System (ADS)

Analysis of Venus Monitoring Camera 1-µm images and surface emission modeling showed apparent emissivity at Chimon-mana tessera and shows that Tuulikki volcano is higher than that of the adjacent plains; Maat Mons did not show any signature of ongoing volcanism.

Basilevsky, A. T.; Shalygin, E. V.; Titov, D. V.; Markiewicz, W. J.; Scholten, F.; Roatsch, Th.; Fiethe, B.; Osterloh, B.; Michalik, H.; Kreslavsky, M. A.; Moroz, L. V.

2010-03-01

241

Creating a Composite Virtual Birds-eye View at a Vehicle Intersection by Synthesizing Camera Images from Multiple Vehicles  

Microsoft Academic Search

We propose generating a virtual bird's-eye view by integrating images seen from vehicles that enter an intersection for supporting traffic safety in ITS. Since more and more vehicles are equipped with video cameras today, it is becoming possible for these vehicles to support each other on an individual level using camera images. In this paper, we integrate them by projecting

Daisuke OTA; Shintaro ONO; Katsuhi IKEUCHI

2007-01-01

242

A fine image motion compensation method for the panoramic TDI CCD camera in remote sensing applications  

NASA Astrophysics Data System (ADS)

The image motion caused by camera housing rotation is the natural component in panoramic TDI CCD camera, and it cannot be eliminated thoroughly by traditional motion compensation schemes. After researching on the operation of a typical panoramic TDI CCD imaging system, we present a fine rolling rate independent motion compensation method. Employing the correction factor k, a TDI CCD line transfer synchronization signal is generated by the high resolution optical encoder as long as the ground scene moves one pixel with respect to the detector. Then, the effect of the motion compensation method is evaluated using the Monte Carlo method. The simulation results indicate that the magnitude of the modulation transfer function at the Nyquist frequency increases more than three times by increasing the subdivision steps from 100 to 500, and also of the imaging experimental results show that the image quality improvement can be achieved.

Wang, Dejiang; Li, Wenming; Yao, Yuan; Huang, Houtian; Wang, Yutang

2013-07-01

243

Chandra High Resolution Camera Imaging of GRS 1758-258  

E-print Network

We observed the "micro-quasar" GRS 1758-258 four times with Chandra. Two HRC-I observations were made in 2000 September-October spanning an intermediate-to-hard spectral transition (identified with RXTE). Another HRC-I and an ACIS/HETG observation were made in 2001 March following a hard-to-soft transition to a very low flux state. Based on the three HRC images and the HETG zero order image, the accurate position (J2000) of the X-ray source is RA = 18h 01m 12.39s, Dec = -25d 44m 36.1s (90% confidence radius = 0".45), consistent with the purported variable radio counterpart. All three HRC images are consistent with GRS 1758-258 being a point source, indicating that any bright jet is less than ~1 light-month in projected length, assuming a distance of 8.5 kpc.

W. A. Heindl; D. M. Smith

2002-08-19

244

Film cameras or digital sensors? The challenge ahead for aerial imaging  

USGS Publications Warehouse

Cartographic aerial cameras continue to play the key role in producing quality products for the aerial photography business, and specifically for the National Aerial Photography Program (NAPP). One NAPP photograph taken with cameras capable of 39 lp/mm system resolution can contain the equivalent of 432 million pixels at 11 ??m spot size, and the cost is less than $75 per photograph to scan and output the pixels on a magnetic storage medium. On the digital side, solid state charge coupled device linear and area arrays can yield quality resolution (7 to 12 ??m detector size) and a broader dynamic range. If linear arrays are to compete with film cameras, they will require precise attitude and positioning of the aircraft so that the lines of pixels can be unscrambled and put into a suitable homogeneous scene that is acceptable to an interpreter. Area arrays need to be much larger than currently available to image scenes competitive in size with film cameras. Analysis of the relative advantages and disadvantages of the two systems show that the analog approach is more economical at present. However, as arrays become larger, attitude sensors become more refined, global positioning system coordinate readouts become commonplace, and storage capacity becomes more affordable, the digital camera may emerge as the imaging system for the future. Several technical challenges must be overcome if digital sensors are to advance to where they can support mapping, charting, and geographic information system applications.

Light, D.L.

1996-01-01

245

Development of an imaging hyperspectral camera using the ultraviolet and visible wavelength AOTF  

NASA Astrophysics Data System (ADS)

A spectroscopic camera has been developed which has spectral resolution of less than 1.5nm in the ultraviolet (UV) and visible wavelength bands (320-580 nm). Its main components are a specially coated UV objective lens, a UV Acousto-Optic Tunable Filter (AOTF) with a thermo-electric cooling system, and a imaging system based on a high-gain avalanche rushing amorphous photoconductor (HARP) developed by NHK Science and Technical Research Laboratories. Research is currently under way to develop the hyperspectral camera into a sensor package for airborne and ultimately space-based remote sensing applications. This paper presents the basic principle and configuration of the hyperspectral camera, and gives details of tests to measure its performance. The results of spectral resolution tests analyzing very close two spectra from a helium-discharge lamp demonstrate the camera's high spectral resolution performance. Full color and spectral images obtained by a spectrometry experiment are also presented to demonstrate the camera's hyperspectral capabilities.

Kurosaki, Hirohisa; Shingu, Hirokimi; Enkyo, Shigeharu; Suzuki, Takao; Tanioka, Kenkichi; Takefuji, Yoshiyasu

2003-04-01

246

Research on the affect of differential-images technique to the resolution of infrared spatial camera  

NASA Astrophysics Data System (ADS)

The optical system of infrared spatial camera adopts bigger relative aperture and bigger pixel size on focal plane element. These make the system have bulky volume and low resolution. The potential of the optical systems can not be exerted adequately. So, one method for improving resolution of infrared spatial camera based on multi-frame difference-images is introduced in the dissertation. The method uses more than one detectors to acquire several difference images, and then reconstructs a new high-resolution image from these images through the relationship of pixel grey value. The technique of difference-images that uses more than two detectors is researched, and it can improve the resolution 2.5 times in theory. The relationship of pixel grey value between low-resolution difference-images and high-resolution image is found by analyzing the energy of CCD sampling, a general relationship between the enhanced times of the resolution of the detected figure with differential method and the least count of CCD that will be used to detect figure is given. Based on the research of theory, the implementation process of utilizing difference-images technique to improve the resolution of the figure was simulated used Matlab software by taking a personality image as the object, and the software can output the result as an image. The result gotten from the works we have finished proves that the technique is available in high-resolution image reconstruction. The resolution of infrared spatial camera can be improved evidently when holding the size of optical structure or using big size detector by applying for difference image technique. So the technique has a high value in optical remote fields.

Jin, Guang; An, Yuan; Qi, Yingchun; Hu, Fusheng

2007-12-01

247

A Survey of Martian Dust Devil Activity Using Mars Global Surveyor Mars Orbiter Camera Images  

Microsoft Academic Search

We present results from an orbital survey of Martian dust devils using the Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) wide- and narrow-angle (WA and NA) images. The survey includes all available imaging data (mapping and pre-mapping orbit), through to mission phase E06. Due to the large volume of data, we have concentrated on surveying limited regions, selected variously

J. Fisher; M. I. Richardson; S. P. Ewald; A. D. Toigo; R. J. Wilson

2002-01-01

248

The iQID camera: An ionizing-radiation quantum imaging detector  

NASA Astrophysics Data System (ADS)

We have developed and tested a novel, ionizing-radiation Quantum Imaging Detector (iQID). This scintillation-based detector was originally developed as a high-resolution gamma-ray imager, called BazookaSPECT, for use in single-photon emission computed tomography (SPECT). Recently, we have investigated the detector's response and imaging potential with other forms of ionizing radiation including alpha, neutron, beta, and fission fragment particles. The confirmed response to this broad range of ionizing radiation has prompted its new title. The principle operation of the iQID camera involves coupling a scintillator to an image intensifier. The scintillation light generated by particle interactions is optically amplified by the intensifier and then re-imaged onto a CCD/CMOS camera sensor. The intensifier provides sufficient optical gain that practically any CCD/CMOS camera can be used to image ionizing radiation. The spatial location and energy of individual particles are estimated on an event-by-event basis in real time using image analysis algorithms on high-performance graphics processing hardware. Distinguishing features of the iQID camera include portability, large active areas, excellent detection efficiency for charged particles, and high spatial resolution (tens of microns). Although modest, iQID has energy resolution that is sufficient to discriminate between particles. Additionally, spatial features of individual events can be used for particle discrimination. An important iQID imaging application that has recently been developed is real-time, single-particle digital autoradiography. We present the latest results and discuss potential applications.

Miller, Brian W.; Gregory, Stephanie J.; Fuller, Erin S.; Barrett, Harrison H.; Bradford Barber, H.; Furenlid, Lars R.

2014-12-01

249

CMOS image sensors: electronic camera-on-a-chip  

Microsoft Academic Search

CMOS active pixel sensors (APS) have performance competitive with charge-coupled device (CCD) technology, and offer advantages in on-chip functionality, system power reduction, cost, and miniaturization. This paper discusses the requirements for CMOS image sensors and their historical development, CMOS devices and circuits for pixels, analog signal chain, and on-chip analog-to-digital conversion are reviewed and discussed

Eric R. Fossum

1997-01-01

250

Artificially reproduced image of earth photographed by UV camera  

NASA Technical Reports Server (NTRS)

A reproduction of a color enhancement of a picture photographed in far-ultraviolet light by Astronaut John W. Young, Apollo 16 commander, showing the Earth. Note this is an artificially reproduced image. The three auroral belts, the sunlit atmosphere and background stars are visible.

1972-01-01

251

Preliminary experience with small animal SPECT imaging on clinical gamma cameras.  

PubMed

The traditional lack of techniques suitable for in vivo imaging has induced a great interest in molecular imaging for preclinical research. Nevertheless, its use spreads slowly due to the difficulties in justifying the high cost of the current dedicated preclinical scanners. An alternative for lowering the costs is to repurpose old clinical gamma cameras to be used for preclinical imaging. In this paper we assess the performance of a portable device, that is, working coupled to a single-head clinical gamma camera, and we present our preliminary experience in several small animal applications. Our findings, based on phantom experiments and animal studies, provided an image quality, in terms of contrast-noise trade-off, comparable to dedicated preclinical pinhole-based scanners. We feel that our portable device offers an opportunity for recycling the widespread availability of clinical gamma cameras in nuclear medicine departments to be used in small animal SPECT imaging and we hope that it can contribute to spreading the use of preclinical imaging within institutions on tight budgets. PMID:24963478

Aguiar, P; Silva-Rodríguez, J; Herranz, M; Ruibal, A

2014-01-01

252

High-speed camera with real time processing for frequency domain imaging.  

PubMed

We describe a high-speed camera system for frequency domain imaging suitable for applications such as in vivo diffuse optical imaging and fluorescence lifetime imaging. 14-bit images are acquired at 2 gigapixels per second and analyzed with real-time pipeline processing using field programmable gate arrays (FPGAs). Performance of the camera system has been tested both for RF-modulated laser imaging in combination with a gain-modulated image intensifier and a simpler system based upon an LED light source. System amplitude and phase noise are measured and compared against theoretical expressions in the shot noise limit presented for different frequency domain configurations. We show the camera itself is capable of shot noise limited performance for amplitude and phase in as little as 3 ms, and when used in combination with the intensifier the noise levels are nearly shot noise limited. The best phase noise in a single pixel is 0.04 degrees for a 1 s integration time. PMID:21750770

Shia, Victor; Watt, David; Faris, Gregory W

2011-07-01

253

High-speed camera with real time processing for frequency domain imaging  

PubMed Central

We describe a high-speed camera system for frequency domain imaging suitable for applications such as in vivo diffuse optical imaging and fluorescence lifetime imaging. 14-bit images are acquired at 2 gigapixels per second and analyzed with real-time pipeline processing using field programmable gate arrays (FPGAs). Performance of the camera system has been tested both for RF-modulated laser imaging in combination with a gain-modulated image intensifier and a simpler system based upon an LED light source. System amplitude and phase noise are measured and compared against theoretical expressions in the shot noise limit presented for different frequency domain configurations. We show the camera itself is capable of shot noise limited performance for amplitude and phase in as little as 3 ms, and when used in combination with the intensifier the noise levels are nearly shot noise limited. The best phase noise in a single pixel is 0.04 degrees for a 1 s integration time. PMID:21750770

Shia, Victor; Watt, David; Faris, Gregory W.

2011-01-01

254

Digital Camera Calibration Using Images Taken from AN Unmanned Aerial Vehicle  

NASA Astrophysics Data System (ADS)

For calibrating the camera, an accurate determination of the interior orientation parameters is needed. For more accurate results, the calibration images should be taken under conditions that are similar to the field samples. The aim of this work is the establishment of an efficient and accurate digital camera calibration method to be used in particular working conditions, as it can be found with our UAV (Unmanned Aerial Vehicle) photogrammetric projects. The UAV used in this work was md4-200 modelled by Microdrones. The microdrone is also equipped with a standard digital non- metric camera, the Pentax Optio A40 camera. To find out the interior orientation parameters of the digital camera, two calibration methods were done. A lab calibration based on a flat pattern and a field calibration were fulfilled. To carry out the calibration, Photomodeler Scanner software was used in both cases. The lab calibration process was completely automatic using a calibration grid. The focal length was fixed at widest angle and the network included a total of twelve images with± 90º roll angles. In order to develop the field calibration, a flight plan was programmed including a total of twelve images. In the same way as in the lab calibration, the focal length was fixed at widest angle. The field test used in the study was a flat surface located on the University of Almería campus and a set of 67 target points were placed. The calibration field area was 25 × 25 m approximately and the altitude flight over ground was 50 m. After the software processing, the camera calibration parameter values were obtained. The paper presents the process, the results and the accuracy of these calibration methods. The field calibration method reduced the final total error obtained in the previous lab calibration. Furthermore the overall RMSs obtained from both methods are similar. Therefore we will apply the field calibration results to all our photogrammetric projects in which the flight high will be close to 50 m.

Pérez, M.; Agüera, F.; Carvajal, F.

2011-09-01

255

Wide Field Camera 3: A Powerful New Imager for the Hubble Space Telescope  

NASA Technical Reports Server (NTRS)

Wide Field Camera 3 (WFC3) is a powerful UV/visible/near-infrared camera in development for installation into the Hubble Space Telescope during upcoming Servicing Mission 4. WFC3 provides two imaging channels. The UVIS channel incorporates a 4096 x 4096 pixel CCD focal plane with sensitivity from 200 to 1000 nm. The IR channel features a 1024 x 1024 pixel HgCdTe focal plane covering 850 to 1700 nm. We report here on the design of the instrument, the performance of its flight detectors, results of the ground test and calibration program, and the plans for the Servicing Mission installation and checkout.

Kimble, Randy

2008-01-01

256

Multi-camera: interactive rendering of abstract digital images  

E-print Network

Rendering of Abstract Digital Images. (December 2003) Jeffrey Statler Smith, B.E.D., Texas A&M University Chair of Advisory Committee: Dr. Ergun Akleman The purpose of this thesis is the development of an interactive computer-generated rendering system... techniques used by many cubist and futurist artists of the early twentieth century. iv To my family and friends v ACKNOWLEDGMENTS I would like to express my sincere thanks to my thesis committee chair, Dr. Ergun Akleman, for sharing his expertise...

Smith, Jeffrey Statler

2004-09-30

257

Single Leaf Area Measurement Using Digital Camera Image  

Microsoft Academic Search

\\u000a Leaf area index is one of the most important parameters in ecological and environmental studies. This paper presents a method\\u000a for single leaf area measurement based on the counting of the leaf pixels in digital leaf image. Initially, the target leaf\\u000a is put on a piece of white paper on which four endpoints of an equal-length and orthogonal cross are

Baisong Chen; Zhuo Fu; Yuchun Pan; Jihua Wang; Zhixuan Zeng

2010-01-01

258

The trustworthy digital camera: Restoring credibility to the photographic image  

NASA Technical Reports Server (NTRS)

The increasing sophistication of computers has made digital manipulation of photographic images, as well as other digitally-recorded artifacts such as audio and video, incredibly easy to perform and increasingly difficult to detect. Today, every picture appearing in newspapers and magazines has been digitally altered to some degree, with the severity varying from the trivial (cleaning up 'noise' and removing distracting backgrounds) to the point of deception (articles of clothing removed, heads attached to other people's bodies, and the complete rearrangement of city skylines). As the power, flexibility, and ubiquity of image-altering computers continues to increase, the well-known adage that 'the photography doesn't lie' will continue to become an anachronism. A solution to this problem comes from a concept called digital signatures, which incorporates modern cryptographic techniques to authenticate electronic mail messages. 'Authenticate' in this case means one can be sure that the message has not been altered, and that the sender's identity has not been forged. The technique can serve not only to authenticate images, but also to help the photographer retain and enforce copyright protection when the concept of 'electronic original' is no longer meaningful.

Friedman, Gary L.

1994-01-01

259

MONICA: A Compact, Portable Dual Gamma Camera System for Mouse Whole-Body Imaging  

PubMed Central

Introduction We describe a compact, portable dual-gamma camera system (named “MONICA” for MObile Nuclear Imaging CAmeras) for visualizing and analyzing the whole-body biodistribution of putative diagnostic and therapeutic single photon emitting radiotracers in animals the size of mice. Methods Two identical, miniature pixelated NaI(Tl) gamma cameras were fabricated and installed “looking up” through the tabletop of a compact portable cart. Mice are placed directly on the tabletop for imaging. Camera imaging performance was evaluated with phantoms and field performance was evaluated in a weeklong In-111 imaging study performed in a mouse tumor xenograft model. Results Tc-99m performance measurements, using a photopeak energy window of 140 keV ± 10%, yielded the following results: spatial resolution (FWHM at 1-cm), 2.2-mm; sensitivity, 149 cps/MBq (5.5 cps/?Ci); energy resolution (FWHM), 10.8%; count rate linearity (count rate vs. activity), r2 = 0.99 for 0–185 MBq (0–5 mCi) in the field-of-view (FOV); spatial uniformity, < 3% count rate variation across the FOV. Tumor and whole-body distributions of the In-111 agent were well visualized in all animals in 5-minute images acquired throughout the 168-hour study period. Conclusion Performance measurements indicate that MONICA is well suited to whole-body single photon mouse imaging. The field study suggests that inter-device communications and user-oriented interfaces included in the MONICA design facilitate use of the system in practice. We believe that MONICA may be particularly useful early in the (cancer) drug development cycle where basic whole-body biodistribution data can direct future development of the agent under study and where logistical factors, e.g. limited imaging space, portability, and, potentially, cost are important. PMID:20346864

Xi, Wenze; Seidel, Jurgen; Karkareka, John W.; Pohida, Thomas J.; Milenic, Diane E.; Proffitt, James; Majewski, Stan; Weisenberger, Andrew G.; Green, Michael V.; Choyke, Peter L.

2009-01-01

260

Method of improving the accuracy of image motion measurement for panoramic aerial cameras  

NASA Astrophysics Data System (ADS)

To improve the accuracy of image motion measurement for panoramic aerial cameras, the paper proposed a method of real-time image motion measurement based on 2D spatial correlation. An auxiliary area CCD is positioned paralleled to the main linear TDI CCD in the focal plane of the camera. For every frame of the image sequence taken by the auxiliary CCD, a prediction on the displacement of the current image in the flight direction is made based on the forward image motion velocity. The current and the reference images are combined together for 2D-correlation, which products an image motion vector. By decomposing the vector, both the forward and pendulum-swing image motions can be determined. The demanding computational requirements for the real-time 2D-correlation are covered by a joint transform optical correlator. Simulation test results show that the accuracy is improved and the measurement error is within 0.2 pixels for input images with SNR=1 dB.

Li, Gang; Jia, Ping

2010-10-01

261

Advanced High-Speed Framing Camera Development for Fast, Visible Imaging Experiments  

SciTech Connect

The advances in high-voltage switching developed in this project allow a camera user to rapidly vary the number of output frames from 1 to 25. A high-voltage, variable-amplitude pulse train shifts the deflection location to the new frame location during the interlude between frames, making multiple frame counts and locations possible. The final deflection circuit deflects to five different frame positions per axis, including the center position, making for a total of 25 frames. To create the preset voltages, electronically adjustable {+-}500 V power supplies were chosen. Digital-to-analog converters provide digital control of the supplies. The power supplies are clamped to {+-}400 V so as not to exceed the voltage ratings of the transistors. A field-programmable gated array (FPGA) receives the trigger signal and calculates the combination of plate voltages for each frame. The interframe time and number of frames are specified by the user, but are limited by the camera electronics. The variable-frame circuit shifts the plate voltages of the first frame to those of the second frame during the user-specified interframe time. Designed around an electrostatic image tube, a framing camera images the light present during each frame (at the photocathode) onto the tube’s phosphor. The phosphor persistence allows the camera to display multiple frames on the phosphor at one time. During this persistence, a CCD camera is triggered and the analog image is collected digitally. The tube functions by converting photons to electrons at the negatively charged photocathode. The electrons move quickly toward the more positive charge of the phosphor. Two sets of deflection plates skew the electron’s path in horizontal and vertical (x axis and y axis, respectively) directions. Hence, each frame’s electrons bombard the phosphor surface at a controlled location defined by the voltages on the deflection plates. To prevent the phosphor from being exposed between frames, the image tube is gated off between exposures.

Amy Lewis, Stuart Baker, Brian Cox, Abel Diaz, David Glass, Matthew Martin

2011-05-11

262

A portable device for small animal SPECT imaging in clinical gamma-cameras  

NASA Astrophysics Data System (ADS)

Molecular imaging is reshaping clinical practice in the last decades, providing practitioners with non-invasive ways to obtain functional in-vivo information on a diversity of relevant biological processes. The use of molecular imaging techniques in preclinical research is equally beneficial, but spreads more slowly due to the difficulties to justify a costly investment dedicated only to animal scanning. An alternative for lowering the costs is to repurpose parts of old clinical scanners to build new preclinical ones. Following this trend, we have designed, built, and characterized the performance of a portable system that can be attached to a clinical gamma-camera to make a preclinical single photon emission computed tomography scanner. Our system offers an image quality comparable to commercial systems at a fraction of their cost, and can be used with any existing gamma-camera with just an adaptation of the reconstruction software.

Aguiar, P.; Silva-Rodríguez, J.; González-Castaño, D. M.; Pino, F.; Sánchez, M.; Herranz, M.; Iglesias, A.; Lois, C.; Ruibal, A.

2014-07-01

263

Stereo Imaging Velocimetry Technique Using Standard Off-the-Shelf CCD Cameras  

NASA Technical Reports Server (NTRS)

Stereo imaging velocimetry is a fluid physics technique for measuring three-dimensional (3D) velocities at a plurality of points. This technique provides full-field 3D analysis of any optically clear fluid or gas experiment seeded with tracer particles. Unlike current 3D particle imaging velocimetry systems that rely primarily on laser-based systems, stereo imaging velocimetry uses standard off-the-shelf charge-coupled device (CCD) cameras to provide accurate and reproducible 3D velocity profiles for experiments that require 3D analysis. Using two cameras aligned orthogonally, we present a closed mathematical solution resulting in an accurate 3D approximation of the observation volume. The stereo imaging velocimetry technique is divided into four phases: 3D camera calibration, particle overlap decomposition, particle tracking, and stereo matching. Each phase is explained in detail. In addition to being utilized for space shuttle experiments, stereo imaging velocimetry has been applied to the fields of fluid physics, bioscience, and colloidal microscopy.

McDowell, Mark; Gray, Elizabeth

2004-01-01

264

Note: In vivo pH imaging system using luminescent indicator and color camera  

NASA Astrophysics Data System (ADS)

Microscopic in vivo pH imaging system is developed that can capture the luminescent- and color-imaging. The former gives a quantitative measurement of a pH distribution in vivo. The latter captures the structural information that can be overlaid to the pH distribution for correlating the structure of a specimen and its pH distribution. By using a digital color camera, a luminescent image as well as a color image is obtained. The system uses HPTS (8-hydroxypyrene-1,3,6-trisulfonate) as a luminescent pH indicator for the luminescent imaging. Filter units are mounted in the microscope, which extract two luminescent images for using the excitation-ratio method. A ratio of the two images is converted to a pH distribution through a priori pH calibration. An application of the system to epidermal cells of Lactuca Sativa L is shown.

Sakaue, Hirotaka; Dan, Risako; Shimizu, Megumi; Kazama, Haruko

2012-07-01

265

Image restoration of hybrid time delay and integration camera system with residual motion  

NASA Astrophysics Data System (ADS)

In this paper, we present a hybrid system for time delay and integration (TDI) image restoration. Images are degraded by residual motion, which distorts and blurs the TDI images during exposures and excludes the along-track scanning motion. The motion trajectory is estimated from the image sequence captured by an auxiliary high-speed camera. In order to make the estimated results less sensitive to the imaging conditions and noise, a new method based on cross-correlation is introduced for motion estimation. Then geometric distortion of the TDI image is removed by choosing the correct blurred block according to the central of the corresponding motion trajectory and the final image is restored row by row with the Richardson-Lucy algorithm. Simulated and experimental results are given to prove the effectiveness of our system.

Zheng, Zhenzhen; Wu, Jiagu; Feng, Huajun; Xu, Zhihai; Li, Qi; Chen, Yueting

2011-06-01

266

Engineering performance of IRIS2 infrared imaging camera and spectrograph  

NASA Astrophysics Data System (ADS)

IRIS2, the infrared imager and spectrograph for the Cassegrain focus of the Anglo Australian Telescope, has been in service since October 2001. IRIS2 incorporated many novel features, including multiple cryogenic multislit masks, a dual chambered vacuum vessel (the smaller chamber used to reduce thermal cycle time required to change sets of multislit masks), encoded cryogenic wheel drives with controlled backlash, a deflection compensating structure, and use of teflon impregnated hard anodizing for gear lubrication at low temperatures. Other noteworthy features were: swaged foil thermal link terminations, the pupil imager, the detector focus mechanism, phased getter cycling to prevent detector contamination, and a flow-through LN2 precooling system. The instrument control electronics was designed to allow accurate positioning of the internal mechanisms with minimal generation of heat. The detector controller was based on the AAO2 CCD controller, adapted for use on the HAWAII1 detector (1024 x 1024 pixels) and is achieving low noise and high performance. We describe features of the instrument design, the problems encountered and the development work required to bring them into operation, and their performance in service.

Churilov, Vladimir; Dawson, John; Smith, Greg A.; Waller, Lew; Whittard, John D.; Haynes, Roger; Lankshear, Allan; Ryder, Stuart D.; Tinney, Chris G.

2004-09-01

267

[The hyperspectral camera side-scan geometric imaging in any direction considering the spectral mixing].  

PubMed

In order to correct the image distortion in the hyperspectral camera side-scan geometric Imaging, the image pixel geo-referenced algorithm was deduced in detail in the present paper, which is suitable to the linear push-broom camera side-scan imaging on the ground in any direction. It takes the orientation of objects in the navigation coordinates system into account. Combined with the ground sampling distance of geo-referenced image and the area of push broom imaging, the general process of geo-referenced image divided into grids is also presented. The new image rows and columns will be got through the geo-referenced image area dividing the ground sampling distance. Considering the error produced by round rule in the pixel grids generated progress, and the spectral mixing problem caused by traditional direct spectral sampling method in the process of image correction, the improved spectral sampling method based on the weighted fusion method was proposed. It takes the area proportion of adjacent pixels in the new generated pixel as coefficient and then the coefficients are normalized to avoid the spectral overflow. So the new generated pixel is combined with the geo-referenced adjacent pixels spectral. Finally the amounts of push-broom imaging experiments were taken on the ground, and the distortion images were corrected according to the algorithm proposed above. The results show that the linear image distortion correction algorithm is valid and robust. At the same time, multiple samples were selected in the corrected images to verify the spectral data. The results indicate that the improved spectral sampling method is better than the direct spectral sampling algorithm. It provides reference for the application of similar productions on the ground. PMID:25269321

Wang, Shu-Min; Zhang, Ai-Wu; Hu, Shao-Xing; Sun, Wei-Dong

2014-07-01

268

Miniature indium gallium arsenide short-wave infrared camera for unattended imaging applications  

NASA Astrophysics Data System (ADS)

Indium gallium arsenide (InGaAs) focal plane arrays and cameras have demonstrated significant potential in battlefield applications. Room temperature detectivities, D*, in excess of 1014 cm-(root)Hz/W have enabled night vision imaging under low light level conditions. The 0.9 micrometers to 1.7 micrometers wavelength band allows the use of eye- safe lasers for target designation and covert active illumination. We report here a miniature InGaAs camera designed for unattended ground sensor and robot-mounted applications. The camera is approximately the size of a D- cell battery, weighs less than 200 g. has a 320 X 240 pixel spatial resolution and maintains D* > 1014 cm- (root)Hz/W. The miniature camera is fully self contained. The only input is DC power (3.6 V). The camera has both analog (RS170) and 12-bit digital (LVDS) video outputs. It is intended as a demonstration vehicle for battlefield distributed robotic vision but will find use in other applications as an unattended sensor or rifle site.

Cohen, Marshall J.; O'Grady, Matthew T.; Vermaak, Jacobus S.; Groppe, Joseph V.; Olsen, Gregory H.

2000-07-01

269

Optimal design of anger camera for bremsstrahlung imaging: monte carlo evaluation.  

PubMed

A conventional Anger camera is not adapted to bremsstrahlung imaging and, as a result, even using a reduced energy acquisition window, geometric x-rays represent <15% of the recorded events. This increases noise, limits the contrast, and reduces the quantification accuracy. Monte Carlo (MC) simulations of energy spectra showed that a camera based on a 30-mm-thick BGO crystal and equipped with a high energy pinhole collimator is well-adapted to bremsstrahlung imaging. The total scatter contamination is reduced by a factor 10 versus a conventional NaI camera equipped with a high energy parallel hole collimator enabling acquisition using an extended energy window ranging from 50 to 350?keV. By using the recorded event energy in the reconstruction method, shorter acquisition time and reduced orbit range will be usable allowing the design of a simplified mobile gantry. This is more convenient for use in a busy catheterization room. After injecting a safe activity, a fast single photon emission computed tomography could be performed without moving the catheter tip in order to assess the liver dosimetry and estimate the additional safe activity that could still be injected. Further long running time MC simulations of realistic acquisitions will allow assessing the quantification capability of such system. Simultaneously, a dedicated bremsstrahlung prototype camera reusing PMT-BGO blocks coming from a retired PET system is currently under design for further evaluation. PMID:24982849

Walrand, Stephan; Hesse, Michel; Wojcik, Randy; Lhommel, Renaud; Jamar, François

2014-01-01

270

Optimal Design of Anger Camera for Bremsstrahlung Imaging: Monte Carlo Evaluation  

PubMed Central

A conventional Anger camera is not adapted to bremsstrahlung imaging and, as a result, even using a reduced energy acquisition window, geometric x-rays represent <15% of the recorded events. This increases noise, limits the contrast, and reduces the quantification accuracy. Monte Carlo (MC) simulations of energy spectra showed that a camera based on a 30-mm-thick BGO crystal and equipped with a high energy pinhole collimator is well-adapted to bremsstrahlung imaging. The total scatter contamination is reduced by a factor 10 versus a conventional NaI camera equipped with a high energy parallel hole collimator enabling acquisition using an extended energy window ranging from 50 to 350?keV. By using the recorded event energy in the reconstruction method, shorter acquisition time and reduced orbit range will be usable allowing the design of a simplified mobile gantry. This is more convenient for use in a busy catheterization room. After injecting a safe activity, a fast single photon emission computed tomography could be performed without moving the catheter tip in order to assess the liver dosimetry and estimate the additional safe activity that could still be injected. Further long running time MC simulations of realistic acquisitions will allow assessing the quantification capability of such system. Simultaneously, a dedicated bremsstrahlung prototype camera reusing PMT–BGO blocks coming from a retired PET system is currently under design for further evaluation. PMID:24982849

Walrand, Stephan; Hesse, Michel; Wojcik, Randy; Lhommel, Renaud; Jamar, François

2014-01-01

271

Simultaneous Images of the Northern and Southern Auroras with the Earth Camera on the Polar Spacecraft  

NASA Astrophysics Data System (ADS)

During the period 22 October through late November, 2001, the Polar spacecraft was fortuitously positioned near Earth's equatorial plane such that the wide field-of-view of its far-ultraviolet camera was able to capture simultaneous images of the auroras in the Northern and Southern Hemispheres. This Earth Camera provided sufficient resolution of auroral features that the luminosities could be mapped into corrected geomagnetic coordinates for the first time. A few such images were previously available from the Dynamics Explorer-1 spacecraft, but the viewing aspect and the spatial resolution of the camera were insufficient to map the luminosities onto the Earth's surface. The viewing period for simultaneous imaging of the two auroras with the Polar spacecraft was about one hour during the spacecraft orbital period of 18.3 hours. The results of such imaging were exciting in two important areas of auroral research. These areas were the behavior of the auroral luminosities during substorm activity and the mapping of magnetic field lines from auroral altitudes into the equatorial magnetosphere. One of the principal results concerning substorms was the unequal auroral intensities between the two hemispheres during the onset and the expansive phase of substorms which imply strong coupling between the ionosphere and magnetosphere during these events. With regard to the mapping of field lines it is found that further work on the global magnetic models is required in order to provide accuracies of 1 RE earthward of radial distances of 20 RE.

Frank, L. A.; Sigwarth, J. B.

2002-05-01

272

New design of a gamma camera detector with reduced edge effect for breast imaging  

NASA Astrophysics Data System (ADS)

In recent years, there has been a growing interest in developing small gamma cameras dedicated to breast imaging. We designed a new detector with trapezoidal shape to expand the field of view (FOV) of camera without increasing its dimensions. To find optimal parameters, images of point sources at the edge area as functions of the angle and optical treatment of crystal side surface were simulated by using a DETECT2000. Our detector employs monolithic CsI(Tl) with dimensions of 48.0×48.0×6.0 mm coupled to an array of photo-sensors. Side surfaces of crystal were treated with three different surface finishes: black absorber, metal reflector and white reflector. The trapezoidal angle varied from 45° to 90° in steps of 15°. Gamma events were generated on 15 evenly spaced points with 1.0 mm spacing in the X-axis starting 1.0 mm away from the side surface. Ten thousand gamma events were simulated at each location and images were formed by calculating the Anger-logic. The results demonstrated that all the 15 points could be identified only for the crystal with trapezoidal shape having 45° angle and white reflector on the side surface. In conclusion, our new detector proved to be a reliable design to expand the FOV of small gamma camera for breast imaging.

Yeon Hwang, Ji; Lee, Seung-Jae; Baek, Cheol-Ha; Hyun Kim, Kwang; Hyun Chung, Yong

2011-05-01

273

Influence of electron dose rate on electron counting images recorded with the K2 camera.  

PubMed

A recent technological breakthrough in electron cryomicroscopy (cryoEM) is the development of direct electron detection cameras for data acquisition. By bypassing the traditional phosphor scintillator and fiber optic coupling, these cameras have greatly enhanced sensitivity and detective quantum efficiency (DQE). Of the three currently available commercial cameras, the Gatan K2 Summit was designed specifically for counting individual electron events. Counting further enhances the DQE, allows for practical doubling of detector resolution and eliminates noise arising from the variable deposition of energy by each primary electron. While counting has many advantages, undercounting of electrons happens when more than one electron strikes the same area of the detector within the analog readout period (coincidence loss), which influences image quality. In this work, we characterized the K2 Summit in electron counting mode, and studied the relationship of dose rate and coincidence loss and its influence on the quality of counted images. We found that coincidence loss reduces low frequency amplitudes but has no significant influence on the signal-to-noise ratio of the recorded image. It also has little influence on high frequency signals. Images of frozen hydrated archaeal 20S proteasome (~700 kDa, D7 symmetry) recorded at the optimal dose rate retained both high-resolution signal and low-resolution contrast and enabled calculating a 3.6 Å three-dimensional reconstruction from only 10,000 particles. PMID:23968652

Li, Xueming; Zheng, Shawn Q; Egami, Kiyoshi; Agard, David A; Cheng, Yifan

2013-11-01

274

High-precision investigations of the fast range imaging camera SwissRanger  

NASA Astrophysics Data System (ADS)

Many applications need fast measurement systems that capture their environment in three dimensions. Adequate measurement sensors are required that provide fast, accurate, and reliable 3-D data. Automotive applications long for real time and reliable data, not only for driving assistance systems but for safety, also. Until now, most solutions, like multi image photogrammetry, radar sensors or laser scanners, lack in one of these aspects at least. With the upcoming range imaging cameras, new sensors with a performance never seen before are to be taken into consideration. Range imaging has already been proved as an emerging technology for automotive applications. These cameras provide a distance measurement system in each pixel and therefore produce 3-D data with up to video frame rates with a single sensor. But because of their new measurement concept classical calibration approaches cannot be used. This paper will present results of research about the calibration of the SwissRanger TM, a range imaging camera introduced by CSEM Switzerland. Special emphasis is given to the determination of the influence of the diverse parameters on the distance measurement accuracy. These parameters are the temperature, the reflectivity and the distance itself, for example. The influences are represented in functional dependencies in order to reach high accuracy of the system. Temperature compensation by means of a specialized setup is addressed. A successful implementation of a temperature drift compensation by means of a differential setup is presented.

Kahlmann, T.; Ingensand, H.

2007-09-01

275

Influence of electron dose rate on electron counting images recorded with the K2 camera  

PubMed Central

A recent technological breakthrough in electron cryomicroscopy (cryoEM) is the development of direct electron detection cameras for data acquisition. By bypassing the traditional phosphor scintillator and fiber optic coupling, these cameras have greatly enhanced sensitivity and detective quantum efficiency (DQE). Of the three currently available commercial cameras, the Gatan K2 Summit was designed specifically for counting individual electron events. Counting further enhances the DQE, allows for practical doubling of detector resolution and eliminates noise arising from the variable deposition of energy by each primary electron. While counting has many advantages, undercounting of electrons happens when more than one electron strikes the same area of the detector within the analog readout period (coincidence loss), which influences image quality. In this work, we characterized the K2 Summit in electron counting mode, and studied the relationship of dose rate and coincidence loss and its influence on the quality of counted images. We found that coincidence loss reduces low frequency amplitudes but has no significant influence on the signal-to-noise ratio of the recorded image. It also has little influence on high frequency signals. Images of frozen hydrated archaeal 20S proteasome (~700 kDa, D7 symmetry) recorded at the optimal dose rate retained both high-resolution signal and low-resolution contrast and enabled calculating a 3.6 Å three-dimensional reconstruction from only 10,000 particles. PMID:23968652

Li, Xueming; Zheng, Shawn Q.; Egami, Kiyoshi; Agard, David A.; Cheng, Yifan

2013-01-01

276

Real Time Speed Estimation of Moving Vehicles from Side View Images from an Uncalibrated Video Camera  

PubMed Central

In order to estimate the speed of a moving vehicle with side view camera images, velocity vectors of a sufficient number of reference points identified on the vehicle must be found using frame images. This procedure involves two main steps. In the first step, a sufficient number of points from the vehicle is selected, and these points must be accurately tracked on at least two successive video frames. In the second step, by using the displacement vectors of the tracked points and passed time, the velocity vectors of those points are computed. Computed velocity vectors are defined in the video image coordinate system and displacement vectors are measured by the means of pixel units. Then the magnitudes of the computed vectors in image space should be transformed to the object space to find the absolute values of these magnitudes. This transformation requires an image to object space information in a mathematical sense that is achieved by means of the calibration and orientation parameters of the video frame images. This paper presents proposed solutions for the problems of using side view camera images mentioned here. PMID:22399909

Do?an, Sedat; Temiz, Mahir Serhan; Külür, S?tk?

2010-01-01

277

Striping Noise Removal of Images Acquired by Cbers 2 CCD Camera Sensor  

NASA Astrophysics Data System (ADS)

CCD Camera is a multi-spectral sensor that is carried by CBERS 2 satellite. Imaging technique in this sensor is push broom. In images acquired by the CCD Camera, some vertical striping noise can be seen. This is due to the detectors mismatch, inter detector variability, improper calibration of detectors and low signal-to-noise ratio. These noises are more profound in images acquired from the homogeneous surfaces, which are processed at level 2. However, the existence of these noises render the interpretation of the data and extracting information from these images difficult. In this work, spatial moment matching method is proposed to modify these images. In this method, the statistical moments such as mean and standard deviation of columns in each band are used to balance the statistical specifications of the detector array to those of reference values. After the removal of the noise, some periodic diagonal stripes remain in the image where their removal by using the aforementioned method seems impossible. Therefore, to omit them, frequency domain Butterworth notch filter was applied. Finally to evaluate the results, the image statistical moments such as the mean and standard deviation were deployed. The study proves the effectiveness of the method in noise removal.

Amraei, E.; Mobasheri, M. R.

2014-10-01

278

Estimating information from image colors: an application to digital cameras and natural scenes.  

PubMed

The colors present in an image of a scene provide information about its constituent elements. But the amount of information depends on the imaging conditions and on how information is calculated. This work had two aims. The first was to derive explicitly estimators of the information available and the information retrieved from the color values at each point in images of a scene under different illuminations. The second was to apply these estimators to simulations of images obtained with five sets of sensors used in digital cameras and with the cone photoreceptors of the human eye. Estimates were obtained for 50 hyperspectral images of natural scenes under daylight illuminants with correlated color temperatures 4,000, 6,500, and 25,000 K. Depending on the sensor set, the mean estimated information available across images with the largest illumination difference varied from 15.5 to 18.0 bits and the mean estimated information retrieved after optimal linear processing varied from 13.2 to 15.5 bits (each about 85 percent of the corresponding information available). With the best sensor set, 390 percent more points could be identified per scene than with the worst. Capturing scene information from image colors depends crucially on the choice of camera sensors. PMID:22450817

Marín-Franch, Iván; Foster, David H

2013-01-01

279

Real time speed estimation of moving vehicles from side view images from an uncalibrated video camera.  

PubMed

In order to estimate the speed of a moving vehicle with side view camera images, velocity vectors of a sufficient number of reference points identified on the vehicle must be found using frame images. This procedure involves two main steps. In the first step, a sufficient number of points from the vehicle is selected, and these points must be accurately tracked on at least two successive video frames. In the second step, by using the displacement vectors of the tracked points and passed time, the velocity vectors of those points are computed. Computed velocity vectors are defined in the video image coordinate system and displacement vectors are measured by the means of pixel units. Then the magnitudes of the computed vectors in image space should be transformed to the object space to find the absolute values of these magnitudes. This transformation requires an image to object space information in a mathematical sense that is achieved by means of the calibration and orientation parameters of the video frame images. This paper presents proposed solutions for the problems of using side view camera images mentioned here. PMID:22399909

Do?an, Sedat; Temiz, Mahir Serhan; Külür, Sitki

2010-01-01

280

Imaging Observations of Thermal Emissions from Augustine Volcano Using a Small Astronomical Camera  

USGS Publications Warehouse

Long-exposure visible-light images of Augustine Volcano were obtained using a charge-coupled device (CCD) camera during several nights of the 2006 eruption. The camera was located 105 km away, at Homer, Alaska, yet showed persistent bright emissions from the north flank of the volcano corresponding to steam releases, pyroclastic flows, and rockfalls originating near the summit. The apparent brightness of the emissions substantially exceeded that of the background nighttime scene. The bright signatures in the images are shown to probably be thermal emissions detected near the long-wavelength limit (~1 (u or mu)) of the CCD. Modeling of the emissions as a black-body brightness yields an apparent temperature of 400 to 450 degrees C that likely reflects an unresolved combination of emissions from hot ejecta and cooler material.

Sentman, Davis D.; McNutt, Stephen R.; Stenbaek-Nielsen, Hans C.; Tytgat, Guy; DeRoin, Nicole

2010-01-01

281

Crack propagation imaging by the ISIS camera and a video trigger system  

Microsoft Academic Search

An ultra-high speed camera of 1Mfps was applied to visualize the crack propagation. Change of stress field around the propagating crack tip was captured as a change of the fringe pattern by means of the photo-elastic imaging technique. Newly developed video trigger system is employed to detect the occurrence of the crack propagation as a trigger in the experiment. The

Tomoo Okinaka; Pavel Karimov; Takeharu Etoh; Kenji Oguni

2007-01-01

282

Experiences Supporting the Lunar Reconnaissance Orbiter Camera: the Devops Model  

NASA Astrophysics Data System (ADS)

Introduction: The Lunar Reconnaissance Orbiter Camera (LROC) Science Operations Center (SOC) is responsible for instrument targeting, product processing, and archiving [1]. The LROC SOC maintains over 1,000,000 observations with over 300 TB of released data. Processing challenges compound with the acquisition of over 400 Gbits of observations daily creating the need for a robust, efficient, and reliable suite of specialized software. Development Environment: The LROC SOC's software development methodology has evolved over time. Today, the development team operates in close cooperation with the systems administration team in a model known in the IT industry as DevOps. The DevOps model enables a highly productive development environment that facilitates accomplishment of key goals within tight schedules[2]. The LROC SOC DevOps model incorporates industry best practices including prototyping, continuous integration, unit testing, code coverage analysis, version control, and utilizing existing open source software. Scientists and researchers at LROC often prototype algorithms and scripts in a high-level language such as MATLAB or IDL. After the prototype is functionally complete the solution is implemented as production ready software by the developers. Following this process ensures that all controls and requirements set by the LROC SOC DevOps team are met. The LROC SOC also strives to enhance the efficiency of the operations staff by way of weekly presentations and informal mentoring. Many small scripting tasks are assigned to the cognizant operations personnel (end users), allowing for the DevOps team to focus on more complex and mission critical tasks. In addition to leveraging open source software the LROC SOC has also contributed to the open source community by releasing Lunaserv [3]. Findings: The DevOps software model very efficiently provides smooth software releases and maintains team momentum. Scientists prototyping their work has proven to be very efficient as developers do not need to spend time iterating over small changes. Instead, these changes are realized in early prototypes and implemented before the task is seen by developers. The development practices followed by the LROC SOC DevOps team help facilitate a high level of software quality that is necessary for LROC SOC operations. Application to the Scientific Community: There is no replacement for having software developed by professional developers. While it is beneficial for scientists to write software, this activity should be seen as prototyping, which is then made production ready by professional developers. When constructed properly, even a small development team has the ability to increase the rate of software development for a research group while creating more efficient, reliable, and maintainable products. This strategy allows scientists to accomplish more, focusing on teamwork, rather than software development, which may not be their primary focus. 1. Robinson et al. (2010) Space Sci. Rev. 150, 81-124 2. DeGrandis. (2011) Cutter IT Journal. Vol 24, No. 8, 34-39 3. Estes, N.M.; Hanger, C.D.; Licht, A.A.; Bowman-Cisneros, E.; Lunaserv Web Map Service: History, Implementation Details, Development, and Uses, http://adsabs.harvard.edu/abs/2013LPICo1719.2609E.

Licht, A.; Estes, N. M.; Bowman-Cisnesros, E.; Hanger, C. D.

2013-12-01

283

The Simultaneous Quad-Color Infrared Imaging Device (SQIID) - A leap forward in infrared cameras for astronomy  

Microsoft Academic Search

The Simultaneous Quad-Color Infrared Imaging Device (SQIID) is the first of a new generation of infrared instruments to be put into service at the Kitt Peak National Observatory (KPNO). The camera has been configured to be modular in design and to accept new innovations in detector format as they become available. Currently the camera is equipped with four 256 x

Timothy Ellis; Raleigh Drake; A. M. Fowler; Ian Gatley; Jerry Heim; Roger Luce; K. M. Merrill; Ron Probst; Nick Buchholz

1993-01-01

284

Image and Depth from a Conventional Camera with a Coded Aperture Anat Levin Rob Fergus Fredo Durand William T. Freeman  

E-print Network

: Recovered depth map with color indicating depth from camera (cm) (in this this case, without user with the modified camera. A layered depth map is then extracted, requiring user-drawn strokes to clarify layer assignments in some cases. The resulting sharp image and layered depth map can be combined for various

285

Feasibility of monitoring patient motion with opposed stereo infrared cameras during supine medical imaging  

NASA Astrophysics Data System (ADS)

Patient motion during single photon emission computed tomographic (SPECT) acquisition causes inconsistent projection data and reconstruction artifacts which can significantly affect diagnostic accuracy. We have investigated use of the Polaris stereo infrared motion-tracking system to track 6-Degrees-of-Freedom (6-DOF) motion of spherical reflectors (markers) on stretchy bands about the patient's chest and abdomen during cardiac SPECT imaging. The marker position information, obtained by opposed stereo infrared-camera systems, requires processing to correctly record tracked markers, and map Polaris co-ordinate data into the SPECT co-ordinate system. One stereo camera views the markers from the patient's head direction, and the other from the patient's foot direction. The need for opposed cameras is to overcome anatomical and geometrical limitations which sometimes prevent all markers from being seen by a single stereo camera. Both sets of marker data are required to compute rotational and translational 6-DOF motion of the patient which ultimately will be used for SPECT patient-motion corrections. The processing utilizes an algorithm involving least-squares fitting, to each other, of two 3-D point sets using singular value decomposition (SVD) resulting in the rotation matrix and translation of the rigid body centroid. We have previously demonstrated the ability to monitor multiple markers for twelve patients viewing from the foot end, and employed a neural network to separate the periodic respiratory motion component of marker motion from aperiodic body motion. We plan to initiate routine 6-DOF tracking of patient motion during SPECT imaging in the future, and are herein evaluating the feasibility of employing opposed stereo cameras.

Beach, Richard D.; McNamara, Joseph E.; Terlecki, George; King, Michael A.

2006-10-01

286

Digital image georeferencing from a multiple camera system by GPS/INS  

NASA Astrophysics Data System (ADS)

In this paper, the development and testing of an airborne fully digital multi-sensor system for digital mapping data acquisition is presented. The system acquires two streams of data, namely, navigation (georeferencing) data and imaging data. The navigation data are obtained by integrating an accurate strapdown inertial navigation system with a differential GPS system (DGPS). The imaging data are acquired by two low-cost digital cameras, configured in such a way so as to reduce their geometric limitations. The two cameras capture strips of overlapping nadir and oblique images. The GPS/INS-derived trajectory contains the full translational and rotational motion of the carrier aircraft. Thus, image exterior orientation information is extracted from the trajectory, during post-processing. This approach eliminates the need for ground control (GCP) when computing 3D positions of objects that appear in the field of view of the system imaging component. Two approaches for calibrating the system are presented, namely, terrestrial calibration and in-flight calibration. Test flights were conducted over the campus of The University of Calgary. Testing the system showed that best ground point positioning accuracy at 1:12,000 average image scale is 0.2 m (RMS) in easting and northing and 0.3 m (RMS) in height. Preliminary results indicate that major applications of such a system in the future are in the field of digital mapping, at scales of 1:5000 and smaller, and in the generation of digital elevation models for engineering applications.

Mostafa, Mohamed M. R.; Schwarz, Klaus-Peter

287

Digital optical microscope as a sampled imaging system: mathematical description, camera selection, and focusing  

NASA Astrophysics Data System (ADS)

The paper presents the mathematical description of an optical microscope with a digital camera and an image processing as an analog-digital-analog imaging system. This description considers a channel of the microscope as a sequence of the linear spatial filters of two dimensional signals. The channel contains an optical system as a low frequency analog filter, a digital camera as a low frequency analog filter with spatial and amplitude discretization and noise generation, a digital linear filter which has to amplify the high frequency harmonics and a restoration unit that plays a role of a two dimensional interpolator. This mathematical apparatus is useful for proper selection of a digital camera which guarantees the maximal field of view with absence of image distortions. The terms like optimal, nonsufficient and non-useful (void) linear magnification of a microscope optical system are expanded from a visual microscopy to a digital one. This mathematical description is also applied for selection of a digital filter for focusing and digital focus extension. The modulation transfer function of this filter should match for the spatial spectrum of observed objects in the zone of spatial harmonics that is most sensitive to defocusing. In this case the maximal sensitivity to defocusing with minimization of influence of noise can be reached.

Borovytsky, Volodymyr N.

2012-10-01

288

Gamma camera imaging for studying intestinal absorption and whole-body distribution of selenomethionine.  

PubMed

Se metabolism in humans is not well characterised. Currently, the estimates of Se absorption, whole-body retention and excretion are being obtained from balance and tracer studies. In the present study, we used gamma camera imaging to evaluate the whole-body retention and distribution of radiolabelled selenomethionine (SeMet), the predominant form of Se present in foods. A total of eight healthy young men participated in the study. After consumption of a meal containing 4 MBq [??Se]L-SeMet ([??Se]SeMet), whole-body gamma camera scanning was performed for 45 min every hour over a 6 h period, every second hour for the next 18 h and once on each of the subsequent 6 d. Blood, urine and faecal samples were collected to determine the plasma content of [??Se]SeMet as well as its excretion in urine and faeces. Imaging showed that 87·9 (sd 3·3)% of the administered activity of [??Se]SeMet was retained within the body after 7 d. In contrast, the measured excretion in urine and faeces for the 7 d period was 8·2 (sd 1·1)% of the activity. Time-activity curves were generated for the whole body, stomach, liver, abdomen (other than the stomach and the liver), brain and femoral muscles. Gamma camera imaging allows for the assessment of the postprandial absorption of SeMet. This technique may also permit concurrent studies of organ turnover of SeMet. PMID:23930999

Madsen, Jan L; Sjögreen-Gleisner, Katarina; Elema, Dennis R; Søndergaard, Lasse R; Rasmussen, Palle; Fuglsang, Stefan; Ljungberg, Michael; Damgaard, Morten

2014-02-01

289

Compact CdZnTe-based gamma camera for prostate cancer imaging  

NASA Astrophysics Data System (ADS)

In this paper, we discuss the design of a compact gamma camera for high-resolution prostate cancer imaging using Cadmium Zinc Telluride (CdZnTe or CZT) radiation detectors. Prostate cancer is a common disease in men. Nowadays, a blood test measuring the level of prostate specific antigen (PSA) is widely used for screening for the disease in males over 50, followed by (ultrasound) imaging-guided biopsy. However, PSA tests have a high falsepositive rate and ultrasound-guided biopsy has a high likelihood of missing small cancerous tissues. Commercial methods of nuclear medical imaging, e.g. PET and SPECT, can functionally image the organs, and potentially find cancer tissues at early stages, but their applications in diagnosing prostate cancer has been limited by the smallness of the prostate gland and the long working distance between the organ and the detectors comprising these imaging systems. CZT is a semiconductor material with wide band-gap and relatively high electron mobility, and thus can operate at room temperature without additional cooling. CZT detectors are photon-electron direct-conversion devices, thus offering high energy-resolution in detecting gamma rays, enabling energy-resolved imaging, and reducing the background of Compton-scattering events. In addition, CZT material has high stopping power for gamma rays; for medical imaging, a few-mm-thick CZT material provides adequate detection efficiency for many SPECT radiotracers. Because of these advantages, CZT detectors are becoming popular for several SPECT medical-imaging applications. Most recently, we designed a compact gamma camera using CZT detectors coupled to an application-specific-integratedcircuit (ASIC). This camera functions as a trans-rectal probe to image the prostate gland from a distance of only 1-5 cm, thus offering higher detection efficiency and higher spatial resolution. Hence, it potentially can detect prostate cancers at their early stages. The performance tests of this camera have been completed. The results show better than 6-mm resolution at a distance of 1 cm. Details of the test results are discussed in this paper.

Cui, Yonggang; Lall, Terry; Tsui, Benjamin; Yu, Jianhua; Mahler, George; Bolotnikov, Aleksey; Vaska, Paul; De Geronimo, Gianluigi; O'Connor, Paul; Meinken, George; Joyal, John; Barrett, John; Camarda, Giuseppe; Hossain, Anwar; Kim, Ki Hyun; Yang, Ge; Pomper, Marty; Cho, Steve; Weisman, Ken; Seo, Youngho; Babich, John; LaFrance, Norman; James, Ralph B.

2011-06-01

290

Diffuse reflection imaging of sub-epidermal tissue haematocrit using a simple RGB camera  

NASA Astrophysics Data System (ADS)

This paper describes the design and evaluation of a novel easy to use, tissue viability imaging system (TiVi). The system is based on the methods of diffuse reflectance spectroscopy and polarization spectroscopy. The technique has been developed as an alternative to current imaging technology in the area of microcirculation imaging, most notably optical coherence tomography (OCT) and laser Doppler perfusion imaging (LDPI). The system is based on standard digital camera technology, and is sensitive to red blood cells (RBCs) in the microcirculation. Lack of clinical acceptance of both OCT and LDPI fuels the need for an objective, simple, reproducible and portable imaging method that can provide accurate measurements related to stimulus vasoactivity in the microvasculature. The limitations of these technologies are discussed in this paper. Uses of the Tissue Viability system include skin care products, drug development, and assessment spatial and temporal aspects of vasodilation (erythema) and vasoconstriction (blanching).

Leahy, Martin J.; O'Doherty, Jim; McNamara, Paul; Henricson, Joakim; Nilsson, Gert E.; Anderson, Chris; Sjoberg, Folke

2007-05-01

291

Advances In The Image Sensor: The Critical Element In The Performance Of Cameras  

NASA Astrophysics Data System (ADS)

Digital imaging technology and digital imaging products are advancing at a rapid pace. The progress of digital cameras has been particularly impressive. Image sensors now have smaller pixel size, a greater number of pixels, higher sensitivity, lower noise and a higher frame rate. Picture resolution is a function of the number of pixels of the image sensor. The more pixels there are, the smaller each pixel, but the sensitivity and the charge-handling capability of each pixel can be maintained or even be increased by raising the quantum efficiency and the saturation capacity of the pixel per unit area. Sony's many technologies can be successfully applied to CMOS Image Sensor manufacturing toward sub-2.0 um pitch pixel and beyond.

Narabu, Tadakuni

2011-01-01

292

Formulation of image quality prediction criteria for the Viking lander camera  

NASA Technical Reports Server (NTRS)

Image quality criteria are defined and mathematically formulated for the prediction computer program which is to be developed for the Viking lander imaging experiment. The general objective of broad-band (black and white) imagery to resolve small spatial details and slopes is formulated as the detectability of a right-circular cone with surface properties of the surrounding terrain. The general objective of narrow-band (color and near-infrared) imagery to observe spectral characteristics if formulated as the minimum detectable albedo variation. The general goal to encompass, but not exceed, the range of the scene radiance distribution within single, commandable, camera dynamic range setting is also considered.

Huck, F. O.; Jobson, D. J.; Taylor, E. J.; Wall, S. D.

1973-01-01

293

Linearisation of RGB Camera Responses for Quantitative Image Analysis of Visible and UV Photography: A Comparison of Two Techniques  

PubMed Central

Linear camera responses are required for recovering the total amount of incident irradiance, quantitative image analysis, spectral reconstruction from camera responses and characterisation of spectral sensitivity curves. Two commercially-available digital cameras equipped with Bayer filter arrays and sensitive to visible and near-UV radiation were characterised using biexponential and Bézier curves. Both methods successfully fitted the entire characteristic curve of the tested devices, allowing for an accurate recovery of linear camera responses, particularly those corresponding to the middle of the exposure range. Nevertheless the two methods differ in the nature of the required input parameters and the uncertainty associated with the recovered linear camera responses obtained at the extreme ends of the exposure range. Here we demonstrate the use of both methods for retrieving information about scene irradiance, describing and quantifying the uncertainty involved in the estimation of linear camera responses. PMID:24260244

Garcia, Jair E.; Dyer, Adrian G.; Greentree, Andrew D.; Spring, Gale; Wilksch, Philip A.

2013-01-01

294

Grooves of Phobos as Seen on Rectified Images Taken by the Mars Express High Resolution Stereo Camera  

NASA Astrophysics Data System (ADS)

In this work, we analyze images taken by the HRSC camera onboard Mars Express to revisit the problem of the origin of Phobos' grooves, numerous linear features, often turning to chains of small craters, which criss-cross its surface.

Basilevsky, A. T.; Oberst, J.; Willner, K.; Waehlisch, M.; Neukum, G.

2011-03-01

295

Development of a portable 3CCD camera system for multispectral imaging of biological samples.  

PubMed

Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples. PMID:25350510

Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S

2014-01-01

296

Development of a Portable 3CCD Camera System for Multispectral Imaging of Biological Samples  

PubMed Central

Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples. PMID:25350510

Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S.

2014-01-01

297

EPR-based ghost imaging using a single-photon-sensitive camera  

E-print Network

Correlated-photon imaging, popularly known as ghost imaging, is a technique whereby an image is formed from light that has never interacted with the object. In ghost imaging experiments two correlated light fields are produced. One of these fields illuminates the object, and the other field is measured by a spatially resolving detector. In the quantum regime, these correlated light fields are produced by entangled photons created by spontaneous parametric down-conversion. To date, all correlated-photon ghost-imaging experiments have scanned a single-pixel detector through the field of view to obtain the spatial information. However, scanning leads to a poor sampling efficiency, which scales inversely with the number of pixels, N, in the image. In this work we overcome this limitation by using a time-gated camera to record the single-photon events across the full scene. We obtain high-contrast images, 90%, in either the image plane or the far-field of the photon pair source, taking advantage of the EPR-like correlations in position and momentum of the photon pairs. Our images contain a large number of modes, >500, creating opportunities in low-light-level imaging and in quantum information processing.

Reuben S. Aspden; Daniel S. Tasca; Robert W. Boyd; Miles J. Padgett

2013-08-05

298

Research on optimization of imaging parameters of optical remote sensing camera based on ground objects BRDF  

NASA Astrophysics Data System (ADS)

With the development of high resolution remote sensing satellite in recent years, the research of typical objects is connecting more and more closely with remote sensing applications. In the TDI CCD camera on-orbit imaging process, great changes will happen on solar angles at different time, causing a certain change of BRDF of most earth's surface objects, and finally affect the remote sensing radiances, even imaging quality. In order to solve this problem, optimization of in-orbit parameters based on the ground objects BRDF is necessary. A detailed investigation about the global imaging area of ground objects characteristics is given in this paper. We inverse BRDF of different time based on Kernel-Driven BRDF model, establish database of ground objects BRDF, make a classification of ground objects characteristics, simulate imaging effect with radiative transfer model and degradation model of remote sensor, and then optimize imaging parameters according to the imaging quality requirement. The simulation results show that the contrast, definition and dynamic range of image have improved, the proposed method in this paper can set imaging parameters reasonably of different imaging conditions, improve the imaging quality of high resolution remote sensing satellites.

Li, Fangqi; He, Hongyan; Bao, Yunfei; Xing, Kun; Zhang, Zhi

2013-10-01

299

Determining 3D Flow Fields via Multi-camera Light Field Imaging  

PubMed Central

In the field of fluid mechanics, the resolution of computational schemes has outpaced experimental methods and widened the gap between predicted and observed phenomena in fluid flows. Thus, a need exists for an accessible method capable of resolving three-dimensional (3D) data sets for a range of problems. We present a novel technique for performing quantitative 3D imaging of many types of flow fields. The 3D technique enables investigation of complicated velocity fields and bubbly flows. Measurements of these types present a variety of challenges to the instrument. For instance, optically dense bubbly multiphase flows cannot be readily imaged by traditional, non-invasive flow measurement techniques due to the bubbles occluding optical access to the interior regions of the volume of interest. By using Light Field Imaging we are able to reparameterize images captured by an array of cameras to reconstruct a 3D volumetric map for every time instance, despite partial occlusions in the volume. The technique makes use of an algorithm known as synthetic aperture (SA) refocusing, whereby a 3D focal stack is generated by combining images from several cameras post-capture 1. Light Field Imaging allows for the capture of angular as well as spatial information about the light rays, and hence enables 3D scene reconstruction. Quantitative information can then be extracted from the 3D reconstructions using a variety of processing algorithms. In particular, we have developed measurement methods based on Light Field Imaging for performing 3D particle image velocimetry (PIV), extracting bubbles in a 3D field and tracking the boundary of a flickering flame. We present the fundamentals of the Light Field Imaging methodology in the context of our setup for performing 3DPIV of the airflow passing over a set of synthetic vocal folds, and show representative results from application of the technique to a bubble-entraining plunging jet. PMID:23486112

Truscott, Tadd T.; Belden, Jesse; Nielson, Joseph R.; Daily, David J.; Thomson, Scott L.

2013-01-01

300

200 ps FWHM and 100 MHz repetition rate ultrafast gated camera for optical medical functional imaging  

NASA Astrophysics Data System (ADS)

The paper describes the realization of a complete optical imaging device to clinical applications like brain functional imaging by time-resolved, spectroscopic diffuse optical tomography. The entire instrument is assembled in a unique setup that includes a light source, an ultrafast time-gated intensified camera and all the electronic control units. The light source is composed of four near infrared laser diodes driven by a nanosecond electrical pulse generator working in a sequential mode at a repetition rate of 100 MHz. The resulting light pulses, at four wavelengths, are less than 80 ps FWHM. They are injected in a four-furcated optical fiber ended with a frontal light distributor to obtain a uniform illumination spot directed towards the head of the patient. Photons back-scattered by the subject are detected by the intensified CCD camera; there are resolved according to their time of flight inside the head. The very core of the intensified camera system is the image intensifier tube and its associated electrical pulse generator. The ultrafast generator produces 50 V pulses, at a repetition rate of 100 MHz and a width corresponding to the 200 ps requested gate. The photocathode and the Micro-Channel-Plate of the intensifier have been specially designed to enhance the electromagnetic wave propagation and reduce the power loss and heat that are prejudicial to the quality of the image. The whole instrumentation system is controlled by an FPGA based module. The timing of the light pulses and the photocathode gating is precisely adjustable with a step of 9 ps. All the acquisition parameters are configurable via software through an USB plug and the image data are transferred to a PC via an Ethernet link. The compactness of the device makes it a perfect device for bedside clinical applications.

Uhring, Wilfried; Poulet, Patrick; Hanselmann, Walter; Glazenborg, René; Zint, Virginie; Nouizi, Farouk; Dubois, Benoit; Hirschi, Werner

2012-04-01

301

An energy-subtraction Compton scatter camera design for in vivo medical imaging of radiopharmaceuticals  

SciTech Connect

A Compton scatter camera (CSC) design is proposed for imaging radioisotopes used as biotracers. A clinical version may increase sensitivity by a factor of over 100, while maintaining or improving spatial resolution, as compared with existing Anger cameras that use lead collimators. This novel approach is based on using energy subtraction ({Delta}E = E{sub 0} {minus} E{sub SC}, where E{sub 0}, {Delta}E, and E{sub SC} are the energy of the emitted gamma ray, the energy deposited by the initial Compton scatter, and the energy of the Compton scattered photon) to determine the amount of energy deposited in the primary system. The energy subtraction approach allows the requirement of high energy resolution to be placed on a secondary detector system instead of the primary detector system. Requiring primary system high energy resolution has significantly limited previous CSC designs for medical imaging applications. Furthermore, this approach is dependent on optimizing the camera design for data acquisition of gamma rays that undergo only one Compton scatter in a low-Z primary detector system followed by a total absorption of the Compton scattered photon in a high-Z secondary detector system. The proposed approach allows for a more compact primary detector system, a more simplified pulse processing interface, and a much less complicated detector cooling scheme as compared with previous CSC designs. Analytical calculations and Monte Carlo simulation results for some specific detector materials and geometries are presented.

Rohe, R.C.; Valentine, J.D. [Univ. of Cincinnati, OH (United States)] [Univ. of Cincinnati, OH (United States)

1996-12-01

302

The computation of cloud base height from paired whole-sky imaging cameras  

SciTech Connect

A major goal for global change studies is to improve the accuracy of general circulation models (GCMs) capable of predicting the timing and magnitude of greenhouse gas-induced global warming. Research has shown that cloud radiative feedback is the single most important effect determining the magnitude of possible climate responses to human activity. Of particular value to reducing the uncertainties associated with cloud-radiation interactions is the measurement of cloud base height (CBH), both because it is a dominant factor in determining the infrared radiative properties of clouds with respect to the earth`s surface and lower atmosphere and because CBHs are essential to measuring cloud cover fraction. We have developed a novel approach to the extraction of cloud base height from pairs of whole sky imaging (WSI) cameras. The core problem is to spatially register cloud fields from widely separated WSI cameras; this complete, triangulation provides the CBH measurements. The wide camera separation (necessary to cover the desired observation area) and the self-similarity of clouds defeats all standard matching algorithms when applied to static views of the sky. To address this, our approach is based on optical flow methods that exploit the fact that modern WSIs provide sequences of images. We will describe the algorithm and present its performance as evaluated both on real data validated by ceilometer measurements and on a variety of simulated cases.

Allmen, M.C.; Kegelmeyer, W.P. Jr.

1994-03-01

303

Nonlinear camera response functions and image deblurring: theoretical analysis and practice.  

PubMed

This paper investigates the role that nonlinear camera response functions (CRFs) have on image deblurring. We present a comprehensive study to analyze the effects of CRFs on motion deblurring. In particular, we show how nonlinear CRFs can cause a spatially invariant blur to behave as a spatially varying blur. We prove that such nonlinearity can cause large errors around edges when directly applying deconvolution to a motion blurred image without CRF correction. These errors are inevitable even with a known point spread function (PSF) and with state-of-the-art regularization-based deconvolution algorithms. In addition, we show how CRFs can adversely affect PSF estimation algorithms in the case of blind deconvolution. To help counter these effects, we introduce two methods to estimate the CRF directly from one or more blurred images when the PSF is known or unknown. Our experimental results on synthetic and real images validate our analysis and demonstrate the robustness and accuracy of our approaches. PMID:23969392

Tai, Yu-Wing; Chen, Xiaogang; Kim, Sunyeong; Kim, Seon Joo; Li, Feng; Yang, Jie; Yu, Jingyi; Matsushita, Yasuyuki; Brown, Michael S

2013-10-01

304

Nonlinear Camera Response Functions and Image Deblurring: Theoretical Analysis and Practice.  

PubMed

This paper investigates the role that nonlinear camera response functions (CRFs) have on image deblurring. We present a comprehensive study to analyze the effects of CRFs on motion deblurring. In particular, we show how nonlinear CRFs can cause a spatially invariant blur to behave as a spatially varying blur. We prove that such nonlinearity can cause large errors around edges when directly applying deconvolution to a motion blurred image without CRF correction. These errors are inevitable even with a known point spread function (PSF) and with state-of-the-art regularization based deconvolution algorithms. In addition, we show how CRFs can adversely affect PSF estimation algorithms in the case of blind deconvolution. To help counter these effects, we introduce two methods to estimate the CRF directly from one or more blurred images when the PSF is known or unknown. Our experimental results on synthetic and real images validate our analysis and demonstrate the robustness and accuracy of our approaches. PMID:23420212

Tai, Yu-Wing; Chen, Xiaogang; Kim, Sunyeong; Kim, Seon Joo; Li, Feng; Yang, Jie; Yu, Jingyi; Matsushita, Yasuyuki; Brown, Michael S

2013-02-13

305

Dual-mode laparoscopic fluorescence image-guided surgery using a single camera  

PubMed Central

Iatrogenic nerve damage is a leading cause of morbidity associated with many common surgical procedures. Complications arising from these injuries may result in loss of function and/or sensation, muscle atrophy, and chronic neuropathy. Fluorescence image-guided surgery offers a potential solution for avoiding intraoperative nerve damage by highlighting nerves that are otherwise difficult to visualize. In this work we present the development of a single camera, dual-mode laparoscope that provides near simultaneous display of white-light and fluorescence images of nerves. The capability of the instrumentation is demonstrated through imaging several types of in situ rat nerves via a nerve specific contrast agent. Full color white light and high brightness fluorescence images and video of nerves as small as 100 µm in diameter are presented. PMID:22876351

Gray, Daniel C.; Kim, Evgenia M.; Cotero, Victoria E.; Bajaj, Anshika; Staudinger, V. Paul; Hehir, Cristina A. Tan; Yazdanfar, Siavash

2012-01-01

306

WFC3RED: A HST Wide Field Camera 3 Image Processing Pipeline  

NASA Astrophysics Data System (ADS)

WFC3RED is a pipeline for automatically processing imaging data taken with the Wide Field Camera 3 instrument on the Hubble Space Telescope (HST). The pipeline currently supports processing of imaging data from both the IR and UVIS channels and is written in Python and C. The automated processing steps include cosmic-ray removal (UVIS), super-sky subtraction, user defined artifact masking, robust alignment and registration for large mosaics, weight map generation, and drizzling onto a final image mosaic. WFC3RED can combined data across different HST observations, visits and proposals without the need for any pre-defined associations. WFC3RED can create imaging products with a signal-to-noise ratio that matches the most careful step-by-step manual WFC3 reductions.

Magee, D. K.; Bouwens, R. J.; Illingworth, G. D.

2011-07-01

307

Quantitative Evaluation of Scintillation Camera Imaging Characteristics of Isotopes Used in Liver Radioembolization  

PubMed Central

Background Scintillation camera imaging is used for treatment planning and post-treatment dosimetry in liver radioembolization (RE). In yttrium-90 (90Y) RE, scintigraphic images of technetium-99m (99mTc) are used for treatment planning, while 90Y Bremsstrahlung images are used for post-treatment dosimetry. In holmium-166 (166Ho) RE, scintigraphic images of 166Ho can be used for both treatment planning and post-treatment dosimetry. The aim of this study is to quantitatively evaluate and compare the imaging characteristics of these three isotopes, in order that imaging protocols can be optimized and RE studies with varying isotopes can be compared. Methodology/Principal Findings Phantom experiments were performed in line with NEMA guidelines to assess the spatial resolution, sensitivity, count rate linearity, and contrast recovery of 99mTc, 90Y and 166Ho. In addition, Monte Carlo simulations were performed to obtain detailed information about the history of detected photons. The results showed that the use of a broad energy window and the high-energy collimator gave optimal combination of sensitivity, spatial resolution, and primary photon fraction for 90Y Bremsstrahlung imaging, although differences with the medium-energy collimator were small. For 166Ho, the high-energy collimator also slightly outperformed the medium-energy collimator. In comparison with 99mTc, the image quality of both 90Y and 166Ho is degraded by a lower spatial resolution, a lower sensitivity, and larger scatter and collimator penetration fractions. Conclusions/Significance The quantitative evaluation of the scintillation camera characteristics presented in this study helps to optimize acquisition parameters and supports future analysis of clinical comparisons between RE studies. PMID:22073149

Elschot, Mattijs; Nijsen, Johannes Franciscus Wilhelmus; Dam, Alida Johanna; de Jong, Hugo Wilhelmus Antonius Maria

2011-01-01

308

A Method to Solve Interior and Exterior Camera Calibration Parameters for Image Resection  

NASA Technical Reports Server (NTRS)

An iterative method is presented to solve the internal and external camera calibration parameters, given model target points and their images from one or more camera locations. The direct linear transform formulation was used to obtain a guess for the iterative method, and herein lies one of the strengths of the present method. In all test cases, the method converged to the correct solution. In general, an overdetermined system of nonlinear equations is solved in the least-squares sense. The iterative method presented is based on Newton-Raphson for solving systems of nonlinear algebraic equations. The Jacobian is analytically derived and the pseudo-inverse of the Jacobian is obtained by singular value decomposition.

Samtaney, Ravi

1999-01-01

309

Modeling of three-dimensional camera imaging in a tokamak torus  

NASA Astrophysics Data System (ADS)

A procedure is described for precision modeling of the views for imaging diagnostics monitoring tokamak internal components, particularly high heat flux divertor components. Such modeling is required to enable predictions of resolution and viewing angle for the available viewing locations. Since oblique views are typically expected for tokamak divertors, fully three-dimensional (3D) perspective imaging is required. A suite of matched 3D CAD, graphics and animation applications are used to provide a fast and flexible technique for reproducing these views. An analytic calculation of the resolution and viewing incidence angle is developed to validate the results of the modeling procedures. The tokamak physics experiment (TPX) diagnostics1 for infrared viewing are used as an example to demonstrate the implementation of the tools. As is generally the case in tokamak experiments, the available diagnostic locations for TPX are severely constrained by access limitations and the resulting images can be marginal in both resolution and viewing incidence angle. The procedures described here provide a complete design tool for in-vessel viewing, both for camera location and for identification of viewed surfaces. Additionally, these same tools can be used for the interpretation of the actual images obtained by the diagnostic cameras.

Edmonds, P. H.; Medley, S. S.

1997-01-01

310

Multi-frame image processing with panning cameras and moving subjects  

NASA Astrophysics Data System (ADS)

Imaging scenarios commonly involve erratic, unpredictable camera behavior or subjects that are prone to movement, complicating multi-frame image processing techniques. To address these issues, we developed three techniques that can be applied to multi-frame image processing algorithms in order to mitigate the adverse effects observed when cameras are panning or subjects within the scene are moving. We provide a detailed overview of the techniques and discuss the applicability of each to various movement types. In addition to this, we evaluated algorithm efficacy with demonstrated benefits using field test video, which has been processed using our commercially available surveillance product. Our results show that algorithm efficacy is significantly improved in common scenarios, expanding our software's operational scope. Our methods introduce little computational burden, enabling their use in real-time and low-power solutions, and are appropriate for long observation periods. Our test cases focus on imaging through turbulence, a common use case for multi-frame techniques. We present results of a field study designed to test the efficacy of these techniques under expanded use cases.

Paolini, Aaron; Humphrey, John; Curt, Petersen; Kelmelis, Eric

2014-06-01

311

Real-Time On-Board Processing Validation of MSPI Ground Camera Images  

NASA Technical Reports Server (NTRS)

The Earth Sciences Decadal Survey identifies a multiangle, multispectral, high-accuracy polarization imager as one requirement for the Aerosol-Cloud-Ecosystem (ACE) mission. JPL has been developing a Multiangle SpectroPolarimetric Imager (MSPI) as a candidate to fill this need. A key technology development needed for MSPI is on-board signal processing to calculate polarimetry data as imaged by each of the 9 cameras forming the instrument. With funding from NASA's Advanced Information Systems Technology (AIST) Program, JPL is solving the real-time data processing requirements to demonstrate, for the first time, how signal data at 95 Mbytes/sec over 16-channels for each of the 9 multiangle cameras in the spaceborne instrument can be reduced on-board to 0.45 Mbytes/sec. This will produce the intensity and polarization data needed to characterize aerosol and cloud microphysical properties. Using the Xilinx Virtex-5 FPGA including PowerPC440 processors we have implemented a least squares fitting algorithm that extracts intensity and polarimetric parameters in real-time, thereby substantially reducing the image data volume for spacecraft downlink without loss of science information.

Pingree, Paula J.; Werne, Thomas A.; Bekker, Dmitriy L.

2010-01-01

312

Optimization of camera exposure durations for multi-exposure speckle imaging of the microcirculation  

PubMed Central

Improved Laser Speckle Contrast Imaging (LSCI) blood flow analyses that incorporate inverse models of the underlying laser-tissue interaction have been used to develop more quantitative implementations of speckle flowmetry such as Multi-Exposure Speckle Imaging (MESI). In this paper, we determine the optimal camera exposure durations required for obtaining flow information with comparable accuracy with the prevailing MESI implementation utilized in recent in vivo rodent studies. A looping leave-one-out (LOO) algorithm was used to identify exposure subsets which were analyzed for accuracy against flows obtained from analysis with the original full exposure set over 9 animals comprising n = 314 regional flow measurements. From the 15 original exposures, 6 exposures were found using the LOO process to provide comparable accuracy, defined as being no more than 10% deviant, with the original flow measurements. The optimal subset of exposures provides a basis set of camera durations for speckle flowmetry studies of the microcirculation and confers a two-fold faster acquisition rate and a 28% reduction in processing time without sacrificing accuracy. Additionally, the optimization process can be used to identify further reductions in the exposure subsets for tailoring imaging over less expansive flow distributions to enable even faster imaging. PMID:25071956

Kazmi, S. M. Shams; Balial, Satyajit; Dunn, Andrew K.

2014-01-01

313

Modeling of three-dimensional camera imaging in a tokamak torus  

SciTech Connect

A procedure is described for precision modeling of the views for imaging diagnostics monitoring tokamak internal components, particularly high heat flux divertor components. Such modeling is required to enable predictions of resolution and viewing angle for the available viewing locations. Since oblique views are typically expected for tokamak divertors, fully three-dimensional (3D) perspective imaging is required. A suite of matched 3D CAD, graphics and animation applications are used to provide a fast and flexible technique for reproducing these views. An analytic calculation of the resolution and viewing incidence angle is developed to validate the results of the modeling procedures. The tokamak physics experiment (TPX) diagnostics1 for infrared viewing are used as an example to demonstrate the implementation of the tools. As is generally the case in tokamak experiments, the available diagnostic locations for TPX are severely constrained by access limitations and the resulting images can be marginal in both resolution and viewing incidence angle. The procedures described here provide a complete design tool for in-vessel viewing, both for camera location and for identification of viewed surfaces. Additionally, these same tools can be used for the interpretation of the actual images obtained by the diagnostic cameras. {copyright} {ital 1997 American Institute of Physics.}

Edmonds, P.H. [Fusion Research Center, University of Texas at Austin, Austin, Texas 08543 (United States)] [Fusion Research Center, University of Texas at Austin, Austin, Texas 08543 (United States); Medley, S.S. [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543 (United States)] [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543 (United States)

1997-01-01

314

Loop closure detection by algorithmic information theory: implemented on range and camera image data.  

PubMed

In this paper the problem of loop closing from depth or camera image information in an unknown environment is investigated. A sparse model is constructed from a parametric dictionary for every range or camera image as mobile robot observations. In contrast to high-dimensional feature-based representations, in this model, the dimension of the sensor measurements' representations is reduced. Considering the loop closure detection as a clustering problem in high-dimensional space, little attention has been paid to the curse of dimensionality in the existing state-of-the-art algorithms. In this paper, a representation is developed from a sparse model of images, with a lower dimension than original sensor observations. Exploiting the algorithmic information theory, the representation is developed such that it has the geometrically transformation invariant property in the sense of Kolmogorov complexity. A universal normalized metric is used for comparison of complexity based representations of image models. Finally, a distinctive property of normalized compression distance is exploited for detecting similar places and rejecting incorrect loop closure candidates. Experimental results show efficiency and accuracy of the proposed method in comparison to the state-of-the-art algorithms and some recently proposed methods. PMID:24968363

Ravari, Alireza Norouzzadeh; Taghirad, Hamid D

2014-10-01

315

Introducing sub-wavelength pixel THz camera for the understanding of close pixel-to-wavelength imaging challenges  

NASA Astrophysics Data System (ADS)

Conventional guidelines and approximations useful in macro-scale system design can become invalidated when applied to the smaller scales. An illustration of this is when camera pixel size becomes smaller than the diffraction-limited resolution of the incident light. It is sometimes believed that there is no benefit in having a pixel width smaller than the resolving limit defined by the Raleigh criterion, 1.22 ? F/#. Though this rarely occurs in today's imaging technology, terahertz (THz) imaging is one emerging area where the pixel dimensions can be made smaller than the imaging wavelength. With terahertz camera technology, we are able to achieve sub-wavelength pixel sampling pitch, and therefore capable of directly measuring if there are image quality benefits to be derived from sub-wavelength sampling. Interest in terahertz imaging is high due to potential uses in security applications because of the greater penetration depth of terahertz radiation compared to the infrared and the visible. This paper discusses the modification by INO of its infrared MEMS microbolometer detector technology toward a THz imaging platform yielding a sub-wavelength pixel THz camera. Images obtained with this camera are reviewed in this paper. Measurements were also obtained using microscanning to increase sampling resolution. Parameters such as imaging resolution and sampling are addressed. A comparison is also made with results obtained with an 8-12 ?m band camera having a pixel pitch close to the diffractionlimit.

Bergeron, A.; Marchese, L.; Bolduc, M.; Terroux, M.; Dufour, D.; Savard, E.; Tremblay, B.; Oulachgar, H.; Doucet, M.; Le Noc, L.; Alain, C.; Jerominek, H.

2012-06-01

316

Temperature dependent operation of PSAPD-based compact gamma camera for SPECT imaging  

PubMed Central

We investigated the dependence of image quality on the temperature of a position sensitive avalanche photodiode (PSAPD)-based small animal single photon emission computed tomography (SPECT) gamma camera with a CsI:Tl scintillator. Currently, nitrogen gas cooling is preferred to operate PSAPDs in order to minimize the dark current shot noise. Being able to operate a PSAPD at a relatively high temperature (e.g., 5 °C) would allow a more compact and simple cooling system for the PSAPD. In our investigation, the temperature of the PSAPD was controlled by varying the flow of cold nitrogen gas through the PSAPD module and varied from ?40 °C to 20 °C. Three experiments were performed to demonstrate the performance variation over this temperature range. The point spread function (PSF) of the gamma camera was measured at various temperatures, showing variation of full-width-half-maximum (FWHM) of the PSF. In addition, a 99mTc-pertechnetate (140 keV) flood source was imaged and the visibility of the scintillator segmentation (16×16 array, 8 mm × 8 mm area, 400 ?m pixel size) at different temperatures was evaluated. Comparison of image quality was made at ?25 °C and 5 °C using a mouse heart phantom filled with an aqueous solution of 99mTc-pertechnetate and imaged using a 0.5 mm pinhole collimator made of tungsten. The reconstructed image quality of the mouse heart phantom at 5 °C degraded in comparision to the reconstructed image quality at ?25 °C. However, the defect and structure of the mouse heart phantom were clearly observed, showing the feasibility of operating PSAPDs for SPECT imaging at 5 °C, a temperature that would not need the nitrogen cooling. All PSAPD evaluations were conducted with an applied bias voltage that allowed the highest gain at a given temperature. PMID:24465051

Kim, Sangtaek; McClish, Mickel; Alhassen, Fares; Seo, Youngho; Shah, Kanai S.; Gould, Robert G.

2011-01-01

317

MR-i: high-speed dual-cameras hyperspectral imaging FTS  

NASA Astrophysics Data System (ADS)

From scientific research to deployable operational solutions, Fourier-Transform Infrared (FT-IR) spectroradiometry is widely used for the development and enhancement of military and research applications. These techniques include targets IR signature characterization, development of advanced camouflage techniques, aircraft engine's plumes monitoring, meteorological sounding and atmospheric composition analysis such as detection and identification of chemical threats. Imaging FT-IR spectrometers have the capability of generating 3D images composed of multiple spectra associated with every pixel of the mapped scene. That data allow for accurate spatial characterization of target's signature by resolving spatially the spectral characteristics of the observed scenes. MR-i is the most recent addition to the MR product line series and generates spectral data cubes in the MWIR and LWIR. The instrument is designed to acquire the spectral signature of various scenes with high temporal, spatial and spectral resolution. The four port architecture of the interferometer brings modularity and upgradeability since the two output ports of the instrument can be populated with different combinations of detectors (imaging or not). For instance to measure over a broad spectral range from 1.3 to 13 ?m, one output port can be equipped with a LWIR camera while the other port is equipped with a MWIR camera. Both ports can be equipped with cameras serving the same spectral range but set at different sensitivity levels in order to increase the measurement dynamic range and avoid saturation of bright parts of the scene while simultaneously obtaining good measurement of the faintest parts of the scene. Various telescope options are available for the input port. Overview of the instrument capabilities will be presented as well as test results and results from field trials for a configuration with two MWIR cameras. That specific system is dedicated to the characterization of airborne targets. The expanded dynamic range allowed by the two MWIR cameras enables to simultaneously measure the spectral signature of the cold background and of the warmest elements of the scene (flares, jet engines exhausts, etc.).

Prel, Florent; Moreau, Louis; Lantagne, Stéphane; Roy, Claude; Vallières, Christian; Lévesque, Luc

2011-11-01

318

Improved Digitization of Lunar Mare Ridges with LROC Derived Products  

NASA Astrophysics Data System (ADS)

Lunar wrinkle ridges (mare ridges) are positive-relief structures formed from compressional stress in basin-filling flood basalt deposits [1]. Previous workers have measured wrinkle ridge orientations and lengths to investigate their spatial distribution and infer basin-localized stress fields [2,3]. Although these plots include the most prominent mare ridges and their general trends, they may not have fully captured all of the ridges, particularly the smaller-scale ridges. Using Lunar Reconnaissance Orbiter Wide Angle Camera (WAC) global mosaics and derived topography (100m pixel scale) [4], we systematically remapped wrinkle ridges in Mare Serenitatis. By comparing two WAC mosaics with different lighting geometry, and shaded relief maps made from a WAC digital elevation model (DEM) [5], we observed that some ridge segments and some smaller ridges are not visible in previous structure maps [2,3]. In the past, mapping efforts were limited by a fixed Sun direction [6,7]. For systematic mapping we created three shaded relief maps from the WAC DEM with solar azimuth angles of 0°, 45°, and 90°, and a fourth map was created by combining the three shaded reliefs into one, using a simple averaging scheme. Along with the original WAC mosaic and the WAC DEM, these four datasets were imported into ArcGIS, and the mare ridges of Imbrium, Serenitatis, and Tranquillitatis were digitized from each of the six maps. Since the mare ridges are often divided into many ridge segments [8], each major component was digitized separately, as opposed to the ridge as a whole. This strategy enhanced our ability to analyze the lengths, orientations, and abundances of these ridges. After the initial mapping was completed, the six products were viewed together to identify and resolve discrepancies in order to produce a final wrinkle ridge map. Comparing this new mare ridge map with past lunar tectonic maps, we found that many mare ridges were not recorded in the previous works. It was noted in some cases, the lengths and orientations of previously digitized ridges were different than those of the ridges digitized in this study. This method of multi-map digitizing allows for a greater accuracy in spatial characterization of mare ridges than previous methods. We intend to map mare ridges on a global scale, creating a more comprehensive ridge map due to higher resolution. References Cited: [1] Schultz P.H. (1976) Moon Morphology, 308. [2] Wilhelms D.E. (1987) USGS Prof. Paper 1348, 5A-B. [3] Carr, M.H. (1966) USGS Geologic Atlas of the Moon, I-498. [4] Robinson M.S. (2010) Space Sci. Rev., 150:82. [5] Scholten F. et al. (2011) LPSC XLII, 2046. [6] Fielder G. and Kiang T. (1962) The Observatory: No. 926, 8. [7] Watters T.R. and Konopliv A.S. (2001) Planetary and Space Sci. 49. 743-748. [8] Aubele J.C. (1988) LPSC XIX, 19.

Crowell, J. M.; Robinson, M. S.; Watters, T. R.; Bowman-Cisneros, E.; Enns, A. C.; Lawrence, S.

2011-12-01

319

Toward real-time quantum imaging with a single pixel camera  

E-print Network

We present a workbench for the study of real-time quantum imaging by measuring the frame-by-frame quantum noise reduction of multi-spatial-mode twin beams generated by four wave mixing in Rb vapor. Exploiting the multiple spatial modes of this squeezed light source, we utilize spatial light modulators to selectively pass macropixels of quantum correlated modes from each of the twin beams to a high quantum efficiency balanced detector. In low-light-level imaging applications, the ability to measure the quantum correlations between individual spatial modes and macropixels of spatial modes with a single pixel camera will facilitate compressive quantum imaging with sensitivity below the photon shot noise limit.

B. J. Lawrie; R. C. Pooser

2013-03-15

320

First experience DaTSCAN imaging using cadmium-zinc-telluride gamma camera SPECT.  

PubMed

We report our first experience of brain DaTSCAN SPECT imaging using cadmium-zinc-telluride gamma camera (CZT-GC) in 2 cases: a 64-year-old patient suffering from essential tremor and a 73-year-old patient presenting with atypical bilateral extrapyramidal syndrome. In both cases, 2 different acquisitions were performed and compared, using a double-head Anger-GC, followed immediately by a second acquisition on CZT-GC. There were no significant visual differences between images generated by different GC. Our first result suggests that DaTSCAN SPECT is feasible on CZT-GC, allowing both injected dose and acquisition time reductions without compromising image quality. This experience needs to be evaluated in larger series. PMID:22785531

Farid, Karim; Queneau, Mathieu; Guernou, Mohamed; Lussato, David; Poullias, Xavier; Petras, Slavomir; Caillat-Vigneron, Nadine; Songy, Bernard

2012-08-01

321

Airborne imaging for heritage documentation using the Fotokite tethered flying camera  

NASA Astrophysics Data System (ADS)

Since the beginning of aerial photography, researchers used all kinds of devices (from pigeons, kites, poles, and balloons to rockets) to take still cameras aloft and remotely gather aerial imagery. To date, many of these unmanned devices are still used for what has been referred to as Low-Altitude Aerial Photography or LAAP. In addition to these more traditional camera platforms, radio-controlled (multi-)copter platforms have recently added a new aspect to LAAP. Although model airplanes have been around for several decades, the decreasing cost, increasing functionality and stability of ready-to-fly multi-copter systems has proliferated their use among non-hobbyists. As such, they became a very popular tool for aerial imaging. The overwhelming amount of currently available brands and types (heli-, dual-, tri-, quad-, hexa-, octo-, dodeca-, deca-hexa and deca-octocopters), together with the wide variety of navigation options (e.g. altitude and position hold, waypoint flight) and camera mounts indicate that these platforms are here to stay for some time. Given the multitude of still camera types and the image quality they are currently capable of, endless combinations of low- and high-cost LAAP solutions are available. In addition, LAAP allows for the exploitation of new imaging techniques, as it is often only a matter of lifting the appropriate device (e.g. video cameras, thermal frame imagers, hyperspectral line sensors). Archaeologists were among the first to adopt this technology, as it provided them with a means to easily acquire essential data from a unique point of view, whether for simple illustration purposes of standing historic structures or to compute three-dimensional (3D) models and orthophotographs from excavation areas. However, even very cheap multi-copters models require certain skills to pilot them safely. Additionally, malfunction or overconfidence might lift these devices to altitudes where they can interfere with manned aircrafts. As such, the safe operation of these devices is still an issue, certainly when flying on locations which can be crowded (such as students on excavations or tourists walking around historic places). As the future of UAS regulation remains unclear, this talk presents an alternative approach to aerial imaging: the Fotokite. Developed at the ETH Zürich, the Fotokite is a tethered flying camera that is essentially a multi-copter connected to the ground with a taut tether to achieve controlled flight. Crucially, it relies solely on onboard IMU (Inertial Measurement Unit) measurements to fly, launches in seconds, and is classified as not a UAS (Unmanned Aerial System), e.g. in the latest FAA (Federal Aviation Administration) UAS proposal. As a result it may be used for imaging cultural heritage in a variety of environments and settings with minimal training by non-experienced pilots. Furthermore, it is subject to less extensive certification, regulation and import/export restrictions, making it a viable solution for use at a greater range of sites than traditional methods. Unlike a balloon or a kite it is not subject to particular weather conditions and, thanks to active stabilization, is capable of a variety of intelligent flight modes. Finally, it is compact and lightweight, making it easy to transport and deploy, and its lack of reliance on GNSS (Global Navigation Satellite System) makes it possible to use in urban, overbuilt areas. After outlining its operating principles, the talk will present some archaeological case studies in which the Fotokite was used, hereby assessing its capabilities compared to the conventional UAS's on the market.

Verhoeven, Geert; Lupashin, Sergei; Briese, Christian; Doneus, Michael

2014-05-01

322

ELOp EO/IR LOROP camera: image stabilization for dual-band whiskbroom scanning photography  

NASA Astrophysics Data System (ADS)

The ELOP dual band LOROP camera was designed as a payload of a 300 gal reconnaissance pod capable of being carried by a single-engineerd fighter aircrat like F-16. The optical arrangement provides coincidence of the IR and EO optical axes, as well as equality of the fields-of-view. These features allow the same sacn coverage to be achieved, and the same gimbals control software to be used for the visible-light-only, IR-only and simultaneous dual band photography. Because of intensive, broadband vibration existing in teh pod environment, special attention was given to image stabilization system. Nevertheless, residual vibraiton still exists in a wide frequency range spreading from zero frequency to the detector integration rate and beyond it. Hence, evaluation of the camera performance could not rely on the well-known analytical solutions for MTFMOTION. The image motion is deinfed in terms of the Power Spectral Density throughout the whole frequency range of interest. The expected MTFMOTION is calculated numerically using a statistical approach. Aspects of the staggered-structure IR detecotr application in oblique photography are discussed. Particuarly, the ground footprint of the IR detector is much wider along-scan than one of the EO detector, requiring considerations to be implemented in order to prevent IR image deformation.

Petrushevsky, Vladimir; Karklinsky, Yehoshua; Chernobrov, Arie

2003-01-01

323

Mars Orbiter Camera High Resolution Images: Some Results From The First 6 Weeks In Orbit  

NASA Technical Reports Server (NTRS)

The Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) images acquired shortly after orbit insertion were relatively poor in both resolution and image quality. This poor performance was solely the result of low sunlight conditions and the relative distance to the planet, both of which have been progressively improving over the past six weeks. Some of the better images are used here (see PIA01021 through PIA01029) to illustrate how the MOC images provide substantially better views of the martian surface than have ever been recorded previously from orbit.

This U.S. Geological Survey shaded relief map provides an overall context for the MGS MOC images of the Tithonium/Ius Chasma, Ganges Chasma, and Schiaparelli Crater. Closeup images of the Tithonium/Ius Chasma area are visible in PIA01021 through PIA01023. Closeups of Ganges Chasma are available as PIA01027 through PIA01029, and Schiaparelli Crater is shown in PIA01024 through PIA01026. The Mars Pathfinder landing site is shown to the north of the sites of the MGS images.

Launched on November 7, 1996, Mars Global Surveyor entered Mars orbit on Thursday, September 11, 1997. The original mission plan called for using friction with the planet's atmosphere to reduce the orbital energy, leading to a two-year mapping mission from close, circular orbit (beginning in March 1998). Owing to difficulties with one of the two solar panels, aerobraking was suspended in mid-October and resumed in November 8. Many of the original objectives of the mission, and in particular those of the camera, are likely to be accomplished as the mission progresses.

Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.

1997-01-01

324

Synchroscan streak camera imaging at a 15-MeV photoinjector with emittance exchange  

SciTech Connect

At the Fermilab A0 photoinjector facility, bunch-length measurements of the laser micropulse and the e-beam micropulse have been done in the past with a fast single-sweep module of the Hamamatsu C5680 streak camera with an intrinsic shot-to-shot trigger jitter of 10 to 20 ps. We have upgraded the camera system with the synchroscan module tuned to 81.25 MHz to provide synchronous summing capability with less than 1.5-ps FWHM trigger jitter and a phase-locked delay box to provide phase stability of {approx}1 ps over 10s of minutes. These steps allowed us to measure both the UV laser pulse train at 263 nm and the e-beam via optical transition radiation (OTR). Due to the low electron beam energies and OTR signals, we typically summed over 50 micropulses with 0.25-1 nC per micropulse. The phase-locked delay box allowed us to assess chromatic temporal effects and instigated another upgrade to an all-mirror input optics barrel. In addition, we added a slow sweep horizontal deflection plug-in unit to provide dual-sweep capability for the streak camera. We report on a series of measurements made during the commissioning of these upgrades including bunch-length and phase effects using the emittance exchange beamline and simultaneous imaging of a UV drive laser component, OTR, and the 800-nm diagnostics laser.

Lumpkin, A.H.; Ruan, J.; Thurman-Keup, R.; /Fermilab

2012-03-01

325

Intensified array camera imaging of solid surface combustion aboard the NASA Learjet  

NASA Technical Reports Server (NTRS)

An intensified array camera was used to image weakly luminous flames spreading over thermally thin paper samples in a low gravity environment aboard the NASA-Lewis Learjet. The aircraft offers 10 to 20 sec of reduced gravity during execution of a Keplerian trajectory and allows the use of instrumentation that is delicate or requires higher electrical power than is available in drop towers. The intensified array camera is a charge intensified device type that responds to light between 400 and 900 nm and has a minimum sensitivity of 10(exp 6) footcandles. The paper sample, either ashless filter paper or a lab wiper, burns inside a sealed chamber which is filled with 21, 18, or 15 pct. oxygen in nitrogen at one atmosphere. The camera views the edge of the paper and its output is recorded on videotape. Flame positions are measured every 0.1 sec to calculate flame spread rates. Comparisons with drop tower data indicate that the flame shapes and spread rates are affected by the residual g level in the aircraft.

Weiland, Karen J.

1992-01-01

326

Fast high-resolution characterization of powders using an imaging plate Guinier camera  

NASA Astrophysics Data System (ADS)

A new Huber Guinier camera G670 was installed on an Ultrax18-Rigaku X-ray rotating Cu anode source, with a monochromator (focal length B=360 mm) providing pure K? 1 radiation. The camera is used for powder diffraction applying transmission geometry. An imaging plate (IP) brings about position-sensitive detection measurement. In order to evaluate this new instrumental setup, quality data were collected for some classical reference materials such as silicon, quartz, some standards supplied by NIST USA and ceramic oxides synthesized in our laboratory. Each sample was measured at 4 kW for 1-2 min at 2 ? from 0 to 100°. The results were compared with published references. The following desirable features are noted for the instrumental combination studied: production of high quality X-ray data at a very fast rate, very accurate intensity measurements, sharp diffraction patterns due to small instrumental broadening and a pure monochromatic beam, and small position errors for 2 ? from 4 to 80°. There is no evidence for extra line broadening by the IP camera detector setup. It was found that the relatively high instrumental background can be easily dealt with and does not pose difficulty in the analysis of the data. However, fluorescence cannot be filtered.

Gal, Joseph; Mogilanski, Dmitry; Nippus, Michael; Zabicky, Jacob; Kimmel, Giora

2005-10-01

327

Using different interpolation techniques in unwrapping the distorted images from panoramic annular lens camera  

NASA Astrophysics Data System (ADS)

The camera using panoramic annular lens (PAL) can capture the surrounding scene in a view of 360° without any scanning component. Due to severe distortions, the image formed by PAL must be unwrapped into a perspective-view image in order to get consistency with the human's visual custom. However the unfilled pixels would probably exist after unwrapping as a result of the non-uniform resolution in the PAL image, hence the interpolation should be employed in the phase of the forward projection unwrapping. We also evaluated the performance of several interpolation techniques for unwrapping the PAL image on a series of frequency-patterned images as a simulation by using three image quality indexes: MSE, SSIM and S-CIELAB. The experiment result revealed that those interpolation methods had better capability for the low frequent PAL images. The Bicubic, Ferguson and Newton interpolations performed relatively better at higher frequencies, while Bilinear and Bezier could achieve better result at lower frequency. Besides, the Nearest method had poorest performance in general and the Ferguson interpolation was excellent in both high and low frequencies.

Yu, Guo; Fu, Lingjin; Bai, Jian

2010-11-01

328

A 3D HIDAC-PET camera with sub-millimetre resolution for imaging small animals  

Microsoft Academic Search

A HIDAC-PET camera consisting essentially of 5 million 0.5 mm gas avalanching detectors has been constructed for small-animal imaging. The particular HIDAC advantage-a high 3D spatial resolution-has been improved to 0.95 mm fwhm and to 0.7 mm fwhm when reconstructing with 3D-OSEM methods incorporating resolution recovery. A depth-of-interaction resolution of 2.5 mm is implicit, due to the laminar construction. Scatter-corrected

A. P. Jeavons; R. A. Chandler; C. A. R. Dettmar

1999-01-01

329

New Mars Camera's First Image of Mars from Mapping Orbit (Full Frame)  

NASA Technical Reports Server (NTRS)

The high resolution camera on NASA's Mars Reconnaissance Orbiter captured its first image of Mars in the mapping orbit, demonstrating the full resolution capability, on Sept. 29, 2006. The High Resolution Imaging Science Experiment (HiRISE) acquired this first image at 8:16 AM (Pacific Time). With the spacecraft at an altitude of 280 kilometers (174 miles), the image scale is 25 centimeters per pixel (10 inches per pixel). If a person were located on this part of Mars, he or she would just barely be visible in this image.

The image covers a small portion of the floor of Ius Chasma, one branch of the giant Valles Marineris system of canyons. The image illustrates a variety of processes that have shaped the Martian surface. There are bedrock exposures of layered materials, which could be sedimentary rocks deposited in water or from the air. Some of the bedrock has been faulted and folded, perhaps the result of large-scale forces in the crust or from a giant landslide. The image resolves rocks as small as small as 90 centimeters (3 feet) in diameter. It includes many dunes or ridges of windblown sand.

This image (TRA_000823_1720) was taken by the High Resolution Imaging Science Experiment camera onboard the Mars Reconnaissance Orbiter spacecraft on Sept. 29, 2006. Shown here is the full image, centered at minus 7.8 degrees latitude, 279.5 degrees east longitude. The image is oriented such that north is to the top. The range to the target site was 297 kilometers (185.6 miles). At this distance the image scale is 25 centimeters (10 inches) per pixel (with one-by-one binning) so objects about 75 centimeters (30 inches) across are resolved. The image was taken at a local Mars time of 3:30 PM and the scene is illuminated from the west with a solar incidence angle of 59.7 degrees, thus the sun was about 30.3 degrees above the horizon. The season on Mars is northern winter, southern summer.

[Photojournal note: Due to the large sizes of the high-resolution TIFF and JPEG files, some systems may experience extremely slow downlink time while viewing or downloading these images; some systems may be incapable of handling the download entirely.]

NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Reconnaissance Orbiter for NASA's Science Mission Directorate, Washington. Lockheed Martin Space Systems, Denver, is the prime contractor for the project and built the spacecraft. The HiRISE camera was built by Ball Aerospace & Technologies Corporation, Boulder, Colo., and is operated by the University of Arizona, Tucson.

2006-01-01

330

High-resolution imaging of the Pluto-Charon system with the Faint Object Camera of the Hubble Space Telescope  

NASA Technical Reports Server (NTRS)

Images of the Pluto-Charon system were obtained with the Faint Object Camera (FOC) of the Hubble Space Telescope (HST) after the refurbishment of the telescope. The images are of superb quality, allowing the determination of radii, fluxes, and albedos. Attempts were made to improve the resolution of the already diffraction limited images by image restoration. These yielded indications of surface albedo distributions qualitatively consistent with models derived from observations of Pluto-Charon mutual eclipses.

Albrecht, R.; Barbieri, C.; Adorf, H.-M.; Corrain, G.; Gemmo, A.; Greenfield, P.; Hainaut, O.; Hook, R. N.; Tholen, D. J.; Blades, J. C.

1994-01-01

331

Real-time image processing and fusion for a new high-speed dual-band infrared camera  

NASA Astrophysics Data System (ADS)

A dual-band infrared camera system based on a dual-band quantum well infrared photodetector (QWIP) has been developed for acquiring images from both the mid-wavelength (MWIR) and long-wavelength (LWIR) infrared spectral band. The system delivers exactly pixel-registered simultaneously acquired images. It has the advantage that appropriate signal and image processing permit to exploit differences in the characteristics of those bands. Thus, the camera reveals more information than a single-band camera. It helps distinguishing between targets and decoys and has the ability to defeat many IR countermeasures such as smoke, camouflage and flares. Furthermore, the system permits to identify materials (e.g. glass, asphalt, slate, etc.), to distinguish sun reflections from hot objects and to visualize hot exhaust gases. Furthermore, dedicated software for processing and exploitation in real-time extends the application domain of the camera system. One component corrects the images and allows for overlays with complementary colors such that differences become apparent. Another software component aims at a robust estimation of transformation parameters of consecutive images in the image stream for image registration purposes. This feature stabilizes the images also under rugged conditions and it allows for the automatic stitching of the image stream to construct large mosaic images. Mosaic images facilitate the inspection of large objects and scenarios and create a better overview for human observers. In addition, image based MTI (moving target indication) also for the case of a moving camera is under development. This component aims at surveillance applications and could also be used for camouflage assessment of moving targets.

Müller, Markus; Schreer, Oliver; López Sáenz, Monica

2007-04-01

332

Minimal Camera Networks for 3D Image Based Modeling of Cultural Heritage Objects  

PubMed Central

3D modeling of cultural heritage objects like artifacts, statues and buildings is nowadays an important tool for virtual museums, preservation and restoration. In this paper, we introduce a method to automatically design a minimal imaging network for the 3D modeling of cultural heritage objects. This becomes important for reducing the image capture time and processing when documenting large and complex sites. Moreover, such a minimal camera network design is desirable for imaging non-digitally documented artifacts in museums and other archeological sites to avoid disturbing the visitors for a long time and/or moving delicate precious objects to complete the documentation task. The developed method is tested on the Iraqi famous statue “Lamassu”. Lamassu is a human-headed winged bull of over 4.25 m in height from the era of Ashurnasirpal II (883–859 BC). Close-range photogrammetry is used for the 3D modeling task where a dense ordered imaging network of 45 high resolution images were captured around Lamassu with an object sample distance of 1 mm. These images constitute a dense network and the aim of our study was to apply our method to reduce the number of images for the 3D modeling and at the same time preserve pre-defined point accuracy. Temporary control points were fixed evenly on the body of Lamassu and measured by using a total station for the external validation and scaling purpose. Two network filtering methods are implemented and three different software packages are used to investigate the efficiency of the image orientation and modeling of the statue in the filtered (reduced) image networks. Internal and external validation results prove that minimal image networks can provide highly accurate records and efficiency in terms of visualization, completeness, processing time (>60% reduction) and the final accuracy of 1 mm. PMID:24670718

Alsadik, Bashar; Gerke, Markus; Vosselman, George; Daham, Afrah; Jasim, Luma

2014-01-01

333

Efficient Smart CMOS Camera Based on FPGAs Oriented to Embedded Image Processing  

PubMed Central

This article describes an image processing system based on an intelligent ad-hoc camera, whose two principle elements are a high speed 1.2 megapixel Complementary Metal Oxide Semiconductor (CMOS) sensor and a Field Programmable Gate Array (FPGA). The latter is used to control the various sensor parameter configurations and, where desired, to receive and process the images captured by the CMOS sensor. The flexibility and versatility offered by the new FPGA families makes it possible to incorporate microprocessors into these reconfigurable devices, and these are normally used for highly sequential tasks unsuitable for parallelization in hardware. For the present study, we used a Xilinx XC4VFX12 FPGA, which contains an internal Power PC (PPC) microprocessor. In turn, this contains a standalone system which manages the FPGA image processing hardware and endows the system with multiple software options for processing the images captured by the CMOS sensor. The system also incorporates an Ethernet channel for sending processed and unprocessed images from the FPGA to a remote node. Consequently, it is possible to visualize and configure system operation and captured and/or processed images remotely. PMID:22163739

Bravo, Ignacio; Baliñas, Javier; Gardel, Alfredo; Lázaro, José L.; Espinosa, Felipe; García, Jorge

2011-01-01

334

Efficient smart CMOS camera based on FPGAs oriented to embedded image processing.  

PubMed

This article describes an image processing system based on an intelligent ad-hoc camera, whose two principle elements are a high speed 1.2 megapixel Complementary Metal Oxide Semiconductor (CMOS) sensor and a Field Programmable Gate Array (FPGA). The latter is used to control the various sensor parameter configurations and, where desired, to receive and process the images captured by the CMOS sensor. The flexibility and versatility offered by the new FPGA families makes it possible to incorporate microprocessors into these reconfigurable devices, and these are normally used for highly sequential tasks unsuitable for parallelization in hardware. For the present study, we used a Xilinx XC4VFX12 FPGA, which contains an internal Power PC (PPC) microprocessor. In turn, this contains a standalone system which manages the FPGA image processing hardware and endows the system with multiple software options for processing the images captured by the CMOS sensor. The system also incorporates an Ethernet channel for sending processed and unprocessed images from the FPGA to a remote node. Consequently, it is possible to visualize and configure system operation and captured and/or processed images remotely. PMID:22163739

Bravo, Ignacio; Baliñas, Javier; Gardel, Alfredo; Lázaro, José L; Espinosa, Felipe; García, Jorge

2011-01-01

335

Spectral imaging with a cid camera. Final report, 4 February 1982-14 September 1984  

SciTech Connect

This report describes a program of spectral imaging observations of the solar atmosphere using the Sacromento Peak Vacuum Tower Telescope. The observations were obtained with Lockheed instruments including: an active tilt mirror for image motion compensation; polarization analyzer; narrowband tunable birefringent filter; photoelectric cid array camera; digital video image processor; and a microcomputer controller. Five observing runs were made, three of them with the entire system in operation. The images obtained were processed to measure magnetic and velocity fields in the solar photosphere with very high spatial resolution - 0.5 arcseconds on the best frames. Sets of these images have been studied to address three scientific problems; (1) The relationship among small magnetic flux tubes, downdrafts and granulation; (2) The puzzling flux changes in isolated magnetic features in a decaying active region; (3) The temporal power spectrum of the magnetogram signal in isolated flux tubes. Examples of the narrowband images are included in the report, along with abstracts and manuscripts of papers, resulting from this research.

Tarbell, T.D.

1985-03-22

336

A camera for imaging hard x-rays from suprathermal electrons during lower hybrid current drive on PBX-M  

SciTech Connect

During lower hybrid current drive (LHCD), suprathermal electrons are generated that emit hard X-ray bremsstrahlung. A pinhole camera has been installed on the PBX-M tokamak that records 128 {times} 128 pixel images of the bremsstrahlung with a 3 ms time resolution. This camera has identified hollow radiation profiles on PBX-M, indicating off-axis current drive. The detector is a 9in. dia. intensifier. A detailed account of the construction of the Hard X-ray Camera, its operation, and its performance is given.

von Goeler, S.; Kaita, R.; Bernabei, S.; Davis, W.; Fishman, H.; Gettelfinger, G.; Ignat, D.; Roney, P.; Stevens, J.; Stodiek, W. [Princeton Univ., NJ (United States). Plasma Physics Lab.; Jones, S.; Paoletti, F. [Massachusetts Inst. of Tech., Cambridge, MA (United States). Plasma Fusion Center; Petravich, G. [Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics; Rimini, F. [JET Joint Undertaking, Abingdon (United Kingdom)

1993-05-01

337

A camera for imaging hard x-rays from suprathermal electrons during lower hybrid current drive on PBX-M  

SciTech Connect

During lower hybrid current drive (LHCD), suprathermal electrons are generated that emit hard X-ray bremsstrahlung. A pinhole camera has been installed on the PBX-M tokamak that records 128 [times] 128 pixel images of the bremsstrahlung with a 3 ms time resolution. This camera has identified hollow radiation profiles on PBX-M, indicating off-axis current drive. The detector is a 9in. dia. intensifier. A detailed account of the construction of the Hard X-ray Camera, its operation, and its performance is given.

von Goeler, S.; Kaita, R.; Bernabei, S.; Davis, W.; Fishman, H.; Gettelfinger, G.; Ignat, D.; Roney, P.; Stevens, J.; Stodiek, W. (Princeton Univ., NJ (United States). Plasma Physics Lab.); Jones, S.; Paoletti, F. (Massachusetts Inst. of Tech., Cambridge, MA (United States). Plasma Fusion Center); Petravich, G. (Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics); Rimini,

1993-05-01

338

Monitoring Phenology and Transpiration. Since Spring 2008, three time-lapse digital cameras have been recording daily images within the  

E-print Network

been recording daily images within the footprint of the SRC eddy covariance tower. These cameras images Creosote FLUXNET Site Background. The Santa Rita Creosote (SRC) FLUXNET Site (UTM: 0515177, 3530284) was established in February 2008. The SRC site is situated just south of Tucson, AZ, within the Santa Rita

Papuga, Shirley A.

339

Image Set-based Hand Shape Recognition Using Camera Selection Driven by Multi-class AdaBoosting  

E-print Network

Image Set-based Hand Shape Recognition Using Camera Selection Driven by Multi-class Ada. We propose a method for image set-based hand shape recognition that uses the multi-class AdaBoost framework. The recognition of hand shape is a difficult problem, as a hand's appearance depends greatly

Fukui, Kazuhiro

340

Real time plume and laser spot recognition in IR camera images  

SciTech Connect

It is desirable to automatically guide the laser spot onto the effluent plume for maximum IR DIAL system sensitivity. This requires the use of a 2D focal plane array. The authors have demonstrated that a wavelength-filtered IR camera is capable of 2D imaging of both the plume and the laser spot. In order to identify the centers of the plume and the laser spot, it is first necessary to segment these features from the background. They report a demonstration of real time plume segmentation based on velocity estimation. They also present results of laser spot segmentation using simple thresholding. Finally, they describe current research on both advanced segmentation and recognition algorithms and on reconfigurable real time image processing hardware based on field programmable gate array technology.

Moore, K.R.; Caffrey, M.P.; Nemzek, R.J.; Salazar, A.A.; Jeffs, J. [Los Alamos National Lab., NM (United States); Andes, D.K.; Witham, J.C. [Naval Air Weapons Station, China Lake, CA (United States)

1997-08-01

341

High resolution Vesta High Altitude Mapping Orbit (HAMO) Atlas derived from Dawn framing camera images  

NASA Astrophysics Data System (ADS)

The Dawn framing camera (FC) acquired about 2500 clear filter images of Vesta with a resolution of about 70 m/pixels during the High Altitude Mapping Orbit (HAMO) in fall 2011. We ortho-rectified these images and produced a global high resolution controlled mosaic of Vesta. This global mosaic is the baseline for a high resolution Vesta atlas that consists of 15 tiles mapped at a scale of 1:500,000. The nomenclature used in this atlas was proposed by the Dawn team and was approved by the International Astronomical Union (IAU). The whole atlas is available to the public through the Dawn GIS web page [http://dawn_gis.dlr.de/atlas].

Roatsch, Th.; Kersten, E.; Matz, K.-D.; Preusker, F.; Scholten, F.; Jaumann, R.; Raymond, C. A.; Russell, C. T.

2012-12-01

342

Efficient high-dimensional entanglement imaging with a compressive sensing, double-pixel camera  

E-print Network

We implement a double-pixel, compressive sensing camera to efficiently characterize, at high resolution, the spatially entangled fields produced by spontaneous parametric downconversion. This technique leverages sparsity in spatial correlations between entangled photons to improve acquisition times over raster-scanning by a scaling factor up to n^2/log(n) for n-dimensional images. We image at resolutions up to 1024 dimensions per detector and demonstrate a channel capacity of 8.4 bits per photon. By comparing the classical mutual information in conjugate bases, we violate an entropic Einstein-Podolsky-Rosen separability criterion for all measured resolutions. More broadly, our result indicates compressive sensing can be especially effective for higher-order measurements on correlated systems.

Gregory A. Howland; John C. Howell

2012-12-21

343

Automatic camera-based identification and 3-D reconstruction of electrode positions in electrocardiographic imaging.  

PubMed

Abstract Electrocardiographic imaging (ECG imaging) is a method to depict electrophysiological processes in the heart. It is an emerging technology with the potential of making the therapy of cardiac arrhythmia less invasive, less expensive, and more precise. A major challenge for integrating the method into clinical workflow is the seamless and correct identification and localization of electrodes on the thorax and their assignment to recorded channels. This work proposes a camera-based system, which can localize all electrode positions at once and to an accuracy of approximately 1±1 mm. A system for automatic identification of individual electrodes is implemented that overcomes the need of manual annotation. For this purpose, a system of markers is suggested, which facilitates a precise localization to subpixel accuracy and robust identification using an error-correcting code. The accuracy of the presented system in identifying and localizing electrodes is validated in a phantom study. Its overall capability is demonstrated in a clinical scenario. PMID:25229412

Schulze, Walther H W; Mackens, Patrick; Potyagaylo, Danila; Rhode, Kawal; Tülümen, Erol; Schimpf, Rainer; Papavassiliu, Theano; Borggrefe, Martin; Dössel, Olaf

2014-12-01

344

Crack propagation imaging by the ISIS camera and a video trigger system  

NASA Astrophysics Data System (ADS)

An ultra-high speed camera of 1Mfps was applied to visualize the crack propagation. Change of stress field around the propagating crack tip was captured as a change of the fringe pattern by means of the photo-elastic imaging technique. Newly developed video trigger system is employed to detect the occurrence of the crack propagation as a trigger in the experiment. The trigger successfully perceived the initiation of the crack propagation stably. Also its response time was fast enough even for the image capturing with 1Mfps. As a result, it is revealed that the elastic wave, propagating in the continuous body, has a significant effect on the velocity and kinking behavior of the propagating crack.

Okinaka, Tomoo; Karimov, Pavel; Etoh, Takeharu; Oguni, Kenji

2007-01-01

345

Geometric calibration of multi-sensor image fusion system with thermal infrared and low-light camera  

NASA Astrophysics Data System (ADS)

A calibration platform for geometric calibration of multi-sensor image fusion system is presented in this paper. The accurate geometric calibration of the extrinsic geometric parameters of cameras that uses planar calibration pattern is applied. For calibration procedure specific software is made. Patterns used in geometric calibration are prepared with aim to obtain maximum contrast in both visible and infrared spectral range - using chessboards which fields are made of different emissivity materials. Experiments were held in both indoor and outdoor scenarios. Important results of geometric calibration for multi-sensor image fusion system are extrinsic parameters in form of homography matrices used for homography transformation of the object plane to the image plane. For each camera a corresponding homography matrix is calculated. These matrices can be used for image registration of images from thermal and low light camera. We implemented such image registration algorithm to confirm accuracy of geometric calibration procedure in multi-sensor image fusion system. Results are given for selected patterns - chessboard with fields made of different emissivity materials. For the final image registration algorithm in surveillance system for object tracking we have chosen multi-resolution image registration algorithm which naturally combines with a pyramidal fusion scheme. The image pyramids which are generated at each time step of image registration algorithm may be reused at the fusion stage so that overall number of calculations that must be performed is greatly reduced.

Peric, Dragana; Lukic, Vojislav; Spanovic, Milana; Sekulic, Radmila; Kocic, Jelena

2014-10-01

346

Integrated camera motion compensation by real-time image motion tracking and image deconvolution  

Microsoft Academic Search

This paper presents the concept of a smart satellite pushbroom imaging system with internal compensation of attitude instability effects. The compensation is performed within the optical path by an active opto-mechatronic stabilization of the focal plane image motion in a closed loop system with visual feedback. Residual distortions are corrected by image deblurring through deconvolution. Both corrective actions are derived

Klaus Janschek; Valerij Tchernykh; Serguei Dyblenko

2005-01-01

347

Accidental Pinhole and Pinspeck Cameras  

E-print Network

We identify and study two types of “accidental” images that can be formed in scenes. The first is an accidental pinhole camera image. The second class of accidental images are “inverse” pinhole camera images, formed by ...

Torralba, Antonio

348

A semi-automatic image-based close range 3D modeling pipeline using a multi-camera configuration.  

PubMed

The generation of photo-realistic 3D models is an important task for digital recording of cultural heritage objects. This study proposes an image-based 3D modeling pipeline which takes advantage of a multi-camera configuration and multi-image matching technique that does not require any markers on or around the object. Multiple digital single lens reflex (DSLR) cameras are adopted and fixed with invariant relative orientations. Instead of photo-triangulation after image acquisition, calibration is performed to estimate the exterior orientation parameters of the multi-camera configuration which can be processed fully automatically using coded targets. The calibrated orientation parameters of all cameras are applied to images taken using the same camera configuration. This means that when performing multi-image matching for surface point cloud generation, the orientation parameters will remain the same as the calibrated results, even when the target has changed. Base on this invariant character, the whole 3D modeling pipeline can be performed completely automatically, once the whole system has been calibrated and the software was seamlessly integrated. Several experiments were conducted to prove the feasibility of the proposed system. Images observed include that of a human being, eight Buddhist statues, and a stone sculpture. The results for the stone sculpture, obtained with several multi-camera configurations were compared with a reference model acquired by an ATOS-I 2M active scanner. The best result has an absolute accuracy of 0.26 mm and a relative accuracy of 1:17,333. It demonstrates the feasibility of the proposed low-cost image-based 3D modeling pipeline and its applicability to a large quantity of antiques stored in a museum. PMID:23112656

Rau, Jiann-Yeou; Yeh, Po-Chia

2012-01-01

349

Experimental comparison of the high-speed imaging performance of an EM-CCD and sCMOS camera in a dynamic live-cell imaging test case.  

PubMed

The study of living cells may require advanced imaging techniques to track weak and rapidly changing signals. Fundamental to this need is the recent advancement in camera technology. Two camera types, specifically sCMOS and EM-CCD, promise both high signal-to-noise and high speed (>100 fps), leaving researchers with a critical decision when determining the best technology for their application. In this article, we compare two cameras using a live-cell imaging test case in which small changes in cellular fluorescence must be rapidly detected with high spatial resolution. The EM-CCD maintained an advantage of being able to acquire discernible images with a lower number of photons due to its EM-enhancement. However, if high-resolution images at speeds approaching or exceeding 1000 fps are desired, the flexibility of the full-frame imaging capabilities of sCMOS is superior. PMID:24404178

Beier, Hope T; Ibey, Bennett L

2014-01-01

350

Field-programmable gate array-based hardware architecture for high-speed camera with KAI-0340 CCD image sensor  

NASA Astrophysics Data System (ADS)

We present a field-programmable gate array (FPGA)-based hardware architecture for high-speed camera which have fast auto-exposure control and colour filter array (CFA) demosaicing. The proposed hardware architecture includes the design of charge coupled devices (CCD) drive circuits, image processing circuits, and power supply circuits. CCD drive circuits transfer the TTL (Transistor-Transistor-Logic) level timing Sequences which is produced by image processing circuits to the timing Sequences under which CCD image sensor can output analog image signals. Image processing circuits convert the analog signals to digital signals which is processing subsequently, and the TTL timing, auto-exposure control, CFA demosaicing, and gamma correction is accomplished in this module. Power supply circuits provide the power for the whole system, which is very important for image quality. Power noises effect image quality directly, and we reduce power noises by hardware way, which is very effective. In this system, the CCD is KAI-0340 which is can output 210 full resolution frame-per-second, and our camera can work outstandingly in this mode. The speed of traditional auto-exposure control algorithms to reach a proper exposure level is so slow that it is necessary to develop a fast auto-exposure control method. We present a new auto-exposure algorithm which is fit high-speed camera. Color demosaicing is critical for digital cameras, because it converts a Bayer sensor mosaic output to a full color image, which determines the output image quality of the camera. Complexity algorithm can acquire high quality but cannot implement in hardware. An low-complexity demosaicing method is presented which can implement in hardware and satisfy the demand of quality. The experiment results are given in this paper in last.

Wang, Hao; Yan, Su; Zhou, Zuofeng; Cao, Jianzhong; Yan, Aqi; Tang, Linao; Lei, Yangjie

2013-08-01

351

Performance assessment of a slat gamma camera collimator for 511 keV imaging  

NASA Astrophysics Data System (ADS)

The physical performance of a prototype slat collimator is described for gamma camera planar imaging at 511 keV. Measurements were made of sensitivity, spatial resolution and a septal penetration index at 511 keV. These measurements were repeated with a commercial parallel hole collimator designed for 511 keV imaging. The slat collimator sensitivity was 22.9 times that of the parallel hole collimator with 10 cm tissue equivalent scatter material, and 16.8 times the parallel hole collimator sensitivity in air. Spatial resolution was also better for the slat collimator than the parallel hole collimator (FWHM at 10 cm in air 17.9 mm and 21.2 mm respectively). Septal penetration was compared by a single value for the counts at 120 mm from the point source profile peak, expressed as a percentage of the peak counts, showing less penetration for the slat collimator than the parallel hole collimator (1.9% versus 3.6% respectively). In conclusion, these results show that the slat collimator may have advantages over the parallel hole collimator for 511 keV imaging, though the greater complexity of operation of the slat collimator and potential sources of artefact in slat collimator imaging are recognized.

Britten, Alan J.; Klie, Robert

1999-07-01

352

Imaging the Jovian Magnetosphere in Energetic Neutral Atoms with the Cassini/Huygens Magnetospheric Imaging Instrument Ion and Neutral Camera  

NASA Astrophysics Data System (ADS)

During the flyby of Jupiter on its cruise to Saturn, the Cassini/Huygens spacecraft collected unique data with a broad assortment of instruments. The Magnetospheric Imaging Instrument (MIMI) measured in-situ energetic ions and electrons, performed charge and mass analysis of both interstellar pickup ions and Jovian source ions, and remotely imaged the Jovian magnetosphere in energetic neutral atom (ENA) emissions. The ENA images were measured with the Ion and Neutral Camera (INCA) [Mitchell et al., 1998]. INCA images ENA in the velocity range from about 1200 km/s to 8000 km/s, which for hydrogen corresponds to about 8 keV to about 350 keV. Jupiter, as expected based on Voyager measurements [Kirsch et al., 1981; Cheng 1986], is a weak ENA source. The expected INCA response was modeled by Mauk et al., 1998. The INCA sensor was run in a mode for which images were obtained only below about 80 keV/nucleon. The images obtained are quite close to the predictions of Mauk et al., both in conformation and in intensity. They are consistent with the source strength of the Jovian magnetosphere remaining essentially constant between 1979 and 2001. Rough inferences as to the composition of the ENA show that there is a mix of hydrogen as well as heavier atoms, most probably oxygen and sulfur. These first-ever images of Jupiter's magnetosphere provide a taste of what we can look forward to at Saturn, although the distance of this flyby (140 Rj) allows only hints of the magnetospheric structure to be resolved with this instrument, which is designed for a closer vantage point. Mitchell, D. G., A. F. Cheng, K. C. Hsieh, S. M. Krimigis, E. P. Keath, S. E. Jaskulek, B. H. Mauk, R. W. McEntire, E. C. Roelof, C. E. Schlemm, B. E. Tossman, and D. J. Williams, The ion neutral camera for the Cassini Mission to Saturn and Titan, Measurement Techniques in Space Plasmas, Fields, R. F. Pfaff, J. E. Borovsky, and D. T. Young, ed., Geophysical Monograph 103, p281, AGU, 1998. Kirsch, E., S. M. Krimigis, J. Johl, and E. Keath, Upper limits for X-ray and energetic neutral particle emission from Jupiter: Voyager 1 results, Geophys. Res. Lett., 8, 169, 1981. Mauk, B.H., S.M. Krimigis, D.G. Mitchell, and E.C. Roelof, Energetic neutral atom imaging of Jupiter's magnetosphere using the Cassini MIMI instrument, Adv. Space Res., 21, 1483, 1998.

Mitchell, D. G.; Krimigis, S. M.; Mauk, B. H.

2001-05-01

353

A Novel Method of Object Detection from a Moving Camera Based on Image Matching and Frame Coupling  

PubMed Central

A new method based on image matching and frame coupling to handle the problems of object detection caused by a moving camera and object motion is presented in this paper. First, feature points are extracted from each frame. Then, motion parameters can be obtained. Sub-images are extracted from the corresponding frame via these motion parameters. Furthermore, a novel searching method for potential orientations improves efficiency and accuracy. Finally, a method based on frame coupling is adopted, which improves the accuracy of object detection. The results demonstrate the effectiveness and feasibility of our proposed method for a moving object with changing posture and with a moving camera. PMID:25354301

Chen, Yong; Zhang, Rong hua; Shang, Lei

2014-01-01

354

Space-bandwidth extension in parallel phase-shifting digital holography using a four-channel polarization-imaging camera.  

PubMed

We propose a method for extending the space bandwidth (SBW) available for recording an object wave in parallel phase-shifting digital holography using a four-channel polarization-imaging camera. A linear spatial carrier of the reference wave is introduced to an optical setup of parallel four-step phase-shifting interferometry using a commercially available polarization-imaging camera that has four polarization-detection channels. Then a hologram required for parallel two-step phase shifting, which is a technique capable of recording the widest SBW in parallel phase shifting, can be obtained. The effectiveness of the proposed method was numerically and experimentally verified. PMID:23939081

Tahara, Tatsuki; Ito, Yasunori; Xia, Peng; Awatsuji, Yasuhiro; Nishio, Kenzo; Ura, Shogo; Kubota, Toshihiro; Matoba, Osamu

2013-07-15

355

TRIDENT: an Infrared Differential Imaging Camera Optimized for the Detection of Methanated Substellar Companions  

E-print Network

A near-infrared camera in use at the Canada-France-Hawaii Telescope (CFHT) and at the 1.6-m telescope of the Observatoire du Mont-Megantic is described. The camera is based on a Hawaii-1 1024x1024 HgCdTe array detector. Its main feature is to acquire three simultaneous images at three wavelengths across the methane absorption bandhead at 1.6 microns, enabling, in theory, an accurate subtraction of the stellar point spread function (PSF) and the detection of faint close methanated companions. The instrument has no coronagraph and features fast data acquisition, yielding high observing efficiency on bright stars. The performance of the instrument is described, and it is illustrated by laboratory tests and CFHT observations of the nearby stars GL526, Ups And and Chi And. TRIDENT can detect (6 sigma) a methanated companion with delta H = 9.5 at 0.5" separation from the star in one hour of observing time. Non-common path aberrations and amplitude modulation differences between the three optical paths are likely to be the limiting factors preventing further PSF attenuation. Instrument rotation and reference star subtraction improve the detection limit by a factor of 2 and 4 respectively. A PSF noise attenuation model is presented to estimate the non-common path wavefront difference effect on PSF subtraction performance.

C. Marois; R. Doyon; D. Nadeau; R. Racine; M. Riopel; P. Vallee; D. Lafreniere; .

2005-04-28

356

Experimental analysis of scattered photons in Tc-99m imaging with a gamma camera.  

PubMed

The amount of scattered photons in a clinical imaging window of Tc-99m was experimentally measured by means of a line source with scattering materials and a gamma camera. A symmetrical photopeak energy window centered at 140 keV with a width of 20% (126-154 keV) was partitioned into several small windows. Energy spectra were analyzed to determine the scatter fraction and the attenuation coefficient for each window. Line spread functions (LSF) were also obtained to characterize the spatial scatter distribution. The results of analysis of energy spectra show that scattered photons are included over the symmetric 20% window (SW) and scatter fractions increase linearly with the increasing thickness of the scattering material in all energy windows investigated. In addition, the results for the LSF show that the scatter distribution within the SW is represented as a mono-exponential function. Experimental measurements obtained with a phantom and a gamma camera simplify accurate quantification of scattered photons. Such quantitative analysis of scattered photons is important in developing and evaluating a scatter correction technique. PMID:1797068

Kojima, A; Matsumoto, M; Takahashi, M

1991-11-01

357

Accuracy of the quantification of organ activity from planar gamma camera images.  

PubMed

The accuracy in determination of organ activity of (99m)Tc was investigated, with activity estimated from gamma camera images of phantoms, using the conjugate view method. The accuracy depends on several parameters such as the choice of background correction method, the accuracy in determination of the effective attenuation coefficient and the thickness of the body and organs and on the determination of the gamma camera sensitivity. The background correction method has a major influence on the quantification of the activity. Methods which take the volume of the source organ into consideration are recommended. The discrepancy in the determined organ activity varied between an underestimation of 26% and an overestimation of 16% in the MIRD phantom, depending on which organ was studied and on the correction method used. To correct for absorption and scattering, an effective attenuation coefficient was used. A theoretical analysis showed that a change in the effective attenuation coefficient of 0.01 cm(-1) resulted in a 15% change in the calculated activity. Also the thickness of the body and the organ of interest influences the calculated activity. A 2 cm deviation in the body thickness causes a deviation of approximately 10% in the calculated activity. The quantification is improved if the attenuation coefficient is determined by transmission measurements. PMID:12667315

Norrgren, Kristina; Svegborn, Sigrid Leide; Areberg, Johan; Mattsson, Sören

2003-02-01

358

An accelerated threshold-based back-projection algorithm for Compton camera image reconstruction  

SciTech Connect

Purpose: Compton camera imaging (CCI) systems are currently under investigation for radiotherapy dose reconstruction and verification. The ability of such a system to provide real-time images during dose delivery will be limited by the computational speed of the image reconstruction algorithm. In this work, the authors present a fast and simple method by which to generate an initial back-projected image from acquired CCI data, suitable for use in a filtered back-projection algorithm or as a starting point for iterative reconstruction algorithms, and compare its performance to the current state of the art. Methods: Each detector event in a CCI system describes a conical surface that includes the true point of origin of the detected photon. Numerical image reconstruction algorithms require, as a first step, the back-projection of each of these conical surfaces into an image space. The algorithm presented here first generates a solution matrix for each slice of the image space by solving the intersection of the conical surface with the image plane. Each element of the solution matrix is proportional to the distance of the corresponding voxel from the true intersection curve. A threshold function was developed to extract those pixels sufficiently close to the true intersection to generate a binary intersection curve. This process is repeated for each image plane for each CCI detector event, resulting in a three-dimensional back-projection image. The performance of this algorithm was tested against a marching algorithm known for speed and accuracy. Results: The threshold-based algorithm was found to be approximately four times faster than the current state of the art with minimal deficit to image quality, arising from the fact that a generically applicable threshold function cannot provide perfect results in all situations. The algorithm fails to extract a complete intersection curve in image slices near the detector surface for detector event cones having axes nearly parallel to the image plane. This effect decreases the sum of the image, thereby also affecting the mean, standard deviation, and SNR of the image. All back-projected events associated with a simulated point source intersected the voxel containing the source and the FWHM of the back-projected image was similar to that obtained from the marching method. Conclusions: The slight deficit to image quality observed with the threshold-based back-projection algorithm described here is outweighed by the 75% reduction in computation time. The implementation of this method requires the development of an optimum threshold function, which determines the overall accuracy of the method. This makes the algorithm well-suited to applications involving the reconstruction of many large images, where the time invested in threshold development is offset by the decreased image reconstruction time. Implemented in a parallel-computing environment, the threshold-based algorithm has the potential to provide real-time dose verification for radiation therapy.

Mundy, Daniel W.; Herman, Michael G. [Department of Radiation Oncology, Mayo Clinic, Rochester, Minnesota 55905 (United States)

2011-01-15

359

Computer-vision-based weed identification of images acquired by 3CCD camera  

NASA Astrophysics Data System (ADS)

Selective application of herbicide to weeds at an earlier stage in crop growth is an important aspect of site-specific management of field crops. For approaches more adaptive in developing the on-line weed detecting application, more researchers involves in studies on image processing techniques for intensive computation and feature extraction tasks to identify the weeds from the other crops and soil background. This paper investigated the potentiality of applying the digital images acquired by the MegaPlus TM MS3100 3-CCD camera to segment the background soil from the plants in question and further recognize weeds from the crops using the Matlab script language. The image of the near-infrared waveband (center 800 nm; width 65 nm) was selected principally for segmenting soil and identifying the cottons from the thistles was achieved based on their respective relative area (pixel amount) in the whole image. The results show adequate recognition that the pixel proportion of soil, cotton leaves and thistle leaves were 78.24%(-0.20% deviation), 16.66% (+ 2.71% SD) and 4.68% (-4.19% SD). However, problems still exists by separating and allocating single plants for their clustering in the images. The information in the images acquired via the other two channels, i.e., the green and the red bands, need to be extracted to help the crop/weed discrimination. More optical specimens should be acquired for calibration and validation to establish the weed-detection model that could be effectively applied in fields.

Zhang, Yun; He, Yong; Fang, Hui

2006-09-01

360

Evaluation of a CdTe semiconductor based compact gamma camera for sentinel lymph node imaging  

SciTech Connect

Purpose: The authors assembled a prototype compact gamma-ray imaging probe (MediPROBE) for sentinel lymph node (SLN) localization. This probe is based on a semiconductor pixel detector. Its basic performance was assessed in the laboratory and clinically in comparison with a conventional gamma camera. Methods: The room-temperature CdTe pixel detector (1 mm thick) has 256x256 square pixels arranged with a 55 {mu}m pitch (sensitive area 14.08x14.08 mm{sup 2}), coupled pixel-by-pixel via bump-bonding to the Medipix2 photon-counting readout CMOS integrated circuit. The imaging probe is equipped with a set of three interchangeable knife-edge pinhole collimators (0.94, 1.2, or 2.1 mm effective diameter at 140 keV) and its focal distance can be regulated in order to set a given field of view (FOV). A typical FOV of 70 mm at 50 mm skin-to-collimator distance corresponds to a minification factor 1:5. The detector is operated at a single low-energy threshold of about 20 keV. Results: For {sup 99m}Tc, at 50 mm distance, a background-subtracted sensitivity of 6.5x10{sup -3} cps/kBq and a system spatial resolution of 5.5 mm FWHM were obtained for the 0.94 mm pinhole; corresponding values for the 2.1 mm pinhole were 3.3x10{sup -2} cps/kBq and 12.6 mm. The dark count rate was 0.71 cps. Clinical images in three patients with melanoma indicate detection of the SLNs with acquisition times between 60 and 410 s with an injected activity of 26 MBq {sup 99m}Tc and prior localization with standard gamma camera lymphoscintigraphy. Conclusions: The laboratory performance of this imaging probe is limited by the pinhole collimator performance and the necessity of working in minification due to the limited detector size. However, in clinical operative conditions, the CdTe imaging probe was effective in detecting SLNs with adequate resolution and an acceptable sensitivity. Sensitivity is expected to improve with the future availability of a larger CdTe detector permitting operation at shorter distances from the patient skin.

Russo, Paolo; Curion, Assunta S.; Mettivier, Giovanni; Esposito, Michela; Aurilio, Michela; Caraco, Corradina; Aloj, Luigi; Lastoria, Secondo [Dipartimento di Scienze Fisiche, Universita di Napoli Federico II, I-80126 Napoli (Italy) and Istituto Nazionale di Fisica Nucleare, Sezione di Napoli, I-80126 Napoli (Italy); Medicina Nucleare, Istituto Nazionale per lo Studio e la Cura dei Tumori, Fondazione G. Pascale, I-80131 Napoli (Italy)

2011-03-15

361

A Powerful New Imager for HST: Performance and Early Science Results from Wide Field Camera 3  

NASA Technical Reports Server (NTRS)

Wide Field Camera 3 (WFC3) was installed into the Hubble Space Telescope during the highly successful Servicing Mission 4 in May, 2009. WFC3 offers sensitive, high resolution imaging over a broad wavelength range from the near UV through the visible to the near IR (200nm - 1700nm). Its capabilities in the near UV and near IR ends of that range represent particularly large advances vs. those of previous HST instruments. In this talk, I will review the purpose and design of the instrument, describe its performance in flight, and highlight some of the initial scientific results from the instrument, including its use in deep infrared surveys in search of galaxies at very high redshift, in investigations of the global processes of star formation in nearby galaxies, and in the study of the recent impact on Jupiter.

Kimble, Randy

2009-01-01

362

Results of shuttle EMU thermal vacuum tests incorporating an infrared imaging camera data acquisition system  

NASA Technical Reports Server (NTRS)

Manned tests in Chamber B at NASA JSC were conducted in May and June of 1990 to better quantify the Space Shuttle Extravehicular Mobility Unit's (EMU) thermal performance in the cold environmental extremes of space. Use of an infrared imaging camera with real-time video monitoring of the output significantly added to the scope, quality and interpretation of the test conduct and data acquisition. Results of this test program have been effective in the thermal certification of a new insulation configuration and the '5000 Series' glove. In addition, the acceptable thermal performance of flight garments with visually deteriorated insulation was successfully demonstrated, thereby saving significant inspection and garment replacement cost. This test program also established a new method for collecting data vital to improving crew thermal comfort in a cold environment.

Anderson, James E.; Tepper, Edward H.; Trevino, Louis A.

1991-01-01

363

First light observations with TIFR Near Infrared Imaging Camera (TIRCAM-II)  

NASA Astrophysics Data System (ADS)

TIFR near infrared imaging camera (TIRCAM-II) is based on the Aladdin III Quadrant InSb focal plane array (512×512 pixels; 27.6 ?m pixel size; sensitive between 1 - 5.5 ?m). TIRCAM-II had its first engineering run with the 2 m IUCAA telescope at Girawali during February - March 2011. The first light observations with TIRCAM-II were quite successful. Several infrared standard with TIRCAM-II were quite successful. Several infrared standard stars, the Trapezium Cluster in Orion region, McNeil's nebula, etc., were observed in the J, K and in a narrow-band at 3.6 ?m (nbL). In the nbL band, some bright stars could be detected from the Girawali site. The performance of TIRCAM-II is discussed in the light of preliminary observations in near infrared bands.

Ojha, D. K.; Ghosh, S. K.; D'Costa, S. L. A.; Naik, M. B.; Sandimani, P. R.; Poojary, S. S.; Bhagat, S. B.; Jadhav, R. B.; Meshram, G. S.; Bakalkar, C. B.; Ramaprakash, A. N.; Mohan, V.; Joshi, J.

364

Estimating the camera direction of a geotagged image using reference images  

E-print Network

by recent research results that the additional global positioning system (GPS) information helps visual the fusion of user photos and satellite images obtained using the global positioning system (GPS) information

365

HERSCHEL/SCORE, imaging the solar corona in visible and EUV light: CCD camera characterization.  

PubMed

The HERSCHEL (helium resonant scattering in the corona and heliosphere) experiment is a rocket mission that was successfully launched last September from White Sands Missile Range, New Mexico, USA. HERSCHEL was conceived to investigate the solar corona in the extreme UV (EUV) and in the visible broadband polarized brightness and provided, for the first time, a global map of helium in the solar environment. The HERSCHEL payload consisted of a telescope, HERSCHEL EUV Imaging Telescope (HEIT), and two coronagraphs, HECOR (helium coronagraph) and SCORE (sounding coronagraph experiment). The SCORE instrument was designed and developed mainly by Italian research institutes and it is an imaging coronagraph to observe the solar corona from 1.4 to 4 solar radii. SCORE has two detectors for the EUV lines at 121.6 nm (HI) and 30.4 nm (HeII) and the visible broadband polarized brightness. The SCORE UV detector is an intensified CCD with a microchannel plate coupled to a CCD through a fiber-optic bundle. The SCORE visible light detector is a frame-transfer CCD coupled to a polarimeter based on a liquid crystal variable retarder plate. The SCORE coronagraph is described together with the performances of the cameras for imaging the solar corona. PMID:20428852

Pancrazzi, M; Focardi, M; Landini, F; Romoli, M; Fineschi, S; Gherardi, A; Pace, E; Massone, G; Antonucci, E; Moses, D; Newmark, J; Wang, D; Rossi, G

2010-07-01

366

Detection of pulmonary emboli in dogs: comparison of single photon emission computed tomography, gamma camera imaging, and angiography  

SciTech Connect

Single photon emission computed tomography (SPECT) allows separation of radionuclide activity in front of and behind the area of interest and results in more contrast than can be achieved using conventional gamma camera imaging. To determine if this improved contrast is of value in the detection of perfusion abnormalities in the lung, the lower lobe segmental pulmonary arteries of six dogs were embolized and pulmonary perfusion was evaluated using SPECT, conventional gamma camera imaging, and angiography. Although selective segmental angiography was the most sensitive method of detecting emboli, SPECT was much more sensitive than the gamma camera examination in evaluating the effects of the emboli at two hours and one, two, and eight weeks after embolization.

Osborne, D.R.; Jaszczak, R.J.; Greer, K.; Roggli, V.; Lischko, M.; Coleman, R.E.

1983-02-01

367

Tower Camera Handbook  

SciTech Connect

The tower camera in Barrow provides hourly images of ground surrounding the tower. These images may be used to determine fractional snow cover as winter arrives, for comparison with the albedo that can be calculated from downward-looking radiometers, as well as some indication of present weather. Similarly, during spring time, the camera images show the changes in the ground albedo as the snow melts. The tower images are saved in hourly intervals. In addition, two other cameras, the skydeck camera in Barrow and the piling camera in Atqasuk, show the current conditions at those sites.

Moudry, D

2005-01-01

368

ISIS Camera Model Design  

NASA Astrophysics Data System (ADS)

The Integrated Software for Imagers and Spectromters (ISIS) provides camera model support for a variety of past and current NASA missions. Adding new camera models to ISIS has become easier due to object-oriented design.

Anderson, J. A.

2008-03-01

369

Recovery of the 3-D location and motion of a rigid object through camera image (An Extended Kalman Filter Approach)  

Microsoft Academic Search

This paper deals with the problem of locating a rigid object and estimating its motion in three dimensions. This involves determining the position and orientation of the object at each instant when an image is captured by a camera, and recovering the motion of the object between consecutive frames.

J. J. Wu; R. E. Rink; T. M. Caelli; V. G. Gourishankar

1989-01-01

370

Low cost camera modules using integration of wafer-scale optics and wafer-level packaging of image sensors  

NASA Astrophysics Data System (ADS)

Using wafer scale optics, wafer scale integration, and wafer level packaging of image sensor, we developed small form factor (3.3mmx3.3mmx2.5mm), low manufacturing cost, Pb-free solder reflow compatible digital camera modules which are suitable for many applications including mobile electronic devices, automotives, security, and medical applications.

Han, Hongtao; Main, Keith

2009-11-01

371

Educational Applications for Digital Cameras.  

ERIC Educational Resources Information Center

Discusses uses of digital cameras in education. Highlights include advantages and disadvantages, digital photography assignments and activities, camera features and operation, applications for digital images, accessory equipment, and comparisons between digital cameras and other digitizers. (AEF)

Cavanaugh, Terence; Cavanaugh, Catherine

1997-01-01

372

Catadioptric Omnidirectional Camera  

Microsoft Academic Search

Conventional video cameras have limited fields of view that make them restrictive in a variety of vision applications. There are several ways to enhance the field of view of an imaging system. However, the entire imaging system must have a single effective viewpoint to enable the generation of pure perspective images from a sensed image. A new camera with a

Shree K. Nayar

1997-01-01

373

Gamma camera-based PET inverse treatment planning for head and neck cancer using hybrid imaging instrumentation and IMRT  

Microsoft Academic Search

Purpose: To demonstrate the feasibility of incorporating gamma camera-based positron emission tomography (GC-PET) nuclear medicine molecular imaging into inverse conformal radiotherapy treatment planning using commercially available hardware and software. Materials and methods: Anatomical X-ray computed tomography (X-ray CT) and GC-PET imaging of the base of the tongue region were performed on a hybrid nuclear medicine—X-ray CT scanner (General Electric Millennium

Christopher Scarfone; James A Patton; Anthony J Cmelak; Martin P Sandler

2002-01-01

374

On-Orbit Geometric Calibration of the Lunar Reconnaissance Orbiter Wide Angle Camera  

NASA Astrophysics Data System (ADS)

Lunar Reconnaissance Orbiter (LRO) is equipped with a single Wide Angle Camera (WAC) [1] designed to collect monochromatic and multispectral observations of the lunar surface. Cartographically accurate image mosaics and stereo image based terrain models requires the position of each pixel in a given image be known to a corresponding point on the lunar surface with a high degree of accuracy and precision. The Lunar Reconnaissance Orbiter Camera (LROC) team initially characterized the WAC geometry prior to launch at the Malin Space Science Systems calibration facility. After lunar orbit insertion, the LROC team recognized spatially varying geometric offsets between color bands. These misregistrations made analysis of the color data problematic and showed that refinements to the pre-launch geometric analysis were necessary. The geometric parameters that define the WAC optical system were characterized from statistics gathered from co-registering over 84,000 image pairs. For each pair, we registered all five visible WAC bands to a precisely rectified Narrow Angle Camera (NAC) image (accuracy <15 m) [2] to compute key geometric parameters. In total, we registered 2,896 monochrome and 1,079 color WAC observations to nearly 34,000 NAC observations and collected over 13.7 million data points across the visible portion of the WAC CCD. Using the collected statistics, we refined the relative pointing (yaw, pitch and roll), effective focal length, principal point coordinates, and radial distortion coefficients. This large dataset also revealed spatial offsets between bands after orthorectification due to chromatic aberrations in the optical system. As white light enters the optical system, the light bends at different magnitudes as a function of wavelength, causing a single incident ray to disperse in a spectral spread of color [3,4]. This lateral chromatic aberration effect, also known as 'chromatic difference in magnification' [5] introduces variation to the effective focal length for each WAC band. Secondly, tangential distortions caused by minor decentering in the optical system altered the derived exterior orientation parameters for each 14-line WAC band. We computed the geometric parameter sets separately for each band to characterize the lateral chromatic aberrations and the decentering components in the WAC optical system. From this approach, we negated the need for additional tangential terms in the distortion model, thus reducing the number of computations during image orthorectification and therefore expediting the orthorectification process. We undertook a similar process for refining the geometry for the UV bands (321 and 360 nm), except we registered each UV bands to orthorectified visible bands of the same WAC observation (the visible bands have resolutions 4 times greater than the UV). The resulting 7-band camera model with refined geometric parameters enables map projection with sub-pixel accuracy. References: [1] Robinson et al. (2010) Space Sci. Rev. 150, 81-124 [2] Wagner et al. (2013) Lunar Sci Forum [3] Mahajan, V.N. (1998) Optical Imaging and Aberrations [4] Fiete, R.D. (2013), Manual of Photogrammetry, pp. 359-450 [5] Brown, D.C. (1966) Photometric Eng. 32, 444-462.

Speyerer, E. J.; Wagner, R.; Robinson, M. S.

2013-12-01

375

The JANUS camera onboard JUICE mission for Jupiter system optical imaging  

NASA Astrophysics Data System (ADS)

JANUS (Jovis, Amorum ac Natorum Undique Scrutator) is the visible camera selected for the ESA JUICE mission to the Jupiter system. Resources constraints, S/C characteristics, mission design, environment and the great variability of observing conditions for several targets put stringent constraints on instrument architecture. In addition to the usual requirements for a planetary mission, the problem of mass and power consumption is particularly stringent due to the long-lasting cruising and operations at large distance from the Sun. JANUS design shall cope with a wide range of targets, from Jupiter atmosphere, to solid satellite surfaces, exosphere, rings, and lightning, all to be observed in several color and narrow-band filters. All targets shall be tracked during the mission and in some specific cases the DTM will be derived from stereo imaging. Mission design allows a quite long time range for observations in Jupiter system, with orbits around Jupiter and multiple fly-bys of satellites for 2.5 years, followed by about 6 months in orbit around Ganymede, at surface distances variable from 104 to few hundreds km. Our concept was based on a single optical channel, which was fine-tuned to cover all scientific objectives based on low to high-resolution imaging. A catoptric telescope with excellent optical quality is coupled with a rectangular detector, avoiding any scanning mechanism. In this paper the present JANUS design and its foreseen scientific capabilities are discussed.

Della Corte, Vincenzo; Schmitz, Nicole; Zusi, Michele; Castro, José Maria; Leese, Mark; Debei, Stefano; Magrin, Demetrio; Michalik, Harald; Palumbo, Pasquale; Jaumann, Ralf; Cremonese, Gabriele; Hoffmann, Harald; Holland, Andrew; Lara, Luisa Maria; Fiethe, Björn; Friso, Enrico; Greggio, Davide; Herranz, Miguel; Koncz, Alexander; Lichopoj, Alexander; Martinez-Navajas, Ignacio; Mazzotta Epifani, Elena; Michaelis, Harald; Ragazzoni, Roberto; Roatsch, Thomas; Rodrigo, Julio; Rodriguez, Emilio; Schipani, Pietro; Soman, Matthew; Zaccariotto, Mirco

2014-08-01

376

First results from the Faint Object Camera - Imaging the core of R Aquarii  

NASA Technical Reports Server (NTRS)

The Faint Object Camera on the HST was pointed toward the symbiotic long-period M7e Mira variable R Aquarii, and very high resolution images of the inner core, mainly in the ionized oxygen emission lines in the optical, are reported. Both images show bright arcs, knots, and filaments superposed on a fainter, diffuse nebulosity extending in a general SW-NE direction from the variable to the edge of the field at 10 arcsec distance. The core is resolved in forbidden O III 5007 A and forbidden O II 3727 A into at least two bright knots of emission whose positions and structures are aligned with PA = 50 deg. The central knots appear to be the source of a continuous, well-collimated, stream of material extending out to 3-4 arcsec in the northern sector corresponding to a linear distance of about 1000 AU. The northern stream seems to bend around an opaque obstacle and form a spiral before breaking up into wisps and knots. The southern stream is composed of smaller, discrete parcels of emitting gas curving to the SE.

Paresce, F.; Albrecht, R.; Barbieri, C.; Blades, J. C.; Boksenberg, A.

1991-01-01

377

4 Vesta in Color: High Resolution Mapping from Dawn Framing Camera Images  

NASA Technical Reports Server (NTRS)

Rotational surface variations on asteroid 4 Vesta have been known from ground-based and HST observations, and they have been interpreted as evidence of compositional diversity. NASA s Dawn mission entered orbit around Vesta on July 16, 2011 for a year-long global characterization. The framing cameras (FC) onboard the Dawn spacecraft will image the asteroid in one clear (broad) and seven narrow band filters covering the wavelength range between 0.4-1.0 microns. We present color mapping results from the Dawn FC observations of Vesta obtained during Survey orbit (approx.3000 km) and High-Altitude Mapping Orbit (HAMO) (approx.950 km). Our aim is to create global color maps of Vesta using multi spectral FC images to identify the spatial extent of compositional units and link them with other available data sets to extract the basic mineralogy. While the VIR spectrometer onboard Dawn has higher spectral resolution (864 channels) allowing precise mineralogical assessment of Vesta s surface, the FC has three times higher spatial resolution in any given orbital phase. In an effort to extract maximum information from FC data we have developed algorithms using laboratory spectra of pyroxenes and HED meteorites to derive parameters associated with the 1-micron absorption band wing. These parameters will help map the global distribution of compositionally related units on Vesta s surface. Interpretation of these units will involve the integration of FC and VIR data.

Reddy, V.; LeCorre, L.; Nathues, A.; Sierks, H.; Christensen, U.; Hoffmann, M.; Schroeder, S. E.; Vincent, J. B.; McSween, H. Y.; Denevi, B. W.; Li, J.-Y.; Pieters, C. M.; Gaffey, M.; Mittlefehldt, D.; Buratti, B.; Hicks, M.; McCord, T.; Combe, J.-P.; DeSantis, M. C.; Russell, C. T.; Raymond, C. A.; Marques, P. Gutierrez; Maue, T.; Hall, I.

2011-01-01

378

Factors affecting the repeatability of gamma camera calibration for quantitative imaging applications using a sealed source.  

PubMed

Several applications in nuclear medicine require absolute activity quantification of single photon emission computed tomography images. Obtaining a repeatable calibration factor that converts voxel values to activity units is essential for these applications. Because source preparation and measurement of the source activity using a radionuclide activity meter are potential sources of variability, this work investigated instrumentation and acquisition factors affecting repeatability using planar acquisition of sealed sources. The calibration factor was calculated for different acquisition and geometry conditions to evaluate the effect of the source size, lateral position of the source in the camera field-of-view (FOV), source-to-camera distance (SCD), and variability over time using sealed Ba-133 sources. A small region of interest (ROI) based on the source dimensions and collimator resolution was investigated to decrease the background effect. A statistical analysis with a mixed-effects model was used to evaluate quantitatively the effect of each variable on the global calibration factor variability. A variation of 1?cm in the measurement of the SCD from the assumed distance of 17?cm led to a variation of 1-2% in the calibration factor measurement using a small disc source (0.4?cm diameter) and less than 1% with a larger rod source (2.9?cm diameter). The lateral position of the source in the FOV and the variability over time had small impacts on calibration factor variability. The residual error component was well estimated by Poisson noise. Repeatability of better than 1% in a calibration factor measurement using a planar acquisition of a sealed source can be reasonably achieved. The best reproducibility was obtained with the largest source with a count rate much higher than the average background in the ROI, and when the SCD was positioned within 5?mm of the desired position. In this case, calibration source variability was limited by the quantum noise. PMID:25592130

Anizan, N; Wang, H; Zhou, X C; Wahl, R L; Frey, E C

2015-02-01

379

Radioisotope guided surgery with imaging probe, a hand-held high-resolution gamma camera  

NASA Astrophysics Data System (ADS)

Since 1997, our group of Physics together with Nuclear Physicians studies imaging probes (IP), hand-held, high-resolution gamma cameras for radio-guided surgery (RGS). Present work is aimed to verify the usefulness of two updated IP in different surgical operations. Forty patients scheduled for breast cancer sentinel node (SN) biopsy, five patients with nodal recurrence of thyroid cancer, seven patients with parathyroid adenomas, five patients with neuroendocrine tumours (NET), were operated under the guide of IP. We used two different IP with field of view of 1 and 4 in. 2, respectively and intrinsic spatial resolution of about 2 mm. Radioisotopes were 99mTc, 123I and 111In. The 1 in. 2 IP detected SN in all the 40 patients and more than one node in 24, whereas anger camera (AC) failed locating SN in four patients and detected true positive second nodes in only nine patients. The 4 in. 2 IP was used for RGS of thyroid, parathyroid and NETs. It detected eight latero-cervical nodes. In the same patients, AC detected five invaded nodes. Parathyroid adenomas detected by IP were 10 in 7 patients, NET five in five patients. One and 4 in. 2 IPs showed usefulness in all operations. Initial studies on SN biopsy were carried out on small series of patients to validate IP and to demonstrate the effectiveness and usefulness of IP alone or against conventional probes. We propose the use of the IP as control method for legal documentation and surgeon strategy guide before and after lesion(s) removal.

Soluri, A.; Trotta, C.; Scopinaro, F.; Tofani, A.; D'Alessandria, C.; Pasta, V.; Stella, S.; Massari, R.

2007-12-01

380

Factors affecting the repeatability of gamma camera calibration for quantitative imaging applications using a sealed source  

NASA Astrophysics Data System (ADS)

Several applications in nuclear medicine require absolute activity quantification of single photon emission computed tomography images. Obtaining a repeatable calibration factor that converts voxel values to activity units is essential for these applications. Because source preparation and measurement of the source activity using a radionuclide activity meter are potential sources of variability, this work investigated instrumentation and acquisition factors affecting repeatability using planar acquisition of sealed sources. The calibration factor was calculated for different acquisition and geometry conditions to evaluate the effect of the source size, lateral position of the source in the camera field-of-view (FOV), source-to-camera distance (SCD), and variability over time using sealed Ba-133 sources. A small region of interest (ROI) based on the source dimensions and collimator resolution was investigated to decrease the background effect. A statistical analysis with a mixed-effects model was used to evaluate quantitatively the effect of each variable on the global calibration factor variability. A variation of 1?cm in the measurement of the SCD from the assumed distance of 17?cm led to a variation of 1–2% in the calibration factor measurement using a small disc source (0.4?cm diameter) and less than 1% with a larger rod source (2.9?cm diameter). The lateral position of the source in the FOV and the variability over time had small impacts on calibration factor variability. The residual error component was well estimated by Poisson noise. Repeatability of better than 1% in a calibration factor measurement using a planar acquisition of a sealed source can be reasonably achieved. The best reproducibility was obtained with the largest source with a count rate much higher than the average background in the ROI, and when the SCD was positioned within 5?mm of the desired position. In this case, calibration source variability was limited by the quantum noise.

Anizan, N.; Wang, H.; Zhou, X. C.; Wahl, R. L.; Frey, E. C.

2015-02-01

381

Prompt gamma imaging with a slit camera for real-time range control in proton therapy.  

PubMed

Treatments delivered by proton therapy are affected by uncertainties on the range of the beam within the patient, requiring medical physicists to add safety margins on the penetration depth of the beam. To reduce these margins and deliver safer treatments, different projects are currently investigating real-time range control by imaging prompt gammas emitted along the proton tracks in the patient. This study reports on the feasibility, development and test of a new concept of prompt gamma camera using a slit collimator to obtain a one-dimensional projection of the beam path on a scintillation detector. This concept was optimized, using the Monte Carlo code MCNPX version 2.5.0, to select high energy photons correlated with the beam range and detect them with both high statistics and sufficient spatial resolution. To validate the Monte Carlo model, spectrometry measurements of secondary particles emitted by a PMMA target during proton irradiation at 160 MeV were realized. An excellent agreement with the simulations was observed when using subtraction methods to isolate the gammas in direct incidence. A first prototype slit camera using the HiCam gamma detector was consequently prepared and tested successfully at 100 and 160 MeV beam energies. Results confirmed the potential of this concept for real-time range monitoring with millimetre accuracy in pencil beam scanning mode for typical clinical conditions. If we neglect electronic dead times and rejection of detected events, the current solution with its collimator at 15 cm from the beam axis can achieve a 1-2 mm standard deviation on range estimation in a homogeneous PMMA target for numbers of protons that correspond to doses in water at the Bragg peak as low as 15 cGy at 100 MeV and 25 cGy at 160 MeV assuming pencil beams with a Gaussian profile of 5 mm sigma at target entrance. PMID:22572603

Smeets, J; Roellinghoff, F; Prieels, D; Stichelbaut, F; Benilov, A; Busca, P; Fiorini, C; Peloso, R; Basilavecchia, M; Frizzi, T; Dehaes, J C; Dubus, A

2012-06-01

382

Prompt gamma imaging with a slit camera for real-time range control in proton therapy  

NASA Astrophysics Data System (ADS)

Treatments delivered by proton therapy are affected by uncertainties on the range of the beam within the patient, requiring medical physicists to add safety margins on the penetration depth of the beam. To reduce these margins and deliver safer treatments, different projects are currently investigating real-time range control by imaging prompt gammas emitted along the proton tracks in the patient. This study reports on the feasibility, development and test of a new concept of prompt gamma camera using a slit collimator to obtain a one-dimensional projection of the beam path on a scintillation detector. This concept was optimized, using the Monte Carlo code MCNPX version 2.5.0, to select high energy photons correlated with the beam range and detect them with both high statistics and sufficient spatial resolution. To validate the Monte Carlo model, spectrometry measurements of secondary particles emitted by a PMMA target during proton irradiation at 160 MeV were realized. An excellent agreement with the simulations was observed when using subtraction methods to isolate the gammas in direct incidence. A first prototype slit camera using the HiCam gamma detector was consequently prepared and tested successfully at 100 and 160 MeV beam energies. Results confirmed the potential of this concept for real-time range monitoring with millimetre accuracy in pencil beam scanning mode for typical clinical conditions. If we neglect electronic dead times and rejection of detected events, the current solution with its collimator at 15 cm from the beam axis can achieve a 1-2 mm standard deviation on range estimation in a homogeneous PMMA target for numbers of protons that correspond to doses in water at the Bragg peak as low as 15 cGy at 100 MeV and 25 cGy at 160 MeV assuming pencil beams with a Gaussian profile of 5 mm sigma at target entrance.

Smeets, J.; Roellinghoff, F.; Prieels, D.; Stichelbaut, F.; Benilov, A.; Busca, P.; Fiorini, C.; Peloso, R.; Basilavecchia, M.; Frizzi, T.; Dehaes, J. C.; Dubus, A.

2012-06-01

383

A new solution for camera calibration and real-time image distortion correction in medical endoscopy-initial technical evaluation.  

PubMed

Medical endoscopy is used in a wide variety of diagnostic and surgical procedures. These procedures are renowned for the difficulty of orienting the camera and instruments inside the human body cavities. The small size of the lens causes radial distortion of the image, which hinders the navigation process and leads to errors in depth perception and object morphology. This article presents a complete software-based system to calibrate and correct the radial distortion in clinical endoscopy in real time. Our system can be used with any type of medical endoscopic technology, including oblique-viewing endoscopes and HD image acquisition. The initial camera calibration is performed in an unsupervised manner from a single checkerboard pattern image. For oblique-viewing endoscopes the changes in calibration during operation are handled by a new adaptive camera projection model and an algorithm that infer the rotation of the probe lens using only image information. The workload is distributed across the CPU and GPU through an optimized CPU+GPU hybrid solution. This enables real-time performance, even for HD video inputs. The system is evaluated for different technical aspects, including accuracy of modeling and calibration, overall robustness, and runtime profile. The contributions are highly relevant for applications in computer-aided surgery and image-guided intervention such as improved visualization by image warping, 3-D modeling, and visual SLAM. PMID:22127990

Melo, Rui; Barreto, João P; Falcão, Gabriel

2012-03-01

384

A novel Compton camera design featuring a rear-panel shield for substantial noise reduction in gamma-ray images  

NASA Astrophysics Data System (ADS)

After the Japanese nuclear disaster in 2011, large amounts of radioactive isotopes were released and still remain a serious problem in Japan. Consequently, various gamma cameras are being developed to help identify radiation hotspots and ensure effective decontamination operation. The Compton camera utilizes the kinematics of Compton scattering to contract images without using a mechanical collimator, and features a wide field of view. For instance, we have developed a novel Compton camera that features a small size (13 × 14 × 15 cm3) and light weight (1.9 kg), but which also achieves high sensitivity thanks to Ce:GAGG scintillators optically coupled wiith MPPC arrays. By definition, in such a Compton camera, gamma rays are expected to scatter in the ``scatterer'' and then be fully absorbed in the ``absorber'' (in what is called a forward-scattered event). However, high energy gamma rays often interact with the detector in the opposite direction - initially scattered in the absorber and then absorbed in the scatterer - in what is called a ``back-scattered'' event. Any contamination of such back-scattered events is known to substantially degrade the quality of gamma-ray images, but determining the order of gamma-ray interaction based solely on energy deposits in the scatterer and absorber is quite difficult. For this reason, we propose a novel yet simple Compton camera design that includes a rear-panel shield (a few mm thick) consisting of W or Pb located just behind the scatterer. Since the energy of scattered gamma rays in back-scattered events is much lower than that in forward-scattered events, we can effectively discriminate and reduce back-scattered events to improve the signal-to-noise ratio in the images. This paper presents our detailed optimization of the rear-panel shield using Geant4 simulation, and describes a demonstration test using our Compton camera.

Nishiyama, T.; Kataoka, J.; Kishimoto, A.; Fujita, T.; Iwamoto, Y.; Taya, T.; Ohsuka, S.; Nakamura, S.; Hirayanagi, M.; Sakurai, N.; Adachi, S.; Uchiyama, T.

2014-12-01

385

Development of a pixelated GSO gamma camera system with tungsten parallel hole collimator for single photon imaging  

SciTech Connect

Purpose: In small animal imaging using a single photon emitting radionuclide, a high resolution gamma camera is required. Recently, position sensitive photomultiplier tubes (PSPMTs) with high quantum efficiency have been developed. By combining these with nonhygroscopic scintillators with a relatively low light output, a high resolution gamma camera can become useful for low energy gamma photons. Therefore, the authors developed a gamma camera by combining a pixelated Ce-doped Gd{sub 2}SiO{sub 5} (GSO) block with a high quantum efficiency PSPMT. Methods: GSO was selected for the scintillator, because it is not hygroscopic and does not contain any natural radioactivity. An array of 1.9 mm x 1.9 mm x 7 mm individual GSO crystal elements was constructed. These GSOs were combined with a 0.1-mm thick reflector to form a 22 x 22 matrix and optically coupled to a high quantum efficiency PSPMT (H8500C-100 MOD8). The GSO gamma camera was encased in a tungsten gamma-ray shield with tungsten pixelated parallel hole collimator, and the basic performance was measured for Co-57 gamma photons (122 keV). Results: In a two-dimensional position histogram, all pixels were clearly resolved. The energy resolution was {approx}15% FWHM. With the 20-mm thick tungsten pixelated collimator, the spatial resolution was 4.4-mm FWHM 40 mm from the collimator surface, and the sensitivity was {approx}0.05%. Phantom and small animal images were successfully obtained with our developed gamma camera. Conclusions: These results confirmed that the developed pixelated GSO gamma camera has potential as an effective instrument for low energy gamma photon imaging.

Yamamoto, S.; Watabe, H.; Kanai, Y.; Shimosegawa, E.; Hatazawa, J. [Kobe City College of Technology, 8-3 Gakuen-Higashi-machi, Nishi-ku, Kobe 651-2194 (Japan); Department of Molecular Imaging in Medicine, Osaka University Graduate School of Medicine, Osaka 565-0871 (Japan); Department of Nuclear Medicine and Tracer Kinetics, Osaka University Graduate School of Medicine, Osaka 565-0871 (Japan); Department of Molecular Imaging in Medicine, Osaka University Graduate School of Medicine, Osaka 565-0871 (Japan) and Department of Nuclear Medicine and Tracer Kinetics, Osaka University Graduate School of Medicine, Osaka 565-0871 (Japan)

2012-02-15

386

Statistical performance evaluation and comparison of a Compton medical imaging system and a collimated Anger camera for higher energy photon imaging  

NASA Astrophysics Data System (ADS)

In radionuclide treatment, tumor cells are primarily destroyed by charged particles emitted by the compound while associated higher energy photons are used to image the tumor in order to determine radiation dose and monitor shrinkage. However, the higher energy photons are difficult to image with conventional collimated Anger cameras, since a tradeoff exists between resolution and sensitivity, and the collimator septal penetration and scattering is increased due to the high energy photons. This research compares imaging performance of the conventional Anger camera to a Compton imaging system that can have improved spatial resolution and sensitivity for high energy photons because this tradeoff is decoupled, and the effect of Doppler broadening at higher gamma energies is decreased. System performance is analyzed by the modified uniform Cramer-Rao bound (M-UCRB) algorithms based on the developed system modeling. The bound shows that the effect of Doppler broadening is the limiting factor for Compton camera performance for imaging 364.4 keV photons emitted from 131I. According to the bound, the Compton camera outperforms the collimated system for an equal number of detected events when the desired spatial resolution for a 26 cm diameter uniform disk object is better than 12 mm FWHM. For a 3D cylindrical phantom, the lower bound on variance for the collimated camera is greater than for the Compton imaginer over the resolution range from 0.5 to 2 cm FWHM. Furthermore, the detection sensitivity of the proposed Compton imaging system is about 15-20 times higher than that of the collimated Anger camera.

Han, Li; Rogers, W. Leslie; Huh, Sam S.; Clinthorne, Neal

2008-12-01

387

CCD Camera  

DOEpatents

A CCD camera capable of observing a moving object which has varying intensities of radiation eminating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other.

Roth, Roger R. (Minnetonka, MN)

1983-01-01

388

Epipolar Geometry of Panoramic Cameras  

Microsoft Academic Search

. This paper presents fundamental theory and design of centralpanoramic cameras. Panoramic cameras combine a convex hyperbolic orparabolic mirror with a perspective camera to obtain a large field of view.We show how to design a panoramic camera with a tractable geometryand we propose a simple calibration method. We derive the image formationfunction for such a camera. The main contribution of

Tomás Svoboda; Tomás Pajdla; Václav Hlavác

1998-01-01

389

A Survey of Martian Dust Devil Activity Using Mars Global Surveyor Mars Orbiter Camera Images  

NASA Astrophysics Data System (ADS)

We present results from an orbital survey of Martian dust devils using the Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) wide- and narrow-angle (WA and NA) images. The survey includes all available imaging data (mapping and pre-mapping orbit), through to mission phase E06. Due to the large volume of data, we have concentrated on surveying limited regions, selected variously on the basis of where dust devils or other dust storm activity has previously been reported, on the basis of where lander observations have been or will be obtained, and on the basis of predictions from numerical atmospheric models. Our study regions to date include: Amazonis Planitia (25-45N, 145-165W), Sinus Meridiani (10S-10N, 10E-10W), Chryse Planitia (10-30N, 30-60W), Solis Planum (15-45S, 75-105W), Hellas Planitia (15-60S, 265-315W), Casius (45-65N, 255-285W), Utopia Planitia (25-45N, 225-255W), Sinai Planum (10-20S, 60-100W), Mare Cimmerium (10-45S, 180-220W). We have compiled statistics on dust devil activity in three categories: dust devils observed in NA images, dust devils observed in WA images, and dust devil tracks observed in NA images. For each region and each category, we have compiled statistics for four seasonal date bins, centered on the equinoxes and solstices: Ls=45-135 (northern summer solstice), Ls=135-225 (northern autumn equinox), Ls=225-315 (northern winter solstice), and Ls=315-45 (northern spring equinox). Our survey has highlighted great spatial variability in dust devil activity, with the Amazonis Planitia region being by far the dominant location for activity. This region is additionally characterized by a large size range of dust devils, including individual devils up to several km in height. Other regions in which dust devils have been frequently imaged include Utopia, Solis, and Sinai. Numerous dust devil tracks were observed in Casius and Cimmerium, but with very few accompanying dust devils. This suggests dust devils occurring in local times other than that of the MGS orbit (~2pm). Our seasonal statistics suggest a very strong preference for Amazonis and Solis dust devil activity to occur in the northern autumn season. Conversely, Utopia shows dust devil activity which is relatively constant, except in the northern spring period. The observations will be presented, and compared with numerical model predictions. Initial results from this survey have already been used to define target regions for very high resolution simulations of dust devil development using the Caltech/Cornell Mars MM5 model.

Fisher, J.; Richardson, M. I.; Ewald, S. P.; Toigo, A. D.; Wilson, R. J.

2002-12-01

390

Multi-temporal database of High Resolution Stereo Camera (HRSC) images  

NASA Astrophysics Data System (ADS)

Image data transmitted to Earth by Martian spacecraft since the 1970s, for example by Mariner and Viking, Mars Global Surveyor (MGS), Mars Express (MEx) and the Mars Reconnaissance Orbiter (MRO) showed, that the surface of Mars has changed dramatically and actually is continually changing [e.g., 1-8]. The changes are attributed to a large variety of atmospherical, geological and morphological processes, including eolian processes [9,10], mass wasting processes [11], changes of the polar caps [12] and impact cratering processes [13]. In addition, comparisons between Mariner, Viking and Mars Global Surveyor images suggest that more than one third of the Martian surface has brightened or darkened by at least 10% [6]. Albedo changes can have effects on the global heat balance and the circulation of winds, which can result in further surface changes [14-15]. In particular, the High Resolution Stereo Camera (HRSC) [16,17] on board Mars Express (MEx) covers large areas at high resolution and is therefore suited to detect the frequency, extent and origin of Martian surface changes. Since 2003 HRSC acquires high-resolution images of the Martian surface and contributes to Martian research, with focus on the surface morphology, the geology and mineralogy, the role of liquid water on the surface and in the atmosphere, on volcanism, as well as on the proposed climate change throughout the Martian history and has improved our understanding of the evolution of Mars significantly [18-21]. The HRSC data are available at ESA's Planetary Science Archive (PSA) as well as through the NASA Planetary Data System (PDS). Both data platforms are frequently used by the scientific community and provide additional software and environments to further generate map-projected and geometrically calibrated HRSC data. However, while previews of the images are available, there is no possibility to quickly and conveniently see the spatial and temporal availability of HRSC images in a specific region, which is important to detect the surface changes that occurred between two or more images.

Erkeling, G.; Luesebrink, D.; Hiesinger, H.; Reiss, D.

2013-09-01

391

Biophysical control of intertidal benthic macroalgae revealed by high-frequency multispectral camera images  

NASA Astrophysics Data System (ADS)

Intertidal benthic macroalgae are a biological quality indicator in estuaries and coasts. While remote sensing has been applied to quantify the spatial distribution of such macroalgae, it is generally not used for their monitoring. We examined the day-to-day and seasonal dynamics of macroalgal cover on a sandy intertidal flat using visible and near-infrared images from a time-lapse camera mounted on a tower. Benthic algae were identified using supervised, semi-supervised and unsupervised classification techniques, validated with monthly ground-truthing over one year. A supervised classification (based on maximum likelihood, using training areas identified in the field) performed best in discriminating between sediment, benthic diatom films and macroalgae, with highest spectral separability between macroalgae and diatoms in spring/summer. An automated unsupervised classification (based on the Normalised Differential Vegetation Index NDVI) allowed detection of daily changes in macroalgal coverage without the need for calibration. This method showed a bloom of macroalgae (filamentous green algae, Ulva sp.) in summer with > 60% cover, but with pronounced superimposed day-to-day variation in cover. Waves were a major factor in regulating macroalgal cover, but regrowth of the thalli after a summer storm was fast (2 weeks). Images and in situ data demonstrated that the protruding tubes of the polychaete Lanice conchilega facilitated both settlement (anchorage) and survival (resistance to waves) of the macroalgae. Thus, high-frequency, high resolution images revealed the mechanisms for regulating the dynamics in cover of the macroalgae and for their spatial structuring. Ramifications for the mode, timing, frequency and evaluation of monitoring macroalgae by field and remote sensing surveys are discussed.

van der Wal, Daphne; van Dalen, Jeroen; Wielemaker-van den Dool, Annette; Dijkstra, Jasper T.; Ysebaert, Tom

2014-07-01

392

Test of Compton camera components for prompt gamma imaging at the ELBE bremsstrahlung beam  

NASA Astrophysics Data System (ADS)

In the context of ion beam therapy, particle range verification is a major challenge for the quality assurance of the treatment. One approach is the measurement of the prompt gamma rays resulting from the tissue irradiation. A Compton camera based on several position sensitive gamma ray detectors, together with an imaging algorithm, is expected to reconstruct the prompt gamma ray emission density map, which is correlated with the dose distribution. At OncoRay and Helmholtz-Zentrum Dresden-Rossendorf (HZDR), a Compton camera setup is being developed consisting of two scatter planes: two CdZnTe (CZT) cross strip detectors, and an absorber consisting of one Lu2SiO5 (LSO) block detector. The data acquisition is based on VME electronics and handled by software developed on the ROOT framework. The setup has been tested at the linear electron accelerator ELBE at HZDR, which is used in this experiment to produce bunched bremsstrahlung photons with up to 12.5 MeV energy and a repetition rate of 13 MHz. Their spectrum has similarities with the shape expected from prompt gamma rays in the clinical environment, and the flux is also bunched with the accelerator frequency. The charge sharing effect of the CZT detector is studied qualitatively for different energy ranges. The LSO detector pixel discrimination resolution is analyzed and it shows a trend to improve for high energy depositions. The time correlation between the pulsed prompt photons and the measured detector signals, to be used for background suppression, exhibits a time resolution of 3 ns FWHM for the CZT detector and of 2 ns for the LSO detector. A time walk correction and pixel-wise calibration is applied for the LSO detector, whose resolution improves up to 630 ps. In conclusion, the detector setup is suitable for time-resolved background suppression in pulsed clinical particle accelerators. Ongoing tasks are the quantitative comparison with simulations and the test of imaging algorithms. Experiments at proton accelerators have also been performed and are currently under analysis.

Hueso-González, F.; Golnik, C.; Berthel, M.; Dreyer, A.; Enghardt, W.; Fiedler, F.; Heidel, K.; Kormoll, T.; Rohling, H.; Schöne, S.; Schwengner, R.; Wagner, A.; Pausch, G.

2014-05-01

393

A high resolution Small Field Of View (SFOV) gamma camera: a columnar scintillator coated CCD imager for medical applications  

NASA Astrophysics Data System (ADS)

We describe a high resolution, small field of view (SFOV), Charge Coupled Device (CCD) based camera for imaging small volumes of radionuclide uptake in tissues. The Mini Gamma Ray Camera (MGRC) is a collimated, scintillator-coated, low cost, high performance imager using low noise CCDs. The prototype MGRC has a 600 ?m thick layer of columnar CsI(Tl) and operates in photon counting mode using a thermoelectric cooler to achieve an operating temperature of - 10°C. Collimation was performed using a pin hole collimator. We have measured the spatial resolution, energy resolution and efficiency using a number of radioisotope sources including 140 keV gamma-rays from 99mTc in a specially designed phantom. We also describe our first imaging of a volunteer patient.

Lees, J. E.; Bassford, D. J.; Blake, O. E.; Blackshaw, P. E.; Perkins, A. C.

2011-12-01

394

Active hyperspectral imaging using a quantum cascade laser (QCL) array and digital-pixel focal plane array (DFPA) camera.  

PubMed

We demonstrate active hyperspectral imaging using a quantum-cascade laser (QCL) array as the illumination source and a digital-pixel focal-plane-array (DFPA) camera as the receiver. The multi-wavelength QCL array used in this work comprises 15 individually addressable QCLs in which the beams from all lasers are spatially overlapped using wavelength beam combining (WBC). The DFPA camera was configured to integrate the laser light reflected from the sample and to perform on-chip subtraction of the passive thermal background. A 27-frame hyperspectral image was acquired of a liquid contaminant on a diffuse gold surface at a range of 5 meters. The measured spectral reflectance closely matches the calculated reflectance. Furthermore, the high-speed capabilities of the system were demonstrated by capturing differential reflectance images of sand and KClO3 particles that were moving at speeds of up to 10 m/s. PMID:24977536

Goyal, Anish; Myers, Travis; Wang, Christine A; Kelly, Michael; Tyrrell, Brian; Gokden, B; Sanchez, Antonio; Turner, George; Capasso, Federico

2014-06-16

395

Classification of volcanic ash particles from Sakurajima volcano using CCD camera image and cluster analysis  

NASA Astrophysics Data System (ADS)

Quantitative and speedy characterization of volcanic ash particle is needed to conduct a petrologic monitoring of ongoing eruption. We develop a new simple system using CCD camera images for quantitatively characterizing ash properties, and apply it to volcanic ash collected at Sakurajima. Our method characterizes volcanic ash particles by 1) apparent luminance through RGB filters and 2) a quasi-fractal dimension of the shape of particles. Using a monochromatic CCD camera (Starshoot by Orion Co. LTD.) attached to a stereoscopic microscope, we capture digital images of ash particles that are set on a glass plate under which white colored paper or polarizing plate is set. The images of 1390 x 1080 pixels are taken through three kinds of color filters (Red, Green and Blue) under incident-light and transmitted-light through polarizing plate. Brightness of the light sources is set to be constant, and luminance is calibrated by white and black colored papers. About fifteen ash particles are set on the plate at the same time, and their images are saved with a bit map format. We first extract the outlines of particles from the image taken under transmitted-light through polarizing plate. Then, luminances for each color are represented by 256 tones at each pixel in the particles, and the average and its standard deviation are calculated for each ash particle. We also measure the quasi-fractal dimension (qfd) of ash particles. We perform box counting that counts the number of boxes which consist of 1×1 and 128×128 pixels that catch the area of the ash particle. The qfd is estimated by taking the ratio of the former number to the latter one. These parameters are calculated by using software R. We characterize volcanic ash from Showa crater of Sakurajima collected in two days (Feb 09, 2009, and Jan 13, 2010), and apply cluster analyses. Dendrograms are formed from the qfd and following four parameters calculated from the luminance: Rf=R/(R+G+B), G=G/(R+G+B), B=B/(R+G+B), and total luminance=(R+G+B)/665. We classify the volcanic ash particles from the Dendrograms into three groups based on the euclid distance. The groups are named as Group A, B and C in order of increasing of the average value of total luminance. The classification shows that the numbers of particles belonging to Group A, B and C are 77, 25 and 6 in Feb, 09, 2009 sample, and 102, 19 and 6 in Jan, 13, 2010 sample, respectively. The examination under stereoscopic microscope suggests that Group A, B and C mainly correspond with juvenile, altered and free-crystal particles, respectively. So the result of classification by present method demonstrates a difference in the contribution of juvenile material between the two days. To evaluate reliability of our classification, we classify pseudo-samples in which errors of 10% are added in the measured parameters. We apply our method to one thousand psuedo-samples, and the result shows that the numbers of particles classified into the three groups vary less than 20 % of the total number of 235 particles. Our system can classify 120 particles within 6 minutes so that we easily increase the number of ash particles, which enable us to improve reliabilities and resolutions of the classification and to speedily capture temporal changes of the property of ash particles from active volcanoes.

Miwa, T.; Shimano, T.; Nishimura, T.

2012-12-01

396

Far ultraviolet wide field imaging and photometry - Spartan-202 Mark II Far Ultraviolet Camera  

NASA Technical Reports Server (NTRS)

The U.S. Naval Research Laboratory' Mark II Far Ultraviolet Camera, which is expected to be a primary scientific instrument aboard the Spartan-202 Space Shuttle mission, is described. This camera is intended to obtain FUV wide-field imagery of stars and extended celestial objects, including diffuse nebulae and nearby galaxies. The observations will support the HST by providing FUV photometry of calibration objects. The Mark II camera is an electrographic Schmidt camera with an aperture of 15 cm, a focal length of 30.5 cm, and sensitivity in the 1230-1600 A wavelength range.

Carruthers, George R.; Heckathorn, Harry M.; Opal, Chet B.; Witt, Adolf N.; Henize, Karl G.

1988-01-01

397

Removing cosmic-ray hits from multiorbit HST Wide Field Camera images  

NASA Technical Reports Server (NTRS)

We present an optimized algorithm that removes cosmic rays ('CRs') from multiorbit Hubble Space Telescope (HST) Wide Field/Planetary Camera ('WF/PC') images. It computes the image noise in every iteration from the WF/PC CCD equation. This includes all known sources of random and systematic calibration errors. We test this algorithm on WF/PC stacks of 2-12 orbits as a function of the number of available orbits and the formal Poissonian sigma-clipping level. We find that the algorithm needs greater than or equal 4 WF/PC exposures to locate the minimal sky signal (which is noticeably affected by CRs), with an optimal clipping level at 2-2.5 x sigma(sub Poisson). We analyze the CR flux detected on multiorbit 'CR stacks,' which are constructed by subtracting the best CR filtered images from the unfiltered 8-12 orbit average. We use an automated object finder to determine the surface density of CRS as a function of the apparent magnitude (or ADU flux) they would have generated in the images had they not been removed. The power law slope of the CR 'counts' (gamma approximately = 0.6 for N(m) m(exp gamma)) is steeper than that of the faint galaxy counts down to V approximately = 28 mag. The CR counts show a drop off between 28 less than or approximately V less than or approximately 30 mag (the latter is our formal 2 sigma point source sensitivity without spherical aberration). This prevents the CR sky integral from diverging, and is likely due to a real cutoff in the CR energy distribution below approximately 11 ADU per orbit. The integral CR surface density is less than or approximately 10(exp 8)/sq. deg, and their sky signal is V approximately = 25.5-27.0 mag/sq. arcsec, or 3%-13% of our NEP sky background (V = 23.3 mag/sq. arcsec), and well above the EBL integral of the deepest galaxy counts (B(sub J) approximately = 28.0 mag/sq. arcsec). We conclude that faint CRs will always contribute to the sky signal in the deepest WF/PC images. Since WFPC2 has approximately 2.7x lower read noise and a thicker CCD, this will result in more CR detections than in WF/PC, potentially affecting approximately 10%-20% of the pixels in multiorbit WFPC2 data cubes.

Windhorst, Rogier A.; Franklin, Barbara E.; Neuschaefer, Lyman W.

1994-01-01

398

A double photomultiplier Compton camera and its readout system for mice imaging  

SciTech Connect

We have designed a Compton Camera (CC) to image the bio-distribution of gamma-emitting radiopharmaceuticals in mice. A CC employs the 'electronic collimation', i.e. a technique that traces the gamma-rays instead of selecting them with physical lead or tungsten collimators. To perform such a task, a CC measures the parameters of the Compton interaction that occurs in the device itself. At least two detectors are required: one (tracker), where the primary gamma undergoes a Compton interaction and a second one (calorimeter), in which the scattered gamma is completely absorbed. Eventually the polar angle and hence a 'cone' of possible incident directions are obtained (event with 'incomplete geometry'). Different solutions for the two detectors are proposed in the literature: our design foresees two similar Position Sensitive Photomultipliers (PMT, Hamamatsu H8500). Each PMT has 64 output channels that are reduced to 4 using a charge multiplexed readout system, i.e. a Series Charge Multiplexing net of resistors. Triggering of the system is provided by the coincidence of fast signals extracted at the last dynode of the PMTs. Assets are the low cost and the simplicity of design and operation, having just one type of device; among drawbacks there is a lower resolution with respect to more sophisticated trackers and full 64 channels Readout. This paper does compare our design of our two-Hamamatsu CC to other solutions and shows how the spatial and energy accuracy is suitable for the inspection of radioactivity in mice.

Fontana, Cristiano Lino [Physics Department Galileo Galilei, University of Padua, Via Marzolo 8, Padova 35131 (Italy) and INFN Padova, Via Marzolo 8, Padova 35131 (Italy); Atroshchenko, Kostiantyn [Physics Department Galileo Galilei, University of Padua, Via Marzolo 8, Padova 35131 (Italy) and INFN Legnaro, Viale dell'Universita 2, Legnaro PD 35020 (Italy); Baldazzi, Giuseppe [Physics Department, University of Bologna, Viale Berti Pichat 6/2, Bologna 40127, Italy and INFN Bologna, Viale Berti Pichat 6/2, Bologna 40127 (Italy); Bello, Michele [INFN Legnaro, Viale dell'Universita 2, Legnaro PD 35020 (Italy); Uzunov, Nikolay [Department of Natural Sciences, Shumen University, 115 Universitetska str., Shumen 9712, Bulgaria and INFN Legnaro, Viale dell'Universita 2, Legnaro PD 35020 (Italy); Di Domenico, Giovanni [Physics Department, University of Ferrara, Via Saragat 1, Ferrara 44122 (Italy) and INFN Ferrara, Via Saragat 1, Ferrara 44122 (Italy)

2013-04-19

399

High-contrast imaging in polychromatic light with the self-coherent camera  

NASA Astrophysics Data System (ADS)

Context. In the context of direct imaging of exoplanets, coronagraphs are commonly proposed to reach the required very high contrast levels. However, wavefront aberrations induce speckles in their focal plane and limit their performance. Aims: An active correction of these wavefront aberrations using a deformable mirror upstream of the coronagraph is mandatory. These aberrations need to be calibrated and focal-plane wavefront-sensing techniques in the science channel are being developed. One of these is the self-coherent camera, of which we present the latest laboratory results. Methods: We present here an enhancement of the method: we directly minimized the complex amplitude of the speckle field in the focal plane. Laboratory tests using a four-quadrant phase-mask coronagraph and a 32 × 32 actuator deformable mirror were conducted in monochromatic light and in polychromatic light for different bandwidths. Results: We obtain contrast levels in the focal plane in monochromatic light better than 3 × 10-8 (RMS) in the 5-12 ?/D region for a correction of both phase and amplitude aberrations. In narrow bands (10 nm) the contrast level is 4 × 10-8 (RMS) in the same region. Conclusions: The contrast level is currently limited by the amplitude aberrations on the bench. We identified several improvements that can be implemented to enhance the performance of our optical bench in monochromatic as well as in polychromatic light.

Mazoyer, J.; Baudoz, P.; Galicher, R.; Rousset, G.

2014-04-01

400

Development of a large-angle pinhole gamma camera with depth-of-interaction capability for small animal imaging  

NASA Astrophysics Data System (ADS)

A large-angle gamma camera was developed for imaging small animal models used in medical and biological research. The simulation study shows that a large field of view (FOV) system provides higher sensitivity with respect to a typical pinhole gamma cameras by reducing the distance between the pinhole and the object. However, this gamma camera suffers from the degradation of the spatial resolution at the periphery region due to parallax error by obliquely incident photons. We propose a new method to measure the depth of interaction (DOI) using three layers of monolithic scintillators to reduce the parallax error. The detector module consists of three layers of monolithic CsI(Tl) crystals with dimensions of 50.0 × 50.0 × 2.0 mm3, a Hamamatsu H8500 PSPMT and a large-angle pinhole collimator with an acceptance angle of 120°. The 3-dimensional event positions were determined by the maximum-likelihood position-estimation (MLPE) algorithm and the pre-generated look up table (LUT). The spatial resolution (FWHM) of a Co-57 point-like source was measured at different source position with the conventional method (Anger logic) and with DOI information. We proved that high sensitivity can be achieved without degradation of spatial resolution using a large-angle pinhole gamma camera: this system can be used as a small animal imaging tool.

Baek, C.-H.; An, S. J.; Kim, H.-I.; Choi, Y.; Chung, Y. H.

2012-01-01

401

Infrared Camera  

NASA Technical Reports Server (NTRS)

A sensitive infrared camera that observes the blazing plumes from the Space Shuttle or expendable rocket lift-offs is capable of scanning for fires, monitoring the environment and providing medical imaging. The hand-held camera uses highly sensitive arrays in infrared photodetectors known as quantum well infrared photo detectors (QWIPS). QWIPS were developed by the Jet Propulsion Laboratory's Center for Space Microelectronics Technology in partnership with Amber, a Raytheon company. In October 1996, QWIP detectors pointed out hot spots of the destructive fires speeding through Malibu, California. Night vision, early warning systems, navigation, flight control systems, weather monitoring, security and surveillance are among the duties for which the camera is suited. Medical applications are also expected.

1997-01-01

402

Modelling image profiles produced with a small field of view gamma camera with a single pinhole collimator  

NASA Astrophysics Data System (ADS)

Gamma cameras making use of parallel-hole collimators have a long history in medical imaging. Pinhole collimators were used in the original gamma camera instruments and have been used more recently in dedicated organ specific systems, intraoperative instruments and for small animal imaging, providing higher resolution over a smaller field of view than the traditional large field of view systems. With the resurgence of interest in the use of pinhole collimators for small field of view (SOV) medical gamma cameras, it is important to be able to accurately determine their response under various conditions. Several analytical approaches to pinhole response have been reported in the literature including models of 3D pinhole imaging systems. Success has also been reported in the use of Monte Carlo simulations; however this approach can require significant time and computing power. This report describes a 2D model that was used to investigate some common problems in pinhole imaging, the variation in resolution over the field of view and the use of `point' sources for quantifying pinhole response.

Bugby, S. L.; Lees, J. E.; Perkins, A. C.

2012-11-01

403

Digital photogrammetric analysis of the IMP camera images: Mapping the Mars Pathfinder landing site in three dimensions  

USGS Publications Warehouse

This paper describes our photogrammetric analysis of the Imager for Mars Pathfinder data, part of a broader program of mapping the Mars Pathfinder landing site in support of geoscience investigations. This analysis, carried out primarily with a commercial digital photogrammetric system, supported by our in-house Integrated Software for Imagers and Spectrometers (ISIS), consists of three steps: (1) geometric control: simultaneous solution for refined estimates of camera positions and pointing plus three-dimensional (3-D) coordinates of ???103 features sitewide, based on the measured image coordinates of those features; (2) topographic modeling: identification of ???3 ?? 105 closely spaced points in the images and calculation (based on camera parameters from step 1) of their 3-D coordinates, yielding digital terrain models (DTMs); and (3) geometric manipulation of the data: combination of the DTMs from different stereo pairs into a sitewide model, and reprojection of image data to remove parallax between the different spectral filters in the two cameras and to provide an undistorted planimetric view of the site. These processes are described in detail and example products are shown. Plans for combining the photogrammetrically derived topographic data with spectrophotometry are also described. These include photometric modeling using surface orientations from the DTM to study surface microtextures and improve the accuracy of spectral measurements, and photoclinometry to refine the DTM to single-pixel resolution where photometric properties are sufficiently uniform. Finally, the inclusion of rover images in a joint photogrammetric analysis with IMP images is described. This challenging task will provide coverage of areas hidden to the IMP, but accurate ranging of distant features can be achieved only if the lander is also visible in the rover image used. Copyright 1999 by the American Geophysical Union.

Kirk, R.L.; Howington-Kraus, E.; Hare, T.; Dorrer, E.; Cook, D.; Becker, K.; Thompson, K.; Redding, B.; Blue, J.; Galuszka, D.; Lee, E.M.; Gaddis, L.R.; Johnson, J.R.; Soderblom, L.A.; Ward, A.W.; Smith, P.H.; Britt, D.T.

1999-01-01