Sample records for dawn framing camera

  1. Dawn LAMO Image 83

    NASA Image and Video Library

    2016-05-06

    Ceres densely cratered landscape is revealed in this image taken by the framing camera aboard NASA Dawn spacecraft. The craters show various degrees of degradation. The youngest craters have sharp rims.

  2. Dawn LAMO Image 84

    NASA Image and Video Library

    2016-05-09

    Ceres densely cratered landscape is revealed in this image taken by the framing camera aboard NASA Dawn spacecraft. The craters show various degrees of degradation. The youngest craters have sharp rims.

  3. Ceres From Dawn, Processed

    NASA Image and Video Library

    2015-01-19

    This processed image, taken Jan. 13, 2015, shows the dwarf planet Ceres as seen from the Dawn spacecraft. The image hints at craters on the surface of Ceres. Dawn framing camera took this image at 238,000 miles 383,000 kilometers from Ceres. http://photojournal.jpl.nasa.gov/catalog/PIA19167

  4. Automated Spectral System for Terrain Classification, Mineralogy of Vesta from the Dawn Framing Cameras

    NASA Astrophysics Data System (ADS)

    Reddy, V.; Le Corre, L.; Nathues, A.; Hall, I.; Gutierrez-Marques, P.; Hoffmann, M.

    2011-10-01

    The Dawn mission will rendezvous with asteroid (4) Vesta in July 2011. We have developed a set of equations for extracting mean pyroxene chemistry (Ferrosilite and Wollastonite) for classifying terrains on Vesta by using the Dawn Framing Camera (FC) multi-color bands. The Automated Spectral System (ASS) utilizes pseudo-Band I minima to estimate the mean pyroxene chemistry of diogenites, and basaltic eucrites. The mean pyroxene chemistries of cumulate eucrites, and howardites overlap each other on the pyroxene quadrilateral and hence are harder to distinguish. We expect our ASS to carry a bulk of the terrain classification and mineralogy workload utilizing these equations and complement the work of DawnKey (Le Corre et al., 2011, DPS/EPSC 2011). The system will also provide surface mineral chemistry layers that can be used for mapping Vesta's surface.

  5. High-Resolution Global Geologic Map of Ceres from NASA Dawn Mission

    NASA Astrophysics Data System (ADS)

    Williams, D. A.; Buczkowski, D. L.; Crown, D. A.; Frigeri, A.; Hughson, K.; Kneissl, T.; Krohn, K.; Mest, S. C.; Pasckert, J. H.; Platz, T.; Ruesch, O.; Schulzeck, F.; Scully, J. E. C.; Sizemore, H. G.; Nass, A.; Jaumann, R.; Raymond, C. A.; Russell, C. T.

    2018-06-01

    This presentation will discuss the completed 1:4,000,000 global geologic map of dwarf planet Ceres derived from Dawn Framing Camera Low Altitude Mapping Orbit (LAMo) images, combining 15 quadrangle maps.

  6. High-resolution Ceres Low Altitude Mapping Orbit Atlas derived from Dawn Framing Camera images

    NASA Astrophysics Data System (ADS)

    Roatsch, Th.; Kersten, E.; Matz, K.-D.; Preusker, F.; Scholten, F.; Jaumann, R.; Raymond, C. A.; Russell, C. T.

    2017-06-01

    The Dawn spacecraft Framing Camera (FC) acquired over 31,300 clear filter images of Ceres with a resolution of about 35 m/pxl during the eleven cycles in the Low Altitude Mapping Orbit (LAMO) phase between December 16 2015 and August 8 2016. We ortho-rectified the images from the first four cycles and produced a global, high-resolution, uncontrolled photomosaic of Ceres. This global mosaic is the basis for a high-resolution Ceres atlas that consists of 62 tiles mapped at a scale of 1:250,000. The nomenclature used in this atlas was proposed by the Dawn team and was approved by the International Astronomical Union (IAU). The full atlas is available to the public through the Dawn Geographical Information System (GIS) web page [http://dawngis.dlr.de/atlas] and will become available through the NASA Planetary Data System (PDS) (http://pdssbn.astro.umd.edu/).

  7. The Atlases of Vesta derived from Dawn Framing Camera images

    NASA Astrophysics Data System (ADS)

    Roatsch, T.; Kersten, E.; Matz, K.; Preusker, F.; Scholten, F.; Jaumann, R.; Raymond, C. A.; Russell, C. T.

    2013-12-01

    The Dawn Framing Camera acquired during its two HAMO (High Altitude Mapping Orbit) phases in 2011 and 2012 about 6,000 clear filter images with a resolution of about 60 m/pixel. We combined these images in a global ortho-rectified mosaic of Vesta (60 m/pixel resolution). Only very small areas near the northern pole were still in darkness and are missing in the mosaic. The Dawn Framing Camera also acquired about 10,000 high-resolution clear filter images (about 20 m/pixel) of Vesta during its Low Altitude Mapping Orbit (LAMO). Unfortunately, the northern part of Vesta was still in darkness during this phase, good illumination (incidence angle < 70°) was only available for 66.8 % of the surface [1]. We used the LAMO images to calculate another global mosaic of Vesta, this time with 20 m/pixel resolution. Both global mosaics were used to produce atlases of Vesta: a HAMO atlas with 15 tiles at a scale of 1:500,000 and a LAMO atlas with 30 tiles at a scale between 1:200,000 and 1:225,180. The nomenclature used in these atlases is based on names and places historically associated with the Roman goddess Vesta, and is compliant with the rules of the IAU. 65 names for geological features were already approved by the IAU, 39 additional names are currently under review. Selected examples of both atlases will be shown in this presentation. Reference: [1]Roatsch, Th., etal., High-resolution Vesta Low Altitude Mapping Orbit Atlas derived from Dawn Framing Camera images. Planetary and Space Science (2013), http://dx.doi.org/10.1016/j.pss.2013.06.024i

  8. Dawn First Glimpse of Vesta -- Processed

    NASA Image and Video Library

    2011-05-11

    This image, processed to show the true size of the giant asteroid Vesta, shows Vesta in front of a spectacular background of stars. It was obtained by the framing camera aboard NASA Dawn spacecraft on May 3, 2011, from a distance of about 750,000 miles.

  9. Dense Region of Impact Craters

    NASA Image and Video Library

    2011-09-23

    NASA Dawn spacecraft obtained this image of the giant asteroid Vesta with its framing camera on Aug. 14 2011. This image was taken through the camera clear filter. The image has a resolution of about 260 meters per pixel.

  10. Reevaluating Surface Composition of Asteroid (4) Vesta by Comparing HED Spectral Data with Dawn Framing Camera (FC) Observations

    NASA Astrophysics Data System (ADS)

    Giebner, T.; Jaumann, R.; Schröder, S.

    2016-08-01

    This master's thesis project tries to reevaluate previous findings on asteroid (4) Vesta's surface composition by using DAWN FC Filter image ratios in a new way in order to identify HED (howardite, eucrite, diogenite) lithologies on the surface.

  11. Juling Crater's Floor

    NASA Image and Video Library

    2018-03-14

    This view from NASA's Dawn mission shows the floor of Ceres' Juling Crater. The crater floor shows evidence of the flow of ice and rock, similar to rock glaciers in Earth's polar regions. Dawn acquired the picture with its framing camera on Aug. 30, 2016. https://photojournal.jpl.nasa.gov/catalog/PIA21920

  12. False-Color Image of an Impact Crater on Vesta

    NASA Image and Video Library

    2011-08-24

    NASA Dawn spacecraft obtained this false-color image right of an impact crater in asteroid Vesta equatorial region with its framing camera on July 25, 2011. The view on the left is from the camera clear filter.

  13. Enhanced Early View of Ceres from Dawn

    NASA Image and Video Library

    2014-12-05

    As the Dawn spacecraft flies through space toward the dwarf planet Ceres, the unexplored world appears to its camera as a bright light in the distance, full of possibility for scientific discovery. This view was acquired as part of a final calibration of the science camera before Dawn's arrival at Ceres. To accomplish this, the camera needed to take pictures of a target that appears just a few pixels across. On Dec. 1, 2014, Ceres was about nine pixels in diameter, nearly perfect for this calibration. The images provide data on very subtle optical properties of the camera that scientists will use when they analyze and interpret the details of some of the pictures returned from orbit. Ceres is the bright spot in the center of the image. Because the dwarf planet is much brighter than the stars in the background, the camera team selected a long exposure time to make the stars visible. The long exposure made Ceres appear overexposed, and exaggerated its size; this was corrected by superimposing a shorter exposure of the dwarf planet in the center of the image. A cropped, magnified view of Ceres appears in the inset image at lower left. The image was taken on Dec. 1, 2014 with the Dawn spacecraft's framing camera, using a clear spectral filter. Dawn was about 740,000 miles (1.2 million kilometers) from Ceres at the time. Ceres is 590 miles (950 kilometers) across and was discovered in 1801. http://photojournal.jpl.nasa.gov/catalog/PIA19050

  14. Juling Crater

    NASA Image and Video Library

    2018-03-14

    This view from NASA's Dawn mission shows where ice has been detected in the northern wall of Ceres' Juling Crater, which is in almost permanent shadow. Dawn acquired the picture with its framing camera on Aug. 30, 2016, and it was processed with the help of NASA Ames Stereo Pipeline (ASP), to estimate the slope of the cliff. https://photojournal.jpl.nasa.gov/catalog/PIA21918

  15. 4 Vesta in Color: High Resolution Mapping from Dawn Framing Camera Images

    NASA Technical Reports Server (NTRS)

    Reddy, V.; LeCorre, L.; Nathues, A.; Sierks, H.; Christensen, U.; Hoffmann, M.; Schroeder, S. E.; Vincent, J. B.; McSween, H. Y.; Denevi, B. W.; hide

    2011-01-01

    Rotational surface variations on asteroid 4 Vesta have been known from ground-based and HST observations, and they have been interpreted as evidence of compositional diversity. NASA s Dawn mission entered orbit around Vesta on July 16, 2011 for a year-long global characterization. The framing cameras (FC) onboard the Dawn spacecraft will image the asteroid in one clear (broad) and seven narrow band filters covering the wavelength range between 0.4-1.0 microns. We present color mapping results from the Dawn FC observations of Vesta obtained during Survey orbit (approx.3000 km) and High-Altitude Mapping Orbit (HAMO) (approx.950 km). Our aim is to create global color maps of Vesta using multi spectral FC images to identify the spatial extent of compositional units and link them with other available data sets to extract the basic mineralogy. While the VIR spectrometer onboard Dawn has higher spectral resolution (864 channels) allowing precise mineralogical assessment of Vesta s surface, the FC has three times higher spatial resolution in any given orbital phase. In an effort to extract maximum information from FC data we have developed algorithms using laboratory spectra of pyroxenes and HED meteorites to derive parameters associated with the 1-micron absorption band wing. These parameters will help map the global distribution of compositionally related units on Vesta s surface. Interpretation of these units will involve the integration of FC and VIR data.

  16. Densely Cratered Terrain Near the Terminator

    NASA Image and Video Library

    2011-08-16

    NASA Dawn spacecraft shows densely cratered terrain near Vesta terminator on August 6, 2011. This image was taken through the framing camera clear filter aboard the spacecraft. North is pointing towards the two oclock position.

  17. A Valentine from Vesta

    NASA Image and Video Library

    2012-02-14

    This image from NASA Dawn spacecraft, is based on a framing camera image that is overlain by a color-coded height representation of topography. This heart-shaped hollow is roughly 10 kilometers 6 miles across at its widest point.

  18. Topography of Vesta Surface

    NASA Image and Video Library

    2011-08-26

    This view of the topography of asteroid Vesta surface is composed of several images obtained with the framing camera on NASA Dawn spacecraft on August 6, 2011. The image mosaic is shown superimposed on a digital terrain model.

  19. A New Spin on Vesta

    NASA Image and Video Library

    2010-10-08

    Hubble Wide Field Camera 3 observed the potato-shaped asteroid in preparation for the visit by NASA Dawn spacecraft in 2011. This is one frame from a movie showing the difference in brightness and color on the asteroid surface.

  20. Ceres Persistent Shadow

    NASA Image and Video Library

    2016-12-15

    This frame from an animation made of images from NASA's Dawn spacecraft shows a crater in the northern polar region of Ceres that is partly in shadow year-round. In several craters like this one, bright water ice deposits have been observed by Dawn's framing camera. This finding suggests that water ice can be stored for significant amounts of time in cold, dark craters on Ceres. Such reservoirs are called "cold traps." At less than minus 260 degrees Fahrenheit (110 Kelvin), they are so chilly that very little of the ice turns into vapor in the course of a billion years. A movie is available at http://photojournal.jpl.nasa.gov/catalog/PIA21082

  1. Vesta Surface Comes into View

    NASA Image and Video Library

    2011-06-13

    This image from the framing camera aboard NASA Dawn spacecraft shows surface details beginning to resolve as the spacecraft closes in on the giant asteroid Vesta on June 1, 2011, from a distance of about 300,000 miles 483,000 kilometers.

  2. Side by Side Views of a Dark Hill

    NASA Image and Video Library

    2011-09-02

    NASA Dawn spacecraft obtained these side-by-side views of a dark hill of the surface of asteroid Vesta with its framing camera on August 19, 2011. The images have a resolution of about 260 meters per pixel.

  3. Topography of Troughs on Vesta

    NASA Image and Video Library

    2011-08-23

    This view of the topography of asteroid Vesta surface is composed of several images obtained with the clear filter in the framing camera on NASA Dawn spacecraft on August 6, 2011. The image has a resolution of about 260 meters per pixel.

  4. Dawn Orbit Determination Team: Modeling and Fitting of Optical Data at Vesta

    NASA Technical Reports Server (NTRS)

    Kennedy, Brian; Abrahamson, Matt; Ardito, Alessandro; Haw, Robert; Mastrodemos, Nicholas; Nandi, Sumita; Park, Ryan; Rush, Brian; Vaughan, Andrew

    2013-01-01

    The Dawn spacecraft was launched on September 27th, 2007. Its mission is to consecutively rendezvous with and observe the two largest bodies in the main asteroid belt, Vesta and Ceres. It has already completed over a year's worth of direct observations of Vesta (spanning from early 2011 through late 2012) and is currently on a cruise trajectory to Ceres, where it will begin scientific observations in mid-2015. Achieving this data collection required careful planning and execution from all Dawn operations teams. Dawn's Orbit Determination (OD) team was tasked with reconstruction of the as-flown trajectory as well as determination of the Vesta rotational rate, pole orientation and ephemeris, among other Vesta parameters. Improved knowledge of the Vesta pole orientation, specifically, was needed to target the final maneuvers that inserted Dawn into the first science orbit at Vesta. To solve for these parameters, the OD team used radiometric data from the Deep Space Network (DSN) along with optical data reduced from Dawn's Framing Camera (FC) images. This paper will de-scribe the initial determination of the Vesta ephemeris and pole using a combination of radiometric and optical data, and also the progress the OD team has made since then to further refine the knowledge of Vesta's body frame orientation and rate with these data.

  5. How to characterize terrains on 4 Vesta using Dawn Framing Camera color bands?

    NASA Astrophysics Data System (ADS)

    Le Corre, Lucille; Reddy, Vishnu; Nathues, Andreas; Cloutis, Edward A.

    2011-12-01

    We present methods for terrain classification on 4 Vesta using Dawn Framing Camera (FC) color information derived from laboratory spectra of HED meteorites and other Vesta-related assemblages. Color and spectral parameters have been derived using publicly available spectra of these analog materials to identify the best criteria for distinguishing various terrains. We list the relevant parameters for identifying eucrites, diogenites, mesosiderites, pallasites, clinopyroxenes and olivine + orthopyroxene mixtures using Dawn FC color cubes. Pseudo Band I minima derived by fitting a low order polynomial to the color data are found to be useful for extracting the pyroxene chemistry. Our investigation suggests a good correlation (R2 = 0.88) between laboratory measured ferrosilite (Fs) pyroxene chemistry vs. those from pseudo Band I minima using equations from Burbine et al. (Burbine, T.H., Buchanan, P.C., Dolkar, T., Binzel, R.P. [2009]. Planetary Science 44, 1331-1341). The pyroxene chemistry information is a complementary terrain classification capability beside the color ratios. We also investigated the effects of exogenous material (i.e., CM2 carbonaceous chondrites) on the spectra of HEDs using laboratory mixtures of these materials. Our results are the basis for an automated software pipeline that will allow us to classify terrains on 4 Vesta efficiently.

  6. Ahuna Mons

    NASA Image and Video Library

    2018-03-14

    This view from NASA's Dawn mission shows Ceres' tallest mountain, Ahuna Mons, 2.5 miles (4 kilometers) high and 11 miles (17 kilometers) wide. This is one of the few sites on Ceres at which a significant amount of sodium carbonate has been found, shown in green and red colors in the lower right image. The top and lower left images were collected by Dawn's framing camera. The top image is a 3D view reconstructed with the help of topography data. A non-annotated version is available at https://photojournal.jpl.nasa.gov/catalog/PIA21919

  7. Dawn Gateway View of Ceres

    NASA Image and Video Library

    2014-12-05

    From about three times the distance from Earth to the moon, NASA's Dawn spacecraft spies its final destination -- the dwarf planet Ceres. The resolution of this image does not yet exceed the best views of Ceres, which were obtained by the Hubble Space Telescope (see PIA10235). Nonetheless, Ceres' spherical shape is clearly revealed here. Sunlight illuminates the dwarf planet from the right, leaving a sliver of the surface in shadow at left. A zoomed-in view is provided in Figure 1, along with the original unmagnified, uncropped view. The image was taken on Dec. 1, 2014 with the Dawn spacecraft's framing camera, using a clear spectral filter. Dawn was about 740,000 miles (1.2 million kilometers) from Ceres at the time. Ceres is 590 miles (950 kilometers) across and was discovered in 1801. http://photojournal.jpl.nasa.gov/catalog/PIA19049

  8. Ceres Topographic Globe Animation

    NASA Image and Video Library

    2015-07-28

    This frame from an animation shows a color-coded map from NASA Dawn mission revealing the highs and lows of topography on the surface of dwarf planet Ceres. The color scale extends 3.7 miles (6 kilometers) below the surface in purple to 3.7 miles (6 kilometers) above the surface in brown. The brightest features (those appearing nearly white) -- including the well-known bright spots within a crater in the northern hemisphere -- are simply reflective areas, and do not represent elevation. The topographic map was constructed from analyzing images from Dawn's framing camera taken from varying sun and viewing angles. The map was combined with an image mosaic of Ceres and projected onto a 3-D shape model of the dwarf planet to create the animation. http://photojournal.jpl.nasa.gov/catalog/PIA19605

  9. The Dawn Gravity Investigation at Vesta and Ceres

    NASA Technical Reports Server (NTRS)

    Konopliv, A. S.; Asmar, S.W.; Bills, B. G.; Mastrodemos, N.; Park, R. S.; Raymond, C. A.; Smith, D. E.; Zuber, M. T.

    2011-01-01

    The objective of the Dawn gravity investigation is to use high precision X-band Doppler tracking and landmark tracking from optical images to measure the gravity fields of Vesta and Ceres to a half-wavelength surface resolution better than 90-km and 300-km, respectively. Depending on the Doppler tracking assumptions, the gravity field will be determined to somewhere between harmonic degrees 15 and 25 for Vesta and about degree 10 for Ceres. The gravity fields together with shape models determined from Dawn's framing camera constrain models of the interior from the core to the crust. The gravity field is determined jointly with the spin pole location. The second degree harmonics together with assumptions on obliquity or hydrostatic equilibrium may determine the moments of inertia.

  10. Origin of Dark Material on VESTA from DAWN FC Data: Remnant Carbonaceous Chondrite Impators

    NASA Technical Reports Server (NTRS)

    Reddy, V.; LeCorre, L.; Nathues, A.; Mittlefehldt, David W.; Cloutis, E. A.; OBrien, D. P.; Durda, D. D.; Bottke, W. F.; Buczkowski, D.; Scully, J. E. C.; hide

    2012-01-01

    NASA's Dawn spacecraft entered orbit around asteroid (4) Vesta in July 2011 for a yearlong mapping orbit. The surface of Vesta as imaged by the Dawn Framing Camera (FC) revealed a surface that is unlike any asteroid we have visited so far with a spacecraft. Albedo and color variations on Vesta are the most diverse in the asteroid belt with a majority of these linked to distinct compositional units on the asteroid s surface. FC discovered dark material on Vesta. These low albedo surface features were first observed during Rotational Characterization 3 phase at a resolution of approx. 487 m/pixel. Here we explore the composition and possible meteoritical analogs for the dark material on Vesta.

  11. Ahuna Mons: Side View

    NASA Image and Video Library

    2016-09-01

    Ceres' lonely mountain, Ahuna Mons, is seen in this simulated perspective view. The elevation has been exaggerated by a factor of two. The view was made using enhanced-color images from NASA's Dawn mission. Images taken using blue (440 nanometers), green (750 nanometers) and infrared (960 nanometers) spectral filters were combined to create the view. The spacecraft's framing camera took the images from Dawn's low-altitude mapping orbit, from an altitude of 240 miles (385 kilometers) in August 2016. The resolution of the component images is 120 feet (35 meters) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA20915

  12. Oxo Crater: Side View

    NASA Image and Video Library

    2016-09-01

    Ceres' lonely mountain, Ahuna Mons, is seen in this simulated perspective view. The elevation has been exaggerated by a factor of two. The view was made using enhanced-color images from NASA's Dawn mission. Images taken using blue (440 nanometers), green (750 nanometers) and infrared (960 nanometers) spectral filters were combined to create the view. The spacecraft's framing camera took the images from Dawn's low-altitude mapping orbit, from an altitude of 240 miles (385 kilometers) in August 2016. The resolution of the component images is 120 feet (35 meters) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA20915

  13. Dawn Maps the Surface Composition of Vesta

    NASA Technical Reports Server (NTRS)

    Prettyman, T.; Palmer, E.; Reedy, R.; Sykes, M.; Yingst, R.; McSween, H.; DeSanctis, M. C.; Capaccinoni, F.; Capria, M. T.; Filacchione, G.; hide

    2011-01-01

    By 7-October-2011, the Dawn mission will have completed Survey orbit and commenced high altitude mapping of 4-Vesta. We present a preliminary analysis of data acquired by Dawn's Framing Camera (FC) and the Visual and InfraRed Spectrometer (VIR) to map mineralogy and surface temperature, and to detect and quantify surficial OH. The radiometric calibration of VIR and FC is described. Background counting data acquired by GRaND are used to determine elemental detection limits from measurements at low altitude, which will commence in November. Geochemical models used in the interpretation of the data are described. Thermal properties, mineral-, and geochemical-data are combined to provide constraints on Vesta s formation and thermal evolution, the delivery of exogenic materials, space weathering processes, and the origin of the howardite, eucrite, and diogenite (HED) meteorites.

  14. High-resolution Ceres HAMO Atlas derived from Dawn FC Images

    NASA Astrophysics Data System (ADS)

    Roatsch, T.; Kersten, E.; Matz, K. D.; Preusker, F.; Scholten, F.; Jaumann, R.; Raymond, C. A.; Russell, C. T.

    2015-12-01

    Introduction: NASA's Dawn spacecraft will orbit the dwarf planet Ceres in August and September 2015 in HAMO (High Altitude Mapping Orbit) with an altitude of about 1,500 km to characterize for instance the geology, topography, and shape of Ceres before it will be transferred to the lowest orbit. One of the major goals of this mission phase is the global mapping of Ceres. Data: The Dawn mission is equipped with a fram-ing camera (FC). The framing camera will take about 2600 clear filter images with a resolution of about 120 m/pixel and different viewing angles and different illumination conditions. Data Processing: The first step of the processing chain towards the cartographic products is to ortho-rectify the images to the proper scale and map projec-tion type. This process requires detailed information of the Dawn orbit and attitude data and of the topography of the target. Both, improved orientation and high-resolution shape models, are provided by stereo processing of the HAMO dataset. Ceres' HAMO shape model is used for the calculation of the ray intersection points while the map projection itself will be done onto a reference sphere for Ceres. The final step is the controlled mosaicking of all nadir images to a global mosaic of Ceres, the so called basemap. Ceres map tiles: The Ceres atlas will be produced in a scale of 1:750,000 and will consist of 15 tiles that conform to the quadrangle schema for small planets and medium size Icy satellites. A map scale of 1:750,000 guarantees a mapping at the highest availa-ble Dawn resolution in HAMO. Nomenclature: The Dawn team proposed to the International Astronomical Union (IAU) to use the names of gods and goddesses of agriculture and vege-tation from world mythology as names for the craters. This proposal was accepted by the IAU and the team proposed names for geological features to the IAU based on the HAMO mosaic. These feature names will be applied to the map tiles.

  15. Ernutet Crater - Enhanced Color

    NASA Image and Video Library

    2017-02-16

    This enhanced color composite image, made with data from the framing camera aboard NASA's Dawn spacecraft, shows the area around Ernutet crater. The bright red portions appear redder with respect to the rest of Ceres. In a 2017 study in the journal Science, researchers from the Dawn science team found that these red areas around Ernutet are associated with evidence of organic material. Images taken using blue (440 nanometers), green (750 nanometers) and infrared (960 nanometers) spectral filters were combined to create the view. Ernutet Crater measures about 32 miles (52 kilometers) in diameter and is located in the northern hemisphere. http://photojournal.jpl.nasa.gov/catalog/PIA21419

  16. A Perfect View of Vesta: Creating Pointing Observations for the Dawn Spacecraft on Asteroid 4 Vesta

    NASA Technical Reports Server (NTRS)

    Hay, Katrina M.

    2005-01-01

    The Dawn spacecraft has a timely and clever assignment in store. It will take a close look at two intact survivors from the dawn of the solar system (asteroids 4 Vesta and 1 Ceres) to understand more about solar system origin and evolution. To optimize science return, Dawn must make carefully designed observations on approach and in survey orbit, high altitude mapping orbit, and low altitude mapping orbit at each body. In this report, observations outlined in the science plan are modeled using the science opportunity analyzer program for the Vesta encounter. Specifically, I encoded Dawn's flight rules into the program, modeled pointing profiles of the optical instruments (framing camera, visible infrared spectrometer) and mapped their fields of view onto Vesta's surface. Visualization of coverage will provide the science team with information necessary to assess feasibility of alternative observation plans. Dawn launches in summer 2006 and ends its journey in 2016. Instrument observations on Vesta in 2011 will supply detailed information about Vesta's surface and internal structure. These data will be used to analyze the formation and history of the protoplanet and, therefore, complete an important step in understanding the development of our solar system.

  17. High-resolution Ceres LAMO atlas derived from Dawn FC images

    NASA Astrophysics Data System (ADS)

    Roatsch, T.; Kersten, E.; Matz, K. D.; Preusker, F.; Scholten, F.; Jaumann, R.; Raymond, C. A.; Russell, C.

    2016-12-01

    Introduction: NASA's Dawn spacecraft has been orbiting the dwarf planet Ceres since December 2015 in LAMO (High Altitude Mapping Orbit) with an altitude of about 400 km to characterize for instance the geology, topography, and shape of Ceres. One of the major goals of this mission phase is the global high-resolution mapping of Ceres. Data: The Dawn mission is equipped with a fram-ing camera (FC). The framing camera took until the time of writing about 27,500 clear filter images in LAMO with a resolution of about 30 m/pixel and dif-ferent viewing angles and different illumination condi-tions. Data Processing: The first step of the processing chain towards the cartographic products is to ortho-rectify the images to the proper scale and map projec-tion type. This process requires detailed information of the Dawn orbit and attitude data and of the topography of the target. A high-resolution shape model was provided by stereo processing of the HAMO dataset, orbit and attitude data are available as reconstructed SPICE data. Ceres' HAMO shape model is used for the calculation of the ray intersection points while the map projection itself was done onto a reference sphere of Ceres. The final step is the controlled mosaicking of all nadir images to a global mosaic of Ceres, the so called basemap. Ceres map tiles: The Ceres atlas will be produced in a scale of 1:250,000 and will consist of 62 tiles that conforms to the quadrangle schema for Venus at 1:5,000,000. A map scale of 1:250,000 is a compro-mise between the very high resolution in LAMO and a proper map sheet size of the single tiles. Nomenclature: The Dawn team proposed to the International Astronomical Union (IAU) to use the names of gods and goddesses of agriculture and vege-tation from world mythology as names for the craters and to use names of agricultural festivals of the world for other geological features. This proposal was ac-cepted by the IAU and the team proposed 92 names for geological features to the IAU based on the LAMO mosaic. These feature names will be applied to the map tiles.

  18. Preliminary results on the various U.V. straylight sources for the VWFC onboard SL 1. [Very Wide Field Camera

    NASA Technical Reports Server (NTRS)

    Viton, M.; Courtes, G.; Sivan, J. P.; Decher, R.; Gary, A.

    1985-01-01

    Technical difficulties encountered using the Very Wide Field Camera (VWFC) during the Spacelab 1 Shuttle mission are reported. The VWFC is a wide low resolution (5 arcmin half-half width) photographic camera, capable of operating in both spectrometric and photometric modes. The bandpasses of the photometric mode of the VWFC are defined by three Al + MgF2 interference filters. A piggy-back spectrograph attached to the VWFC was used for observations in the spectrometric mode. A total of 48 astronomical frames were obtained using the VWFC, of which only 20 were considered to be of adequate quality for astronomical data processing. Preliminary analysis of the 28 poor-quality images revealed the following possible defects in the VWFC: darkness in the spacing frames, twilight/dawn UV straylight, and internal UV straylight. Improvements in the VWFC astronomical data processing scheme are expected to help identify and eliminate UV straylight sources in the future.

  19. Dawn: An Ion-Propelled Journey to the Beginning of the Solar System

    NASA Technical Reports Server (NTRS)

    Brophy, John R.; Rayman, Marc D.; Pavri, Betina

    2008-01-01

    The Dawn mission is designed to perform a scientific investigation of the two heaviest mainbelt asteroids Vesta and Ceres. These bodies are believed to preserve records of the physical and chemical conditions present during the formation of the solar system. The mission uses an ion propulsion system to enable the single Dawn spacecraft and its complement of scientific instruments to orbit both of these asteroids. Dawn's three science instruments - the gamma ray and neutron detector, the visible and infrared mapping spectrometer, and the primary framing camera - were successfully tested after launch and are functioning normally. The ion propulsion system includes three ion thrusters of the type flown previously on NASA's Deep Space 1 mission. A minimum of two ion thrusters is necessary to accomplish the Dawn mission. Checkout of two of the ion thrusters was completed as planned within 30 days after launch. This activity confirmed that the spacecraft has two healthy ion thrusters. While further checkout activities are still in progress, the activities completed as of the end of October indicate that the spacecraft is well on its way toward being ready for the start of the thrusting-cruise phase of the mission beginning December 15th.

  20. Ceres in Color

    NASA Image and Video Library

    2016-11-18

    This image of Ceres approximates how the dwarf planet's colors would appear to the eye. This view of Ceres, produced by the German Aerospace Center in Berlin, combines images taken during Dawn's first science orbit in 2015 using the framing camera's red, green and blue spectral filters. The color was calculated using a reflectance spectrum, which is based on the way that Ceres reflects different wavelengths of light and the solar wavelengths that illuminate Ceres. http://photojournal.jpl.nasa.gov/catalog/PIA21079

  1. How the Distribution of Impact Ejecta may explain Surface Features on Ceres and Saturnian Satellites

    NASA Astrophysics Data System (ADS)

    Schmedemann, N.; Neesemann, A.; Schulzeck, F.; Krohn, K.; von der Gathen, I.; Otto, K. A.; Jaumann, R.; Wagner, R.; Michael, G.; Raymond, C. A.; Russell, C. T.

    2017-09-01

    The high rate of Ceres' rotation in addition to its low surface gravity result in strong Coriolis forces affecting significant amounts of far flying impact ejecta. Dawn Framing Camera observations of specific orientations of secondary crater chains and global scale color ratio anomalies can be explained by application of our crater ejecta distribution model. The model is also applied to Saturnian satellites for understanding their pattern of secondary crater chains and cluster.

  2. Ceres' Global Cryosphere

    NASA Astrophysics Data System (ADS)

    Sizemore, H. G.; Prettyman, T. H.; De Sanctis, M. C.; Schmidt, B. E.; Hughson, K.; Chilton, H.; Castillo, J. C.; Platz, T.; Schorghofer, N.; Bland, M. T.; Sori, M.; Buczkowski, D.; Byrne, S.; Landis, M. E.; Fu, R.; Ermakov, A.; Raymond, C. A.; Schwartz, S. J.

    2017-12-01

    Prior to the arrival of the Dawn spacecraft at Ceres, the dwarf planet was anticipated to have a deep global cryosphere protected by a thin silicate lag. Gravity science along with data collected by Dawn's Framing Camera (FC), Gamma Ray and Neutron Detector (GRaND), and Visible and Infrared Mapping Spectrometer (VIR-MS) during the primary mission at Ceres have confirmed the existence of a global, silicate-rich cryosphere, and suggest the existence of deeper ice, brine, or mud layers. As such, Ceres' surface morphology has characteristics in common with both Mars and the small icy bodies of the outer solar system. We will summarize the evidence for the existence and global extent of the Cerean cryosphere. We will also discuss the range of morphological features that have been linked to subsurface ice, and highlight outstanding science questions.

  3. Colors and Photometry of Bright Materials on Vesta as Seen by the Dawn Framing Camera

    NASA Technical Reports Server (NTRS)

    Schroeder, S. E.; Li, J.-Y.; Mittlefehldt, D. W.; Pieters, C. M.; De Sanctis, M. C.; Hiesinger, H.; Blewett, D. T.; Russell, C. T.; Raymond, C. A.; Keller, H. U.; hide

    2012-01-01

    The Dawn spacecraft has been in orbit around the asteroid Vesta since July, 2011. The on-board Framing Camera has acquired thousands of high-resolution images of the regolith-covered surface through one clear and seven narrow-band filters in the visible and near-IR wavelength range. It has observed bright and dark materials that have a range of reflectance that is unusually wide for an asteroid. Material brighter than average is predominantly found on crater walls, and in ejecta surrounding caters in the southern hemisphere. Most likely, the brightest material identified on the Vesta surface so far is located on the inside of a crater at 64.27deg S, 1.54deg . The apparent brightness of a regolith is influenced by factors such as particle size, mineralogical composition, and viewing geometry. As such, the presence of bright material can indicate differences in lithology and/or degree of space weathering. We retrieve the spectral and photometric properties of various bright terrains from false-color images acquired in the High Altitude Mapping Orbit (HAMO). We find that most bright material has a deeper 1-m pyroxene band than average. However, the aforementioned brightest material appears to have a 1-m band that is actually less deep, a result that awaits confirmation by the on-board VIR spectrometer. This site may harbor a class of material unique for Vesta. We discuss the implications of our spectral findings for the origin of bright materials.

  4. Photometric properties of Ceres from telescopic observations using Dawn Framing Camera color filters

    NASA Astrophysics Data System (ADS)

    Reddy, Vishnu; Li, Jian-Yang; Gary, Bruce L.; Sanchez, Juan A.; Stephens, Robert D.; Megna, Ralph; Coley, Daniel; Nathues, Andreas; Le Corre, Lucille; Hoffmann, Martin

    2015-11-01

    The dwarf planet Ceres is likely differentiated similar to the terrestrial planets but with a water/ice dominated mantle and an aqueously altered crust. Detailed modeling of Ceres' phase function has never been performed to understand its surface properties. The Dawn spacecraft began orbital science operations at the dwarf planet in April 2015. We observed Ceres with flight spares of the seven Dawn Framing Camera color filters mounted on ground-based telescopes over the course of three years to model its phase function versus wavelength. Our analysis shows that the modeled geometric albedos derived from both the IAU HG model and the Hapke model are consistent with a flat and featureless spectrum of Ceres, although the values are ∼10% higher than previous measurements. Our models also suggest a wavelength dependence of Ceres' phase function. The IAU G-parameter and the Hapke single-particle phase function parameter, g, are both consistent with decreasing (shallower) phase slope with increasing wavelength. Such a wavelength dependence of phase function is consistent with reddening of spectral slope with increasing phase angle, or phase-reddening. This phase reddening is consistent with previous spectra of Ceres obtained at various phase angles archived in the literature, and consistent with the fact that the modeled geometric albedo spectrum of Ceres is the bluest of all spectra because it represents the spectrum at 0° phase angle. Ground-based FC color filter lightcurve data are consistent with HST albedo maps confirming that Ceres' lightcurve is dominated by albedo and not shape. We detected a positive correlation between 1.1-μm absorption band depth and geometric albedo suggesting brighter areas on Ceres have absorption bands that are deeper. We did not see the "extreme" slope values measured by Perna et al. (Perna, D., et al. [2015]. Astron. Astrophys. 575 (L1-6)), which they have attributed to "resurfacing episodes" on Ceres.

  5. Dawn Framing Camera: Morphology and morphometry of impact craters on Ceres

    NASA Astrophysics Data System (ADS)

    Platz, T.; A; Nathues; Schäfer, M.; Hoffmann, M.; Kneissl, T.; Schmedemann, N.; Vincent, J.-B.; Büttner, I.; Gutierrez-Marques, P.; Ripken, J.; Russell, C. T.; Schäfer, T.; Thangjam, G. S.

    2015-10-01

    In the first approach images of Ceres we tried to discern the simple-to-complex transition diameter of impact craters. Limited by spatial resolution we found the smallest complex crater without central peak development to be around 21.4 km in diameter. Hence, the transition diameter is expected to be between 21.4 km and 10.6 km, the predicted transition diameter for an icy target. It appears likely that either Ceres' surface material contains a rocky component or has a laterally inhomogeneous composition ranging from icy to ice-rocky

  6. The Dawn Topography Investigation

    NASA Technical Reports Server (NTRS)

    Raymond, C. A.; Jaumann, R.; Nathues, A.; Sierks, H.; Roatsch, T.; Preusker, E; Scholten, F.; Gaskell, R. W.; Jorda, L.; Keller, H.-U.; hide

    2011-01-01

    The objective of the Dawn topography investigation is to derive the detailed shapes of 4 Vesta and 1 Ceres in order to create orthorectified image mosaics for geologic interpretation, as well as to study the asteroids' landforms, interior structure, and the processes that have modified their surfaces over geologic time. In this paper we describe our approaches for producing shape models, plans for acquiring the needed image data for Vesta, and the results of a numerical simulation of the Vesta mapping campaign that quantify the expected accuracy of our results. Multi-angle images obtained by Dawn's framing camera will be used to create topographic models with 100 m/pixel horizontal resolution and 10 m height accuracy at Vesta, and 200 m/pixel horizontal resolution and 20 m height accuracy at Ceres. Two different techniques, stereophotogrammetry and stereophotoclinometry, are employed to model the shape; these models will be merged with the asteroidal gravity fields obtained by Dawn to produce geodetically controlled topographic models for each body. The resulting digital topography models, together with the gravity data, will reveal the tectonic, volcanic and impact history of Vesta, and enable co-registration of data sets to determine Vesta's geologic history. At Ceres, the topography will likely reveal much about processes of surface modification as well as the internal structure and evolution of this dwarf planet.

  7. Topographic Ceres Map With Crater Names

    NASA Image and Video Library

    2015-07-28

    This color-coded map from NASA Dawn mission shows the highs and lows of topography on the surface of dwarf planet Ceres. It is labeled with names of features approved by the International Astronomical Union. Occator, the mysterious crater containing Ceres' mysterious bright spots, is named after the Roman agriculture deity of harrowing, a method of leveling soil. They retain their bright appearance in this map, although they are color-coded in the same green elevation of the crater floor in which they sit. The color scale extends about 5 miles (7.5 kilometers) below the surface in indigo to 5 miles (7.5 kilometers) above the surface in white. The topographic map was constructed from analyzing images from Dawn's framing camera taken from varying sun and viewing angles. The map was combined with an image mosaic of Ceres and projected as an simple cylindrical projection. http://photojournal.jpl.nasa.gov/catalog/PIA19606

  8. Regolith Depth, Mobility, and Variability on Vesta from Dawn's Low Altitude Mapping Orbit

    NASA Technical Reports Server (NTRS)

    Denevi, B. W.; Coman, E. I.; Blewett, D. T.; Mittlefehldt, D. W.; Buczkowski, D. L.; Combe, J.-P.; De Sanctis, M. C.; Jaumann, R.; Li, J.-Y.; Marchi, S.; hide

    2012-01-01

    Regolith, the fragmental debris layer formed from impact events of all sizes, covers the surface of all asteroids imaged by spacecraft to date. Here we use Framing Camera (FC) images [1] acquired by the Dawn spacecraft [2] from its low-altitude mapping orbit (LAMO) of 210 km (pixel scales of 20 m) to characterize regolith depth, variability, and mobility on Vesta, and to locate areas of especially thin regolith and exposures of competent material. These results will help to evaluate how the surface of this differentiated asteroid has evolved over time, and provide key contextual information for understanding the origin and degree of mixing of the surficial materials for which compositions are estimated [3,4] and the causes of the relative spectral immaturity of the surface [5]. Vestan regolith samples, in the form of howardite meteorites, can be studied in the laboratory to provide complementary constraints on the regolith process [6].

  9. Stray light calibration of the Dawn Framing Camera

    NASA Astrophysics Data System (ADS)

    Kovacs, Gabor; Sierks, Holger; Nathues, Andreas; Richards, Michael; Gutierrez-Marques, Pablo

    2013-10-01

    Sensitive imaging systems with high dynamic range onboard spacecrafts are susceptible to ghost and stray-light effects. During the design phase, the Dawn Framing Camera was laid out and optimized to minimize those unwanted, parasitic effects. However, the requirement of low distortion to the optical design and use of a front-lit focal plane array induced an additional stray light component. This paper presents the ground-based and in-flight procedures characterizing the stray-light artifacts. The in-flight test used the Sun as the stray light source, at different angles of incidence. The spacecraft was commanded to point predefined solar elongation positions, and long exposure images were recorded. The PSNIT function was calculated by the known illumination and the ground based calibration information. In the ground based calibration, several extended and point sources were used with long exposure times in dedicated imaging setups. The tests revealed that the major contribution to the stray light is coming from the ghost reflections between the focal plan array and the band pass interference filters. Various laboratory experiments and computer modeling simulations were carried out to quantify the amount of this effect, including the analysis of the diffractive reflection pattern generated by the imaging sensor. The accurate characterization of the detector reflection pattern is the key to successfully predict the intensity distribution of the ghost image. Based on the results, and the properties of the optical system, a novel correction method is applied in the image processing pipeline. The effect of this correction procedure is also demonstrated with the first images of asteroid Vesta.

  10. Nature of the "Orange" Material on Vesta From Dawn

    NASA Technical Reports Server (NTRS)

    LeCorre, L.; Reddy, V.; Schmedemann, N.; Becker, K. J.; OBrien, D. P.; Yamashita, N.; Peplowski, P. N.; Prettyman, T. H.; Li, J.-Y.; Coultis, E. A.; hide

    2014-01-01

    From ground-based observations of Vesta, it is well-known that the vestan surface has a large variation in albedo. Analysis of images acquired by the Hubble Space Telescope allowed production of the first color maps of Vesta and showed a diverse surface in terms of reflectance. Thanks to images collected by the Dawn spacecraft at Vesta, it became obvious that these specific units observed previously can be linked to geological features. The presence of the darkest material mostly around impact craters and scattered in the Western hemisphere has been associated with carbonaceous chondrite contamination [4]; whereas the brightest materials are believed to result from exposure of unaltered material from the subsurface of Vesta (in fresh looking impact crater rims and in Rheasilvia's ejecta and rim remants). Here we focus on a distinct material characterized by a steep slope in the near-IR relative to all other kinds of materials found on Vesta. It was first detected when combining Dawn Framing Camera (FC) color images in Clementine false-color composites [5] during the Approach phase of the mission (100000 to 5200 km from Vesta). We investigate the mineralogical and elemental composition of this material and its relationship with the HEDs (Howardite-Eucrite- Diogenite group of meteorites).

  11. Dawn Mission: A Journey in Space and Time

    NASA Technical Reports Server (NTRS)

    Russell, C. T.; Coradini, A.; DeSanctis, M. C.; Feldman, W. C.; Jaumann, R.; Konopliv, A. S.; McCord, T. B.; McFadden, L. A.; McSween, H. Y.; Mottola, S.

    2003-01-01

    By successively orbiting both 4 Vesta and 1 Ceres the Dawn mission directly addresses the longstanding goals of NASA and the planetary community to understand the origin and evolution of the solar system by obtaining geophysical and geochemical data on diverse main belt asteroids. Ceres and Vesta are two complementary terrestrial protoplanets (one apparently "wet" and one "dry"), whose accretion was terminated by the formation of Jupiter. Ceres is little changed since it formed in the early solar system, while Vesta has experienced significant heating and differentiation. Both have remained intact over the age of the solar system, thereby retaining a record of events and processes from the time of planet formation. Detailed study of the geophysics and geochemistry of these two bodies provides critical benchmarks for the early solar system conditions and processes that shaped its subsequent evolution. Dawn provides the missing context for both primitive and evolved meteoritic data, thus playing a central role in understanding terrestrial planet formation and the evolution of the asteroid belt. Dawn is to be launched in May 2006 arriving at Vesta in 2010 and Ceres in 2014, stopping at each to make 11 months of orbital measurements. The spacecraft uses solar electric propulsion both in cruise and in orbit to make most efficient use of its xenon propellant. The spacecraft carries a framing camera, visible and infrared mapping spectrometer, gamma ray/neutron spectrometer, a laser altimeter, magnetometer, and radio science.

  12. Vesta's Elemental Composition

    NASA Technical Reports Server (NTRS)

    Prettyman, T. H.; Beck, A. W.; Feldman, W. C.; Lawrence, D. J.; McCoy, T. J.; McSween, H. Y.; Mittlefehldt, D. W.; Peplowski, P. N.; Raymond, C. A.; Reedy, R. C.; hide

    2014-01-01

    Many lines of evidence (e.g. common geochemistry, chronology, O-isotope trends, and the presence of different HED rock types in polymict breccias) indicate that the howardite, eucrite, and diogenite (HED) meteorites originated from a single parent body. Meteorite studies show that this protoplanet underwent igneous differentiation to form a metallic core, an ultramafic mantle, and a basaltic crust. A spectroscopic match between the HEDs and 4 Vesta along with a plausible mechanism for their transfer to Earth, perhaps as chips off V-type asteroids ejected from Vesta's southern impact basin, supports the consensus view that many of these achondritic meteorites are samples of Vesta's crust and upper mantle. The HED-Vesta connection was put to the test by the NASA Dawn mission, which spent a year in close proximity to Vesta. Measurements by Dawn's three instruments, redundant Framing Cameras (FC), a Visible-InfraRed (VIR) spectrometer, and a Gamma Ray and Neutron Detector (GRaND), along with radio science have strengthened the link. Gravity measurements by Dawn are consistent with a differentiated, silicate body, with a dense Fe-rich core. The range of pyroxene compositions determined by VIR overlaps that of the howardites. Elemental abundances determined by nuclear spectroscopy are also consistent with HED-compositions. Observations by GRaND provided a new view of Vesta inaccessible by telescopic observations. Here, we summarize the results of Dawn's geochemical investigation of Vesta and their implications.

  13. Boulders on Ceres

    NASA Technical Reports Server (NTRS)

    Schroder, S. E.; Carsenty, U.; Neesemann, A.; Jaumann, R.; Marchi, S.; Mcfadden, L. A.; Otto, K.; Schenk, P.; Schulzeck, F.; Raymond, C. A.; hide

    2017-01-01

    Introduction: In December 2015 the Dawn spacecraft moved into the Low Altitude Mapping Orbit (LAMO) around Ceres, encircling the dwarf planet at a distance of 400 km to the surface below. At this altitude, images of the on-board framing camera have a resolution of 36 meters per pixel, high enough to distinguish large boulders on the surface. Indeed, LAMO images show a multitude of boulders around what seem to be fresh craters. The average life-time of boulders on Dawn's previous target, Vesta, was estimated to be similar to that of Lunar boulders, as may be expected from the basaltic surface composition. The bulk composition of Ceres may be carbonaceous chondrite-like with significant contributions of clays, salt, and water ice. As such, the abundance and distribution of boulders on Ceres may be different from that on Vesta. We mapped, counted, and measured the diameter of boulders over the entire surface of Ceres. Our analysis of the data in combination with crater age estimates may provide clues to the physical nature and composition of the surface.

  14. NASA's Dawn Mission to Asteroid 4 Vesta

    NASA Technical Reports Server (NTRS)

    McFadden, Lucyann A.

    2011-01-01

    NASA's Dawn Mission to asteroid 4 Vesta is part of a 13-year robotic space project designed to reveal the nature of two of the largest asteroids in the Main Asteroid Belt of our Solar System. Ceres and Vesta are two complementary terrestrial protoplanets whose accretion was probably terminated by the formation of Jupiter. They provide a bridge in our understanding between the rocky bodies of the inner solar system and the icy bodies of the outer solar system. Ceres appears to be undifferentiated Vesta has experienced significant heating and likely differentiation. Both formed very early in history of the solar system and while suffering many impacts have remained intact, thereby retaining a record of events and processes from the time of planet formation. Detailed study of the geophysics and geochemistry of these two bodies provides critical benchmarks for early solar system conditions and processes that shaped its subsequent evolution. Dawn provides the missing context for both primitive and evolved meteoritic data, thus playing a central role in understanding terrestrial planet formation and the evolution of the asteroid belt. Dawn is to he launched in 2006 arriving at Vesta in 20l0 and Ceres in 2014, stopping at each to make 11 months of orbital measurements. The spacecraft uses solar electric propulsion, both in cruise and in orbit, to make most efficient use of its xenon propellant. The spacecraft carries a framing camera, visible and infrared mapping spectrometer, gamma ray/neutron magnetometer, and radio science.

  15. Evidence for ground-ice occurrence on asteroid Vesta using Dawn bistatic radar observations

    NASA Astrophysics Data System (ADS)

    Palmer, E. M.; Heggy, E.; Kofman, W. W.

    2017-12-01

    From 2011 to 2012, the Dawn spacecraft orbited asteroid Vesta, the first of its two targets in the asteroid belt, and conducted the first bistatic radar (BSR) experiment at a small-body, during which Dawn's high-gain communications antenna is used to transmit radar waves that scatter from Vesta's surface toward Earth at high incidence angles just before and after occultation of the spacecraft behind the asteroid. Among the 14 observed mid-latitude forward-scatter reflections, the radar cross section ranges from 84 ± 8 km2 (near Saturnalia Fossae) to 3,588 ± 200 km2 (northwest of Caparronia crater), implying substantial spatial variation in centimeter- to decimeter-scale surface roughness. The compared distributions of surface roughness and subsurface hydrogen concentration [H]—measured using data from Dawn's BSR experiment and Gamma Ray and Neutron Spectrometer (GRaND), respectively—reveal the occurrence of heightened subsurface [H] with smoother terrains that cover tens of square kilometers. Furthermore, unlike on the Moon, we observe no correlation between surface roughness and surface ages on Vesta—whether the latter is derived from lunar or asteroid-flux chronology [Williams et al., 2014]—suggesting that cratering processes alone are insufficient to explain Vesta's surface texture at centimeter-to-decimeter scales. Dawn's BSR observations support the hypothesis of transient melting, runoff and recrystallization of potential ground-ice deposits, which are postulated to flow along fractures after an impact, and provide a mechanism for the smoothing of otherwise rough, fragmented impact ejecta. Potential ground-ice presence within Vesta's subsurface was first proposed by Scully et al. [2014], who identified geomorphological evidence for transient water flow along several of Vesta's crater walls using Dawn Framing Camera images. While airless, differentiated bodies such as Vesta and the Moon are thought to have depleted their initial volatile content during the process of differentiation, evidence to the contrary is continuing to change our understanding of the distribution and preservation of volatiles during planetary formation in the early solar system.

  16. Preliminary Iron Distribution on Vesta

    NASA Technical Reports Server (NTRS)

    Mittlefehldt, David W.; Mittlefehldt, David W.

    2013-01-01

    The distribution of iron on the surface of the asteroid Vesta was investigated using Dawn's Gamma Ray and Neutron Detector (GRaND) [1,2]. Iron varies predictably with rock type for the howardite, eucrite, and diogenite (HED) meteorites, thought to be representative of Vesta. The abundance of Fe in howardites ranges from about 12 to 15 wt.%. Basaltic eucrites have the highest abundance, whereas, lower crustal and upper mantle materials (cumulate eucrites and diogenites) have the lowest, and howardites are intermediate [3]. We have completed a mapping study of 7.6 MeV gamma rays produced by neutron capture by Fe as measured by the bismuth germanate (BGO) detector of GRaND [1]. The procedures to determine Fe counting rates are presented in detail here, along with a preliminary distribution map, constituting the necessary initial step to quantification of Fe abundances. We find that the global distribution of Fe counting rates is generally consistent with independent mineralogical and compositional inferences obtained by other instruments on Dawn such as measurements of pyroxene absorption bands by the Visual and Infrared Spectrometer (VIR) [4] and Framing Camera (FC) [5] and neutron absorption measurements by GRaND [6].

  17. Geologic Mapping of Ejecta Deposits in Oppia Quadrangle, Asteroid (4) Vesta

    NASA Technical Reports Server (NTRS)

    Garry, W. Brent; Williams, David A.; Yingst, R. Aileen; Mest, Scott C.; Buczkowski, Debra L.; Tosi, Federico; Schafer, Michael; LeCorre, Lucille; Reddy, Vishnu; Jaumann, Ralf; hide

    2014-01-01

    Oppia Quadrangle Av-10 (288-360 deg E, +/- 22 deg) is a junction of key geologic features that preserve a rough history of Asteroid (4) Vesta and serves as a case study of using geologic mapping to define a relative geologic timescale. Clear filter images, stereo-derived topography, slope maps, and multispectral color-ratio images from the Framing Camera on NASA's Dawn spacecraft served as basemaps to create a geologic map and investigate the spatial and temporal relationships of the local stratigraphy. Geologic mapping reveals the oldest map unit within Av-10 is the cratered highlands terrain which possibly represents original crustal material on Vesta that was then excavated by one or more impacts to form the basin Feralia Planitia. Saturnalia Fossae and Divalia Fossae ridge and trough terrains intersect the wall of Feralia Planitia indicating that this impact basin is older than both the Veneneia and Rheasilvia impact structures, representing Pre-Veneneian crustal material. Two of the youngest geologic features in Av-10 are Lepida (approximately 45 km diameter) and Oppia (approximately 40 km diameter) impact craters that formed on the northern and southern wall of Feralia Planitia and each cross-cuts a trough terrain. The ejecta blanket of Oppia is mapped as 'dark mantle' material because it appears dark orange in the Framing Camera 'Clementine-type' colorratio image and has a diffuse, gradational contact distributed to the south across the rim of Rheasilvia. Mapping of surface material that appears light orange in color in the Framing Camera 'Clementine-type' color-ratio image as 'light mantle material' supports previous interpretations of an impact ejecta origin. Some light mantle deposits are easily traced to nearby source craters, but other deposits may represent distal ejecta deposits (emplaced greater than 5 crater radii away) in a microgravity environment.

  18. Hanami Planum on Ceres

    NASA Image and Video Library

    2018-03-22

    This image from NASA's Dawn spacecraft showing the northern part of Hanami Planum on Ceres honors the Japanese cherry blossom festival, or "Hanami," which is a long-standing Japanese tradition of welcoming spring. Hanami Planum is the third largest geological feature on Ceres, after Vendimia Planitia and the Samhain Catenae. It extends over 345 miles (555 kilometers). This image shows familiar features, such as Occator Crater, characterized both by bright material inside the crater and dark ejecta material outside. Several parallel linear features, called Junina Catenae, can be seen departing from Occator and extending toward the top of the image. These catenae are chains of small craters formed by the impact and scouring of material ejected when large craters are formed. Scientists were able to relate these crater chains to Urvara and Yalode. Even though these are located in the southern hemisphere, some of their ejecta could reach the northern hemisphere, thanks to Ceres' fast rotation and small size. This image was obtained by Dawn on June 15, 2015. The spacecraft was then in its survey orbit (2,700 miles, or 4,400 kilometers high), when the footprint of Dawns framing camera on Ceres surface was about 260 miles (420 kilometers). The resolution is 1,400 feet (410 meters) per pixel. The central coordinates of the picture are 14 degrees north latitude, 213 degrees east in longitude. https://photojournal.jpl.nasa.gov/catalog/PIA21921

  19. Cryovolcanism on Ceres

    NASA Astrophysics Data System (ADS)

    Ruesch, O.; Platz, T.; Schenk, P.; McFadden, L. A.; Castillo-Rogez, J. C.; Quick, L. C.; Byrne, S.; Preusker, F.; O'Brien, D. P.; Schmedemann, N.; Williams, D. A.; Li, J.-Y.; Bland, M. T.; Hiesinger, H.; Kneissl, T.; Neesemann, A.; Schaefer, M.; Pasckert, J. H.; Schmidt, B. E.; Buczkowski, D. L.; Sykes, M. V.; Nathues, A.; Roatsch, T.; Hoffmann, M.; Raymond, C. A.; Russell, C. T.

    2016-09-01

    Volcanic edifices are abundant on rocky bodies of the inner solar system. In the cold outer solar system, volcanism can occur on solid bodies with a water-ice shell, but derived cryovolcanic constructs have proved elusive. We report the discovery, using Dawn Framing Camera images, of a landform on dwarf planet Ceres that we argue represents a viscous cryovolcanic dome. Parent material of the cryomagma is a mixture of secondary minerals, including salts and water ice. Absolute model ages from impact craters reveal that extrusion of the dome has occurred recently. Ceres’ evolution must have been able to sustain recent interior activity and associated surface expressions. We propose salts with low eutectic temperatures and thermal conductivities as key drivers for Ceres’ long-term internal evolution.

  20. Visible Color and Photometry of Bright Materials on Vesta

    NASA Technical Reports Server (NTRS)

    Schroder, S. E.; Li, J. Y.; Mittlefehldt, D. W.; Pieters, C. M.; De Sanctis, M. C.; Hiesinger, H.; Blewett, D. T.; Russell, C. T.; Raymond, C. A.; Keller, H. U.

    2012-01-01

    The Dawn Framing Camera (FC) collected images of the surface of Vesta at a pixel scale of 70 m in the High Altitude Mapping Orbit (HAMO) phase through its clear and seven color filters spanning from 430 nm to 980 nm. The surface of Vesta displays a large diversity in its brightness and colors, evidently related to the diverse geology [1] and mineralogy [2]. Here we report a detailed investigation of the visible colors and photometric properties of the apparently bright materials on Vesta in order to study their origin. The global distribution and the spectroscopy of bright materials are discussed in companion papers [3, 4], and the synthesis results about the origin of Vestan bright materials are reported in [5].

  1. The geomorphology of Ceres

    USGS Publications Warehouse

    Buczkowski, D.L.; Schmidt, B.E.; Williams, D.A.; Mest, S.C.; Scully, J.E.C.; Ermakov, A.; Preusker, F.; Schenk, P.; Otto, K. A.; Hiesinger, H.; O'Brien, D.; Marchi, S.; Sizemore, H.G.; Hughson, K.; Chilton, H.; Bland, M.; Byrne, S.; Schorghofer, N.; Platz, T.; Jaumann, R.; Roatsch, T.; Sykes, M. V.; Nathues, A.; De Sanctis, M.C.; Raymond, C.A.; Russell, C.T.

    2016-01-01

    Analysis of Dawn spacecraft Framing Camera image data allows evaluation of the topography and geomorphology of features on the surface of Ceres. The dwarf planet is dominated by numerous craters, but other features are also common. Linear structures include both those associated with impact craters and those that do not appear to have any correlation to an impact event. Abundant lobate flows are identified, and numerous domical features are found at a range of scales. Features suggestive of near-surface ice, cryomagmatism, and cryovolcanism have been identified. Although spectroscopic analysis has currently detected surface water ice at only one location on Ceres, the identification of these potentially ice-related features suggests that there may be at least some ice in localized regions in the crust.

  2. Cryovolcanism on Ceres

    USGS Publications Warehouse

    Ruesch, O.; Platz, T.; Schenk, P.; McFadden, L.A.; Castillo-Rogez, J. C.; Quick, L. C.; Byrne, S.; Preusker, F.; O'Brien, D. P.; Schmedemann, N.; Williams, D.A.; Li, Jian-Yang; Bland, M. T.; Hiesinger, H.; Kneissl, T.; Neesemann, A.; Schaefer, M.; Pasckert, J. H.; Schmidt, B.E.; Buczkowski, D.L.; Sykes, M. V.; Nathues, A.; Roatsch, T.; Hoffman, M.; Raymond, C.A.; Russell, C.T.

    2016-01-01

    Volcanic edifices are abundant on rocky bodies of the inner solar system. In the cold outer solar system, volcanism can occur on solid bodies with a water-ice shell, but derived cryovolcanic constructs have proved elusive. We report the discovery using Dawn Framing Camera images of a landform on dwarf planet Ceres, which we argue represents a viscous cryovolcanic dome. Parent material of the cryomagma is a mixture of secondary minerals, including salts and water ice. Absolute model ages from impact craters reveal that extrusion of the dome has occurred recently. Ceres’ evolution must have been able to sustain recent interior activity and associated surface expressions. We propose salts with low eutectic temperatures and thermal conductivities as key drivers for Ceres’ long-term internal evolution.

  3. Urvara and Yalode: Giant Craters on Ceres

    NASA Image and Video Library

    2018-02-22

    This image from NASA's Dawn spacecraft shows the large craters Urvara (top) and Yalode (bottom) on dwarf planet Ceres. These features are so big that they must be observed from high altitudes in order to fit in the frame of a single image. Urvara is (101 miles (163 kilometers) in diameter, while Yalode is 162 miles (260 kilometers) in diameter. The two giant craters were formed at different times. Yalode is almost 1 billion years older than Urvara, which is about 120 million to 140 million years old. Yalode's relatively smooth floor indicates Ceres' crust material became close to -- or even reached -- the melting temperature of ice as a consequence of the heat generated by the impact. On the other hand, the smaller Urvara has rougher terrain. This suggests Urvara had either a lower temperature increase from the impact, or a colder crust temperature at the time of the crater's formation, or a combination of the two. Indeed, Ceres' interior was warmer in the past, and has been slowly cooling as its supply of radioactive isotopes, whose decay represents Ceres' main heat source, has been decreasing over time. This picture also reveals geological details such, as the feature Nar Sulcus inside Yalode and a central peak in Urvara. Urvara is named after the Indian and Iranian deity of plants and fields. Yalode is named for the Dahomey goddess, worshipped by women at the harvest rites. This image was obtained by NASA's Dawn spacecraft on June 9, 2015. The spacecraft was then in its survey orbit (2,700 miles, 4,400 kilometers above the surface), when the footprint of Dawn's framing camera on Ceres' surface was about 260 miles (420 kilometers) across on Ceres' surface. The resolution is 1,400 feet (410 meters) per pixel. The central coordinates of the picture are 43 degrees south latitude, 278 degrees east in longitude. https://photojournal.jpl.nasa.gov/catalog/PIA21917

  4. Composition and structure of the shallow subsurface of Ceres revealed by crater morphology

    NASA Astrophysics Data System (ADS)

    Bland, Michael T.; Raymond, Carol A.; Schenk, Paul M.; Fu, Roger R.; Kneissl, Thomas; Pasckert, Jan Hendrik; Hiesinger, Harry; Preusker, Frank; Park, Ryan S.; Marchi, Simone; King, Scott D.; Castillo-Rogez, Julie C.; Russell, Christopher T.

    2016-07-01

    Before NASA’s Dawn mission, the dwarf planet Ceres was widely believed to contain a substantial ice-rich layer below its rocky surface. The existence of such a layer has significant implications for Ceres’s formation, evolution, and astrobiological potential. Ceres is warmer than icy worlds in the outer Solar System and, if its shallow subsurface is ice-rich, large impact craters are expected to be erased by viscous flow on short geologic timescales. Here we use digital terrain models derived from Dawn Framing Camera images to show that most of Ceres’s largest craters are several kilometres deep, and are therefore inconsistent with the existence of an ice-rich subsurface. We further show from numerical simulations that the absence of viscous relaxation over billion-year timescales implies a subsurface viscosity that is at least one thousand times greater than that of pure water ice. We conclude that Ceres’s shallow subsurface is no more than 30% to 40% ice by volume, with a mixture of rock, salts and/or clathrates accounting for the other 60% to 70%. However, several anomalously shallow craters are consistent with limited viscous relaxation and may indicate spatial variations in subsurface ice content.

  5. Geomorphological evidence for ground ice on dwarf planet Ceres

    USGS Publications Warehouse

    Schmidt, Britney E.; Hughson, Kynan H.G.; Chilton, Heather T.; Scully, Jennifer E. C.; Platz, Thomas; Nathues, Andreas; Sizemore, Hanna; Bland, Michael T.; Byrne, Shane; Marchi, Simone; O'Brien, David; Schorghofer, Norbert; Hiesinger, Harald; Jaumann, Ralf; Hendrick Pasckert, Jan; Lawrence, Justin D.; Buzckowski, Debra; Castillo-Rogez, Julie C.; Sykes, Mark V.; Schenk, Paul M.; DeSanctis, Maria-Cristina; Mitri, Giuseppe; Formisano, Michelangelo; Li, Jian-Yang; Reddy, Vishnu; Le Corre, Lucille; Russell, Christopher T.; Raymond, Carol A.

    2017-01-01

    Five decades of observations of Ceres suggest that the dwarf planet has a composition similar to carbonaceous meteorites and may have an ice-rich outer shell protected by a silicate layer. NASA’s Dawn spacecraft has detected ubiquitous clays, carbonates and other products of aqueous alteration across the surface of Ceres, but surprisingly it has directly observed water ice in only a few areas. Here we use Dawn Framing Camera observations to analyse lobate morphologies on Ceres’ surface and we infer the presence of ice in the upper few kilometres of Ceres. We identify three distinct lobate morphologies that we interpret as surface flows: thick tongue-shaped, furrowed flows on steep slopes; thin, spatulate flows on shallow slopes; and cuspate sheeted flows that appear fluidized. The shapes and aspect ratios of these flows are different from those of dry landslides—including those on ice-poor Vesta—but are morphologically similar to ice-rich flows on other bodies, indicating the involvement of ice. Based on the geomorphology and poleward increase in prevalence of these flows, we suggest that the shallow subsurface of Ceres is comprised of mixtures of silicates and ice, and that ice is most abundant near the poles.

  6. Composition and structure of the shallow subsurface of Ceres revealed by crater morphology

    USGS Publications Warehouse

    Bland, Michael T.; Carol A. Raymond,; Schenk, Paul M.; Roger R. Fu,; Thomas Kneisl,; Hendrick Pasckert, Jan; Hiesinger, Harald; Frank Preusker,; Ryan S. Park,; Simone Marchi,; Scott King,; Castillo-Rogez, Julie C.; Christopher T. Russell,

    2016-01-01

    Before NASA’s Dawn mission, the dwarf planet Ceres was widely believed to contain a substantial ice-rich layer below its rocky surface. The existence of such a layer has significant implications for Ceres’s formation, evolution, and astrobiological potential. Ceres is warmer than icy worlds in the outer Solar System and, if its shallow subsurface is ice-rich, large impact craters are expected to be erased by viscous flow on short geologic timescales. Here we use digital terrain models derived from Dawn Framing Camera images to show that most of Ceres’s largest craters are several kilometres deep, and are therefore inconsistent with the existence of an ice-rich subsurface. We further show from numerical simulations that the absence of viscous relaxation over billion-year timescales implies a subsurface viscosity that is at least one thousand times greater than that of pure water ice. We conclude that Ceres’s shallow subsurface is no more than 30% to 40% ice by volume, with a mixture of rock, salts and/or clathrates accounting for the other 60% to 70%. However, several anomalously shallow craters are consistent with limited viscous relaxation and may indicate spatial variations in subsurface ice content.

  7. Occator in False Color

    NASA Image and Video Library

    2015-12-09

    This representation of Ceres' Occator Crater in false colors shows differences in the surface composition. Red corresponds to a wavelength range around 0.97 micrometers (near infrared), green to a wavelength range around 0.75 micrometers (red, visible light) and blue to a wavelength range of around 0.44 micrometers (blue, visible light). Occator measures about 60 miles (90 kilometers) wide. Scientists use false color to examine differences in surface materials. The color blue on Ceres is generally associated with bright material, found in more than 130 locations, and seems to be consistent with salts, such as sulfates. It is likely that silicate materials are also present. The images were obtained by the framing camera on NASA's Dawn spacecraft from a distance of about 2,700 miles (4,400 kilometers). http://photojournal.jpl.nasa.gov/catalog/PIA20180

  8. The geomorphology of Ceres.

    PubMed

    Buczkowski, D L; Schmidt, B E; Williams, D A; Mest, S C; Scully, J E C; Ermakov, A I; Preusker, F; Schenk, P; Otto, K A; Hiesinger, H; O'Brien, D; Marchi, S; Sizemore, H; Hughson, K; Chilton, H; Bland, M; Byrne, S; Schorghofer, N; Platz, T; Jaumann, R; Roatsch, T; Sykes, M V; Nathues, A; De Sanctis, M C; Raymond, C A; Russell, C T

    2016-09-02

    Analysis of Dawn spacecraft Framing Camera image data allows evaluation of the topography and geomorphology of features on the surface of Ceres. The dwarf planet is dominated by numerous craters, but other features are also common. Linear structures include both those associated with impact craters and those that do not appear to have any correlation to an impact event. Abundant lobate flows are identified, and numerous domical features are found at a range of scales. Features suggestive of near-surface ice, cryomagmatism, and cryovolcanism have been identified. Although spectroscopic analysis has currently detected surface water ice at only one location on Ceres, the identification of these potentially ice-related features suggests that there may be at least some ice in localized regions in the crust. Copyright © 2016, American Association for the Advancement of Science.

  9. Imaging Asteroid 4 Vesta Using the Framing Camera

    NASA Technical Reports Server (NTRS)

    Keller, H. Uwe; Nathues, Andreas; Coradini, Angioletta; Jaumann, Ralf; Jorda, Laurent; Li, Jian-Yang; Mittlefehldt, David W.; Mottola, Stefano; Raymond, C. A.; Schroeder, Stefan E.

    2011-01-01

    The Framing Camera (FC) onboard the Dawn spacecraft serves a dual purpose. Next to its central role as a prime science instrument it is also used for the complex navigation of the ion drive spacecraft. The CCD detector with 1024 by 1024 pixels provides the stability for a multiyear mission and its high requirements of photometric accuracy over the wavelength band from 400 to 1000 nm covered by 7 band-pass filters. Vesta will be observed from 3 orbit stages with image scales of 227, 63, and 17 m/px, respectively. The mapping of Vesta s surface with medium resolution will be only completed during the exit phase when the north pole will be illuminated. A detailed pointing strategy will cover the surface at least twice at similar phase angles to provide stereo views for reconstruction of the topography. During approach the phase function of Vesta was determined over a range of angles not accessible from earth. This is the first step in deriving the photometric function of the surface. Combining the topography based on stereo tie points with the photometry in an iterative procedure will disclose details of the surface morphology at considerably smaller scales than the pixel scale. The 7 color filters are well positioned to provide information on the spectral slope in the visible, the depth of the strong pyroxene absorption band, and their variability over the surface. Cross calibration with the VIR spectrometer that extends into the near IR will provide detailed maps of Vesta s surface mineralogy and physical properties. Georeferencing all these observation will result in a coherent and unique data set. During Dawn s approach and capture FC has already demonstrated its performance. The strong variation observed by the Hubble Space Telescope can now be correlated with surface units and features. We will report on results obtained from images taken during survey mode covering the whole illuminated surface. Vesta is a planet-like differentiated body, but its surface gravity and escape velocity are comparable to those of other asteroids and hence much smaller than those of the inner planets or

  10. Thermal stability of water ice in Ceres' crater Oxo

    NASA Astrophysics Data System (ADS)

    Formisano, Michelangelo; Federico, Costanzo; De Sanctis, Maria Cristina; Frigeri, Alessandro; Magni, Gianfranco; Tosi, Federico

    2016-10-01

    Dwarf planet Ceres, target of the NASA Dawn mission, exhibits evidences of ammoniated phyllosilicates on its surface [1], compatible with a likely outer Solar System origin. Considerable amounts of water ice have recently been detected in some craters by the Visible InfraRed mapping spectrometer (VIR) onboard Dawn in some small fresh crater, such as Oxo, located at about 40° N. The exposure mechanism of water ice is unknown: cryovolcanism, cometary type sublimation/recondensation [2]or impacts with other bodies are likely mechanisms. The evaluation of the time stability of the water ice is crucial to understand the plausible mechanism for its existence. For this purpose, we developed a 3D finite-elements model (FEM) by using the topography given by the shape model of Ceres derived on the basis of images acquired by the Framing Camera in the Survey mission phase. The illumination conditions are provided by the SPICE toolkit. We performed several simulations by analyzing the effect of thermal inertia and albedo on the temperature and rate of ice sublimation. The results of the simulations about the stability of water ice will be presented.[1] De Sanctis et al. NATURE, doi:10.1038/nature16172[2] Formisano et al. MNRAS, doi: 10.1093/mnras/stv2344

  11. High resolution Ceres HAMO atlas derived from Dawn FC images

    NASA Astrophysics Data System (ADS)

    Roatsch, Thomas; Kersten, Elke; Matz, Klaus-Dieter; Preusker, Frank; Scholten, Frank; Jaumann, Ralf; Raymond, Carol A.; Russell, Chris T.

    2016-04-01

    Introduction: NASA's Dawn spacecraft entered the orbit of dwarf planet Ceres in March 2015, and will characterize the geology, elemental and mineralogical composition, topography, shape, and internal structure of Ceres. One of the major goals of the mission is a global mapping of Ceres. Data: The Dawn mission was mapping Ceres in HAMO (High Altitude Mapping Orbit, 1475 km altitude) between August and October 2015. The framing camera took about 2,600 clear filter images with a resolution of about 140 m/pixel during these cycles. The images were taken with different viewing angles and different illumination conditions. We selected images from one cycle (cycle #1) for the mosaicking process to have similar viewing and illumination conditions. Very minor gaps in the coverage were filled with a few images from cycle #2. Data Processing: The first step of the processing chain towards the cartographic products is to ortho-rectify the images to the proper scale and map projec-tion type. This process requires detailed information of the Dawn orbit and attitude data and of the topography of the targets. Both, improved orientation and a high-resolution shape model, are provided by stereo processing (bundle block adjustment) of the HAMO stereo image dataset [3]. Ceres's HAMO shape model was used for the calculation of the ray intersection points while the map projection itself was done onto the reference sphere of Ceres with a radius of 470 km. The final step is the controlled mosaicking) of all images to a global mosaic of Ceres, the so-called basemap. Ceres map tiles: The Ceres atlas was produced in a scale of 1:750,000 and consists of 15 tiles that conform to the quadrangle scheme proposed by Greeley and Batson [4]. A map scale of 1:750,000 guarantees a mapping at the highest available Dawn resolution in HAMO. The individual tiles were extracted from the global mosaic and reprojected. Nomenclature: The Dawn team proposed 81 names for geological features. By international agreement, craters must be named after gods and goddesses of agriculture and vegetation from world mythology, whereas other geological features must be named after agricultural festivals of the world. The nomenclature proposed by the Dawn team was approved by the IAU [http://planetarynames.wr.usgs.gov/] and is shown in Fig. 1. The entire Ceres HAMO atlas will be available to the public through the Dawn GIS web page [http://dawngis.dlr.de/atlas]. References: [1] Russell, C.T. and Raymond, C.A., Space Sci. Rev., 163, DOI 10.1007/s11214-011-9836-2; [2] Sierks, et al., 2011, Space Sci. Rev., 163, DOI 10.1007/s11214-011-9745-4; [3] Preusker, F. et al., this session; [4] Greeley, R. and Batson, G., 1990, Planetary Mapping, Cambridge University Press.

  12. Solid-state framing camera with multiple time frames

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, K. L.; Stewart, R. E.; Steele, P. T.

    2013-10-07

    A high speed solid-state framing camera has been developed which can operate over a wide range of photon energies. This camera measures the two-dimensional spatial profile of the flux incident on a cadmium selenide semiconductor at multiple times. This multi-frame camera has been tested at 3.1 eV and 4.5 keV. The framing camera currently records two frames with a temporal separation between the frames of 5 ps but this separation can be varied between hundreds of femtoseconds up to nanoseconds and the number of frames can be increased by angularly multiplexing the probe beam onto the cadmium selenide semiconductor.

  13. High-frame-rate infrared and visible cameras for test range instrumentation

    NASA Astrophysics Data System (ADS)

    Ambrose, Joseph G.; King, B.; Tower, John R.; Hughes, Gary W.; Levine, Peter A.; Villani, Thomas S.; Esposito, Benjamin J.; Davis, Timothy J.; O'Mara, K.; Sjursen, W.; McCaffrey, Nathaniel J.; Pantuso, Francis P.

    1995-09-01

    Field deployable, high frame rate camera systems have been developed to support the test and evaluation activities at the White Sands Missile Range. The infrared cameras employ a 640 by 480 format PtSi focal plane array (FPA). The visible cameras employ a 1024 by 1024 format backside illuminated CCD. The monolithic, MOS architecture of the PtSi FPA supports commandable frame rate, frame size, and integration time. The infrared cameras provide 3 - 5 micron thermal imaging in selectable modes from 30 Hz frame rate, 640 by 480 frame size, 33 ms integration time to 300 Hz frame rate, 133 by 142 frame size, 1 ms integration time. The infrared cameras employ a 500 mm, f/1.7 lens. Video outputs are 12-bit digital video and RS170 analog video with histogram-based contrast enhancement. The 1024 by 1024 format CCD has a 32-port, split-frame transfer architecture. The visible cameras exploit this architecture to provide selectable modes from 30 Hz frame rate, 1024 by 1024 frame size, 32 ms integration time to 300 Hz frame rate, 1024 by 1024 frame size (with 2:1 vertical binning), 0.5 ms integration time. The visible cameras employ a 500 mm, f/4 lens, with integration time controlled by an electro-optical shutter. Video outputs are RS170 analog video (512 by 480 pixels), and 12-bit digital video.

  14. Constraints on Ceres' Internal Structure and Evolution From Its Shape and Gravity Measured by the Dawn Spacecraft

    NASA Astrophysics Data System (ADS)

    Ermakov, A. I.; Fu, R. R.; Castillo-Rogez, J. C.; Raymond, C. A.; Park, R. S.; Preusker, F.; Russell, C. T.; Smith, D. E.; Zuber, M. T.

    2017-11-01

    Ceres is the largest body in the asteroid belt with a radius of approximately 470 km. In part due to its large mass, Ceres more closely approaches hydrostatic equilibrium than major asteroids. Pre-Dawn mission shape observations of Ceres revealed a shape consistent with a hydrostatic ellipsoid of revolution. The Dawn spacecraft Framing Camera has been imaging Ceres since March 2015, which has led to high-resolution shape models of the dwarf planet, while the gravity field has been globally determined to a spherical harmonic degree 14 (equivalent to a spatial wavelength of 211 km) and locally to 18 (a wavelength of 164 km). We use these shape and gravity models to constrain Ceres' internal structure. We find a negative correlation and admittance between topography and gravity at degree 2 and order 2. Low admittances between spherical harmonic degrees 3 and 16 are well explained by Airy isostatic compensation mechanism. Different models of isostasy give crustal densities between 1,200 and 1,400 kg/m3 with our preferred model giving a crustal density of 1,287+70-87 kg/m3. The mantle density is constrained to be 2,434+5-8 kg/m3. We compute isostatic gravity anomaly and find evidence for mascon-like structures in the two biggest basins. The topographic power spectrum of Ceres and its latitude dependence suggest that viscous relaxation occurred at the long wavelengths (>246 km). Our density constraints combined with finite element modeling of viscous relaxation suggests that the rheology and density of the shallow surface are most consistent with a rock, ice, salt and clathrate mixture.

  15. Preliminary Geological Map of the Ac-H-8 Nawish Quadrangle of Ceres: An Integrated Mapping Study Using Dawn Spacecraft Data

    NASA Astrophysics Data System (ADS)

    Frigeri, A.; De Sanctis, M. C.; Carrorro, F. G.; Ammannito, E.; Williams, D. A.; Mest, S. C.; Buczkowski, D.; Preusker, F.; Jaumann, R.; Roatsch, T.; Scully, J. E. C.; Raymond, C. A.; Russell, C. T.

    2015-12-01

    Herein we present the geologic mapping of the Ac-H-8 Nawish Quadrangle of dwarf planet Ceres, produced on the basis of the Dawn spacecraft data. The Ac-H-08 Nawish quadrangle is located between -22°S and 22°N and between 144°E and 216°E. At the north-east border, a polygonal, 75km-wide crater named Nawish gives the name to the whole quadrangle. An unamed, partially degraded, 100km-diameter crater is evident in the lower central sector of the quadrangle. Bright materials have been mapped and are associated with craters. For example, bright materials occur in the central peak region of Nawish crater and in the ejecta of an unnamed crater, which is located in the nearby quadrangle Ac-H-09. The topography of the area obtained from stereo-processing of imagery shows an highland in the middle of the quadrangle. Topography is lower in the northern and southern borders, with a altitude span of about 9500 meters. At the time of this writing geologic mapping was performed on Framing Camera (FC) mosaics from the Approach (1.3 km/px) and Survey (415 m/px) orbits, including grayscale and color images and digital terrain models derived from stereo images. In Fall 2015 images from the High Altitude Mapping Orbit (140 m/px) will be used to refine the mapping, followed by Low Altitude Mapping Orbit (35 m/px) images in January 2016. Support of the Dawn Instrument, Operations, and Science Teams is acknowledged. This work is supported by grants from NASA, and from the German and Italian Space Agencies.

  16. KSC-07pd2429

    NASA Image and Video Library

    2007-09-11

    KENNEDY SPACE CENTER, FLA. -- On Launch Pad 17-B at Cape Canaveral Air Force Station, the Dawn spacecraft is moved toward the opening above the Delta II rocket in the mobile service tower. Dawn will be mated with the Delta in preparation for launch. Dawn is scheduled for launch in a window from 7:25 to 7:54 a.m. Sept. 26 from CCAFS. During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Photo credit: NASA/Jack Pfaller

  17. KSC-07pd2430

    NASA Image and Video Library

    2007-09-11

    KENNEDY SPACE CENTER, FLA. -- On Launch Pad 17-B at Cape Canaveral Air Force Station, the Dawn spacecraft is lowered toward the awaiting Delta II rocket in the mobile service tower. Dawn will be mated with the Delta in preparation for launch. Dawn is scheduled for launch in a window from 7:25 to 7:54 a.m. Sept. 26 from CCAFS. During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Photo credit: NASA/Jack Pfaller

  18. KSC-07pd2427

    NASA Image and Video Library

    2007-09-11

    KENNEDY SPACE CENTER, FLA. -- On Launch Pad 17-B at Cape Canaveral Air Force Station, the Dawn spacecraft is lifted alongside the mobile service tower. At the top, Dawn will be prepared for mating with the awaiting Delta II rocket. Dawn is scheduled for launch in a window from 7:25 to 7:54 a.m. Sept. 26 from CCAFS. During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Photo credit: NASA/Jack Pfaller

  19. KSC-07pd2431

    NASA Image and Video Library

    2007-09-11

    KENNEDY SPACE CENTER, FLA. -- On Launch Pad 17-B at Cape Canaveral Air Force Station, workers in the mobile service tower keep watch as the Dawn spacecraft is lowered toward the awaiting Delta II rocket. Dawn will be mated with the Delta in preparation for launch. Dawn is scheduled for launch in a window from 7:25 to 7:54 a.m. Sept. 26 from CCAFS. During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Photo credit: NASA/Jack Pfaller

  20. KSC-07pd2438

    NASA Image and Video Library

    2007-09-11

    KENNEDY SPACE CENTER, FLA. -- In the mobile service tower on Launch Pad 17-B at Cape Canaveral Air Force Station, workers remove the transportation canister from around the Dawn spacecraft. After removal of the canister, Dawn will be mated with the waiting Delta II rocket. Dawn is scheduled for launch in a window from 7:25 to 7:54 a.m. EDT Sept. 26 from CCAFS. During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Photo credit: NASA/Jim Grossmann

  1. KSC-07pd2426

    NASA Image and Video Library

    2007-09-11

    KENNEDY SPACE CENTER, FLA. -- On Launch Pad 17-B at Cape Canaveral Air Force Station, the Dawn spacecraft is lifted alongside the mobile service tower. At the top, Dawn will be prepared for mating with the awaiting Delta II rocket. Dawn is scheduled for launch in a window from 7:25 to 7:54 a.m. Sept. 26 from CCAFS. During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Photo credit: NASA/Jack Pfaller

  2. KSC-07pd2424

    NASA Image and Video Library

    2007-09-11

    KENNEDY SPACE CENTER, FLA. -- The Dawn spacecraft arrives on Launch Pad 17-B at Cape Canaveral Air Force Station. At the pad, Dawn will be lifted into the mobile service tower and prepared for mating with the awaiting Delta II rocket. Dawn is scheduled for launch in a window from 7:25 to 7:54 a.m. Sept. 26 from CCAFS. During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Photo credit: NASA/Jack Pfaller

  3. KSC-07pd2442

    NASA Image and Video Library

    2007-09-11

    KENNEDY SPACE CENTER, FLA. -- In the mobile service tower on Launch Pad 17-B at Cape Canaveral Air Force Station, the upper transportation canister is lifted away from the Dawn spacecraft. After removal of the canister, Dawn will be mated with the waiting Delta II rocket. Dawn is scheduled for launch in a window from 7:25 to 7:54 a.m. EDT Sept. 26 from CCAFS. During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Photo credit: NASA/Jim Grossmann

  4. KSC-07pd2425

    NASA Image and Video Library

    2007-09-11

    KENNEDY SPACE CENTER, FLA. -- On Launch Pad 17-B at Cape Canaveral Air Force Station, the Dawn spacecraft is lifted off its transporter. Dawn will be lifted into the mobile service tower and prepared for mating with the awaiting Delta II rocket.Dawn is scheduled for launch in a window from 7:25 to 7:54 a.m. Sept. 26 from CCAFS. During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Photo credit: NASA/Jack Pfaller

  5. Mapping Vesta Mid-Latitude Quadrangle V-12EW: Mapping the Edge of the South Polar Structure

    NASA Astrophysics Data System (ADS)

    Hoogenboom, T.; Schenk, P.; Williams, D. A.; Hiesinger, H.; Garry, W. B.; Yingst, R.; Buczkowski, D.; McCord, T. B.; Jaumann, R.; Pieters, C. M.; Gaskell, R. W.; Neukum, G.; Schmedemann, N.; Marchi, S.; Nathues, A.; Le Corre, L.; Roatsch, T.; Preusker, F.; White, O. L.; DeSanctis, C.; Filacchione, G.; Raymond, C. A.; Russell, C. T.

    2011-12-01

    NASA's Dawn spacecraft arrived at the asteroid 4Vesta on July 15, 2011, and is now collecting imaging, spectroscopic, and elemental abundance data during its one-year orbital mission. As part of the geological analysis of the surface, a series of 15 quadrangle maps are being produced based on Framing Camera images (FC: spatial resolution: ~65 m/pixel) along with Visible & Infrared Spectrometer data (VIR: spatial resolution: ~180 m/pixel) obtained during the High-Altitude Mapping Orbit (HAMO). This poster presentation concentrates on our geologic analysis and mapping of quadrangle V-12EW. This quadrangle is dominated by the arcuate edge of the large 460+ km diameter south polar topographic feature first observed by HST (Thomas et al., 1997). Sparsely cratered, the portion of this feature covered in V-12EW is characterized by arcuate ridges and troughs forming a generalized arcuate pattern. Mapping of this terrain and the transition to areas to the north will be used to test whether this feature has an impact or other (e.g., internal) origin. We are also using FC stereo and VIR images to assess whether their are any compositional differences between this terrain and areas further to the north, and image data to evaluate the distribution and age of young impact craters within the map area. The authors acknowledge the support of the Dawn Science, Instrument and Operations Teams.

  6. Gravity field and shape of Ceres from Dawn

    NASA Astrophysics Data System (ADS)

    Park, Ryan; Konopliv, Alexander; Vaughan, Andrew; Bills, Bruce; Castillo-Rogez, Julie; Ermakov, Anton; Fu, Roger; Raymond, Carol; Russell, Chris; Zuber, Maria

    2017-04-01

    The Dawn gravity science investigation utilizes the DSN radio tracking of the spacecraft and on-board framing camera images to determine the gravity field and global shape of Ceres. The gravity science data collected during Approach, Survey, High-Altitude Mapping Orbit, and Low-Altitude Mapping Orbit phases were processed. The final gravity science solution yielded a degree and order 18 gravity field, called CERES18C, which is globally accurate to degree and order 14. Also, the final Ceres shape using the stereo-photoclinometry method is available with the height uncertainty better than 30 meters. The degree-2 gravity harmonics show that the rotation of Ceres is very nearly about a principal axis. Combining the gravity field and topography gives the bulk density of 2162.6±2.0 kg/m3. The estimated spin pole vector yields RA=(291.42744±0.00022)° and Dec=(66.76065±0.00022)° with the prime meridian and rotation rate of (170.374±0.012)° and (952.1532638±0.0000019)°/day, respectively. The low Bouguer gravity at high topographic areas, and vice versa, indicates that the topography of Ceres is compensated, which can be explained by a low-viscosity layer at depth. Further studies on Ceres interior show that low gravity-topography admittances are consistent with Airy isostasy and finite-element modeling require a decrease of viscosity with depth.

  7. Constraints on Ceres internal strcuture from the Dawn gravity and shape data

    NASA Astrophysics Data System (ADS)

    Ermakov, A.; Zuber, M. T.; Smith, D. E.; Fu, R. R.; Raymond, C. A.; Russell, C. T.; Park, R. S.

    2015-12-01

    Ceres is the largest body in the asteroid belt with a radius of approximately 470 km. It is large enough to attain a shape much closer to hydrostatic equilibrium than major asteroids. Pre-Dawn shape models of Ceres (e.g. Thomas et al., 2005; Carry et al., 2008) revealed that its shape is consistent with a hydrostatic ellipsoid. After the arrival of the Dawn spacecraft in Ceres orbit in March 2015, Framing Camera images were used to construct shape models of Ceres. Meanwhile, radio-tracking data are being used to develop gravity models. We use the Dawn-derived shape and gravity models to constrain Ceres' internal structure. These data for the first time allow estimation of the degree to which Ceres is hydrostatic. Observed non-hydrostatic effects include a 2.1 km triaxiality (difference between the two equatorial axes) as well as an 660-m center-of-mass - center-of-figure offset. The Dawn gravity data from the Survey orbit shows that Ceres has a central density concentration. Second-degree sectorial gravity coefficients are negatively correlated with topography indicating a peculiar interior structure. We compute the relative crustal thickness based on the observed Bouguer anomaly. Hydrostatic models show that Ceres appears more differentiated based on its gravity than on its shape. We expand the Ceres shape in spherical harmonics, observing that the power spectrum of topography deviates from the power law at low degrees (Fig. 1). We interpret the decrease of power at low degrees to be due to viscous relaxation. We suggest that relaxation happens on Ceres but, unlike modeled in Bland (2013), it is important only at the lowest degrees that correspond to scales of several hundreds of km. There are only a few features on Ceres of that size and at least one of them (an impact basin provisionally named Kerwan) appears relaxed. The simplest explanation is that Ceres's outer shell is not pure ice or pure rock but an ice-rock mixture that allows some relaxation at the longest wavelengths. We use the deal.ii finite-element library (Bangerth 2007) to compute relaxed topography spectra. In out future work, we plan to model viscous relaxation to constrain the viscosity profile and thermal evolution.

  8. Lessons from Vesta and Ceres

    NASA Astrophysics Data System (ADS)

    Russell, C. T.; Raymond, C. A.; McSween, H. Y.; Jaumann, R.; DeSanctis, M. C.; Nathues, A.; Prettyman, T.; Capria, M. T.; Pieters, C.; McFadden, L.; Ammannito, E.; Sykes, M. V.; McCord, T. B.; Zuber, M.; Smith, D.; Hoffman, M.; Scully, J. E. C.; Buczkowski, D.

    2014-04-01

    When first discovered, the bodies in the asteroid belt were considered the missing planet(s) between Mars and Jupiter. When their small size and large number become realized, they were deemed to be minor planets and then asteroids. They soon were considered to be simply airless bodies, consisting mostly of rocky material, some having iron cores. When Dawn reached Vesta, this picture was initially largely reinforced by the extensive southern basin and the battered northern hemisphere. A more accurate picture arises, using the color filters of the Framing Camera, in coordination with the nearinfrared spectrometer, revealing a diverse surface with different minerals and processes affecting regions on the surface in various ways. The variegated light and dark material and varying thermal properties indicate a complex surface. The water (OH) content of the surface is far from uniform. Examinations of the floors of Marcia and Cornelia revealed pits, and their crater walls have possibly water-carved gullies. The parent craters appear to have been formed in a wet surface, possibly ice melted in the crater-forming event. Figure 1 shows the latest mosaic of the vestan surface with the currently approved names for the surface features. It had been expected that olivine would be excavated in the southern basin but it was not to be found there. Surprisingly, patches of olivine-rich material were discovered in the north. Doubts arose as to whether a magma ocean hypothesis applies to Vesta, in spite of quantifying the mass of its core, and new ways to explain Vesta's petrogenesis were developed. Closer examination of the surface suggested more interesting scenarios, possible excavation of early volcanic materials, odd craters that seemed impossible to form with simple impacts, and a long ribbon of material stretching diagonally across the surface, possibly originating in the Marcia ejecta blanket. The relative youth of some of these features (ca 50 Ma) suggest Vesta has had planetary processes acting over much of its history and is very much a small terrestrial planet worthy of participating in the comparative planetology that aids our ability to understand these diverse family members. Ceres has yet to be visited by our spacecraft, but it too tells a story of active planetary processes. Ceres does not have meteorites or a family of small ceroids accompanying it in space, so we know little about its origins with any certainty. However, because it is large and has a low density, we believe it accreted late after the short-lived radionucleides had time to decay. It also seems to have continued to devolatize until the present. There were early 1 AU reports from observations at 1 AU of activity that have continued through the recent Herschel plume report. Dawn followed a simple mapping scenario at Vesta with initial low-resolution measurements in a Survey orbit followed by a High-Altitude Mapping Orbit which gave complete stereo imagery and extensive moderate resolution VIR IR and Framing Camera color data. A later Low-Altitude Mapping Orbit provided data on GRaND's elemental composition, gravity and localized high-resolution imagery and spectra. A second HAMO orbit completed the needed stereo data and other data over the northern quadrangles. The same mapping philosophy is planned for Ceres. There will be Survey, HAMO, and LAMO orbits, but once in Ceres orbit, Dawn is not expected to leave. Dawn has sufficient resources to achieve its science objectives but does not carry a large reserve for extended exploration.

  9. KSC-07pd2423

    NASA Image and Video Library

    2007-09-11

    KENNEDY SPACE CENTER, FLA. -- The Dawn spacecraft is moved out of the Astrotech Space Operations facility, on its way to Launch Pad 17-B at Cape Canaveral Air Force Station. At the pad, Dawn will be lifted into the mobile service tower and prepared for mating with the awaiting Delta II rocket. Dawn is scheduled for launch in a window from 7:25 to 7:54 a.m. Sept. 26 from CCAFS. During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Photo credit: NASA/Jack Pfaller

  10. KSC-07pd2444

    NASA Image and Video Library

    2007-09-11

    KENNEDY SPACE CENTER, FLA. -- In the mobile service tower on Launch Pad 17-B at Cape Canaveral Air Force Station, workers remove the lower segments of the transportation canister away from the Dawn spacecraft. After removal of the canister, Dawn will be mated with the waiting Delta II rocket. Dawn is scheduled for launch in a window from 7:25 to 7:54 a.m. EDT Sept. 26 from CCAFS. During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Photo credit: NASA/Jim Grossmann

  11. Multiple Sensor Camera for Enhanced Video Capturing

    NASA Astrophysics Data System (ADS)

    Nagahara, Hajime; Kanki, Yoshinori; Iwai, Yoshio; Yachida, Masahiko

    A resolution of camera has been drastically improved under a current request for high-quality digital images. For example, digital still camera has several mega pixels. Although a video camera has the higher frame-rate, the resolution of a video camera is lower than that of still camera. Thus, the high-resolution is incompatible with the high frame rate of ordinary cameras in market. It is difficult to solve this problem by a single sensor, since it comes from physical limitation of the pixel transfer rate. In this paper, we propose a multi-sensor camera for capturing a resolution and frame-rate enhanced video. Common multi-CCDs camera, such as 3CCD color camera, has same CCD for capturing different spectral information. Our approach is to use different spatio-temporal resolution sensors in a single camera cabinet for capturing higher resolution and frame-rate information separately. We build a prototype camera which can capture high-resolution (2588×1958 pixels, 3.75 fps) and high frame-rate (500×500, 90 fps) videos. We also proposed the calibration method for the camera. As one of the application of the camera, we demonstrate an enhanced video (2128×1952 pixels, 90 fps) generated from the captured videos for showing the utility of the camera.

  12. KSC-07pd2401

    NASA Image and Video Library

    2007-09-10

    KENNEDY SPACE CENTER, FLA. -- At Astrotech Space Operations in Titusville, Fla., workers move the platform with the Dawn spacecraft. They are preparing to install the transportation canister around Dawn for transfer to Launch Pad 17-B at Cape Canaveral Air Force Station (CCAFS). During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Launch via a Delta II rocket is scheduled in a window from 7:25 to 7:54 a.m. Sept. 26 from CCAFS. Photo credit: NASA/Jim Grossmann

  13. KSC-07pd2445

    NASA Image and Video Library

    2007-09-11

    KENNEDY SPACE CENTER, FLA. -- In the mobile service tower on Launch Pad 17-B at Cape Canaveral Air Force Station, the Dawn spacecraft is ready for mating with the waiting Delta II rocket. Dawn is scheduled for launch in a window from 7:25 to 7:54 a.m. EDT Sept. 26 from CCAFS. During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Photo credit: NASA/Jim Grossmann

  14. KSC-07pd2428

    NASA Image and Video Library

    2007-09-11

    KENNEDY SPACE CENTER, FLA. -- On Launch Pad 17-B at Cape Canaveral Air Force Station, the Dawn spacecraft arrives at the upper level of the mobile service tower. It will be moved inside and prepared for mating with the awaiting Delta II rocket. Dawn is scheduled for launch in a window from 7:25 to 7:54 a.m. Sept. 26 from CCAFS. During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Photo credit: NASA/Jack Pfaller

  15. South Melea Planum, By The Dawn's Early Light

    NASA Technical Reports Server (NTRS)

    1999-01-01

    MOC 'sees' by the dawn's early light! This picture was taken over the high southern polar latitudes during the first week of May 1999. The area shown is currently in southern winter darkness. Because sunlight is scattered over the horizon by aerosols--dust and ice particles--suspended in the atmosphere, sufficient light reaches regions within a few degrees of the terminator (the line dividing night and day) to be visible to the Mars Global Surveyor Mars Orbiter Camera (MOC) when the maximum exposure settings are used.

    This image shows a bright, wispy cloud hanging over southern Malea Planum. This cloud would not normally be visible, since it is currently in darkness. At the time this picture was taken, the sun was more than 5.7o below the northern horizon. The scene covers an area 3 kilometers (1.9 miles) wide. Again, the illumination is from the top.

    In this frame, the surface appears a relatively uniform gray. At the time the picture was acquired, the surface was covered with south polar wintertime frost. The highly reflective frost, in fact, may have contributed to the increased visibility of this surface.

    This 'twilight imaging' technique for viewing Mars can only work near the terminator; thus in early May only regions between about 67oS and 74oS were visible in twilight images in the southern hemisphere, and a similar narrow latitude range could be imaged in the northern hemisphere. MOC cannot 'see' in the total darkness of full-borne night.

    Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.

  16. South Melea Planum, By The Dawn's Early Light

    NASA Technical Reports Server (NTRS)

    1999-01-01

    MOC 'sees' by the dawn's early light! This picture was taken over the high southern polar latitudes during the first week of May 1999. The area shown is currently in southern winter darkness. Because sunlight is scattered over the horizon by aerosols--dust and ice particles--suspended in the atmosphere, sufficient light reaches regions within a few degrees of the terminator (the line dividing night and day) to be visible to the Mars Global Surveyor Mars Orbiter Camera (MOC) when the maximum exposure settings are used.

    This picture shows a polygonally-patterned surface on southern Malea Planum. At the time the picture was taken, the sun was more than 4.5o below the northern horizon. The scene covers an area 3 kilometers (1.9 miles) wide, with the illumination from the top of the picture.

    In this frame, the surface appears a relatively uniform gray. At the time the picture was acquired, the surface was covered with south polar wintertime frost. The highly reflective frost, in fact, may have contributed to the increased visibility of this surface.

    This 'twilight imaging' technique for viewing Mars can only work near the terminator; thus in early May only regions between about 67oS and 74oS were visible in twilight images in the southern hemisphere, and a similar narrow latitude range could be imaged in the northern hemisphere. MOC cannot 'see' in the total darkness of full-borne night.

    Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.

  17. Ultra-fast framing camera tube

    DOEpatents

    Kalibjian, Ralph

    1981-01-01

    An electronic framing camera tube features focal plane image dissection and synchronized restoration of the dissected electron line images to form two-dimensional framed images. Ultra-fast framing is performed by first streaking a two-dimensional electron image across a narrow slit, thereby dissecting the two-dimensional electron image into sequential electron line images. The dissected electron line images are then restored into a framed image by a restorer deflector operated synchronously with the dissector deflector. The number of framed images on the tube's viewing screen is equal to the number of dissecting slits in the tube. The distinguishing features of this ultra-fast framing camera tube are the focal plane dissecting slits, and the synchronously-operated restorer deflector which restores the dissected electron line images into a two-dimensional framed image. The framing camera tube can produce image frames having high spatial resolution of optical events in the sub-100 picosecond range.

  18. Solid state replacement of rotating mirror cameras

    NASA Astrophysics Data System (ADS)

    Frank, Alan M.; Bartolick, Joseph M.

    2007-01-01

    Rotating mirror cameras have been the mainstay of mega-frame per second imaging for decades. There is still no electronic camera that can match a film based rotary mirror camera for the combination of frame count, speed, resolution and dynamic range. The rotary mirror cameras are predominantly used in the range of 0.1 to 100 micro-seconds per frame, for 25 to more than a hundred frames. Electron tube gated cameras dominate the sub microsecond regime but are frame count limited. Video cameras are pushing into the microsecond regime but are resolution limited by the high data rates. An all solid state architecture, dubbed 'In-situ Storage Image Sensor' or 'ISIS', by Prof. Goji Etoh has made its first appearance into the market and its evaluation is discussed. Recent work at Lawrence Livermore National Laboratory has concentrated both on evaluation of the presently available technologies and exploring the capabilities of the ISIS architecture. It is clear though there is presently no single chip camera that can simultaneously match the rotary mirror cameras, the ISIS architecture has the potential to approach their performance.

  19. Applying compressive sensing to TEM video: A substantial frame rate increase on any camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, Andrew; Kovarik, Libor; Abellan, Patricia

    One of the main limitations of imaging at high spatial and temporal resolution during in-situ transmission electron microscopy (TEM) experiments is the frame rate of the camera being used to image the dynamic process. While the recent development of direct detectors has provided the hardware to achieve frame rates approaching 0.1 ms, the cameras are expensive and must replace existing detectors. In this paper, we examine the use of coded aperture compressive sensing (CS) methods to increase the frame rate of any camera with simple, low-cost hardware modifications. The coded aperture approach allows multiple sub-frames to be coded and integratedmore » into a single camera frame during the acquisition process, and then extracted upon readout using statistical CS inversion. Here we describe the background of CS and statistical methods in depth and simulate the frame rates and efficiencies for in-situ TEM experiments. Depending on the resolution and signal/noise of the image, it should be possible to increase the speed of any camera by more than an order of magnitude using this approach.« less

  20. Applying compressive sensing to TEM video: A substantial frame rate increase on any camera

    DOE PAGES

    Stevens, Andrew; Kovarik, Libor; Abellan, Patricia; ...

    2015-08-13

    One of the main limitations of imaging at high spatial and temporal resolution during in-situ transmission electron microscopy (TEM) experiments is the frame rate of the camera being used to image the dynamic process. While the recent development of direct detectors has provided the hardware to achieve frame rates approaching 0.1 ms, the cameras are expensive and must replace existing detectors. In this paper, we examine the use of coded aperture compressive sensing (CS) methods to increase the frame rate of any camera with simple, low-cost hardware modifications. The coded aperture approach allows multiple sub-frames to be coded and integratedmore » into a single camera frame during the acquisition process, and then extracted upon readout using statistical CS inversion. Here we describe the background of CS and statistical methods in depth and simulate the frame rates and efficiencies for in-situ TEM experiments. Depending on the resolution and signal/noise of the image, it should be possible to increase the speed of any camera by more than an order of magnitude using this approach.« less

  1. Coincidence ion imaging with a fast frame camera

    NASA Astrophysics Data System (ADS)

    Lee, Suk Kyoung; Cudry, Fadia; Lin, Yun Fei; Lingenfelter, Steven; Winney, Alexander H.; Fan, Lin; Li, Wen

    2014-12-01

    A new time- and position-sensitive particle detection system based on a fast frame CMOS (complementary metal-oxide semiconductors) camera is developed for coincidence ion imaging. The system is composed of four major components: a conventional microchannel plate/phosphor screen ion imager, a fast frame CMOS camera, a single anode photomultiplier tube (PMT), and a high-speed digitizer. The system collects the positional information of ions from a fast frame camera through real-time centroiding while the arrival times are obtained from the timing signal of a PMT processed by a high-speed digitizer. Multi-hit capability is achieved by correlating the intensity of ion spots on each camera frame with the peak heights on the corresponding time-of-flight spectrum of a PMT. Efficient computer algorithms are developed to process camera frames and digitizer traces in real-time at 1 kHz laser repetition rate. We demonstrate the capability of this system by detecting a momentum-matched co-fragments pair (methyl and iodine cations) produced from strong field dissociative double ionization of methyl iodide.

  2. Global variations in regolith properties on asteroid Vesta from Dawn's low-altitude mapping orbit

    NASA Astrophysics Data System (ADS)

    Denevi, Brett W.; Beck, Andrew W.; Coman, Ecaterina I.; Thomson, Bradley J.; Ammannito, Eleonora; Blewett, David T.; Sunshine, Jessica M.; de Sanctis, Maria Cristina; Li, Jian-Yang; Marchi, Simone; Mittlefehldt, David W.; Petro, Noah E.; Raymond, Carol A.; Russell, Christopher T.

    2016-12-01

    We investigate the depth, variability, and history of regolith on asteroid Vesta using data from the Dawn spacecraft. High-resolution (15-20 m pixel-1) Framing Camera images are used to assess the presence of morphologic indicators of a shallow regolith, including the presence of blocks in crater ejecta, spur-and-gully-type features in crater walls, and the retention of small (<300 m) impact craters. Such features reveal that the broad, regional heterogeneities observed on Vesta in terms of albedo and surface composition extend to the physical properties of the upper 1 km of the surface. Regions of thin regolith are found within the Rheasilvia basin and at equatorial latitudes from 0-90°E and 260-360°E. Craters in these areas that appear to excavate material from beneath the regolith have more diogenitic (Rheasilvia, 0-90°E) and cumulate eucrite (260-360°E) compositions. A region of especially thick regolith, where depths generally exceed 1 km, is found from 100-240°E and corresponds to heavily cratered, low-albedo surface with a basaltic eucrite composition enriched in carbonaceous chondrite material. The presence of a thick regolith in this area supports the idea that this is an ancient terrain that has accumulated a larger component of exogenic debris. We find evidence for the gardening of crater ejecta toward more howarditic compositions, consistent with regolith mixing being the dominant form of "weathering" on Vesta.

  3. Detection of serpentine in exogenic carbonaceous chondrite material on Vesta from Dawn FC data

    NASA Astrophysics Data System (ADS)

    Nathues, Andreas; Hoffmann, Martin; Cloutis, Edward A.; Schäfer, Michael; Reddy, Vishnu; Christensen, Ulrich; Sierks, Holger; Thangjam, Guneshwar Singh; Le Corre, Lucille; Mengel, Kurt; Vincent, Jean-Baptist; Russell, Christopher T.; Prettyman, Tom; Schmedemann, Nico; Kneissl, Thomas; Raymond, Carol; Gutierrez-Marques, Pablo; Hall, Ian; Büttner, Irene

    2014-09-01

    The Dawn mission’s Framing Camera (FC) observed Asteroid (4) Vesta in 2011 and 2012 using seven color filters and one clear filter from different orbits. In the present paper we analyze recalibrated HAMO color cubes (spatial resolution ∼60 m/pixel) with a focus on dark material (DM). We present a definition of highly concentrated DM based on spectral parameters, subsequently map the DM across the Vestan surface, geologically classify DM, study its spectral properties on global and local scales, and finally, compare the FC in-flight color data with laboratory spectra. We have discovered an absorption band centered at 0.72 μm in localities of DM that show the lowest albedo values by using FC data as well as spectral information from Dawn’s imaging spectrometer VIR. Such localities are contained within impact-exposed outcrops on inner crater walls and ejecta material. Comparisons between spectral FC in-flight data, and laboratory spectra of meteorites and mineral mixtures in the wavelength range 0.4-1.0 μm, revealed that the absorption band can be attributed to the mineral serpentine, which is typically present in CM chondrites. Dark material in its purest form is rare on Vesta’s surface and is distributed globally in a non-uniform manner. Our findings confirm the hypothesis of an exogenic origin of the DM by the infall of carbonaceous chondritic material, likely of CM type. It further confirms the hypothesis that most of the DM was deposited by the Veneneia impact.

  4. Does Vesta Have Moons?: Dawn's Search for Satellites

    NASA Technical Reports Server (NTRS)

    McFadden, L. A.; Sykes, M. V.; Tricarico, P.; Carsenty, U.; Gutierrez-Marques, P.; Jacobson, R. A.; Joy, S.; Keller, H. U.; Li, J.-Y.; McLean, B.; hide

    2011-01-01

    Upon approach to asteroid 4 Vesta, the Dawn mission included a dedicated satellite search observation of the operational sphere of the spacecraft around Vesta. Discovery of moons of Vesta would constrain theories of satellite f()rmation. The sequence using the framing camera and clear filter includes three mosaics of six stations acquired on July 9-10. 2011. Each station consists of four sets with three different exposures, 1.5,20 and 270 s. We also processed and scanned the optical navigation sequences until Vesta filled the field of view. Analysis of images involves looking for moving objects in the mosaics and identifying catalogued stars, subtracting them from the image and examining residual objects for evidence of bodies in orbit around Vesta. Celestial coordinates were determined using Astrometry.net, an astrometry calibration service (http://astrometry.net/use.html). We processed the images by subtracting dark and bias fields and dividing by a Hatfield. Images were further filtered subtracting a box car filter (9x9 average) to remove effects of scattered light from Vesta itself. Images were scanned by eye for evidence of motion in directions different from the background stars. All objects were compared with Hubble Space Telescope's Guide Star Catalogue and US Naval Observatory's UCAC3 catalog. We report findings from these observations and analysis, including limits of magnitude, size and motion of objects in orbit around Vesta. We gratefully acknowledge modifications made to Astrometrica http://www.astrometrica.at/ for purposes of this effort.

  5. Spectral parameters for Dawn FC color data: Carbonaceous chondrites and aqueous alteration products as potential cerean analog materials

    NASA Astrophysics Data System (ADS)

    Schäfer, Tanja; Nathues, Andreas; Mengel, Kurt; Izawa, Matthew R. M.; Cloutis, Edward A.; Schäfer, Michael; Hoffmann, Martin

    2016-02-01

    We identified a set of spectral parameters based on Dawn Framing Camera (FC) bandpasses, covering the wavelength range 0.4-1.0 μm, for mineralogical mapping of potential chondritic material and aqueous alteration products on dwarf planet Ceres. Our parameters are inferred from laboratory spectra of well-described and clearly classified carbonaceous chondrites representative for a dark component. We additionally investigated the FC signatures of candidate bright materials including carbonates, sulfates and hydroxide (brucite), which can possibly be exposed on the cerean surface by impact craters or plume activity. Several materials mineralogically related to carbonaceous chondrites, including pure ferromagnesian phyllosilicates, and serpentinites were also investigated. We tested the potential of the derived FC parameters for distinguishing between different carbonaceous chondritic materials, and between other plausible cerean surface materials. We found that the major carbonaceous chondrite groups (CM, CO, CV, CK, and CR) are distinguishable using the FC filter ratios 0.56/0.44 μm and 0.83/0.97 μm. The absorption bands of Fe-bearing phyllosilicates at 0.7 and 0.9 μm in terrestrial samples and CM carbonaceous chondrites can be detected by a combination of FC band parameters using the filters at 0.65, 0.75, 0.83, 0.92 and 0.97 μm. This set of parameters serves as a basis to identify and distinguish different lithologies on the cerean surface by FC multispectral data.

  6. The application of high-speed photography in z-pinch high-temperature plasma diagnostics

    NASA Astrophysics Data System (ADS)

    Wang, Kui-lu; Qiu, Meng-tong; Hei, Dong-wei

    2007-01-01

    This invited paper is presented to discuss the application of high speed photography in z-pinch high temperature plasma diagnostics in recent years in Northwest Institute of Nuclear Technology in concentrative mode. The developments and applications of soft x-ray framing camera, soft x-ray curved crystal spectrometer, optical framing camera, ultraviolet four-frame framing camera and ultraviolet-visible spectrometer are introduced.

  7. Types and Distribution of Bright Materials in 4 Vesta

    NASA Technical Reports Server (NTRS)

    Mittlefehldt, D. W.; Li, Jian-Yang; Pieters, C. M.; De Sanctis, M. C.; Schroder, S. E.; Hiesinger, H.; Blewett, D. T.; Russell, C. T.; Raymond, C. A.; Yingst, R. A.

    2012-01-01

    A strong case can be made that Vesta is the parent asteroid of the howardite, eucrite and diogenite (HED) meteorites [1]. As such, we have over a century of detailed sample analysis experience to call upon when formulating hypotheses regarding plausible lithologic diversity on Vesta. It thus came as a surprise when Dawn s Framing Camera (FC) first revealed distinctly localized materials of exceptionally low and high albedos, often closely associated. To understand the nature and origin of these materials, and how they inform us of the geological evolution of Vesta, task forces began their study. An initial step of the scientific endeavor is to develop a descriptive, non-genetic classification of objects to use as a basis for developing hypotheses and observational campaigns. Here we present a catalog of the types of light-toned deposits and their distribution across Vesta. A companion abstract [2] discusses possible origins of bright materials and the constraints they suggest for vestan geology.

  8. Preliminary Geological Map of the Ac-H-5 Fejokoo Quadrangle of Ceres: An Integrated Mapping Study Using Dawn Spacecraft Data

    NASA Astrophysics Data System (ADS)

    Hughson, K.; Russell, C.; Williams, D. A.; Buczkowski, D.; Mest, S. C.; Scully, J. E. C.; Hiesinger, H.; Platz, T.; Ruesch, O.; Schenk, P.; Frigeri, A.; Jaumann, R.; Roatsch, T.; Preusker, F.; Nathues, A.; Hoffmann, M.; Schäfer, M.; Park, R. S.; Marchi, S.; De Sanctis, M. C.; Raymond, C. A.

    2015-12-01

    In order to enable methodical geologic mapping of the surface of Ceres the Dawn Science Team divided its surface into fifteen quadrangles. A preliminary map of the Fejokoo quadrangle is presented here. This region, located between 21˚-66˚N and 270-0˚E, hosts four primary features: (1) the centrally located, 90 km diameter, distinctly hexagonal impact crater Fejokoo; (2) a small unnamed crater midway up the eastern boundary of the quadrangle which contains and is surrounded by bright material; (3) an unnamed degraded crater NW of Fejokoo that contains lobate material deposits on both sides of the crater's S rim; and (4) a heavily cratered unit in the NW portion of the quadrangle. Key objectives for the ongoing mapping of this quadrangle are to assess the types of processes that may be responsible for the creation of the hexagonal Fejokoo crater, identifying the source and nature of the bright material on the eastern boundary, establishing possible mechanisms for the emplacement of lobate material deposits in Fejokoo and the unnamed crater to its NW, and establishing a detailed geological history of the quadrangle. The Fejokoo region is not associated with any major albedo feature identified by the Hubble Space Telescope (Li et al., 2006). At the time of this writing geologic mapping was performed using Framing Camera (FC) mosaics from the Approach (1.3 km/px) and Survey (415 m/px) orbits, including grayscale and color images and digital terrain models derived from stereo images. Future images from the High Altitude Mapping Orbit (140 m/px) and Low Altitude Mapping Orbit (35 m/px) will be used to refine the maps. Support of the Dawn Instrument, Operations, and Science Teams is acknowledged. This work is supported by grants from NASA, and from the German and Italian Space Agencies.

  9. American Carrier Air Power at the Dawn of a New Century

    DTIC Science & Technology

    2005-01-01

    Systems, Office of the Secretary of Defense (Operational Test and Evaluation); then–Commander Calvin Craig, OPNAV N81; Captain Kenneth Neubauer and...TACP Tactical Air Control Party TARPS Tactical Air Reconnaissance Pod System TCS Television Camera System TLAM Tomahawk Land-Attack Missile TST Time...store any video imagery acquired by the aircraft’s systems, including the TARPS pod, the pilot’s head-up display (HUD), the Television Camera System (TCS

  10. Clues to Ceres' Internal Structure

    NASA Image and Video Library

    2017-10-26

    This frame from an animation shows Ceres as seen by NASA's Dawn spacecraft from its high-altitude mapping orbit at 913 miles (1,470 kilometers) above the surface. The colorful map overlaid at right shows variations in Ceres' gravity field measured by Dawn, and gives scientists hints about the dwarf planet's internal structure. Red colors indicate more positive values, corresponding to a stronger gravitational pull than expected, compared to scientists' pre-Dawn model of Ceres' internal structure; blue colors indicate more negative values, corresponding to a weaker gravitational pull. The animation was created by projecting a map of Ceres onto a rotating sphere. The image scale is about 450 feet (140 meters) per pixel. The animations are available at https://photojournal.jpl.nasa.gov/catalog/PIA22083

  11. Mineralogical Mapping of the Av-5 Floronia Quadrangle of Asteroid 4 Vesta

    NASA Astrophysics Data System (ADS)

    Combe, J.-Ph.; Fulchinioni, M.; McCord, T. B.; Ammannito, E.; De Sanctis, M. C.; Nathues, A.; Capaccioni, F.; Frigeri, A.; Jaumann, R.; Le Corre, L.; Palomba, E.; Preusker, F.; Reddy, V.; Stephan, K.; Tosi, F.; Zambon, F.; Raymond, C. A.; Russell, C. T.

    2012-04-01

    Asteroid 4 Vesta is currently under investigation by NASA's Dawn orbiter. The Dawn Science Team is conducting mineralogical mapping of Vesta's surface in the form of 15 quadrangle maps, and here we report results from the mapping of Floronia quadrangle Av-5. The maps are based on the data acquired by the Visible and Infrared Mapping Spectrometer (VIR-MS) and the Framing Camera (FC) (De Sanctis et al., this meeting). This abstract is focused on the analysis of band ratios, as well as the depth and position of the 2-µm absorption band of pyroxenes, but additional information will be presented. Absorption band depth is sensitive to abundance, texture and multiple scattering effects. Absorption band position is controlled by composition, shorter wavelength positions indicate less Calcium (and more Magnesium) in pyroxenes. The inferred composition is compared with that of Howardite, Eucite and Diogenite meteorites (HEDs). Diogenites are Mg-rich with large orthopyroxene crystals suggesting formation in depth; Eucrites are Ca-poor pyroxene, with smaller crystals. Av-5 Floronia Quadrangle is located between ~20-66˚N and 270˚-360˚E. It covers a portion of the heavily-cratered northern hemisphere of Vesta, and part of it is in permanent night, until August 2012. Long shadows make the visualization of albedo variations difficult, because of limited effectiveness of photometric corrections. Most of the variations of the band depth at 2 µm are partly affected by illumination geometry in this area. Only regional tendencies are meaningful at this time of the analysis. The 2-µm absorption band depth seems to be deeper towards the south of the quadrangle, in particular to the south of Floronia crater. It is not possible to interpret the value of the band depth in the floor the craters because of the absence of direct sunlight. However, the illuminated rims seem to have a deeper 2-µm absorption band, as does the ejecta from an unnamed crater located further south, within quadrangle Av-10 (Tosi et al., 2010, this meeting). The absorption band seems slightly shifted towards shorter wavelengths in the neighborhood of the same crater, which may indicate a more diogenitic composition, consistent with materials of the deeper crust. Relationships between craters, ejecta and composition will be investigated further. The authors acknowledge the support of the Dawn Science, Instrument and Operations Teams. This work was supported by the NASA Dawn Project under contract from UCLA and by the NASA Dawn at Vesta Participating Scientist program.

  12. Coincidence ion imaging with a fast frame camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Suk Kyoung; Cudry, Fadia; Lin, Yun Fei

    2014-12-15

    A new time- and position-sensitive particle detection system based on a fast frame CMOS (complementary metal-oxide semiconductors) camera is developed for coincidence ion imaging. The system is composed of four major components: a conventional microchannel plate/phosphor screen ion imager, a fast frame CMOS camera, a single anode photomultiplier tube (PMT), and a high-speed digitizer. The system collects the positional information of ions from a fast frame camera through real-time centroiding while the arrival times are obtained from the timing signal of a PMT processed by a high-speed digitizer. Multi-hit capability is achieved by correlating the intensity of ion spots onmore » each camera frame with the peak heights on the corresponding time-of-flight spectrum of a PMT. Efficient computer algorithms are developed to process camera frames and digitizer traces in real-time at 1 kHz laser repetition rate. We demonstrate the capability of this system by detecting a momentum-matched co-fragments pair (methyl and iodine cations) produced from strong field dissociative double ionization of methyl iodide.« less

  13. Solar array at sunrise

    NASA Image and Video Library

    2001-04-05

    ISS002-E-5121 (5 April 2001) --- The solar panel supporting the Zvezda Service Module on the International Space Station (ISS) is backdropped against Earth's horizon at dawn. The image was made by one of the Expedition Two crew members using a digital still camera.

  14. Noise and sensitivity of x-ray framing cameras at Nike (abstract)

    NASA Astrophysics Data System (ADS)

    Pawley, C. J.; Deniz, A. V.; Lehecka, T.

    1999-01-01

    X-ray framing cameras are the most widely used tool for radiographing density distributions in laser and Z-pinch driven experiments. The x-ray framing cameras that were developed specifically for experiments on the Nike laser system are described. One of these cameras has been coupled to a CCD camera and was tested for resolution and image noise using both electrons and x rays. The largest source of noise in the images was found to be due to low quantum detection efficiency of x-ray photons.

  15. Stratigraphy and Surface Ages of Dwarf Planet (1) Ceres: Results from Geologic and Topographic Mapping in Survey, HAMO and LAMO Data of the Dawn Framing Camera Images

    NASA Astrophysics Data System (ADS)

    Wagner, R. J.; Schmedemann, N.; Stephan, K.; Jaumann, R.; Neesemann, A.; Preusker, F.; Kersten, E.; Roatsch, T.; Hiesinger, H.; Williams, D. A.; Yingst, R. A.; Crown, D. A.; Mest, S. C.; Raymond, C. A.; Russell, C. T.

    2017-12-01

    Since March 6, 2015, the surface of dwarf planet (1) Ceres is being imaged by the FC framing camera aboard the Dawn spacecraft from orbit at various altitudes [1]. For this study we focus on images from the Survey orbit phase (4424 km altitude) with spatial resolutions of 400 m/pxl and use images and topographic data from DTMs (digital terrain models) for global geologic mapping. On Ceres' surface cratered plains are ubiquitous, with variations in superimposed crater frequency indicating different ages and processes. Here, we take the topography into account for geologic mapping and discriminate cratered plains units according to their topographic level - high-standing, medium, or low-lying - in order to examine a possible correlation between topography and surface age. Absolute model ages (AMAs) are derived from two impact cratering chronology models discussed in detail by [2] (henceforth termed LDM: lunar-derived model, and ADM: asteroid-derived model). We also apply an improved method to obtain relative ages and AMAs from crater frequency measurements termed Poisson timing analysis [3]. Our ongoing analysis shows no trend that the topographic level has an influence on the age of the geologic units. Both high-standing and low-lying cratered plains have AMAs ranging from 3.5 to 1.5 Ga (LDM), versus 4.2 to 0.5 Ga (ADM). Some areas of measurement within these units, however, show effects of resurfacing processes in their crater distributions and feature an older and a younger age. We use LAMO data (altitude: 375 km; resolution 30 m/pxl) and/or HAMO data (altitude: 1475 km; resolution 140 m/pxl) to study local geologic units and their ages, e.g., smaller impact craters, especially those not dated so far with crater measurements and/or those with specific spectral properties [4], deposits of mass wasting (e.g., landslides), and mountains, such as Ahuna Mons. Crater frequencies are used to set these geologic units into the context of Ceres' time-stratigraphic system and chronologic periods [5]. References: [1] Russell C. T., et al. (2016), Science 353, doi:10.1126/science.aaf4219. [2] Hiesinger H. H. et al. (2016), Science 353, doi:10.1126/science.aaf4759. [3] Michael G. G. et al. (2016), Icarus 277, 279-285. [4] Stephan K. et al. (2017), submitted to Icarus. [5] Mest S. C. et al. (2017), LPSC XLVIII, abstr. No. 2512.

  16. Simultaneous tracking and regulation visual servoing of wheeled mobile robots with uncalibrated extrinsic parameters

    NASA Astrophysics Data System (ADS)

    Lu, Qun; Yu, Li; Zhang, Dan; Zhang, Xuebo

    2018-01-01

    This paper presentsa global adaptive controller that simultaneously solves tracking and regulation for wheeled mobile robots with unknown depth and uncalibrated camera-to-robot extrinsic parameters. The rotational angle and the scaled translation between the current camera frame and the reference camera frame, as well as the ones between the desired camera frame and the reference camera frame can be calculated in real time by using the pose estimation techniques. A transformed system is first obtained, for which an adaptive controller is then designed to accomplish both tracking and regulation tasks, and the controller synthesis is based on Lyapunov's direct method. Finally, the effectiveness of the proposed method is illustrated by a simulation study.

  17. Hardware accelerator design for tracking in smart camera

    NASA Astrophysics Data System (ADS)

    Singh, Sanjay; Dunga, Srinivasa Murali; Saini, Ravi; Mandal, A. S.; Shekhar, Chandra; Vohra, Anil

    2011-10-01

    Smart Cameras are important components in video analysis. For video analysis, smart cameras needs to detect interesting moving objects, track such objects from frame to frame, and perform analysis of object track in real time. Therefore, the use of real-time tracking is prominent in smart cameras. The software implementation of tracking algorithm on a general purpose processor (like PowerPC) could achieve low frame rate far from real-time requirements. This paper presents the SIMD approach based hardware accelerator designed for real-time tracking of objects in a scene. The system is designed and simulated using VHDL and implemented on Xilinx XUP Virtex-IIPro FPGA. Resulted frame rate is 30 frames per second for 250x200 resolution video in gray scale.

  18. Animation of Ceres

    NASA Image and Video Library

    2015-01-19

    This frame from an animation comes from NASA Dawn spacecraft as it observed Ceres for an hour on Jan. 13, 2015, from a distance of 238,000 miles 383,000 kilometers. http://photojournal.jpl.nasa.gov/catalog/PIA19168

  19. Automatic Calibration of an Airborne Imaging System to an Inertial Navigation Unit

    NASA Technical Reports Server (NTRS)

    Ansar, Adnan I.; Clouse, Daniel S.; McHenry, Michael C.; Zarzhitsky, Dimitri V.; Pagdett, Curtis W.

    2013-01-01

    This software automatically calibrates a camera or an imaging array to an inertial navigation system (INS) that is rigidly mounted to the array or imager. In effect, it recovers the coordinate frame transformation between the reference frame of the imager and the reference frame of the INS. This innovation can automatically derive the camera-to-INS alignment using image data only. The assumption is that the camera fixates on an area while the aircraft flies on orbit. The system then, fully automatically, solves for the camera orientation in the INS frame. No manual intervention or ground tie point data is required.

  20. Coincidence electron/ion imaging with a fast frame camera

    NASA Astrophysics Data System (ADS)

    Li, Wen; Lee, Suk Kyoung; Lin, Yun Fei; Lingenfelter, Steven; Winney, Alexander; Fan, Lin

    2015-05-01

    A new time- and position- sensitive particle detection system based on a fast frame CMOS camera is developed for coincidence electron/ion imaging. The system is composed of three major components: a conventional microchannel plate (MCP)/phosphor screen electron/ion imager, a fast frame CMOS camera and a high-speed digitizer. The system collects the positional information of ions/electrons from a fast frame camera through real-time centroiding while the arrival times are obtained from the timing signal of MCPs processed by a high-speed digitizer. Multi-hit capability is achieved by correlating the intensity of electron/ion spots on each camera frame with the peak heights on the corresponding time-of-flight spectrum. Efficient computer algorithms are developed to process camera frames and digitizer traces in real-time at 1 kHz laser repetition rate. We demonstrate the capability of this system by detecting a momentum-matched co-fragments pair (methyl and iodine cations) produced from strong field dissociative double ionization of methyl iodide. We further show that a time resolution of 30 ps can be achieved when measuring electron TOF spectrum and this enables the new system to achieve a good energy resolution along the TOF axis.

  1. Ice under cover: Using bulk spatial and physical properties of probable ground ice driven mass wasting features on Ceres to better understand its surface

    NASA Astrophysics Data System (ADS)

    Hughson, K.; Russell, C.; Schmidt, B. E.; Chilton, H.; Scully, J. E. C.; Castillo, J. C.; Combe, J. P.; Ammannito, E.; Sizemore, H.; Platz, T.; Byrne, S.; Nathues, A.; Raymond, C. A.

    2016-12-01

    NASA's Dawn spacecraft arrived at Ceres on March 6, 2015, and has been studying the dwarf planet through a series of successively lower orbits, obtaining morphological and topographical image, mineralogical, elemental composition, and gravity data (Russell et al., 2016). Images taken by Dawn's Framing Camera show a multitude of flow features that were broadly interpreted as ground ice related structures either similar to ice cored/ice cemented flows (as seen on Earth and Mars), long run-out landslides, or fluidized ejecta (as seen on Mars) by Schmidt et al. (2016a and 2016b) and Buczkowski et al. (2016). The aforementioned ice cored/ice cemented-like flows are present only at high latitudes. Results from Dawn's Gamma Ray and Neutron Detector (GRaND) indicate a shallow ice table on Ceres above 45-50°N/S, which supports the interpretation that these flows are ice-rich (Prettyman et al., 2016). A near coincident spectral detection of H2O ice with one of these ice cored/ice cemented-like flows in Oxo crater by Dawn's Visual and Infrared spectrometer (VIR) further bolsters this claim (Combe et al., 2016). We use aggregate spatial and physical properties of these ice attributed cerean flows, such as flow orientation, inclination, preference for north or south facing slopes, drop height to run-out length ratio, geographical location, and areal number density to better understand the rheology and distribution of ground ice in Ceres' uppermost layer. By combining these data with local spectroscopic, global elemental abundance, experimentally derived physical properties of cerean analogue material, and other morphological information (such as the morphologies of flow hosting craters) we intend to further test the ground ice hypothesis for the formation of these flows and constrain the global distribution of near surface ground ice on Ceres to a higher fidelity than what would be possible using GRaND and VIR observations alone. References: Buczkowski et al., (2016) Science, AcceptedCombe, J-P., et al. (2016) Science, AcceptedPrettyman, T. H., et al. (2016) LPSC XVII, Abstract #2228 Russell, et al. (2016) Science, AcceptedSchmidt, B. E., et al. (2016a) LPSC XVII, Abstract #2677 Schmidt, B. E., et al. (2016b), Nature Geoscience, In Review

  2. IMAX camera (12-IML-1)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The IMAX camera system is used to record on-orbit activities of interest to the public. Because of the extremely high resolution of the IMAX camera, projector, and audio systems, the audience is afforded a motion picture experience unlike any other. IMAX and OMNIMAX motion picture systems were designed to create motion picture images of superior quality and audience impact. The IMAX camera is a 65 mm, single lens, reflex viewing design with a 15 perforation per frame horizontal pull across. The frame size is 2.06 x 2.77 inches. Film travels through the camera at a rate of 336 feet per minute when the camera is running at the standard 24 frames/sec.

  3. A novel simultaneous streak and framing camera without principle errors

    NASA Astrophysics Data System (ADS)

    Jingzhen, L.; Fengshan, S.; Ningwen, L.; Xiangdong, G.; Bin, H.; Qingyang, W.; Hongyi, C.; Yi, C.; Xiaowei, L.

    2018-02-01

    A novel simultaneous streak and framing camera with continuous access, the perfect information of which is far more important for the exact interpretation and precise evaluation of many detonation events and shockwave phenomena, has been developed. The camera with the maximum imaging frequency of 2 × 106 fps and the maximum scanning velocity of 16.3 mm/μs has fine imaging properties which are the eigen resolution of over 40 lp/mm in the temporal direction and over 60 lp/mm in the spatial direction and the framing frequency principle error of zero for framing record, and the maximum time resolving power of 8 ns and the scanning velocity nonuniformity of 0.136%~-0.277% for streak record. The test data have verified the performance of the camera quantitatively. This camera, simultaneously gained frames and streak with parallax-free and identical time base, is characterized by the plane optical system at oblique incidence different from space system, the innovative camera obscura without principle errors, and the high velocity motor driven beryllium-like rotating mirror, made of high strength aluminum alloy with cellular lateral structure. Experiments demonstrate that the camera is very useful and reliable to take high quality pictures of the detonation events.

  4. Event-Driven Random-Access-Windowing CCD Imaging System

    NASA Technical Reports Server (NTRS)

    Monacos, Steve; Portillo, Angel; Ortiz, Gerardo; Alexander, James; Lam, Raymond; Liu, William

    2004-01-01

    A charge-coupled-device (CCD) based high-speed imaging system, called a realtime, event-driven (RARE) camera, is undergoing development. This camera is capable of readout from multiple subwindows [also known as regions of interest (ROIs)] within the CCD field of view. Both the sizes and the locations of the ROIs can be controlled in real time and can be changed at the camera frame rate. The predecessor of this camera was described in High-Frame-Rate CCD Camera Having Subwindow Capability (NPO- 30564) NASA Tech Briefs, Vol. 26, No. 12 (December 2002), page 26. The architecture of the prior camera requires tight coupling between camera control logic and an external host computer that provides commands for camera operation and processes pixels from the camera. This tight coupling limits the attainable frame rate and functionality of the camera. The design of the present camera loosens this coupling to increase the achievable frame rate and functionality. From a host computer perspective, the readout operation in the prior camera was defined on a per-line basis; in this camera, it is defined on a per-ROI basis. In addition, the camera includes internal timing circuitry. This combination of features enables real-time, event-driven operation for adaptive control of the camera. Hence, this camera is well suited for applications requiring autonomous control of multiple ROIs to track multiple targets moving throughout the CCD field of view. Additionally, by eliminating the need for control intervention by the host computer during the pixel readout, the present design reduces ROI-readout times to attain higher frame rates. This camera (see figure) includes an imager card consisting of a commercial CCD imager and two signal-processor chips. The imager card converts transistor/ transistor-logic (TTL)-level signals from a field programmable gate array (FPGA) controller card. These signals are transmitted to the imager card via a low-voltage differential signaling (LVDS) cable assembly. The FPGA controller card is connected to the host computer via a standard peripheral component interface (PCI).

  5. Initial Demonstration of 9-MHz Framing Camera Rates on the FAST UV Drive Laser Pulse Trains

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lumpkin, A. H.; Edstrom Jr., D.; Ruan, J.

    2016-10-09

    We report the configuration of a Hamamatsu C5680 streak camera as a framing camera to record transverse spatial information of green-component laser micropulses at 3- and 9-MHz rates for the first time. The latter is near the time scale of the ~7.5-MHz revolution frequency of the Integrable Optics Test Accelerator (IOTA) ring and its expected synchroton radiation source temporal structure. The 2-D images are recorded with a Gig-E readout CCD camera. We also report a first proof of principle with an OTR source using the linac streak camera in a semi-framing mode.

  6. Advances in x-ray framing cameras at the National Ignition Facility to improve quantitative precision in x-ray imaging

    DOE PAGES

    Benedetti, L. R.; Holder, J. P.; Perkins, M.; ...

    2016-02-26

    We describe an experimental method to measure the gate profile of an x-ray framing camera and to determine several important functional parameters: relative gain (between strips), relative gain droop (within each strip), gate propagation velocity, gate width, and actual inter-strip timing. Several of these parameters cannot be measured accurately by any other technique. This method is then used to document cross talk-induced gain variations and artifacts created by radiation that arrives before the framing camera is actively amplifying x-rays. Electromagnetic cross talk can cause relative gains to vary significantly as inter-strip timing is varied. This imposes a stringent requirement formore » gain calibration. If radiation arrives before a framing camera is triggered, it can cause an artifact that manifests as a high-intensity, spatially varying background signal. Furthermore, we have developed a device that can be added to the framing camera head to prevent these artifacts.« less

  7. Advances in x-ray framing cameras at the National Ignition Facility to improve quantitative precision in x-ray imaging.

    PubMed

    Benedetti, L R; Holder, J P; Perkins, M; Brown, C G; Anderson, C S; Allen, F V; Petre, R B; Hargrove, D; Glenn, S M; Simanovskaia, N; Bradley, D K; Bell, P

    2016-02-01

    We describe an experimental method to measure the gate profile of an x-ray framing camera and to determine several important functional parameters: relative gain (between strips), relative gain droop (within each strip), gate propagation velocity, gate width, and actual inter-strip timing. Several of these parameters cannot be measured accurately by any other technique. This method is then used to document cross talk-induced gain variations and artifacts created by radiation that arrives before the framing camera is actively amplifying x-rays. Electromagnetic cross talk can cause relative gains to vary significantly as inter-strip timing is varied. This imposes a stringent requirement for gain calibration. If radiation arrives before a framing camera is triggered, it can cause an artifact that manifests as a high-intensity, spatially varying background signal. We have developed a device that can be added to the framing camera head to prevent these artifacts.

  8. Image synchronization for 3D application using the NanEye sensor

    NASA Astrophysics Data System (ADS)

    Sousa, Ricardo M.; Wäny, Martin; Santos, Pedro; Dias, Morgado

    2015-03-01

    Based on Awaiba's NanEye CMOS image sensor family and a FPGA platform with USB3 interface, the aim of this paper is to demonstrate a novel technique to perfectly synchronize up to 8 individual self-timed cameras. Minimal form factor self-timed camera modules of 1 mm x 1 mm or smaller do not generally allow external synchronization. However, for stereo vision or 3D reconstruction with multiple cameras as well as for applications requiring pulsed illumination it is required to synchronize multiple cameras. In this work, the challenge to synchronize multiple self-timed cameras with only 4 wire interface has been solved by adaptively regulating the power supply for each of the cameras to synchronize their frame rate and frame phase. To that effect, a control core was created to constantly monitor the operating frequency of each camera by measuring the line period in each frame based on a well-defined sampling signal. The frequency is adjusted by varying the voltage level applied to the sensor based on the error between the measured line period and the desired line period. To ensure phase synchronization between frames of multiple cameras, a Master-Slave interface was implemented. A single camera is defined as the Master entity, with its operating frequency being controlled directly through a PC based interface. The remaining cameras are setup in Slave mode and are interfaced directly with the Master camera control module. This enables the remaining cameras to monitor its line and frame period and adjust their own to achieve phase and frequency synchronization. The result of this work will allow the realization of smaller than 3mm diameter 3D stereo vision equipment in medical endoscopic context, such as endoscopic surgical robotic or micro invasive surgery.

  9. Development of two-framing camera with large format and ultrahigh speed

    NASA Astrophysics Data System (ADS)

    Jiang, Xiaoguo; Wang, Yuan; Wang, Yi

    2012-10-01

    High-speed imaging facility is important and necessary for the formation of time-resolved measurement system with multi-framing capability. The framing camera which satisfies the demands of both high speed and large format needs to be specially developed in the ultrahigh speed research field. A two-framing camera system with high sensitivity and time-resolution has been developed and used for the diagnosis of electron beam parameters of Dragon-I linear induction accelerator (LIA). The camera system, which adopts the principle of light beam splitting in the image space behind the lens with long focus length, mainly consists of lens-coupled gated image intensifier, CCD camera and high-speed shutter trigger device based on the programmable integrated circuit. The fastest gating time is about 3 ns, and the interval time between the two frames can be adjusted discretely at the step of 0.5 ns. Both the gating time and the interval time can be tuned to the maximum value of about 1 s independently. Two images with the size of 1024×1024 for each can be captured simultaneously in our developed camera. Besides, this camera system possesses a good linearity, uniform spatial response and an equivalent background illumination as low as 5 electrons/pix/sec, which fully meets the measurement requirements of Dragon-I LIA.

  10. SEOS frame camera applications study

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A research and development satellite is discussed which will provide opportunities for observation of transient phenomena that fall within the fixed viewing circle of the spacecraft. The evaluation of possible applications for frame cameras, for SEOS, are studied. The computed lens characteristics for each camera are listed.

  11. Cheetah: A high frame rate, high resolution SWIR image camera

    NASA Astrophysics Data System (ADS)

    Neys, Joel; Bentell, Jonas; O'Grady, Matt; Vermeiren, Jan; Colin, Thierry; Hooylaerts, Peter; Grietens, Bob

    2008-10-01

    A high resolution, high frame rate InGaAs based image sensor and associated camera has been developed. The sensor and the camera are capable of recording and delivering more than 1700 full 640x512pixel frames per second. The FPA utilizes a low lag CTIA current integrator in each pixel, enabling integration times shorter than one microsecond. On-chip logics allows for four different sub windows to be read out simultaneously at even higher rates. The spectral sensitivity of the FPA is situated in the SWIR range [0.9-1.7 μm] and can be further extended into the Visible and NIR range. The Cheetah camera has max 16 GB of on-board memory to store the acquired images and transfer the data over a Gigabit Ethernet connection to the PC. The camera is also equipped with a full CameralinkTM interface to directly stream the data to a frame grabber or dedicated image processing unit. The Cheetah camera is completely under software control.

  12. High-frame rate multiport CCD imager and camera

    NASA Astrophysics Data System (ADS)

    Levine, Peter A.; Patterson, David R.; Esposito, Benjamin J.; Tower, John R.; Lawler, William B.

    1993-01-01

    A high frame rate visible CCD camera capable of operation up to 200 frames per second is described. The camera produces a 256 X 256 pixel image by using one quadrant of a 512 X 512 16-port, back illuminated CCD imager. Four contiguous outputs are digitally reformatted into a correct, 256 X 256 image. This paper details the architecture and timing used for the CCD drive circuits, analog processing, and the digital reformatter.

  13. Smooth pond-like deposits on asteroid 4 Vesta: First results from the Dawn mission.

    NASA Astrophysics Data System (ADS)

    Hiesinger, H.; Ruesch, O.; Jaumann, R.; Nathues, A.; Raymond, C. A.; Russell, C. T.

    2012-04-01

    The Dawn spacecraft arrived at Vesta on July 16, 2011 to study the asteroid with a Framing Camera (FC), a Visible & Infrared Spectrometer (VIR), and a Gamma Ray and Neutron Detector (GRaND) [1]. Dawn provides the first high-resolution data from its survey orbit, high-altitude mapping orbit (HAMO), and low-altitude mapping orbit (LAMO). FC data revealed smooth pond-like deposits of ambiguous origin, similar to deposits on other asteroids, including Eros and Itokawa [2,3]. Several scenarios for the origin of these deposits can be tested with Dawn data, including volcanism, impact sedimentation, impact melt deposition, dust levitation and transport, seismic shaking, or landslides. We measured 83 small (~7 km2 average size) smooth deposits distributed across the surface of Vesta. Most ponds on Vesta occur on the floors of impact craters and in irregular depressions. We did not observe inflow of material into the depressions. Most of these deposits have well-defined geological contacts, indicating that they are younger than the surrounding terrain. However, lunar impact melt pools that formed contemporaneously with surrounding ejecta blankets show similar stratigraphic relationships. Sometimes the albedo of these ponds is lower than the surrounding terrain, in other cases the ponds are indistinguishable from the adjacent terrain. The ponds preferentially occur in a band between -10 and 30 degrees latitude with fewer ponds north of ~30 degrees and even fewer ponds in the southern hemisphere, i.e., the Rheasilvia region. The largest cluster of ponds occurs in the vicinity of the Marcia impact crater, which is part of the so-called snowman craters. Similar, but smaller (<230 m diameter) smooth ponds were also reported from the surface of asteroid Eros [2]. Robinson et al. [2] found that most smooth ponds on Eros occur in equatorial regions and concluded that the most likely process for their formation is electrostatic levitation and redistribution of the finest regolith components (<100 µm). Sierks et al. [4] argued that along the terminator, particularly strong electric fields can develop between the sun-lit and shaded areas, e.g., within craters, resulting in particle motion from sun-lit to dark regions. Dust levitation and transport was also discussed for asteroid 25143 Itokawa [3]. [1] Russell et al., (2007), Earth Moon Planets, 101; [2] Robinson et al., (2002), Met. Planet. Sci., 37; [3] Yano et al., (2006), Science, 312; [4] Sierks et al., (2011), Space Sci. Rev., doi:10.1007/s11214-011-9745-4. This research has been supported by the German Space Agency (DLR) and NASA. We would like to thank the Dawn Operations Team for their success-ful planning and acquisition of high-quality Vesta data.

  14. KSC-07pd2405

    NASA Image and Video Library

    2007-09-10

    KENNEDY SPACE CENTER, FLA. -- At Astrotech Space Operations in Titusville, Fla., workers guide the upper transportation canister toward the Dawn spacecraft in the background. The canister will be lowered onto the lower segments and attached. The canister will protect the spacecraft and booster during transfer to Launch Pad 17-B at Cape Canaveral Air Force Station (CCAFS). During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Launch via a Delta II rocket is scheduled in a window from 7:25 to 7:54 a.m. Sept. 26 from CCAFS. Photo credit: NASA/Jim Grossmann

  15. KSC-07pd2407

    NASA Image and Video Library

    2007-09-10

    KENNEDY SPACE CENTER, FLA. -- At Astrotech Space Operations in Titusville, Fla., workers watch as the upper transportation canister is lowered over the Dawn spacecraft. The canister will be attached to the bottom segments already in place. The canister will protect the spacecraft and booster during transfer to Launch Pad 17-B at Cape Canaveral Air Force Station (CCAFS). During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Launch via a Delta II rocket is scheduled in a window from 7:25 to 7:54 a.m. Sept. 26 from CCAFS. Photo credit: NASA/Jim Grossmann

  16. KSC-07pd2403

    NASA Image and Video Library

    2007-09-10

    KENNEDY SPACE CENTER, FLA. -- At Astrotech Space Operations in Titusville, Fla., workers place another segment of the transportation canister around the upper stage booster beneath the Dawn spacecraft. The canister will protect the spacecraft and booster during transfer to Launch Pad 17-B at Cape Canaveral Air Force Station (CCAFS). During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Launch via a Delta II rocket is scheduled in a window from 7:25 to 7:54 a.m. Sept. 26 from CCAFS. Photo credit: NASA/Jim Grossmann

  17. KSC-07pd2404

    NASA Image and Video Library

    2007-09-10

    KENNEDY SPACE CENTER, FLA. -- At Astrotech Space Operations in Titusville, Fla., workers check the fitting on the lower transportation canister segments in place around the upper stage booster beneath the Dawn spacecraft. The canister will protect the spacecraft and booster during transfer to Launch Pad 17-B at Cape Canaveral Air Force Station (CCAFS). During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Launch via a Delta II rocket is scheduled in a window from 7:25 to 7:54 a.m. Sept. 26 from CCAFS. Photo credit: NASA/Jim Grossmann

  18. KSC-07pd2402

    NASA Image and Video Library

    2007-09-10

    KENNEDY SPACE CENTER, FLA. -- At Astrotech Space Operations in Titusville, Fla., workers place the lower segments of the transportation canister around the upper stage booster beneath the Dawn spacecraft. The canister will protect the spacecraft and booster during transfer to Launch Pad 17-B at Cape Canaveral Air Force Station (CCAFS). During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Launch via a Delta II rocket is scheduled in a window from 7:25 to 7:54 a.m. Sept. 26 from CCAFS. Photo credit: NASA/Jim Grossmann

  19. KSC-07pd2406

    NASA Image and Video Library

    2007-09-10

    KENNEDY SPACE CENTER, FLA. -- At Astrotech Space Operations in Titusville, Fla., workers guide the upper transportation canister as it is lowered onto the Dawn spacecraft. The canister will be attached to the bottom segments already in place. The canister will protect the spacecraft and booster during transfer to Launch Pad 17-B at Cape Canaveral Air Force Station (CCAFS). During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Launch via a Delta II rocket is scheduled in a window from 7:25 to 7:54 a.m. Sept. 26 from CCAFS. Photo credit: NASA/Jim Grossmann

  20. Hardware accelerator design for change detection in smart camera

    NASA Astrophysics Data System (ADS)

    Singh, Sanjay; Dunga, Srinivasa Murali; Saini, Ravi; Mandal, A. S.; Shekhar, Chandra; Chaudhury, Santanu; Vohra, Anil

    2011-10-01

    Smart Cameras are important components in Human Computer Interaction. In any remote surveillance scenario, smart cameras have to take intelligent decisions to select frames of significant changes to minimize communication and processing overhead. Among many of the algorithms for change detection, one based on clustering based scheme was proposed for smart camera systems. However, such an algorithm could achieve low frame rate far from real-time requirements on a general purpose processors (like PowerPC) available on FPGAs. This paper proposes the hardware accelerator capable of detecting real time changes in a scene, which uses clustering based change detection scheme. The system is designed and simulated using VHDL and implemented on Xilinx XUP Virtex-IIPro FPGA board. Resulted frame rate is 30 frames per second for QVGA resolution in gray scale.

  1. Development of high-speed video cameras

    NASA Astrophysics Data System (ADS)

    Etoh, Takeharu G.; Takehara, Kohsei; Okinaka, Tomoo; Takano, Yasuhide; Ruckelshausen, Arno; Poggemann, Dirk

    2001-04-01

    Presented in this paper is an outline of the R and D activities on high-speed video cameras, which have been done in Kinki University since more than ten years ago, and are currently proceeded as an international cooperative project with University of Applied Sciences Osnabruck and other organizations. Extensive marketing researches have been done, (1) on user's requirements on high-speed multi-framing and video cameras by questionnaires and hearings, and (2) on current availability of the cameras of this sort by search of journals and websites. Both of them support necessity of development of a high-speed video camera of more than 1 million fps. A video camera of 4,500 fps with parallel readout was developed in 1991. A video camera with triple sensors was developed in 1996. The sensor is the same one as developed for the previous camera. The frame rate is 50 million fps for triple-framing and 4,500 fps for triple-light-wave framing, including color image capturing. Idea on a video camera of 1 million fps with an ISIS, In-situ Storage Image Sensor, was proposed in 1993 at first, and has been continuously improved. A test sensor was developed in early 2000, and successfully captured images at 62,500 fps. Currently, design of a prototype ISIS is going on, and, hopefully, will be fabricated in near future. Epoch-making cameras in history of development of high-speed video cameras by other persons are also briefly reviewed.

  2. Preliminary Geological Map of the Ac-H-3 Dantu Quadrangle of Ceres: An Integrated Mapping Study Using Dawn Spacecraft Data

    NASA Astrophysics Data System (ADS)

    Kneissl, T.; Schmedemann, N.; Neesemann, A.; Williams, D. A.; Crown, D. A.; Mest, S. C.; Buczkowski, D.; Scully, J. E. C.; Frigeri, A.; Ruesch, O.; Hiesinger, H.; Walter, S. H. G.; Jaumann, R.; Roatsch, T.; Preusker, F.; Nathues, A.; Platz, T.; Hoffmann, M.; Schäfer, M.; De Sanctis, M. C.; Raymond, C. A.; Russell, C. T.; Kersten, E.; Naß, A.

    2015-12-01

    We are using Dawn spacecraft data to create a geologic map of the Ac-H-3 Dantu Quadrangle of dwarf planet Ceres. The quadrangle is located between 21-66˚N and 90-180˚E and includes the following dominant features: 1) the central and northern portion of the 124.6 km diameter impact crater Dantu; 2) crater chains and/or grooves oriented in an east-west direction; 3) a portion of the 84 km diameter impact crater Gaue, whose ejecta blanket covers the SW corner of the quadrangle. Dantu is a complex impact crater showing terraces, a central pit structure, concentric fractures, and smooth deposits on the crater floor. The materials interpreted to be ejecta deposits of Dantu show low crater frequencies and dominate the southern half of the quadrangle. These deposits appear to be relatively bright and correspond to parts of the #2 high albedo region observed by (1) with the HST indicating different composition and/or material properties than the surroundings. The east-west striking crater chains and grooves are mainly found in the southern half of the quadrangle. They seem to be connected to the crater chains found in Ac-H-4 Ezinu, the neighboring quadrangle to the east, and are potentially related to ballistic ejecta emplacement (see 2). Further work will be focused on Dantu crater and its complex interior and exterior. The current geologic map is based on Framing Camera (FC) image mosaics derived from Approach (~1.3 km/px) and Survey (~400 m/px) data as well as digital terrain models (DTMs) derived from stereo imagery. In the course of the mission, we will incorporate mosaics from the High Altitude Mapping Orbit (~140 m/px, Fall 2015) and Low Altitude Mapping Orbit (~35 m/px, Spring 2016) phases. We acknowledge the support of the Dawn Instrument, Operations, and Science Teams. This work is partly supported by the German Space Agency (DLR), grant 50 OW 1101. (1) Li, J-Y. et al. (2006), Icarus, 182, 143-160. (2) Scully, J.E.C. et al. (2015), this conference.

  3. Preliminary Geological Map of the Ac-H-2 Coniraya Quadrangle of Ceres: An Integrated Mapping Study Using Dawn Spacecraft Data

    NASA Astrophysics Data System (ADS)

    Hiesinger, H.; Pasckert, J. H.; Williams, D. A.; Crown, D. A.; Mest, S. C.; Buczkowski, D.; Schenk, P.; Scully, J. E. C.; Jaumann, R.; Roatsch, T.; Preusker, F.; Platz, T.; Nathues, A.; Hoffmann, M.; Marchi, S.; De Sanctis, M. C.; Russell, C. T.; Raymond, C. A.

    2015-12-01

    To better understand the geologic history of dwarf planet Ceres, the surface has been divided into 15 quadrangles that are systematically mapped on the basis of images obtained by NASA's Dawn spacecraft, which began orbiting Ceres in April 2015. We will report on preliminary mapping results for the Ac-H-2 Coniraya Quadrangle based on Framing Camera (FC) mosaics from the Dawn Approach (1.3 km/px) and Survey (415 m/px) orbits. This quadrangle is located between 21-66°N and 0-90°E and is dominated by mostly highly degraded impact craters of diameters between 50 and 200 km and clusters of small- to midsize impact craters. Color data show that this quadrangle is generally darker than most regions of the southern hemisphere. Two prominent impact craters in this quadrangle have been named Coniraya and Gaue crater, respectively. Coniraya is the largest more or less intact impact crater with a diameter of 136 km, centered at 65.8°N/40.5°E. It appears shallow and its crater rim is heavily degraded but still continuous. At the current image resolution, textural differences between the interior and exterior of the crater are not visible. With a diameter of 84 km, Gaue crater appears to be the freshest large impact crater in this quadrangle. It is located at the eastern border of the Coniraya Quadrangle with a small central peak at 30°N/85.7°E. The crater rim is quite sharp and the ejecta blanket can be traced around the crater to a distance of ~200km from the crater center. Most of the crater floor around the central peak is covered by a smooth uniform unit with a lower impact crater population than the surrounding surfaces. Color data show that this smooth unit is darker than the surrounding surfaces. A similar unit can be found on the floor of a complex cluster of 10-56 km diameter craters at 32°N/40°E. With upcoming higher resolution data we will refine our geologic map and will specifically investigate possible formation processes of these smooth units.

  4. Preliminary Geological Map of the Ac-H-7 Kerwan Quadrangle of Ceres: An Integrated Mapping Study Using Dawn Spacecraft Data

    NASA Astrophysics Data System (ADS)

    Williams, D. A.; Crown, D. A.; Mest, S. C.; Buczkowski, D.; Schenk, P.; Scully, J. E. C.; Jaumann, R.; Roatsch, T.; Preusker, F.; Platz, T.; Nathues, A.; Hoffmann, M.; Schäfer, M.; Marchi, S.; De Sanctis, M. C.; Russell, C. T.; Raymond, C. A.

    2015-12-01

    We used geologic mapping applied to Dawn spacecraft data as a tool to understand the geologic history of the Ac-H-7 Kerwan Quadrangle of dwarf planet Ceres. This region, located between 22˚S-22˚N and 72-144˚E, hosts four primary features: 1) the northern part of the 284 km diameter impact basin Kerwan in the center and SE corner of the quadrangle, whose rim is degraded and whose interior has been filled with a 'smooth material' that hosts a significantly lower impact crater density than most of the rest of Ceres' surface; 2) a portion of the 125 km diameter crater Dantu, whose ejecta field covers the NE corner of the quadrangle and where color data show both bright and dark materials, suggesting excavation of terrains of different compositions; 3) an unnamed double crater in the NW corner of the quadrangle surrounded by an ejecta field; and 4) a heavily cratered plains unit in the SW corner of the quadrangle that appears to be part of the dominant unit across Ceres surface. Key goals of the ongoing mapping are to assess the types of processes that might be responsible for resurfacing by the smooth unit, and understanding the nature of the variably-colored Dantu ejecta. The Dantu region is one of two longitudinally distinct regions on Ceres where ESA Hershel space telescope data suggested a release of water vapor (1). At the time of this writing geologic mapping was performed on Framing Camera (FC) mosaics from the Approach (1.3 km/px) and Survey (415 m/px) orbits, including grayscale and color images and digital terrain models derived from stereo images. In Fall 2015 images from the High Altitude Mapping Orbit (140 m/px) will be used to refine the mapping, followed by Low Altitude Mapping Orbit (35 m/px) images in January 2016. Support of the Dawn Instrument, Operations, and Science Teams is acknowledged. This work is supported by grants from NASA, and from the German and Italian Space Agencies. Reference: (1) Küppers, M., et al. (2014). Nature, v. 505, 525-527.

  5. Preliminary Geological Map of the Ac-H-12 Toharu Quadrangle of Ceres: An Integrated Mapping Study Using Dawn Spacecraft Data

    NASA Astrophysics Data System (ADS)

    Mest, S. C.; Williams, D. A.; Crown, D. A.; Yingst, R. A.; Buczkowski, D.; Schenk, P.; Scully, J. E. C.; Jaumann, R.; Roatsch, T.; Preusker, F.; Platz, T.; Nathues, A.; Hoffmann, M.; Schäfer, M.; Marchi, S.; De Sanctis, M. C.; Russell, C. T.; Raymond, C. A.

    2015-12-01

    We are using recent data from the Dawn spacecraft to map the geology of the Ac-H-12 Toharu Quadrangle (21-66°S, 90-180°E) of the dwarf planet Ceres in order to examine its surface geology and understand its geologic history. At the time of this writing, mapping was performed on Framing Camera (FC) mosaics from late Approach (1.3 km/px) and Survey (415 m/px) orbits, including clear filter and color images and digital terrain models derived from stereo images. Images from the High Altitude Mapping Orbit (140 m/px) will be used to refine the map in Fall 2015, followed by the Low Altitude Mapping Orbit (35 m/px) starting in December 2015. The quad is named after crater Toharu (87 km diameter; 49°S, 155°E). The southern rim of Kerwan basin (284 km diameter) is visible along the northern edge of the quad, which is preserved as a low-relief scarp. The quad exhibits smooth terrain in the north, and more heavily cratered terrain in the south. The smooth terrain forms nearly flat-lying plains in some areas, such as on the floor and to the southeast of Kerwan, and overlies hummocky materials in other areas. These smooth materials extend over a much broader area outside of the quad, and appear to contain some of the lowest crater densities on Ceres. Impact craters exhibit a range of coinciding sizes and preservation styles. Smaller craters (<40 km) generally appear morphologically "fresh", and their rims are nearly circular and raised above the surrounding terrain. Larger craters, such as Toharu, appear more degraded, exhibiting irregularly shaped, sometimes scalloped, rim structures, and debris lobes on their floors. Numerous craters (> 20 km) contain central mounds; at current FC resolution, it is difficult to discern if these are primary structures (i.e., central peaks) or secondary features. Support of the Dawn Instrument, Operations, & Science Teams is acknowledged. This work is supported by grants from NASA, DLR and MPG.

  6. Preliminary Geological Map of the Ac-H-14 Yalode Quadrangle of Ceres: An Integrated Mapping Study Using Dawn Spacecraft Data

    NASA Astrophysics Data System (ADS)

    Crown, D. A.; Yingst, R. A.; Mest, S. C.; Platz, T.; Williams, D. A.; Buczkowski, D.; Schenk, P.; Scully, J. E. C.; Jaumann, R.; Roatsch, T.; Preusker, F.; Nathues, A.; Hoffmann, M.; Schäfer, M.; Marchi, S.; De Sanctis, M. C.; Russell, C.; Raymond, C. A.

    2015-12-01

    We are conducting a geologic mapping investigation of the Ac-H-14 Yalode Quadrangle (21-66°S, 270-360°E) of Ceres to examine its surface geology and geologic history. At the time of this writing, geologic mapping has been performed on Dawn Framing Camera (FC) mosaics from the late Approach phase (up to 1.3 km/px) and Survey orbit (415 m/px), including clear filter and color images and digital terrain models derived from stereo images. In Fall 2015 images from the High Altitude Mapping Orbit (140 m/px) will be used to refine the mapping, followed by the Low Altitude Mapping Orbit (35 m/px) starting in December 2015. The Yalode Quadrangle is dominated by the ~300-km diameter impact basin Yalode and includes rugged and smooth terrains to the east. Yalode basin has a variably preserved rim, which is continuous and sharply defined to the north/northwest and is irregular or degraded elsewhere, and may have an interior ring structure. The basin floor includes hummocky and smooth areas (some bounded by scarps), crater chains, and a lineated zone. High-resolution images will be used to search for volcanic features on the basin floor and in association with basin structures. Yalode basin and its floor deposits appear to have been strongly affected by the Urvara impact to the west. Impact craters in Yalode Quadrangle display a range of preservation states. Degraded features, including Yalode basin and numerous smaller craters, exhibit subdued rims, lack discrete ejecta deposits, and have infilled interiors. More pristine features (including the large unnamed basin in the SE corner of the quadrangle and craters on Yalode basin floor) have well-defined, quasi-circular forms with prominent rims and in some cases discernible ejecta. Some of these craters have bowl-shaped interiors and others contain hills or mounds on their floors. Support of the Dawn Instrument, Operations, and Science Teams is acknowledged. This work is supported by grants from NASA, MPG, and DLR.

  7. Exogenic olivine on Vesta from Dawn Framing Camera color data

    NASA Astrophysics Data System (ADS)

    Nathues, Andreas; Hoffmann, Martin; Schäfer, Michael; Thangjam, Guneshwar; Le Corre, Lucille; Reddy, Vishnu; Christensen, Ulrich; Mengel, Kurt; Sierks, Holger; Vincent, Jean-Baptist; Cloutis, Edward A.; Russell, Christopher T.; Schäfer, Tanja; Gutierrez-Marques, Pablo; Hall, Ian; Ripken, Joachim; Büttner, Irene

    2015-09-01

    In this paper we present the results of a global survey of olivine-rich lithologies on (4) Vesta. We investigated Dawn Framing Camera (FC) High Altitude Mapping Orbit (HAMO) color cubes (∼60 m/pixel resolution) by using a method described in Thangjam et al. (Thangjam, G., Nathues, A., Mengel, K., Hoffmann, M., Schäfer, M., Reddy, V., Cloutis, E.A., Christensen, U., Sierks, H., Le Corre, L., Vincent, J.-B, Russell, C.T. [2014b]. Meteorit. Planet. Sci. arXiv:1408.4687 [astro-ph.EP]). In total we identified 15 impact craters exhibiting olivine-rich (>40 wt.% ol) outcrops on their inner walls, some showing olivine-rich material also in their ejecta and floors. Olivine-rich sites are concentrated in the Bellicia, Arruntia and Pomponia region on Vesta's northern hemisphere. From our multi-color and stratigraphic analysis, we conclude that most, if not all, of the olivine-rich material identified is of exogenic origin, i.e. remnants of A- or/and S-type projectiles. The olivine-rich lithologies in the north are possibly ejecta of the ∼90 km diameter Albana crater. We cannot draw a final conclusion on their relative stratigraphic succession, but it seems that the dark material (Nathues, A., Hoffmann, M., Cloutis, E.A., Schäfer, M., Reddy, V., Christensen, U., Sierks, H., Thangjam, G.S., Le Corre, L., Mengel, K., Vincent, J.-B., Russell, C.T., Prettyman, T., Schmedemann, N., Kneissl, T., Raymond, C., Gutierrez-Marques, P., Hall, I. Büttner, I. [2014b]. Icarus (239, 222-237)) and the olivine-rich lithologies are of a similar age. The origin of some potential olivine-rich sites in the Rheasilvia basin and at crater Portia are ambiguous, i.e. these are either of endogenic or exogenic origin. However, the small number and size of these sites led us to conclude that olivine-rich mantle material, containing more than 40 wt.% of olivine, is basically absent on the present surface of Vesta. In combination with recent impact models of Veneneia and Rheasilvia (Clenet, H., Jutzi, M., Barrat, J.-A., Gillet, Ph. [2014]. Lunar Planet Sci. 45, #1349; Jutzi, M., Asphaug, E., Gillet, P., Barrat, J.-A., Benz, W. [2013]. Nature 494, 207-210), which predict an excavation depth of up to 80 km, we are confident that the crust-mantle depth is significantly deeper than predicted by most evolution models (30 km; Mittlefehldt, D.W. [2014]. Asteroid 4 Vesta: A Fully Differentiated Dwarf Planet. NASA Technical Reports Server (20140004857.pdf)) or, alternatively, the olivine-content of the (upper) mantle is lower than our detection limit, which would lead to the conclusion that Vesta's parent material was already depleted in olivine compared to CI meteorites.

  8. Resolved spectrophotometric properties of the Ceres surface from Dawn Framing Camera images

    NASA Astrophysics Data System (ADS)

    Schröder, S. E.; Mottola, S.; Carsenty, U.; Ciarniello, M.; Jaumann, R.; Li, J.-Y.; Longobardo, A.; Palmer, E.; Pieters, C.; Preusker, F.; Raymond, C. A.; Russell, C. T.

    2017-05-01

    We present a global spectrophotometric characterization of the Ceres surface using Dawn Framing Camera (FC) images. We identify the photometric model that yields the best results for photometrically correcting images. Corrected FC images acquired on approach to Ceres were assembled into global maps of albedo and color. Generally, albedo and color variations on Ceres are muted. The albedo map is dominated by a large, circular feature in Vendimia Planitia, known from HST images (Li et al., 2006), and dotted by smaller bright features mostly associated with fresh-looking craters. The dominant color variation over the surface is represented by the presence of "blue" material in and around such craters, which has a negative spectral slope over the visible wavelength range when compared to average terrain. We also mapped variations of the phase curve by employing an exponential photometric model, a technique previously applied to asteroid Vesta (Schröder et al., 2013b). The surface of Ceres scatters light differently from Vesta in the sense that the ejecta of several fresh-looking craters may be physically smooth rather than rough. High albedo, blue color, and physical smoothness all appear to be indicators of youth. The blue color may result from the desiccation of ejected material that is similar to the phyllosilicates/water ice mixtures in the experiments of Poch et al. (2016). The physical smoothness of some blue terrains would be consistent with an initially liquid condition, perhaps as a consequence of impact melting of subsurface water ice. We find red terrain (positive spectral slope) near Ernutet crater, where De Sanctis et al. (2017) detected organic material. The spectrophotometric properties of the large Vendimia Planitia feature suggest it is a palimpsest, consistent with the Marchi et al. (2016) impact basin hypothesis. The central bright area in Occator crater, Cerealia Facula, is the brightest on Ceres with an average visual normal albedo of about 0.6 at a resolution of 1.3 km per pixel (six times Ceres average). The albedo of fresh, bright material seen inside this area in the highest resolution images (35 m per pixel) is probably around unity. Cerealia Facula has an unusually steep phase function, which may be due to unresolved topography, high surface roughness, or large average particle size. It has a strongly red spectrum whereas the neighboring, less-bright, Vinalia Faculae are neutral in color. We find no evidence for a diurnal ground fog-type haze in Occator as described by Nathues et al. (2015). We can neither reproduce their findings using the same images, nor confirm them using higher resolution images. FC images have not yet offered direct evidence for present sublimation in Occator.

  9. Hypervelocity impact studies using a rotating mirror framing laser shadowgraph camera

    NASA Technical Reports Server (NTRS)

    Parker, Vance C.; Crews, Jeanne Lee

    1988-01-01

    The need to study the effects of the impact of micrometeorites and orbital debris on various space-based systems has brought together the technologies of several companies and individuals in order to provide a successful instrumentation package. A light gas gun was employed to accelerate small projectiles to speeds in excess of 7 km/sec. Their impact on various targets is being studied with the help of a specially designed continuous-access rotating-mirror framing camera. The camera provides 80 frames of data at up to 1 x 10 to the 6th frames/sec with exposure times of 20 nsec.

  10. Animation of Ceres on Approach

    NASA Image and Video Library

    2015-02-04

    This is a frame from an animation showcasing a series of images NASA Dawn spacecraft took on approach to Ceres on Feb. 4, 2015 at a distance of about 90,000 miles 145,000 kilometers from the dwarf planet. http://photojournal.jpl.nasa.gov/catalog/PIA19182

  11. Multiple-frame IR photo-recorder KIT-3M

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roos, E; Wilkins, P; Nebeker, N

    2006-05-15

    This paper reports the experimental results of a high-speed multi-frame infrared camera which has been developed in Sarov at VNIIEF. Earlier [1] we discussed the possibility of creation of the multi-frame infrared radiation photo-recorder with framing frequency about 1 MHz. The basis of the photo-recorder is a semiconductor ionization camera [2, 3], which converts IR radiation of spectral range 1-10 micrometers into a visible image. Several sequential thermal images are registered by using the IR converter in conjunction with a multi-frame electron-optical camera. In the present report we discuss the performance characteristics of a prototype commercial 9-frame high-speed IR photo-recorder.more » The image converter records infrared images of thermal fields corresponding to temperatures ranging from 300 C to 2000 C with an exposure time of 1-20 {micro}s at a frame frequency up to 500 KHz. The IR-photo-recorder camera is useful for recording the time evolution of thermal fields in fast processes such as gas dynamics, ballistics, pulsed welding, thermal processing, automotive industry, aircraft construction, in pulsed-power electric experiments, and for the measurement of spatial mode characteristics of IR-laser radiation.« less

  12. Dog and Cat Interactions in a Remote Aboriginal Community.

    PubMed

    Kennedy, Brooke; Brown, Wendy Y; Vernes, Karl; Körtner, Gerhard; Butler, James R A

    2018-04-26

    This study examined dog and cat demographics, roaming behaviours, and interspecific interactions in a remote Aboriginal island community using multiple methods. Our results revealed temporal differences between the roaming behaviours of dogs, cats, and wildlife. Dogs showed crepuscular behaviour, being active around dawn (5:30 a.m. to 9:30 a.m.) and dusk (6:00 p.m. and 11:35 p.m.). The majority of cats were active between dawn (6:30 a.m.) and dusk (7:30 p.m.) and travelled shorter distances than dogs. However, some cats were also observed roaming between dusk and dawn, and were likely to be hunting since flightless wildlife were also recorded on our remote-sensing cameras during this time. These baseline data provide evidence to suggest that new management programs are needed to reduce the number of roaming cats and therefore their potential impacts on native wildlife. Collaborations between Aboriginal owners and other stakeholders is necessary to design innovative and effective animal management and policy on the island.

  13. Object tracking using multiple camera video streams

    NASA Astrophysics Data System (ADS)

    Mehrubeoglu, Mehrube; Rojas, Diego; McLauchlan, Lifford

    2010-05-01

    Two synchronized cameras are utilized to obtain independent video streams to detect moving objects from two different viewing angles. The video frames are directly correlated in time. Moving objects in image frames from the two cameras are identified and tagged for tracking. One advantage of such a system involves overcoming effects of occlusions that could result in an object in partial or full view in one camera, when the same object is fully visible in another camera. Object registration is achieved by determining the location of common features in the moving object across simultaneous frames. Perspective differences are adjusted. Combining information from images from multiple cameras increases robustness of the tracking process. Motion tracking is achieved by determining anomalies caused by the objects' movement across frames in time in each and the combined video information. The path of each object is determined heuristically. Accuracy of detection is dependent on the speed of the object as well as variations in direction of motion. Fast cameras increase accuracy but limit the speed and complexity of the algorithm. Such an imaging system has applications in traffic analysis, surveillance and security, as well as object modeling from multi-view images. The system can easily be expanded by increasing the number of cameras such that there is an overlap between the scenes from at least two cameras in proximity. An object can then be tracked long distances or across multiple cameras continuously, applicable, for example, in wireless sensor networks for surveillance or navigation.

  14. A Reconfigurable Real-Time Compressive-Sampling Camera for Biological Applications

    PubMed Central

    Fu, Bo; Pitter, Mark C.; Russell, Noah A.

    2011-01-01

    Many applications in biology, such as long-term functional imaging of neural and cardiac systems, require continuous high-speed imaging. This is typically not possible, however, using commercially available systems. The frame rate and the recording time of high-speed cameras are limited by the digitization rate and the capacity of on-camera memory. Further restrictions are often imposed by the limited bandwidth of the data link to the host computer. Even if the system bandwidth is not a limiting factor, continuous high-speed acquisition results in very large volumes of data that are difficult to handle, particularly when real-time analysis is required. In response to this issue many cameras allow a predetermined, rectangular region of interest (ROI) to be sampled, however this approach lacks flexibility and is blind to the image region outside of the ROI. We have addressed this problem by building a camera system using a randomly-addressable CMOS sensor. The camera has a low bandwidth, but is able to capture continuous high-speed images of an arbitrarily defined ROI, using most of the available bandwidth, while simultaneously acquiring low-speed, full frame images using the remaining bandwidth. In addition, the camera is able to use the full-frame information to recalculate the positions of targets and update the high-speed ROIs without interrupting acquisition. In this way the camera is capable of imaging moving targets at high-speed while simultaneously imaging the whole frame at a lower speed. We have used this camera system to monitor the heartbeat and blood cell flow of a water flea (Daphnia) at frame rates in excess of 1500 fps. PMID:22028852

  15. High speed photography, videography, and photonics III; Proceedings of the Meeting, San Diego, CA, August 22, 23, 1985

    NASA Technical Reports Server (NTRS)

    Ponseggi, B. G. (Editor); Johnson, H. C. (Editor)

    1985-01-01

    Papers are presented on the picosecond electronic framing camera, photogrammetric techniques using high-speed cineradiography, picosecond semiconductor lasers for characterizing high-speed image shutters, the measurement of dynamic strain by high-speed moire photography, the fast framing camera with independent frame adjustments, design considerations for a data recording system, and nanosecond optical shutters. Consideration is given to boundary-layer transition detectors, holographic imaging, laser holographic interferometry in wind tunnels, heterodyne holographic interferometry, a multispectral video imaging and analysis system, a gated intensified camera, a charge-injection-device profile camera, a gated silicon-intensified-target streak tube and nanosecond-gated photoemissive shutter tubes. Topics discussed include high time-space resolved photography of lasers, time-resolved X-ray spectrographic instrumentation for laser studies, a time-resolving X-ray spectrometer, a femtosecond streak camera, streak tubes and cameras, and a short pulse X-ray diagnostic development facility.

  16. Neutron Absorption Measurements Constrain Eucrite-Diogenite Mixing in Vesta's Regolith

    NASA Technical Reports Server (NTRS)

    Prettyman, T. H.; Mittlefehldt, D. W.; Feldman, W. C.; Hendricks, J. S.; Lawrence, D. J.; Peplowski, P. N.; Toplis, M. J.; Yamashita, N.; Beck, A.; LeCorre, L.; hide

    2013-01-01

    The NASA Dawn Mission s Gamma Ray and Neutron Detector (GRaND) [1] acquired mapping data during 5 months in a polar, low altitude mapping orbit (LAMO) with approx.460-km mean radius around main-belt asteroid Vesta (264-km mean radius) [2]. Neutrons and gamma rays are produced by galactic cosmic ray interactions and by the decay of natural radioelements (K, Th, U), providing information about the elemental composition of Vesta s regolith to depths of a few decimeters beneath the surface. From the data acquired in LAMO, maps of vestan neutron and gamma ray signatures were determined with a spatial resolution of approx.300 km full-width-at-half-maximum (FWHM), comparable in scale to the Rheasilvia impact basin (approx.500 km diameter). The data from Vesta encounter are available from the NASA Planetary Data System. Based on an analysis of gamma-ray spectra, Vesta s global-average regolith composition was found to be consistent with the Howardite, Eucrite, and Diogenite (HED) meteorites, reinforcing the HED-Vesta connection [2-7]. Further, an analysis of epithermal neutrons revealed variations in the abundance of hydrogen on Vesta s surface, reaching values up to 400 micro-g/g [2]. The association of high concentrations of hydrogen with equatorial, low-albedo surface regions indicated exogenic delivery of hydrogen by the infall of carbonaceous chondrite (CC) materials. This finding was buttressed by the presence of minimally-altered CC clasts in howardites, with inferred bulk hydrogen abundances similar to that found by GRaND, and by studies using data from Dawn s Framing Camera (FC) and VIR instruments [8-10]. In addition, from an analysis of neutron absorption, spatial-variations in the abundance of elements other than hydrogen were detected [2].

  17. a Spatio-Spectral Camera for High Resolution Hyperspectral Imaging

    NASA Astrophysics Data System (ADS)

    Livens, S.; Pauly, K.; Baeck, P.; Blommaert, J.; Nuyts, D.; Zender, J.; Delauré, B.

    2017-08-01

    Imaging with a conventional frame camera from a moving remotely piloted aircraft system (RPAS) is by design very inefficient. Less than 1 % of the flying time is used for collecting light. This unused potential can be utilized by an innovative imaging concept, the spatio-spectral camera. The core of the camera is a frame sensor with a large number of hyperspectral filters arranged on the sensor in stepwise lines. It combines the advantages of frame cameras with those of pushbroom cameras. By acquiring images in rapid succession, such a camera can collect detailed hyperspectral information, while retaining the high spatial resolution offered by the sensor. We have developed two versions of a spatio-spectral camera and used them in a variety of conditions. In this paper, we present a summary of three missions with the in-house developed COSI prototype camera (600-900 nm) in the domains of precision agriculture (fungus infection monitoring in experimental wheat plots), horticulture (crop status monitoring to evaluate irrigation management in strawberry fields) and geology (meteorite detection on a grassland field). Additionally, we describe the characteristics of the 2nd generation, commercially available ButterflEYE camera offering extended spectral range (475-925 nm), and we discuss future work.

  18. Ceres Sharper Than Ever Animation

    NASA Image and Video Library

    2015-01-27

    This frame from an animation of the dwarf planet Ceres was made by combining images taken by the Dawn spacecraft on January 25, 2015. These images of Ceres, and they represent the highest-resolution views to date of the dwarf planet. http://photojournal.jpl.nasa.gov/catalog/PIA19171

  19. Ceres RC3 Animation

    NASA Image and Video Library

    2015-05-11

    In this closest-yet view of Ceres, the brightest spots within a crater in the northern hemisphere are revealed to be composed of many smaller spots. This frame is from an animation of sequences taken by NASA Dawn spacecraft on May 4, 2015. http://photojournal.jpl.nasa.gov/catalog/PIA19547

  20. Integration of image capture and processing: beyond single-chip digital camera

    NASA Astrophysics Data System (ADS)

    Lim, SukHwan; El Gamal, Abbas

    2001-05-01

    An important trend in the design of digital cameras is the integration of capture and processing onto a single CMOS chip. Although integrating the components of a digital camera system onto a single chip significantly reduces system size and power, it does not fully exploit the potential advantages of integration. We argue that a key advantage of integration is the ability to exploit the high speed imaging capability of CMOS image senor to enable new applications such as multiple capture for enhancing dynamic range and to improve the performance of existing applications such as optical flow estimation. Conventional digital cameras operate at low frame rates and it would be too costly, if not infeasible, to operate their chips at high frame rates. Integration solves this problem. The idea is to capture images at much higher frame rates than he standard frame rate, process the high frame rate data on chip, and output the video sequence and the application specific data at standard frame rate. This idea is applied to optical flow estimation, where significant performance improvements are demonstrate over methods using standard frame rate sequences. We then investigate the constraints on memory size and processing power that can be integrated with a CMOS image sensor in a 0.18 micrometers process and below. We show that enough memory and processing power can be integrated to be able to not only perform the functions of a conventional camera system but also to perform applications such as real time optical flow estimation.

  1. Visible camera imaging of plasmas in Proto-MPEX

    NASA Astrophysics Data System (ADS)

    Mosby, R.; Skeen, C.; Biewer, T. M.; Renfro, R.; Ray, H.; Shaw, G. C.

    2015-11-01

    The prototype Material Plasma Exposure eXperiment (Proto-MPEX) is a linear plasma device being developed at Oak Ridge National Laboratory (ORNL). This machine plans to study plasma-material interaction (PMI) physics relevant to future fusion reactors. Measurements of plasma light emission will be made on Proto-MPEX using fast, visible framing cameras. The cameras utilize a global shutter, which allows a full frame image of the plasma to be captured and compared at multiple times during the plasma discharge. Typical exposure times are ~10-100 microseconds. The cameras are capable of capturing images at up to 18,000 frames per second (fps). However, the frame rate is strongly dependent on the size of the ``region of interest'' that is sampled. The maximum ROI corresponds to the full detector area, of ~1000x1000 pixels. The cameras have an internal gain, which controls the sensitivity of the 10-bit detector. The detector includes a Bayer filter, for ``true-color'' imaging of the plasma emission. This presentation will exmine the optimized camera settings for use on Proto-MPEX. This work was supported by the US. D.O.E. contract DE-AC05-00OR22725.

  2. Real time heart rate variability assessment from Android smartphone camera photoplethysmography: Postural and device influences.

    PubMed

    Guede-Fernandez, F; Ferrer-Mileo, V; Ramos-Castro, J; Fernandez-Chimeno, M; Garcia-Gonzalez, M A

    2015-01-01

    The aim of this paper is to present a smartphone based system for real-time pulse-to-pulse (PP) interval time series acquisition by frame-to-frame camera image processing. The developed smartphone application acquires image frames from built-in rear-camera at the maximum available rate (30 Hz) and the smartphone GPU has been used by Renderscript API for high performance frame-by-frame image acquisition and computing in order to obtain PPG signal and PP interval time series. The relative error of mean heart rate is negligible. In addition, measurement posture and the employed smartphone model influences on the beat-to-beat error measurement of heart rate and HRV indices have been analyzed. Then, the standard deviation of the beat-to-beat error (SDE) was 7.81 ± 3.81 ms in the worst case. Furthermore, in supine measurement posture, significant device influence on the SDE has been found and the SDE is lower with Samsung S5 than Motorola X. This study can be applied to analyze the reliability of different smartphone models for HRV assessment from real-time Android camera frames processing.

  3. Ambient-Light-Canceling Camera Using Subtraction of Frames

    NASA Technical Reports Server (NTRS)

    Morookian, John Michael

    2004-01-01

    The ambient-light-canceling camera (ALCC) is a proposed near-infrared electronic camera that would utilize a combination of (1) synchronized illumination during alternate frame periods and (2) subtraction of readouts from consecutive frames to obtain images without a background component of ambient light. The ALCC is intended especially for use in tracking the motion of an eye by the pupil center corneal reflection (PCCR) method. Eye tracking by the PCCR method has shown potential for application in human-computer interaction for people with and without disabilities, and for noninvasive monitoring, detection, and even diagnosis of physiological and neurological deficiencies. In the PCCR method, an eye is illuminated by near-infrared light from a lightemitting diode (LED). Some of the infrared light is reflected from the surface of the cornea. Some of the infrared light enters the eye through the pupil and is reflected from back of the eye out through the pupil a phenomenon commonly observed as the red-eye effect in flash photography. An electronic camera is oriented to image the user's eye. The output of the camera is digitized and processed by algorithms that locate the two reflections. Then from the locations of the centers of the two reflections, the direction of gaze is computed. As described thus far, the PCCR method is susceptible to errors caused by reflections of ambient light. Although a near-infrared band-pass optical filter can be used to discriminate against ambient light, some sources of ambient light have enough in-band power to compete with the LED signal. The mode of operation of the ALCC would complement or supplant spectral filtering by providing more nearly complete cancellation of the effect of ambient light. In the operation of the ALCC, a near-infrared LED would be pulsed on during one camera frame period and off during the next frame period. Thus, the scene would be illuminated by both the LED (signal) light and the ambient (background) light during one frame period, and would be illuminated with only ambient (background) light during the next frame period. The camera output would be digitized and sent to a computer, wherein the pixel values of the background-only frame would be subtracted from the pixel values of the signal-plus-background frame to obtain signal-only pixel values (see figure). To prevent artifacts of motion from entering the images, it would be necessary to acquire image data at a rate greater than the standard video rate of 30 frames per second. For this purpose, the ALCC would exploit a novel control technique developed at NASA s Jet Propulsion Laboratory for advanced charge-coupled-device (CCD) cameras. This technique provides for readout from a subwindow [region of interest (ROI)] within the image frame. Because the desired reflections from the eye would typically occupy a small fraction of the area within the image frame, the ROI capability would make it possible to acquire and subtract pixel values at rates of several hundred frames per second considerably greater than the standard video rate and sufficient to both (1) suppress motion artifacts and (2) track the motion of the eye between consecutive subtractive frame pairs.

  4. Apollo 12 photography 70 mm, 16 mm, and 35 mm frame index

    NASA Technical Reports Server (NTRS)

    1970-01-01

    For each 70-mm frame, the index presents information on: (1) the focal length of the camera, (2) the photo scale at the principal point of the frame, (3) the selenographic coordinates at the principal point of the frame, (4) the percentage of forward overlap of the frame, (5) the sun angle (medium, low, high), (6) the quality of the photography, (7) the approximate tilt (minimum and maximum) of the camera, and (8) the direction of tilt. A brief description of each frame is also included. The index to the 16-mm sequence photography includes information concerning the approximate surface coverage of the photographic sequence and a brief description of the principal features shown. A column of remarks is included to indicate: (1) if the sequence is plotted on the photographic index map and (2) the quality of the photography. The pictures taken using the lunar surface closeup stereoscopic camera (35 mm) are also described in this same index format.

  5. Motion-Blur-Free High-Speed Video Shooting Using a Resonant Mirror

    PubMed Central

    Inoue, Michiaki; Gu, Qingyi; Takaki, Takeshi; Ishii, Idaku; Tajima, Kenji

    2017-01-01

    This study proposes a novel concept of actuator-driven frame-by-frame intermittent tracking for motion-blur-free video shooting of fast-moving objects. The camera frame and shutter timings are controlled for motion blur reduction in synchronization with a free-vibration-type actuator vibrating with a large amplitude at hundreds of hertz so that motion blur can be significantly reduced in free-viewpoint high-frame-rate video shooting for fast-moving objects by deriving the maximum performance of the actuator. We develop a prototype of a motion-blur-free video shooting system by implementing our frame-by-frame intermittent tracking algorithm on a high-speed video camera system with a resonant mirror vibrating at 750 Hz. It can capture 1024 × 1024 images of fast-moving objects at 750 fps with an exposure time of 0.33 ms without motion blur. Several experimental results for fast-moving objects verify that our proposed method can reduce image degradation from motion blur without decreasing the camera exposure time. PMID:29109385

  6. Constructing a Database from Multiple 2D Images for Camera Pose Estimation and Robot Localization

    NASA Technical Reports Server (NTRS)

    Wolf, Michael; Ansar, Adnan I.; Brennan, Shane; Clouse, Daniel S.; Padgett, Curtis W.

    2012-01-01

    The LMDB (Landmark Database) Builder software identifies persistent image features (landmarks) in a scene viewed multiple times and precisely estimates the landmarks 3D world positions. The software receives as input multiple 2D images of approximately the same scene, along with an initial guess of the camera poses for each image, and a table of features matched pair-wise in each frame. LMDB Builder aggregates landmarks across an arbitrarily large collection of frames with matched features. Range data from stereo vision processing can also be passed to improve the initial guess of the 3D point estimates. The LMDB Builder aggregates feature lists across all frames, manages the process to promote selected features to landmarks, and iteratively calculates the 3D landmark positions using the current camera pose estimations (via an optimal ray projection method), and then improves the camera pose estimates using the 3D landmark positions. Finally, it extracts image patches for each landmark from auto-selected key frames and constructs the landmark database. The landmark database can then be used to estimate future camera poses (and therefore localize a robotic vehicle that may be carrying the cameras) by matching current imagery to landmark database image patches and using the known 3D landmark positions to estimate the current pose.

  7. KSC-07pd2578

    NASA Image and Video Library

    2007-09-26

    KENNEDY SPACE CENTER, FLA. -- The Delta II rocket is revealed as the mobile service tower, or gantry (at right), is retracted on Launch Pad 17B at Cape Canaveral Air Force Station. Starting with a boost from this higher thrust version of the Delta II rocket, the Dawn spacecraft will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission during its nearly decade-long mission, Dawn will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field, and thus, bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Launch is targeted for Sept. 27 during a window that extends from 7:20 to 7:49 a.m. EDT. Photo credit: NASA/Kim Shiflett

  8. KSC-07pd2582

    NASA Image and Video Library

    2007-09-26

    KENNEDY SPACE CENTER, FLA. -- Rollback of the mobile service tower, or gantry, from the Delta II rocket is complete on Launch Pad 17B at Cape Canaveral Air Force Station. Starting with a boost from this higher thrust version of the Delta II rocket, the Dawn spacecraft will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission during its nearly decade-long mission, Dawn will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field, and thus, bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Launch is targeted for Sept. 27 during a window that extends from 7:20 to 7:49 a.m. EDT. Photo credit: NASA/Kim Shiflett

  9. KSC-07pd2408

    NASA Image and Video Library

    2007-09-10

    KENNEDY SPACE CENTER, FLA. -- At Astrotech Space Operations in Titusville, Fla., workers ensure the upper transportation canister is securely attached to the lower segments. The canister will protect the spacecraft and booster during transfer to Launch Pad 17-B at Cape Canaveral Air Force Station (CCAFS). During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Launch via a Delta II rocket is scheduled in a window from 7:25 to 7:54 a.m. Sept. 26 from CCAFS. Photo credit: NASA/Jim Grossmann

  10. KSC-07pd2579

    NASA Image and Video Library

    2007-09-26

    KENNEDY SPACE CENTER, FLA. -- A worker monitors the progress of the retraction of the mobile service tower, or gantry, from the Delta II rocket on Launch Pad 17B at Cape Canaveral Air Force Station. Starting with a boost from this higher thrust version of the Delta II rocket, the Dawn spacecraft will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission during its nearly decade-long mission, Dawn will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field, and thus, bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Launch is targeted for Sept. 27 during a window that extends from 7:20 to 7:49 a.m. EDT. Photo credit: NASA/Kim Shiflett

  11. KSC-07pd2581

    NASA Image and Video Library

    2007-09-26

    KENNEDY SPACE CENTER, FLA. -- The Delta II rocket stands ready for launch following rollback of the mobile service tower, or gantry, on Launch Pad 17B at Cape Canaveral Air Force Station. Starting with a boost from this higher thrust version of the Delta II rocket, the Dawn spacecraft will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission during its nearly decade-long mission, Dawn will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field, and thus, bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Launch is targeted for Sept. 27 during a window that extends from 7:20 to 7:49 a.m. EDT. Photo credit: NASA/Kim Shiflett

  12. KSC-07pd2580

    NASA Image and Video Library

    2007-09-26

    KENNEDY SPACE CENTER, FLA. -- The Delta II rocket stands ready for launch following rollback of the mobile service tower, or gantry, on Launch Pad 17B at Cape Canaveral Air Force Station. Starting with a boost from this higher thrust version of the Delta II rocket, the Dawn spacecraft will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission during its nearly decade-long mission, Dawn will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field, and thus, bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Launch is targeted for Sept. 27 during a window that extends from 7:20 to 7:49 a.m. EDT. Photo credit: NASA/Kim Shiflett

  13. KSC-07pd2577

    NASA Image and Video Library

    2007-09-26

    KENNEDY SPACE CENTER, FLA. -- The Delta II rocket is revealed as the mobile service tower, or gantry (at left), is retracted on Launch Pad 17B at Cape Canaveral Air Force Station. Starting with a boost from this higher thrust version of the Delta II rocket, the Dawn spacecraft will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission during its nearly decade-long mission, Dawn will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field, and thus, bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Launch is targeted for Sept. 27 during a window that extends from 7:20 to 7:49 a.m. EDT. Photo credit: NASA/Kim Shiflett

  14. Oxo Crater on (1) Ceres: Geological History and the Role of Water-ice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nathues, A.; Platz, T.; Hoffmann, M.

    Dwarf planet Ceres (∅ ∼ 940 km) is the largest object in the main asteroid belt. Investigations suggest that Ceres is a thermally evolved, volatile-rich body with potential geological activity, a body that was never completely molten, but one that possibly partially differentiated into a rocky core and an ice-rich mantle, and may contain remnant internal liquid water. Thermal alteration and the infall of exogenic material contribute to producing a (dark) carbonaceous chondritic-like surface containing ammoniated phyllosilicates. Here we report imaging and spectroscopic analyses of data on the bright Oxo crater derived from the Framing Camera and the Visible andmore » Infrared Spectrometer on board the Dawn spacecraft. We confirm that the transitional complex crater Oxo (∅ ∼ 9 km) exhibits exposed surface water-ice. We show that this water-ice-rich material is associated exclusively with two lobate deposits at pole-facing scarps, deposits that also contain carbonates and admixed phyllosilicates. Due to Oxo’s location at −4802 m below the cerean reference ellipsoid and its very young age of only 190 ka (1 σ : +100 ka, −70 ka), Oxo is predestined for ongoing water-ice sublimation.« less

  15. Encrypting Digital Camera with Automatic Encryption Key Deletion

    NASA Technical Reports Server (NTRS)

    Oakley, Ernest C. (Inventor)

    2007-01-01

    A digital video camera includes an image sensor capable of producing a frame of video data representing an image viewed by the sensor, an image memory for storing video data such as previously recorded frame data in a video frame location of the image memory, a read circuit for fetching the previously recorded frame data, an encryption circuit having an encryption key input connected to receive the previously recorded frame data from the read circuit as an encryption key, an un-encrypted data input connected to receive the frame of video data from the image sensor and an encrypted data output port, and a write circuit for writing a frame of encrypted video data received from the encrypted data output port of the encryption circuit to the memory and overwriting the video frame location storing the previously recorded frame data.

  16. An ultrahigh-speed color video camera operating at 1,000,000 fps with 288 frame memories

    NASA Astrophysics Data System (ADS)

    Kitamura, K.; Arai, T.; Yonai, J.; Hayashida, T.; Kurita, T.; Maruyama, H.; Namiki, J.; Yanagi, T.; Yoshida, T.; van Kuijk, H.; Bosiers, Jan T.; Saita, A.; Kanayama, S.; Hatade, K.; Kitagawa, S.; Etoh, T. Goji

    2008-11-01

    We developed an ultrahigh-speed color video camera that operates at 1,000,000 fps (frames per second) and had capacity to store 288 frame memories. In 2005, we developed an ultrahigh-speed, high-sensitivity portable color camera with a 300,000-pixel single CCD (ISIS-V4: In-situ Storage Image Sensor, Version 4). Its ultrahigh-speed shooting capability of 1,000,000 fps was made possible by directly connecting CCD storages, which record video images, to the photodiodes of individual pixels. The number of consecutive frames was 144. However, longer capture times were demanded when the camera was used during imaging experiments and for some television programs. To increase ultrahigh-speed capture times, we used a beam splitter and two ultrahigh-speed 300,000-pixel CCDs. The beam splitter was placed behind the pick up lens. One CCD was located at each of the two outputs of the beam splitter. The CCD driving unit was developed to separately drive two CCDs, and the recording period of the two CCDs was sequentially switched. This increased the recording capacity to 288 images, an increase of a factor of two over that of conventional ultrahigh-speed camera. A problem with the camera was that the incident light on each CCD was reduced by a factor of two by using the beam splitter. To improve the light sensitivity, we developed a microlens array for use with the ultrahigh-speed CCDs. We simulated the operation of the microlens array in order to optimize its shape and then fabricated it using stamping technology. Using this microlens increased the light sensitivity of the CCDs by an approximate factor of two. By using a beam splitter in conjunction with the microlens array, it was possible to make an ultrahigh-speed color video camera that has 288 frame memories but without decreasing the camera's light sensitivity.

  17. Investigations of Ceres's Craters with Straightened Rim

    NASA Astrophysics Data System (ADS)

    Frigeri, A.; De Sanctis, M. C.; Ammannito, E.; Raponi, A.; Formisano, M.; Ciarniello, M.; Magni, G.; Combe, J. P.; Marchi, S.; Raymond, C. A.; Schwartz, S. J.

    2017-12-01

    Dwarf planet Ceres hosts some geological features that are unique in the solar system because its composition, rich in aqueously-altered silicates, is usually found on full-size planets, whereas its mean radius is smaller than most natural satellites in the solar system. For example, the local high-albedo, carbonate-rich areas or faculaeare specific to Ceres; also, the absence of big impact crater structures is key to understand the overall mechanical behaviour of the Cerean crust. After the first findings of water ice occurring in the shadowed areas of craters on Ceres by the NASA/Dawn mission (1, 2), we analyzed the morphology of craters looking for features similar to the ones where the water ice composition has been detected analyzing the data from the VIR spectrometer (3). These craters fall outside of the family of polygonal craters which are mainly related to regional or global scale tectonics (4). We analyzed the morphology on the base of the global mosaic, the digital terrain model derived by using the stereo photogrammetry method and the single data frames of the Framing Camera. Our investigation started from crater Juling, which is characterized by a portion of the rim which forms a straight segment instead of a portion of a circle. This linear crater wall is also steep enough that it forms a cliff that is in the shadowed area in all images acquired by Dawn. Very smooth and bright deposits lay at the foot of this crater-wall cliff. Then, we identified several other craters, relatively fresh, with radius of 2 to 10 kilometers, showing one or two sectors of the crater-rim being truncated by a mass-wasting process, probably a rockfall. Our first analysis show that in the selected craters, the truncated sectors are always in the north-eastern sector of the rim for the craters in the southern hemisphere. Conversely, the craters on the northern hemisphere exhibit a truncated rim in their south-eastern sector. Although a more detailed analysis is mandatory, these first observation are particularly intriguing as they would correlate the mechanical behaviour of the Cerean cust with the presence of ground-ice and the illumination conditions. (1) Platz et al., 2016, Nature Communications. (2) Raponi et al. submitted to Science Advances. (3) Combe et al., submitted to Icarus. (4) Otto et al., LPSC 2017

  18. The appearance of Carbonaceous Chondrites on (1) Ceres from observations by the Dawn Framing Camera

    NASA Astrophysics Data System (ADS)

    Schäfer, Tanja; Schäfer, Michael; Mengel, Kurt; Cloutis, Edward A.; Izawa, Matthew R. M.; Thangjam, Guneshwar; Hoffmann, Martin; Platz, Thomas; Nathues, Andreas; Kallisch, Jan; Ripken, Joachim; Russel, Christopher T.

    2016-04-01

    NASA's Dawn spacecraft reached dwarf planet Ceres in March 2015 and started data acquisition using three different instruments. These are the Framing Camera (FC; [1]), the Visible & Infrared Spectrometer (VIR; [2]), and the Gamma Ray and Neutron Detector (GRaND; [3]). In our work we focus on the potential appearance of carbonaceous chondritic (CC) material on the cerean surface using Dawn FC color mosaics covering the VIS/NIR wavelength region. In preparation of the Dawn arrival at Ceres, a discrimination scheme for CC groups using FC color ratios was developed by [4] and is based on 121 CC laboratory spectra compiled from RELAB. As the cerean surface material mainly differs by its spectral slope over the whole FC wavelength range (0.44-0.97 μm), we classified the color mosaics by this parameter. We applied the CC discrimination scheme only to those regions on the cerean surface (more than 90 %) which exhibit spectral slopes ≥ -1 % reflectance per μm to exclude the strongly negative sloped regions of large young craters such as Occator, Haulani, and Oxo. These are not likely to be similar to pure CC material as can be seen by their brightness and their bluish spectral slope [5]. We found that the surface material of Ceres is, among the suite of CCs, most similar to Ivuna samples artificially heated to 200 and 300°C [6] and unusual CCs, which naturally experienced heating. The latter ones comprise Dhofar 225, Y-86789 and Y-82162, which have been determined to have undergone aqueous alteration and subsequent thermal metamorphism (e.g. [7,8]).Our comparison with VIR data shows, that the spectra of Ivuna heated to 200°C and 300°C match well the OH-absorption at 2.7 μm but do not show the smaller 3.05-3.1 μm absorption observed on Ceres [9,10,11]. Nevertheless, the remarkably flat UV drop-off detected on the cerean surface may, at least spectrally, correspond to highly aqueously altered and subsequently thermally metamorphosed CC material. Further alteration of this material on a parent body like Ceres may produce spectral changes affecting the 3 μm region, while showing no additional modification in the VIS/NIR region. Scenarios of thermal and geophysical evolution models allow Ceres' differentiation into a core of dehydrated silicates and a shell of hydrated silicates overlain by an icy shell [12,13]. The widespread occurence of material on the cerean surface, spectrally similar to thermally altered CC material, suggests that we possibly see the mineralogy of the hydrated-dehydrated boundary of Ceres exposed by impact gardening and simultaneous loss of the icy shell. Also differing recent models of a convecting mud ocean on Ceres, introduced by [14] and enhanced by [15], allow a lag deposit of aqueously altered fine material on the surface, spectrally corresponding to mildly heated Ivuna samples. References: [1] Sierks, H. et al. 2011. Space Sci. Rev., 163, 1-4, 263-327. [2] De Sanctis, C.M. et al. 2011. Space Sci. Rev., 163, 1-4, 329-369. [3] Prettyman, T.H. et al. 2011. Space Sci. Rev., 163, 1-4, 371-459. [4] Schäfer, T. et al., 2015. Icarus 265, 149-160. [5] Nathues, A. et al., 2015. Nature 528 (7581), 237-240. [6] Hiroi, T. et al., 1996. Lunar Planet. Sci. 27, 551. [7] Brearley, A.J., Jones, R.H., 1998. Chondritic meteorites. In: Planetary Materials, Papike, J.J. (Ed.). Rev. in Mineralogy and Geochem. 36 (1), ch. 3, 1-398. [8] Ivanova, M.A. et al., 2010. Meteoritics & Planet. Sci. 45 (7), 1108-1123. [9] King, T.V.V., et al., 1992. Science 255, 1551-1553. [10] De Sanctis, M.C. et al., 2015. Nature 528 (7581), 241-244. [11] Milliken, R.E., Rivkin, A.S., 2009. Nature Geosci. 2 (4), 258-261. [12] Castillo-Rogez, J.C., McCord, T.B., 2010. Icarus 205 (2), 443-459. [13] Neveu, M., Desch, S.J., Castillo-Rogez, J.C., 2015. J. Geophys. Res. Planets 120 (2), 123-154. [14] Travis, B.J. et al., 2015. Lunar Planet. Sci., #2360. [15] Neveu, M., Desch, S.J., 2015. Geophys. Res. Lett. 42 (23), 10197-10206.

  19. Ceres' Yellow Spots - Observations with Dawn Framing Camera

    NASA Astrophysics Data System (ADS)

    Schäfer, Michael; Schäfer, Tanja; Cloutis, Edward A.; Izawa, Matthew R. M.; Platz, Thomas; Castillo-Rogez, Julie C.; Hoffmann, Martin; Thangjam, Guneshwar S.; Kneissl, Thomas; Nathues, Andreas; Mengel, Kurt; Williams, David A.; Kallisch, Jan; Ripken, Joachim; Russell, Christopher T.

    2016-04-01

    The Framing Camera (FC) onboard the Dawn spacecraft acquired several spectral data sets of (1) Ceres with increasing spatial resolution (up to 135 m/pixel with nearly global coverage). The FC is equipped with seven color filters (0.4-1.0 μm) plus one panchromatic ('clear') filter [1]. We produced spectral mosaics using photometrically corrected FC color filter images as described in [2]. Even early FC color mosaics obtained during Dawn's approach unexpectedly exhibited quite a diversity of surface materials on Ceres. Besides the ordinary cerean surface material, potentially composed of ammoniated phyllosilicates [3] or some other alteration product of carbonaceous chondrites [4], a large number of bright spots were found on Ceres [5]. These spots are substantially brighter than the average surface (exceeding its triple standard deviation), with the spots within Occator crater being the brightest and most prominent examples (reflectance more than 10 times the average of Ceres). We observed bright spots which are different by their obvious yellow color. This yellow color appears both in a 'true color' RGB display (R=0.65, G=0.55, B=0.44 μm) as well as in a false color display (R=0.97, G=0.75, B=0.44 μm) using a linear 2% stretch. Their spectra show a steep red slope between 0.44 and 0.55 μm (UV drop-off). On the contrary to these yellow spots, the vast majority of bright spots appears white in the aforementioned color displays and exhibit blue sloped spectra, except for a shallow UV drop-off. Thus, yellow spots are easily distinguishable from white spots and the remaining cerean surface by their high values in the ratio 0.55/0.44 μm. We found 8 occurrences of yellow spots on Ceres. Most of them (>70 individual spots) occur both inside and outside crater Dantu, where white spots are also found in the immediate vicinity. Besides Dantu, further occurrences with only a few yellow spots were found at craters Ikapati and Gaue. Less definite occurrences are found at 97°E/24°N, 205°E/22°S, 244°E/31°S, 213°E/37.5°S, and at Azacca crater. Often, the yellow spots exhibit well-defined boundaries, but sometimes we found a fainter diffuse yellow tinge around them, enclosing several individual yellow spots. Rarely, they are associated with mass wasting on steep slopes, most notably on the SE crater wall of Dantu. Recently acquired clear filter images with 35 m/pixel resolution indicate only a small number of yellow spots to be situated nearby craters. These craters could also be interpreted as pits probably formed by exhalation vents. More frequently, we found yellow spots linked to small positive landforms. Only a few of the yellow spots seem to be interrelated with crater floor fractures. As with white bright spots, which were interpreted as evaporite deposits of magnesium-sulfate salts [5], the yellow spots appear to emerge from the sub-surface as a result of material transport, possibly driven by sublimation of ice [5], where vents or cracks penetrate the insulating lag deposits. However, in contrast to the white spots, a different mineralogy seems to have emerged at yellow spots. First comparisons of FC spectra with laboratory spectra indicate pyrite/marcasite as a possible component. The relatively strong UV drop-off may at least indicate some kind of sulfide- or sulfur-bearing mixture. As identifications of minerals based on FC spectra are often ambiguous, further investigations by high-resolution data yet to come from Dawn's VIR spectrometer may shed light into the compositional differences between yellow and white bright spots. References: [1] Sierks, H. et al., Space Sci. Rev., 163, 263-327, 2011. [2] Schäfer, M. et al., EPSC, Vol. 10, #488, 2015. [3] De Sanctis, M. C. et al., Nature 528, 241-244, 2015. [4] Schäfer, T. et al., EGU, #12370, 2016. [5] Nathues, A. et al., Nature 528, 237-240, 2015.

  20. Flexible nuclear medicine camera and method of using

    DOEpatents

    Dilmanian, F.A.; Packer, S.; Slatkin, D.N.

    1996-12-10

    A nuclear medicine camera and method of use photographically record radioactive decay particles emitted from a source, for example a small, previously undetectable breast cancer, inside a patient. The camera includes a flexible frame containing a window, a photographic film, and a scintillation screen, with or without a gamma-ray collimator. The frame flexes for following the contour of the examination site on the patient, with the window being disposed in substantially abutting contact with the skin of the patient for reducing the distance between the film and the radiation source inside the patient. The frame is removably affixed to the patient at the examination site for allowing the patient mobility to wear the frame for a predetermined exposure time period. The exposure time may be several days for obtaining early qualitative detection of small malignant neoplasms. 11 figs.

  1. Computational Studies of X-ray Framing Cameras for the National Ignition Facility

    DTIC Science & Technology

    2013-06-01

    Livermore National Laboratory 7000 East Avenue Livermore, CA 94550 USA Abstract The NIF is the world’s most powerful laser facility and is...a phosphor screen where the output is recorded. The x-ray framing cameras have provided excellent information. As the yields at NIF have increased...experiments on the NIF . The basic operation of these cameras is shown in Fig. 1. Incident photons generate photoelectrons both in the pores of the MCP and

  2. Earth Observations taken by Expedition 41 crewmember

    NASA Image and Video Library

    2014-09-13

    ISS041-E-013683 (13 Sept. 2014) --- Photographed with a mounted automated camera, this is one of a number of images featuring the European Space Agency?s Automated Transfer Vehicle (ATV-5 or Georges Lemaitre) docked with the International Space Station. Except for color changes, the images are almost identical. The variation in color from frame to frame is due to the camera?s response to the motion of the orbital outpost, relative to the illumination from the sun.

  3. Earth Observations taken by Expedition 41 crewmember

    NASA Image and Video Library

    2014-09-13

    ISS041-E-013687 (13 Sept. 2014) --- Photographed with a mounted automated camera, this is one of a number of images featuring the European Space Agency?s Automated Transfer Vehicle (ATV-5 or Georges Lemaitre) docked with the International Space Station. Except for color changes, the images are almost identical. The variation in color from frame to frame is due to the camera?s response to the motion of the orbital outpost, relative to the illumination from the sun.

  4. Earth Observations taken by Expedition 41 crewmember

    NASA Image and Video Library

    2014-09-13

    ISS041-E-013693 (13 Sept. 2014) --- Photographed with a mounted automated camera, this is one of a number of images featuring the European Space Agency?s Automated Transfer Vehicle (ATV-5 or Georges Lemaitre) docked with the International Space Station. Except for color changes, the images are almost identical. The variation in color from frame to frame is due to the camera?s response to the motion of the orbital outpost, relative to the illumination from the sun.

  5. A combined microphone and camera calibration technique with application to acoustic imaging.

    PubMed

    Legg, Mathew; Bradley, Stuart

    2013-10-01

    We present a calibration technique for an acoustic imaging microphone array, combined with a digital camera. Computer vision and acoustic time of arrival data are used to obtain microphone coordinates in the camera reference frame. Our new method allows acoustic maps to be plotted onto the camera images without the need for additional camera alignment or calibration. Microphones and cameras may be placed in an ad-hoc arrangement and, after calibration, the coordinates of the microphones are known in the reference frame of a camera in the array. No prior knowledge of microphone positions, inter-microphone spacings, or air temperature is required. This technique is applied to a spherical microphone array and a mean difference of 3 mm was obtained between the coordinates obtained with this calibration technique and those measured using a precision mechanical method.

  6. Driving techniques for high frame rate CCD camera

    NASA Astrophysics Data System (ADS)

    Guo, Weiqiang; Jin, Longxu; Xiong, Jingwu

    2008-03-01

    This paper describes a high-frame rate CCD camera capable of operating at 100 frames/s. This camera utilizes Kodak KAI-0340, an interline transfer CCD with 640(vertical)×480(horizontal) pixels. Two output ports are used to read out CCD data and pixel rates approaching 30 MHz. Because of its reduced effective opacity of vertical charge transfer registers, interline transfer CCD can cause undesired image artifacts, such as random white spots and smear generated in the registers. To increase frame rate, a kind of speed-up structure has been incorporated inside KAI-0340, then it is vulnerable to a vertical stripe effect. The phenomena which mentioned above may severely impair the image quality. To solve these problems, some electronic methods of eliminating these artifacts are adopted. Special clocking mode can dump the unwanted charge quickly, then the fast readout of the images, cleared of smear, follows immediately. Amplifier is used to sense and correct delay mismatch between the dual phase vertical clock pulses, the transition edges become close to coincident, so vertical stripes disappear. Results obtained with the CCD camera are shown.

  7. Sequential detection of web defects

    DOEpatents

    Eichel, Paul H.; Sleefe, Gerard E.; Stalker, K. Terry; Yee, Amy A.

    2001-01-01

    A system for detecting defects on a moving web having a sequential series of identical frames uses an imaging device to form a real-time camera image of a frame and a comparitor to comparing elements of the camera image with corresponding elements of an image of an exemplar frame. The comparitor provides an acceptable indication if the pair of elements are determined to be statistically identical; and a defective indication if the pair of elements are determined to be statistically not identical. If the pair of elements is neither acceptable nor defective, the comparitor recursively compares the element of said exemplar frame with corresponding elements of other frames on said web until one of the acceptable or defective indications occur.

  8. High-speed imaging using 3CCD camera and multi-color LED flashes

    NASA Astrophysics Data System (ADS)

    Hijazi, Ala; Friedl, Alexander; Cierpka, Christian; Kähler, Christian; Madhavan, Vis

    2017-11-01

    This paper demonstrates the possibility of capturing full-resolution, high-speed image sequences using a regular 3CCD color camera in conjunction with high-power light emitting diodes of three different colors. This is achieved using a novel approach, referred to as spectral-shuttering, where a high-speed image sequence is captured using short duration light pulses of different colors that are sent consecutively in very close succession. The work presented in this paper demonstrates the feasibility of configuring a high-speed camera system using low cost and readily available off-the-shelf components. This camera can be used for recording six-frame sequences at frame rates up to 20 kHz or three-frame sequences at even higher frame rates. Both color crosstalk and spatial matching between the different channels of the camera are found to be within acceptable limits. A small amount of magnification difference between the different channels is found and a simple calibration procedure for correcting the images is introduced. The images captured using the approach described here are of good quality to be used for obtaining full-field quantitative information using techniques such as digital image correlation and particle image velocimetry. A sequence of six high-speed images of a bubble splash recorded at 400 Hz is presented as a demonstration.

  9. Students' framing of laboratory exercises using infrared cameras

    NASA Astrophysics Data System (ADS)

    Haglund, Jesper; Jeppsson, Fredrik; Hedberg, David; Schönborn, Konrad J.

    2015-12-01

    Thermal science is challenging for students due to its largely imperceptible nature. Handheld infrared cameras offer a pedagogical opportunity for students to see otherwise invisible thermal phenomena. In the present study, a class of upper secondary technology students (N =30 ) partook in four IR-camera laboratory activities, designed around the predict-observe-explain approach of White and Gunstone. The activities involved central thermal concepts that focused on heat conduction and dissipative processes such as friction and collisions. Students' interactions within each activity were videotaped and the analysis focuses on how a purposefully selected group of three students engaged with the exercises. As the basis for an interpretative study, a "thick" narrative description of the students' epistemological and conceptual framing of the exercises and how they took advantage of the disciplinary affordance of IR cameras in the thermal domain is provided. Findings include that the students largely shared their conceptual framing of the four activities, but differed among themselves in their epistemological framing, for instance, in how far they found it relevant to digress from the laboratory instructions when inquiring into thermal phenomena. In conclusion, the study unveils the disciplinary affordances of infrared cameras, in the sense of their use in providing access to knowledge about macroscopic thermal science.

  10. Flexible nuclear medicine camera and method of using

    DOEpatents

    Dilmanian, F. Avraham; Packer, Samuel; Slatkin, Daniel N.

    1996-12-10

    A nuclear medicine camera 10 and method of use photographically record radioactive decay particles emitted from a source, for example a small, previously undetectable breast cancer, inside a patient. The camera 10 includes a flexible frame 20 containing a window 22, a photographic film 24, and a scintillation screen 26, with or without a gamma-ray collimator 34. The frame 20 flexes for following the contour of the examination site on the patient, with the window 22 being disposed in substantially abutting contact with the skin of the patient for reducing the distance between the film 24 and the radiation source inside the patient. The frame 20 is removably affixed to the patient at the examination site for allowing the patient mobility to wear the frame 20 for a predetermined exposure time period. The exposure time may be several days for obtaining early qualitative detection of small malignant neoplasms.

  11. Full-frame, high-speed 3D shape and deformation measurements using stereo-digital image correlation and a single color high-speed camera

    NASA Astrophysics Data System (ADS)

    Yu, Liping; Pan, Bing

    2017-08-01

    Full-frame, high-speed 3D shape and deformation measurement using stereo-digital image correlation (stereo-DIC) technique and a single high-speed color camera is proposed. With the aid of a skillfully designed pseudo stereo-imaging apparatus, color images of a test object surface, composed of blue and red channel images from two different optical paths, are recorded by a high-speed color CMOS camera. The recorded color images can be separated into red and blue channel sub-images using a simple but effective color crosstalk correction method. These separated blue and red channel sub-images are processed by regular stereo-DIC method to retrieve full-field 3D shape and deformation on the test object surface. Compared with existing two-camera high-speed stereo-DIC or four-mirror-adapter-assisted singe-camera high-speed stereo-DIC, the proposed single-camera high-speed stereo-DIC technique offers prominent advantages of full-frame measurements using a single high-speed camera but without sacrificing its spatial resolution. Two real experiments, including shape measurement of a curved surface and vibration measurement of a Chinese double-side drum, demonstrated the effectiveness and accuracy of the proposed technique.

  12. 640x480 PtSi Stirling-cooled camera system

    NASA Astrophysics Data System (ADS)

    Villani, Thomas S.; Esposito, Benjamin J.; Davis, Timothy J.; Coyle, Peter J.; Feder, Howard L.; Gilmartin, Harvey R.; Levine, Peter A.; Sauer, Donald J.; Shallcross, Frank V.; Demers, P. L.; Smalser, P. J.; Tower, John R.

    1992-09-01

    A Stirling cooled 3 - 5 micron camera system has been developed. The camera employs a monolithic 640 X 480 PtSi-MOS focal plane array. The camera system achieves an NEDT equals 0.10 K at 30 Hz frame rate with f/1.5 optics (300 K background). At a spatial frequency of 0.02 cycles/mRAD the vertical and horizontal Minimum Resolvable Temperature are in the range of MRT equals 0.03 K (f/1.5 optics, 300 K background). The MOS focal plane array achieves a resolution of 480 TV lines per picture height independent of background level and position within the frame.

  13. Multi-camera synchronization core implemented on USB3 based FPGA platform

    NASA Astrophysics Data System (ADS)

    Sousa, Ricardo M.; Wäny, Martin; Santos, Pedro; Dias, Morgado

    2015-03-01

    Centered on Awaiba's NanEye CMOS image sensor family and a FPGA platform with USB3 interface, the aim of this paper is to demonstrate a new technique to synchronize up to 8 individual self-timed cameras with minimal error. Small form factor self-timed camera modules of 1 mm x 1 mm or smaller do not normally allow external synchronization. However, for stereo vision or 3D reconstruction with multiple cameras as well as for applications requiring pulsed illumination it is required to synchronize multiple cameras. In this work, the challenge of synchronizing multiple selftimed cameras with only 4 wire interface has been solved by adaptively regulating the power supply for each of the cameras. To that effect, a control core was created to constantly monitor the operating frequency of each camera by measuring the line period in each frame based on a well-defined sampling signal. The frequency is adjusted by varying the voltage level applied to the sensor based on the error between the measured line period and the desired line period. To ensure phase synchronization between frames, a Master-Slave interface was implemented. A single camera is defined as the Master, with its operating frequency being controlled directly through a PC based interface. The remaining cameras are setup in Slave mode and are interfaced directly with the Master camera control module. This enables the remaining cameras to monitor its line and frame period and adjust their own to achieve phase and frequency synchronization. The result of this work will allow the implementation of smaller than 3mm diameter 3D stereo vision equipment in medical endoscopic context, such as endoscopic surgical robotic or micro invasive surgery.

  14. Studies on the formation, temporal evolution and forensic applications of camera "fingerprints".

    PubMed

    Kuppuswamy, R

    2006-06-02

    A series of experiments was conducted by exposing negative film in brand new cameras of different make and model. The exposures were repeated at regular time intervals spread over a period of 2 years. The processed film negatives were studied under a stereomicroscope (10-40x) in transmitted illumination for the presence of the characterizing features on their four frame-edges. These features were then related to those present on the masking frame of the cameras by examining the latter in reflected light stereomicroscopy (10-40x). The purpose of the study was to determine the origin and permanence of the frame-edge-marks, and also the processes by which the marks may probably alter with time. The investigations have arrived at the following conclusions: (i) the edge-marks have originated principally from the imperfections received on the film mask from the manufacturing and also occasionally from the accumulated dirt, dust and fiber on the film mask over an extended time period. (ii) The edge profiles of the cameras have remained fixed over a considerable period of time so as to be of a valuable identification medium. (iii) The marks are found to be varying in nature even with those cameras manufactured at similar time. (iv) The influence of f/number and object distance has great effect in the recording of the frame-edge marks during exposure of the film. The above findings would serve as a useful addition to the technique of camera edge-mark comparisons.

  15. Laser-Camera Vision Sensing for Spacecraft Mobile Robot Navigation

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Khalil, Ahmad S.; Dorais, Gregory A.; Gawdiak, Yuri

    2002-01-01

    The advent of spacecraft mobile robots-free-flyng sensor platforms and communications devices intended to accompany astronauts or remotely operate on space missions both inside and outside of a spacecraft-has demanded the development of a simple and effective navigation schema. One such system under exploration involves the use of a laser-camera arrangement to predict relative positioning of the mobile robot. By projecting laser beams from the robot, a 3D reference frame can be introduced. Thus, as the robot shifts in position, the position reference frame produced by the laser images is correspondingly altered. Using normalization and camera registration techniques presented in this paper, the relative translation and rotation of the robot in 3D are determined from these reference frame transformations.

  16. KSC-07pd2409

    NASA Image and Video Library

    2007-09-10

    KENNEDY SPACE CENTER, FLA. -- At Astrotech Space Operations in Titusville, Fla., workers ensure the upper transportation canister is securely attached to the lower segments. The transportation canister will protect the spacecraft and booster during transfer to Launch Pad 17-B at Cape Canaveral Air Force Station (CCAFS). During its nearly decade-long mission, the Dawn mission will study the asteroid Vesta and dwarf planet Ceres, celestial bodies believed to have accreted early in the history of the solar system. To carry out its scientific mission, the Dawn spacecraft will carry a visible camera, a visible and infrared mapping spectrometer, and a gamma ray and neutron spectrometer, whose data will be used in combination to characterize these bodies. In addition to the three instruments, radiometric and optical navigation data will provide data relating to the gravity field and thus bulk properties and internal structure of the two bodies. Data returned from the Dawn spacecraft could provide opportunities for significant breakthroughs in our knowledge of how the solar system formed. Launch via a Delta II rocket is scheduled in a window from 7:25 to 7:54 a.m. Sept. 26 from CCAFS. Photo credit: NASA/Jim Grossmann

  17. Vision Based SLAM in Dynamic Scenes

    DTIC Science & Technology

    2012-12-20

    the correct relative poses between cameras at frame F. For this purpose, we detect and match SURF features between cameras in dilierent groups, and...all cameras in s uch a challenging case. For a compa rison, we disabled the ’ inte r-camera pose estimation’ and applied the ’ intra-camera pose esti

  18. Accurate estimation of camera shot noise in the real-time

    NASA Astrophysics Data System (ADS)

    Cheremkhin, Pavel A.; Evtikhiev, Nikolay N.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Starikov, Rostislav S.

    2017-10-01

    Nowadays digital cameras are essential parts of various technological processes and daily tasks. They are widely used in optics and photonics, astronomy, biology and other various fields of science and technology such as control systems and video-surveillance monitoring. One of the main information limitations of photo- and videocameras are noises of photosensor pixels. Camera's photosensor noise can be divided into random and pattern components. Temporal noise includes random noise component while spatial noise includes pattern noise component. Temporal noise can be divided into signal-dependent shot noise and signal-nondependent dark temporal noise. For measurement of camera noise characteristics, the most widely used methods are standards (for example, EMVA Standard 1288). It allows precise shot and dark temporal noise measurement but difficult in implementation and time-consuming. Earlier we proposed method for measurement of temporal noise of photo- and videocameras. It is based on the automatic segmentation of nonuniform targets (ASNT). Only two frames are sufficient for noise measurement with the modified method. In this paper, we registered frames and estimated shot and dark temporal noises of cameras consistently in the real-time. The modified ASNT method is used. Estimation was performed for the cameras: consumer photocamera Canon EOS 400D (CMOS, 10.1 MP, 12 bit ADC), scientific camera MegaPlus II ES11000 (CCD, 10.7 MP, 12 bit ADC), industrial camera PixeLink PL-B781F (CMOS, 6.6 MP, 10 bit ADC) and video-surveillance camera Watec LCL-902C (CCD, 0.47 MP, external 8 bit ADC). Experimental dependencies of temporal noise on signal value are in good agreement with fitted curves based on a Poisson distribution excluding areas near saturation. Time of registering and processing of frames used for temporal noise estimation was measured. Using standard computer, frames were registered and processed during a fraction of second to several seconds only. Also the accuracy of the obtained temporal noise values was estimated.

  19. Optical flow estimation on image sequences with differently exposed frames

    NASA Astrophysics Data System (ADS)

    Bengtsson, Tomas; McKelvey, Tomas; Lindström, Konstantin

    2015-09-01

    Optical flow (OF) methods are used to estimate dense motion information between consecutive frames in image sequences. In addition to the specific OF estimation method itself, the quality of the input image sequence is of crucial importance to the quality of the resulting flow estimates. For instance, lack of texture in image frames caused by saturation of the camera sensor during exposure can significantly deteriorate the performance. An approach to avoid this negative effect is to use different camera settings when capturing the individual frames. We provide a framework for OF estimation on such sequences that contain differently exposed frames. Information from multiple frames are combined into a total cost functional such that the lack of an active data term for saturated image areas is avoided. Experimental results demonstrate that using alternate camera settings to capture the full dynamic range of an underlying scene can clearly improve the quality of flow estimates. When saturation of image data is significant, the proposed methods show superior performance in terms of lower endpoint errors of the flow vectors compared to a set of baseline methods. Furthermore, we provide some qualitative examples of how and when our method should be used.

  20. Mineralogical Analysis of the Oppia Quadrangle of Asteroid (4) Vesta: Evidence for Occurrence of Moderate-Reflectance Hydrated Minerals

    NASA Technical Reports Server (NTRS)

    Tosi, F.; Frigeri, A.; Combe, J.-Ph.; Zambon, F.; De Sanctis, M. C.; Ammannito, E.; Longobardo, A.; Hoffmann, M.; Nathues, A.; Garry, W. B.; hide

    2015-01-01

    Quadrangle Av-10 'Oppia' is one of five quadrangles that cover the equatorial region of asteroid (4) Vesta. This quadrangle is notable for the broad, spectrally distinct ejecta that extend south of the Oppia crater. These ejecta exhibit the steepest ('reddest') visible spectral slope observed across the asteroid and have distinct color properties as seen in multispectral composite images. Compared to previous works that focused on the composition and nature of unusual ('orange') ejecta found on Vesta, here we take into account a broader area that includes several features of interest, with an emphasis on mineralogy as inferred from data obtained by Dawn's Visible InfraRed mapping spectrometer (VIR). Our analysis shows that the older northern and northeastern part of Av-10 is dominated by howardite-like material, while the younger southwestern part, including Oppia and its ejecta blanket, has a markedly eucritic mineralogy. The association of the mineralogical information with the geologic and topographic contexts allows for the establishment of relationships between the age of the main formations observed in this quadrangle and their composition. A major point of interest in the Oppia quadrangle is the spectral signature of hydrous material seen at the local scale. This material can be mapped by using high-resolution VIR data, combined with multispectral image products from the Dawn Framing Camera (FC) so as to enable a clear correlation with specific geologic features. Hydrated mineral phases studied previously on Vesta generally correlate with low-albedo material delivered by carbonaceous asteroids. However, our analysis shows that the strongest OH signature in Av-10 is found in a unit west of Oppia, previously mapped as 'light mantle material' and showing moderate reflectance and a red visible slope. With the available data we cannot yet assess the presence of water in this material. However, we offer a possible explanation for its origin.

  1. Robust Video Stabilization Using Particle Keypoint Update and l1-Optimized Camera Path

    PubMed Central

    Jeon, Semi; Yoon, Inhye; Jang, Jinbeum; Yang, Seungji; Kim, Jisung; Paik, Joonki

    2017-01-01

    Acquisition of stabilized video is an important issue for various type of digital cameras. This paper presents an adaptive camera path estimation method using robust feature detection to remove shaky artifacts in a video. The proposed algorithm consists of three steps: (i) robust feature detection using particle keypoints between adjacent frames; (ii) camera path estimation and smoothing; and (iii) rendering to reconstruct a stabilized video. As a result, the proposed algorithm can estimate the optimal homography by redefining important feature points in the flat region using particle keypoints. In addition, stabilized frames with less holes can be generated from the optimal, adaptive camera path that minimizes a temporal total variation (TV). The proposed video stabilization method is suitable for enhancing the visual quality for various portable cameras and can be applied to robot vision, driving assistant systems, and visual surveillance systems. PMID:28208622

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaffney, Kelly

    Movies have transformed our perception of the world. With slow motion photography, we can see a hummingbird flap its wings, and a bullet pierce an apple. The remarkably small and extremely fast molecular world that determines how your body functions cannot be captured with even the most sophisticated movie camera today. To see chemistry in real time requires a camera capable of seeing molecules that are one ten billionth of a foot with a frame rate of 10 trillion frames per second! SLAC has embarked on the construction of just such a camera. Please join me as I discuss howmore » this molecular movie camera will work and how it will change our perception of the molecular world.« less

  3. Geometrical calibration television measuring systems with solid state photodetectors

    NASA Astrophysics Data System (ADS)

    Matiouchenko, V. G.; Strakhov, V. V.; Zhirkov, A. O.

    2000-11-01

    The various optical measuring methods for deriving information about the size and form of objects are now used in difference branches- mechanical engineering, medicine, art, criminalistics. Measuring by means of the digital television systems is one of these methods. The development of this direction is promoted by occurrence on the market of various types and costs small-sized television cameras and frame grabbers. There are many television measuring systems using the expensive cameras, but accuracy performances of low cost cameras are also interested for the system developers. For this reason inexpensive mountingless camera SK1004CP (format 1/3', cost up to 40$) and frame grabber Aver2000 were used in experiments.

  4. Earth Observation taken during the 41G mission

    NASA Image and Video Library

    2009-06-25

    41G-120-056 (October 1984) --- Parts of Israel, Lebanon, Palestine, Syria and Jordan and part of the Mediterranean Sea are seen in this nearly-vertical, large format camera's view from the Earth-orbiting Space Shuttle Challenger. The Sea of Galilee is at center frame and the Dead Sea at bottom center. The frame's center coordinates are 32.5 degrees north latitude and 35.5 degrees east longitude. A Linhof camera, using 4" x 5" film, was used to expose the frame through one of the windows on Challenger's aft flight deck.

  5. Sensor fusion of cameras and a laser for city-scale 3D reconstruction.

    PubMed

    Bok, Yunsu; Choi, Dong-Geol; Kweon, In So

    2014-11-04

    This paper presents a sensor fusion system of cameras and a 2D laser sensorfor large-scale 3D reconstruction. The proposed system is designed to capture data on afast-moving ground vehicle. The system consists of six cameras and one 2D laser sensor,and they are synchronized by a hardware trigger. Reconstruction of 3D structures is doneby estimating frame-by-frame motion and accumulating vertical laser scans, as in previousworks. However, our approach does not assume near 2D motion, but estimates free motion(including absolute scale) in 3D space using both laser data and image features. In orderto avoid the degeneration associated with typical three-point algorithms, we present a newalgorithm that selects 3D points from two frames captured by multiple cameras. The problemof error accumulation is solved by loop closing, not by GPS. The experimental resultsshow that the estimated path is successfully overlaid on the satellite images, such that thereconstruction result is very accurate.

  6. Mapping Vesta Equatorial Quadrangle V-8EDL: Various Craters and Giant Grooves

    NASA Astrophysics Data System (ADS)

    Le Corre, L.; Nathues, A.; Reddy, V.; Buczkowski, D.; Denevi, B. W.; Gaffey, M.; Williams, D. A.; Garry, W. B.; Yingst, R.; Jaumann, R.; Pieters, C. M.; Russell, C. T.; Raymond, C. A.

    2011-12-01

    NASA's Dawn spacecraft arrived at the asteroid 4Vesta on July 15, 2011, and is now collecting imaging, spectroscopic, and elemental abundance data during its one-year orbital mission. As part of the geological analysis of the surface, a series of 15 quadrangle maps are being produced based on Framing Camera images (FC: spatial resolution: ~65 m/pixel) along with Visible & Infrared Spectrometer data (VIR: spatial resolution: ~180 m/pixel) obtained during the High-Altitude Mapping Orbit (HAMO). This poster presentation concentrates on our geologic analysis and mapping of quadrangle V-8EDL located between -22 and 22 degrees latitude and 144 and 216 degrees East longitude. This quadrangle is dominated by old craters (without any ejecta visible in the clear and color bands), but one small recent crater can be seen with bright ejecta blanket and rays. The latter has some small, dark units outside and inside the crater rim that could be indicative of impact melt. This quadrangle also contains a set of giant linear grooves running almost parallel to the equator that might have formed subsequent to a big impact. We will use FC mosaics with clear images and false color composites as well as VIR spectroscopy data in order to constrain the geology and identify the nature of each unit present in this quadrangle.

  7. Mesosiderites on Vesta: A Hyperspectral VIS-NIR Investigation

    NASA Technical Reports Server (NTRS)

    Palomba, E.; Longobardo, A.; DeSanctis, M. C.; Mittlefehldt, D. W.; Ammannito, E.; Capaccioni, F.; Capria, M. T.; Frigeri, A.; Tosi, F.; Zambon, F.; hide

    2013-01-01

    The discussion about the mesosiderite origin is an open issue since several years. Mesosiderites are mixtures of silicate mineral fragments or clasts, embedded in a FeNi metal matrix. Silicates are very similar in mineralogy and texture to howardites [1]. This led some scientists to conclude that mesosiderites could come from the same parent parent asteroid of the howardite, eucrite and diogenite (HED) meteorites [2, 3]. Other studies found a number of differences between HEDs and mesosiderite silicates that could be explained only by separate parent asteroids [4]. Recently, high precision oxygen isotope measurements of m esosiderites silicate fraction were found to be isotopically identical to the HEDs, requiring common parent body, i.e. 4 Vesta [5]. Another important element in favor of a common origin was given by the identification of a centimeter-sized mesosiderite clast in a howardite (Dar al Gani 779): a metal-rich inclusion with fragments of olivine, anorthite, and orthopyroxene plus minor amounts of chromite, tridymite, and troilite [6]. The Dawn mission with its instruments, the Infrared Mapping Spectrometer (VIR) [7], the Framing Camera [8] and the Gamma-Ray and Neutron Detector (GRaND) [9] confirmed that Vesta has a composition fully compatible with HED meteorites [10]. We investigate here the possibility to discern mesosiderite rich locations on the surface of Vesta by means of hyperspectral IR images.

  8. Polygonal Craters on Dwarf-Planet Ceres

    NASA Astrophysics Data System (ADS)

    Otto, K. A.; Jaumann, R.; Krohn, K.; Buczkowski, D. L.; von der Gathen, I.; Kersten, E.; Mest, S. C.; Preusker, F.; Roatsch, T.; Schenk, P. M.; Schröder, S.; Schulzeck, F.; Scully, J. E. C.; Stepahn, K.; Wagner, R.; Williams, D. A.; Raymond, C. A.; Russell, C. T.

    2015-10-01

    With approximately 950 km diameter and a mass of #1/3 of the total mass of the asteroid belt, (1) Ceres is the largest and most massive object in the Main Asteroid Belt. As an intact proto-planet, Ceres is key to understanding the origin and evolution of the terrestrialplanets [1]. In particular, the role of water during planet formation is of interest, because the differentiated dwarf-planet is thought to possess a water rich mantle overlying a rocky core [2]. The Dawn space craft arrived at Ceres in March this year after completing its mission at (4) Vesta. At Ceres, the on-board Framing Camera (FC) collected image data which revealed a large variety of impact crater morphologies including polygonal craters (Figure 1). Polygonal craters show straight rim sections aligned to form an angular shape. They are commonly associated with fractures in the target material. Simple polygonal craters develop during the excavation stage when the excavation flow propagates faster along preexisting fractures [3, 5]. Complex polygonal craters adopt their shape during the modification stage when slumping along fractures is favoured [3]. Polygonal craters are known from a variety of planetary bodies including Earth [e.g. 4], the Moon [e.g. 5], Mars [e.g. 6], Mercury [e.g. 7], Venus [e.g. 8] and outer Solar System icy satellites [e.g. 9].

  9. Investigating the Origin of Bright Materials on Vesta: Synthesis, Conclusions, and Implications

    NASA Technical Reports Server (NTRS)

    Li, Jian-Yang; Mittlefehldt, D. W.; Pieters, C. M.; De Sanctis, M. C.; Schroder, S. E.; Hiesinger, H.; Blewett, D. T.; Russell, C. T.; Raymond, C. A.; Keller, H. U.

    2012-01-01

    The Dawn spacecraft started orbiting the second largest asteroid (4) Vesta in August 2011, revealing the details of its surface at an unprecedented pixel scale as small as approx.70 m in Framing Camera (FC) clear and color filter images and approx.180 m in the Visible and Infrared Spectrometer (VIR) data in its first two science orbits, the Survey Orbit and the High Altitude Mapping Orbit (HAMO) [1]. The surface of Vesta displays the greatest diversity in terms of geology and mineralogy of all asteroids studied in detail [2, 3]. While the albedo of Vesta of approx.0.38 in the visible wavelengths [4, 5] is one of the highest among all asteroids, the surface of Vesta shows the largest variation of albedos found on a single asteroid, with geometric albedos ranging at least from approx.0.10 to approx.0.67 in HAMO images [5]. There are many distinctively bright and dark areas observed on Vesta, associated with various geological features and showing remarkably different forms. Here we report our initial attempt to understand the origin of the areas that are distinctively brighter than their surroundings. The dark materials on Vesta clearly are different in origin from bright materials and are reported in a companion paper [6].

  10. Universal ICT Picosecond Camera

    NASA Astrophysics Data System (ADS)

    Lebedev, Vitaly B.; Syrtzev, V. N.; Tolmachyov, A. M.; Feldman, Gregory G.; Chernyshov, N. A.

    1989-06-01

    The paper reports on the design of an ICI camera operating in the mode of linear or three-frame image scan. The camera incorporates two tubes: time-analyzing ICI PIM-107 1 with cathode S-11, and brightness amplifier PMU-2V (gain about 104) for the image shaped by the first tube. The camera is designed on the basis of streak camera AGAT-SF3 2 with almost the same power sources, but substantially modified pulse electronics. Schematically, the design of tube PIM-107 is depicted in the figure. The tube consists of cermet housing 1, photocathode 2 made in a separate vacuum volume and introduced into the housing by means of a manipulator. In a direct vicinity of the photocathode, accelerating electrode is located made of a fine-structure grid. An electrostatic lens formed by focusing electrode 4 and anode diaphragm 5 produces a beam of electrons with a "remote crossover". The authors have suggested this term for an electron beam whose crossover is 40 to 60 mm away from the anode diaphragm plane which guarantees high sensitivity of scan plates 6 with respect to multiaperture framing diaphragm 7. Beyond every diaphragm aperture, a pair of deflecting plates 8 is found shielded from compensation plates 10 by diaphragm 9. The electronic image produced by the photocathode is focused on luminescent screen 11. The tube is controlled with the help of two saw-tooth voltages applied in antiphase across plates 6 and 10. Plates 6 serve for sweeping the electron beam over the surface of diaphragm 7. The beam is either allowed toward the screen, or delayed by the diaphragm walls. In such a manner, three frames are obtained, the number corresponding to that of the diaphragm apertures. Plates 10 serve for stopping the compensation of the image streak sweep on the screen. To avoid overlapping of frames, plates 8 receive static potentials responsible for shifting frames on the screen. Changing the potentials applied to plates 8, one can control the spacing between frames and partially or fully overlap the frames. This sort of control is independent of the frequency of frame running and of their duration, and can only determine frame positioning on the screen. Since diaphragm 7 is located in the area of crossover and electron trajectories cross in the crossover, the frame is not decomposed into separate elements during its formation. The image is transferred onto the screen practically within the entire time of frame duration increasing the aperture ratio of the tube as compared to that in Ref. 3.

  11. A higher-speed compressive sensing camera through multi-diode design

    NASA Astrophysics Data System (ADS)

    Herman, Matthew A.; Tidman, James; Hewitt, Donna; Weston, Tyler; McMackin, Lenore

    2013-05-01

    Obtaining high frame rates is a challenge with compressive sensing (CS) systems that gather measurements in a sequential manner, such as the single-pixel CS camera. One strategy for increasing the frame rate is to divide the FOV into smaller areas that are sampled and reconstructed in parallel. Following this strategy, InView has developed a multi-aperture CS camera using an 8×4 array of photodiodes that essentially act as 32 individual simultaneously operating single-pixel cameras. Images reconstructed from each of the photodiode measurements are stitched together to form the full FOV. To account for crosstalk between the sub-apertures, novel modulation patterns have been developed to allow neighboring sub-apertures to share energy. Regions of overlap not only account for crosstalk energy that would otherwise be reconstructed as noise, but they also allow for tolerance in the alignment of the DMD to the lenslet array. Currently, the multi-aperture camera is built into a computational imaging workstation configuration useful for research and development purposes. In this configuration, modulation patterns are generated in a CPU and sent to the DMD via PCI express, which allows the operator to develop and change the patterns used in the data acquisition step. The sensor data is collected and then streamed to the workstation via an Ethernet or USB connection for the reconstruction step. Depending on the amount of data taken and the amount of overlap between sub-apertures, frame rates of 2-5 frames per second can be achieved. In a stand-alone camera platform, currently in development, pattern generation and reconstruction will be implemented on-board.

  12. The development of large-aperture test system of infrared camera and visible CCD camera

    NASA Astrophysics Data System (ADS)

    Li, Yingwen; Geng, Anbing; Wang, Bo; Wang, Haitao; Wu, Yanying

    2015-10-01

    Infrared camera and CCD camera dual-band imaging system is used in many equipment and application widely. If it is tested using the traditional infrared camera test system and visible CCD test system, 2 times of installation and alignment are needed in the test procedure. The large-aperture test system of infrared camera and visible CCD camera uses the common large-aperture reflection collimator, target wheel, frame-grabber, computer which reduces the cost and the time of installation and alignment. Multiple-frame averaging algorithm is used to reduce the influence of random noise. Athermal optical design is adopted to reduce the change of focal length location change of collimator when the environmental temperature is changing, and the image quality of the collimator of large field of view and test accuracy are also improved. Its performance is the same as that of the exotic congener and is much cheaper. It will have a good market.

  13. An Acoustic Charge Transport Imager for High Definition Television

    NASA Technical Reports Server (NTRS)

    Hunt, William D.; Brennan, Kevin; May, Gary; Glenn, William E.; Richardson, Mike; Solomon, Richard

    1999-01-01

    This project, over its term, included funding to a variety of companies and organizations. In addition to Georgia Tech these included Florida Atlantic University with Dr. William E. Glenn as the P.I., Kodak with Mr. Mike Richardson as the P.I. and M.I.T./Polaroid with Dr. Richard Solomon as the P.I. The focus of the work conducted by these organizations was the development of camera hardware for High Definition Television (HDTV). The focus of the research at Georgia Tech was the development of new semiconductor technology to achieve a next generation solid state imager chip that would operate at a high frame rate (I 70 frames per second), operate at low light levels (via the use of avalanche photodiodes as the detector element) and contain 2 million pixels. The actual cost required to create this new semiconductor technology was probably at least 5 or 6 times the investment made under this program and hence we fell short of achieving this rather grand goal. We did, however, produce a number of spin-off technologies as a result of our efforts. These include, among others, improved avalanche photodiode structures, significant advancement of the state of understanding of ZnO/GaAs structures and significant contributions to the analysis of general GaAs semiconductor devices and the design of Surface Acoustic Wave resonator filters for wireless communication. More of these will be described in the report. The work conducted at the partner sites resulted in the development of 4 prototype HDTV cameras. The HDTV camera developed by Kodak uses the Kodak KAI-2091M high- definition monochrome image sensor. This progressively-scanned charge-coupled device (CCD) can operate at video frame rates and has 9 gm square pixels. The photosensitive area has a 16:9 aspect ratio and is consistent with the "Common Image Format" (CIF). It features an active image area of 1928 horizontal by 1084 vertical pixels and has a 55% fill factor. The camera is designed to operate in continuous mode with an output data rate of 5MHz, which gives a maximum frame rate of 4 frames per second. The MIT/Polaroid group developed two cameras under this program. The cameras have effectively four times the current video spatial resolution and at 60 frames per second are double the normal video frame rate.

  14. Enhancement Strategies for Frame-To Uas Stereo Visual Odometry

    NASA Astrophysics Data System (ADS)

    Kersten, J.; Rodehorst, V.

    2016-06-01

    Autonomous navigation of indoor unmanned aircraft systems (UAS) requires accurate pose estimations usually obtained from indirect measurements. Navigation based on inertial measurement units (IMU) is known to be affected by high drift rates. The incorporation of cameras provides complementary information due to the different underlying measurement principle. The scale ambiguity problem for monocular cameras is avoided when a light-weight stereo camera setup is used. However, also frame-to-frame stereo visual odometry (VO) approaches are known to accumulate pose estimation errors over time. Several valuable real-time capable techniques for outlier detection and drift reduction in frame-to-frame VO, for example robust relative orientation estimation using random sample consensus (RANSAC) and bundle adjustment, are available. This study addresses the problem of choosing appropriate VO components. We propose a frame-to-frame stereo VO method based on carefully selected components and parameters. This method is evaluated regarding the impact and value of different outlier detection and drift-reduction strategies, for example keyframe selection and sparse bundle adjustment (SBA), using reference benchmark data as well as own real stereo data. The experimental results demonstrate that our VO method is able to estimate quite accurate trajectories. Feature bucketing and keyframe selection are simple but effective strategies which further improve the VO results. Furthermore, introducing the stereo baseline constraint in pose graph optimization (PGO) leads to significant improvements.

  15. Final Report for the Advanced Camera for Surveys (ACS)

    NASA Technical Reports Server (NTRS)

    2004-01-01

    ACS was launched aboard the Space Shuttle Columbia just before dawn on March 1, 2002. At the time of liftoff, the Hubble Space Telescope (HST) was reflecting the early morning sun as it moved across the sky. After successfully docking with HST, several components were replaced. One of the components was the Advanced Camera for Surveys built by Ball Aerospace & Technologies Corp. (BATC) in Boulder, Colorado. Over the life of the HST contract at BATC, hundreds of employees had the pleasure of working on the concept, design, fabrication, assembly, and test of ACS. Those employees thank NASA - Goddard Space Flight Center and the science team at Johns Hopkins University (JHU) for the opportunity to participate in building a great science instrument for HST.

  16. Geiger-mode APD camera system for single-photon 3D LADAR imaging

    NASA Astrophysics Data System (ADS)

    Entwistle, Mark; Itzler, Mark A.; Chen, Jim; Owens, Mark; Patel, Ketan; Jiang, Xudong; Slomkowski, Krystyna; Rangwala, Sabbir

    2012-06-01

    The unparalleled sensitivity of 3D LADAR imaging sensors based on single photon detection provides substantial benefits for imaging at long stand-off distances and minimizing laser pulse energy requirements. To obtain 3D LADAR images with single photon sensitivity, we have demonstrated focal plane arrays (FPAs) based on InGaAsP Geiger-mode avalanche photodiodes (GmAPDs) optimized for use at either 1.06 μm or 1.55 μm. These state-of-the-art FPAs exhibit excellent pixel-level performance and the capability for 100% pixel yield on a 32 x 32 format. To realize the full potential of these FPAs, we have recently developed an integrated camera system providing turnkey operation based on FPGA control. This system implementation enables the extremely high frame-rate capability of the GmAPD FPA, and frame rates in excess of 250 kHz (for 0.4 μs range gates) can be accommodated using an industry-standard CameraLink interface in full configuration. Real-time data streaming for continuous acquisition of 2 μs range gate point cloud data with 13-bit time-stamp resolution at 186 kHz frame rates has been established using multiple solid-state storage drives. Range gate durations spanning 4 ns to 10 μs provide broad operational flexibility. The camera also provides real-time signal processing in the form of multi-frame gray-scale contrast images and single-frame time-stamp histograms, and automated bias control has been implemented to maintain a constant photon detection efficiency in the presence of ambient temperature changes. A comprehensive graphical user interface has been developed to provide complete camera control using a simple serial command set, and this command set supports highly flexible end-user customization.

  17. Selection of optical model of stereophotography experiment for determination the cloud base height as a problem of testing of statistical hypotheses

    NASA Astrophysics Data System (ADS)

    Chulichkov, Alexey I.; Nikitin, Stanislav V.; Emilenko, Alexander S.; Medvedev, Andrey P.; Postylyakov, Oleg V.

    2017-10-01

    Earlier, we developed a method for estimating the height and speed of clouds from cloud images obtained by a pair of digital cameras. The shift of a fragment of the cloud in the right frame relative to its position in the left frame is used to estimate the height of the cloud and its velocity. This shift is estimated by the method of the morphological analysis of images. However, this method requires that the axes of the cameras are parallel. Instead of real adjustment of the axes, we use virtual camera adjustment, namely, a transformation of a real frame, the result of which could be obtained if all the axes were perfectly adjusted. For such adjustment, images of stars as infinitely distant objects were used: on perfectly aligned cameras, images on both the right and left frames should be identical. In this paper, we investigate in more detail possible mathematical models of cloud image deformations caused by the misalignment of the axes of two cameras, as well as their lens aberration. The simplest model follows the paraxial approximation of lens (without lens aberrations) and reduces to an affine transformation of the coordinates of one of the frames. The other two models take into account the lens distortion of the 3rd and 3rd and 5th orders respectively. It is shown that the models differ significantly when converting coordinates near the edges of the frame. Strict statistical criteria allow choosing the most reliable model, which is as much as possible consistent with the measurement data. Further, each of these three models was used to determine parameters of the image deformations. These parameters are used to provide cloud images to mean what they would have when measured using an ideal setup, and then the distance to cloud is calculated. The results were compared with data of a laser range finder.

  18. Frames of Reference in the Classroom

    ERIC Educational Resources Information Center

    Grossman, Joshua

    2012-01-01

    The classic film "Frames of Reference" effectively illustrates concepts involved with inertial and non-inertial reference frames. In it, Donald G. Ivey and Patterson Hume use the cameras perspective to allow the viewer to see motion in reference frames translating with a constant velocity, translating while accelerating, and rotating--all with…

  19. Observation and theory of Pc 5 waves with harmonically related transverse and compressional components

    NASA Astrophysics Data System (ADS)

    Takahashi, K.; Cheng, C. Z.; McEntire, R. W.; Kistler, L. M.

    1990-02-01

    The properties of 23 magnetic pulsation events observed by the AMPTE CCE spacecraft are studied. These events are selected on the basis of the field magnitude which oscillated at the second harmonic of a simultaneously present transverse oscillation. The events have a second harmonic period of 80-600 s (roughly the Pc 5 range), are observed in cluster in the dawn (0300-0800 magnetic local time, MLT) and dusk (1600-2100 MLT) sectors, and are localized near the magnetic equator. Although the azimuthal wave number estimated from an ion finite Larmor radius effect, is generally large (about 50), there is a marked difference between the events observed in the dawn and dusk sectors. In the dawn sector the waves have low frequencies (1-5 mHz), indicate left-hand polarization with respect to the ambient magnetic field, and propagate eastward with respect to the spacecraft. In the dusk sector the waves have high frequencies (5-15 mHz), indicate right-hand polarization, and propagate westward. It is suggested that the waves are all westward propagating in the plasma rest frame and that local-time-dependent Doppler shift is the reason for the local time dependence of the wave properties.

  20. Observation and theory of Pc 5 waves with harmonically related transverse and compressional components

    NASA Technical Reports Server (NTRS)

    Takahashi, K.; Mcentire, R. W.; Cheng, C. Z.; Kistler, L. M.

    1990-01-01

    The properties of 23 magnetic pulsation events observed by the AMPTE CCE spacecraft are studied. These events are selected on the basis of the field magnitude which oscillated at the second harmonic of a simultaneously present transverse oscillation. The events have a second harmonic period of 80-600 s (roughly the Pc 5 range), are observed in cluster in the dawn (0300-0800 magnetic local time, MLT) and dusk (1600-2100 MLT) sectors, and are localized near the magnetic equator. Although the azimuthal wave number estimated from an ion finite Larmor radius effect, is generally large (about 50), there is a marked difference between the events observed in the dawn and dusk sectors. In the dawn sector the waves have low frequencies (1-5 mHz), indicate left-hand polarization with respect to the ambient magnetic field, and propagate eastward with respect to the spacecraft. In the dusk sector the waves have high frequencies (5-15 mHz), indicate right-hand polarization, and propagate westward. It is suggested that the waves are all westward propagating in the plasma rest frame and that local-time-dependent Doppler shift is the reason for the local time dependence of the wave properties.

  1. Development and use of an L3CCD high-cadence imaging system for Optical Astronomy

    NASA Astrophysics Data System (ADS)

    Sheehan, Brendan J.; Butler, Raymond F.

    2008-02-01

    A high cadence imaging system, based on a Low Light Level CCD (L3CCD) camera, has been developed for photometric and polarimetric applications. The camera system is an iXon DV-887 from Andor Technology, which uses a CCD97 L3CCD detector from E2V technologies. This is a back illuminated device, giving it an extended blue response, and has an active area of 512×512 pixels. The camera system allows frame-rates ranging from 30 fps (full frame) to 425 fps (windowed & binned frame). We outline the system design, concentrating on the calibration and control of the L3CCD camera. The L3CCD detector can be either triggered directly by a GPS timeserver/frequency generator or be internally triggered. A central PC remotely controls the camera computer system and timeserver. The data is saved as standard `FITS' files. The large data loads associated with high frame rates, leads to issues with gathering and storing the data effectively. To overcome such problems, a specific data management approach is used, and a Python/PYRAF data reduction pipeline was written for the Linux environment. This uses calibration data collected either on-site, or from lab based measurements, and enables a fast and reliable method for reducing images. To date, the system has been used twice on the 1.5 m Cassini Telescope in Loiano (Italy) we present the reduction methods and observations made.

  2. High-speed plasma imaging: A lightning bolt

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wurden, G.A.; Whiteson, D.O.

    Using a gated intensified digital Kodak Ektapro camera system, the authors captured a lightning bolt at 1,000 frames per second, with 100-{micro}s exposure time on each consecutive frame. As a thunder storm approaches while darkness descended (7:50 pm) on July 21, 1994, they photographed lightning bolts with an f22 105-mm lens and 100% gain on the intensified camera. This 15-frame sequence shows a cloud to ground stroke at a distance of about 1.5 km, which has a series of stepped leaders propagating downwards, following by the upward-propagating main return stroke.

  3. Deep-UV-sensitive high-frame-rate backside-illuminated CCD camera developments

    NASA Astrophysics Data System (ADS)

    Dawson, Robin M.; Andreas, Robert; Andrews, James T.; Bhaskaran, Mahalingham; Farkas, Robert; Furst, David; Gershstein, Sergey; Grygon, Mark S.; Levine, Peter A.; Meray, Grazyna M.; O'Neal, Michael; Perna, Steve N.; Proefrock, Donald; Reale, Michael; Soydan, Ramazan; Sudol, Thomas M.; Swain, Pradyumna K.; Tower, John R.; Zanzucchi, Pete

    2002-04-01

    New applications for ultra-violet imaging are emerging in the fields of drug discovery and industrial inspection. High throughput is critical for these applications where millions of drug combinations are analyzed in secondary screenings or high rate inspection of small feature sizes over large areas is required. Sarnoff demonstrated in1990 a back illuminated, 1024 X 1024, 18 um pixel, split-frame-transfer device running at > 150 frames per second with high sensitivity in the visible spectrum. Sarnoff designed, fabricated and delivered cameras based on these CCDs and is now extending this technology to devices with higher pixel counts and higher frame rates through CCD architectural enhancements. The high sensitivities obtained in the visible spectrum are being pushed into the deep UV to support these new medical and industrial inspection applications. Sarnoff has achieved measured quantum efficiencies > 55% at 193 nm, rising to 65% at 300 nm, and remaining almost constant out to 750 nm. Optimization of the sensitivity is being pursued to tailor the quantum efficiency for particular wavelengths. Characteristics of these high frame rate CCDs and cameras will be described and results will be presented demonstrating high UV sensitivity down to 150 nm.

  4. Multiport backside-illuminated CCD imagers for high-frame-rate camera applications

    NASA Astrophysics Data System (ADS)

    Levine, Peter A.; Sauer, Donald J.; Hseuh, Fu-Lung; Shallcross, Frank V.; Taylor, Gordon C.; Meray, Grazyna M.; Tower, John R.; Harrison, Lorna J.; Lawler, William B.

    1994-05-01

    Two multiport, second-generation CCD imager designs have been fabricated and successfully tested. They are a 16-port 512 X 512 array and a 32-port 1024 X 1024 array. Both designs are back illuminated, have on-chip CDS, lateral blooming control, and use a split vertical frame transfer architecture with full frame storage. The 512 X 512 device has been operated at rates over 800 frames per second. The 1024 X 1024 device has been operated at rates over 300 frames per second. The major changes incorporated in the second-generation design are, reduction in gate length in the output area to give improved high-clock-rate performance, modified on-chip CDS circuitry for reduced noise, and optimized implants to improve performance of blooming control at lower clock amplitude. This paper discusses the imager design improvements and presents measured performance results at high and moderate frame rates. The design and performance of three moderate frame rate cameras are discussed.

  5. Evaluation of Eye Metrics as a Detector of Fatigue

    DTIC Science & Technology

    2010-03-01

    eyeglass frames . The cameras are angled upward toward the eyes and extract real-time pupil diameter, eye-lid movement, and eye-ball movement. The...because the cameras were mounted on eyeglass -like frames , the system was able to continuously monitor the eye throughout all sessions. Overall, the...of “ fitness for duty” testing and “real-time monitoring” of operator performance has been slow (Institute of Medicine, 2004). Oculometric-based

  6. Utilizing ISS Camera Systems for Scientific Analysis of Lightning Characteristics and comparison with ISS-LIS and GLM

    NASA Astrophysics Data System (ADS)

    Schultz, C. J.; Lang, T. J.; Leake, S.; Runco, M.; Blakeslee, R. J.

    2017-12-01

    Video and still frame images from cameras aboard the International Space Station (ISS) are used to inspire, educate, and provide a unique vantage point from low-Earth orbit that is second to none; however, these cameras have overlooked capabilities for contributing to scientific analysis of the Earth and near-space environment. The goal of this project is to study how georeferenced video/images from available ISS camera systems can be useful for scientific analysis, using lightning properties as a demonstration. Camera images from the crew cameras and high definition video from the Chiba University Meteor Camera were combined with lightning data from the National Lightning Detection Network (NLDN), ISS-Lightning Imaging Sensor (ISS-LIS), the Geostationary Lightning Mapper (GLM) and lightning mapping arrays. These cameras provide significant spatial resolution advantages ( 10 times or better) over ISS-LIS and GLM, but with lower temporal resolution. Therefore, they can serve as a complementarity analysis tool for studying lightning and thunderstorm processes from space. Lightning sensor data, Visible Infrared Imaging Radiometer Suite (VIIRS) derived city light maps, and other geographic databases were combined with the ISS attitude and position data to reverse geolocate each image or frame. An open-source Python toolkit has been developed to assist with this effort. Next, the locations and sizes of all flashes in each frame or image were computed and compared with flash characteristics from all available lightning datasets. This allowed for characterization of cloud features that are below the 4-km and 8-km resolution of ISS-LIS and GLM which may reduce the light that reaches the ISS-LIS or GLM sensor. In the case of video, consecutive frames were overlaid to determine the rate of change of the light escaping cloud top. Characterization of the rate of change in geometry, more generally the radius, of light escaping cloud top was integrated with the NLDN, ISS-LIS and GLM to understand how the peak rate of change and the peak area of each flash aligned with each lightning system in time. Flash features like leaders could be inferred from the video frames as well. Testing is being done to see if leader speeds may be accurately calculated under certain circumstances.

  7. Passive ranging redundancy reduction in diurnal weather conditions

    NASA Astrophysics Data System (ADS)

    Cha, Jae H.; Abbott, A. Lynn; Szu, Harold H.

    2013-05-01

    Ambiguity in binocular ranging (David Marr's paradox) may be resolved by using two eyes moving from side to side behind an optical bench while integrating multiple views. Moving a head from left to right with one eye closed can also help resolve the foreground and background range uncertainty. That empirical experiment implies redundancy in image data, which may be reduced by adopting a 3-D camera imaging model to perform compressive sensing. Here, the compressive sensing concept is examined from the perspective of redundancy reduction in images subject to diurnal and weather variations for the purpose of resolving range uncertainty at all weather conditions such as the dawn or dusk, the daytime with different light level or the nighttime at different spectral band. As an example, a scenario at an intersection of a country road at dawn/dusk is discussed where the location of the traffic signs needs to be resolved by passive ranging to answer whether it is located on the same side of the road or the opposite side, which is under the influence of temporal light/color level variation. A spectral band extrapolation via application of Lagrange Constrained Neural Network (LCNN) learning algorithm is discussed to address lost color restoration at dawn/dusk. A numerical simulation is illustrated along with the code example.

  8. 3-D Velocimetry of Strombolian Explosions

    NASA Astrophysics Data System (ADS)

    Taddeucci, J.; Gaudin, D.; Orr, T. R.; Scarlato, P.; Houghton, B. F.; Del Bello, E.

    2014-12-01

    Using two synchronized high-speed cameras we were able to reconstruct the three-dimensional displacement and velocity field of bomb-sized pyroclasts in Strombolian explosions at Stromboli Volcano. Relatively low-intensity Strombolian-style activity offers a rare opportunity to observe volcanic processes that remain hidden from view during more violent explosive activity. Such processes include the ejection and emplacement of bomb-sized clasts along pure or drag-modified ballistic trajectories, in-flight bomb collision, and gas liberation dynamics. High-speed imaging of Strombolian activity has already opened new windows for the study of the abovementioned processes, but to date has only utilized two-dimensional analysis with limited motion detection and ability to record motion towards or away from the observer. To overcome this limitation, we deployed two synchronized high-speed video cameras at Stromboli. The two cameras, located sixty meters apart, filmed Strombolian explosions at 500 and 1000 frames per second and with different resolutions. Frames from the two cameras were pre-processed and combined into a single video showing frames alternating from one to the other camera. Bomb-sized pyroclasts were then manually identified and tracked in the combined video, together with fixed reference points located as close as possible to the vent. The results from manual tracking were fed to a custom software routine that, knowing the relative position of the vent and cameras, and the field of view of the latter, provided the position of each bomb relative to the reference points. By tracking tens of bombs over five to ten frames at different intervals during one explosion, we were able to reconstruct the three-dimensional evolution of the displacement and velocity fields of bomb-sized pyroclasts during individual Strombolian explosions. Shifting jet directivity and dispersal angle clearly appear from the three-dimensional analysis.

  9. Standard design for National Ignition Facility x-ray streak and framing cameras.

    PubMed

    Kimbrough, J R; Bell, P M; Bradley, D K; Holder, J P; Kalantar, D K; MacPhee, A G; Telford, S

    2010-10-01

    The x-ray streak camera and x-ray framing camera for the National Ignition Facility were redesigned to improve electromagnetic pulse hardening, protect high voltage circuits from pressure transients, and maximize the use of common parts and operational software. Both instruments use the same PC104 based controller, interface, power supply, charge coupled device camera, protective hermetically sealed housing, and mechanical interfaces. Communication is over fiber optics with identical facility hardware for both instruments. Each has three triggers that can be either fiber optic or coax. High voltage protection consists of a vacuum sensor to enable the high voltage and pulsed microchannel plate phosphor voltage. In the streak camera, the high voltage is removed after the sweep. Both rely on the hardened aluminum box and a custom power supply to reduce electromagnetic pulse/electromagnetic interference (EMP/EMI) getting into the electronics. In addition, the streak camera has an EMP/EMI shield enclosing the front of the streak tube.

  10. Per-Pixel Coded Exposure for High-Speed and High-Resolution Imaging Using a Digital Micromirror Device Camera

    PubMed Central

    Feng, Wei; Zhang, Fumin; Qu, Xinghua; Zheng, Shiwei

    2016-01-01

    High-speed photography is an important tool for studying rapid physical phenomena. However, low-frame-rate CCD (charge coupled device) or CMOS (complementary metal oxide semiconductor) camera cannot effectively capture the rapid phenomena with high-speed and high-resolution. In this paper, we incorporate the hardware restrictions of existing image sensors, design the sampling functions, and implement a hardware prototype with a digital micromirror device (DMD) camera in which spatial and temporal information can be flexibly modulated. Combined with the optical model of DMD camera, we theoretically analyze the per-pixel coded exposure and propose a three-element median quicksort method to increase the temporal resolution of the imaging system. Theoretically, this approach can rapidly increase the temporal resolution several, or even hundreds, of times without increasing bandwidth requirements of the camera. We demonstrate the effectiveness of our method via extensive examples and achieve 100 fps (frames per second) gain in temporal resolution by using a 25 fps camera. PMID:26959023

  11. Per-Pixel Coded Exposure for High-Speed and High-Resolution Imaging Using a Digital Micromirror Device Camera.

    PubMed

    Feng, Wei; Zhang, Fumin; Qu, Xinghua; Zheng, Shiwei

    2016-03-04

    High-speed photography is an important tool for studying rapid physical phenomena. However, low-frame-rate CCD (charge coupled device) or CMOS (complementary metal oxide semiconductor) camera cannot effectively capture the rapid phenomena with high-speed and high-resolution. In this paper, we incorporate the hardware restrictions of existing image sensors, design the sampling functions, and implement a hardware prototype with a digital micromirror device (DMD) camera in which spatial and temporal information can be flexibly modulated. Combined with the optical model of DMD camera, we theoretically analyze the per-pixel coded exposure and propose a three-element median quicksort method to increase the temporal resolution of the imaging system. Theoretically, this approach can rapidly increase the temporal resolution several, or even hundreds, of times without increasing bandwidth requirements of the camera. We demonstrate the effectiveness of our method via extensive examples and achieve 100 fps (frames per second) gain in temporal resolution by using a 25 fps camera.

  12. A photoelastic modulator-based birefringence imaging microscope for measuring biological specimens

    NASA Astrophysics Data System (ADS)

    Freudenthal, John; Leadbetter, Andy; Wolf, Jacob; Wang, Baoliang; Segal, Solomon

    2014-11-01

    The photoelastic modulator (PEM) has been applied to a variety of polarimetric measurements. However, nearly all such applications use point-measurements where each point (spot) on the sample is measured one at a time. The main challenge for employing the PEM in a camera-based imaging instrument is that the PEM modulates too fast for typical cameras. The PEM modulates at tens of KHz. To capture the specific polarization information that is carried on the modulation frequency of the PEM, the camera needs to be at least ten times faster. However, the typical frame rates of common cameras are only in the tens or hundreds frames per second. In this paper, we report a PEM-camera birefringence imaging microscope. We use the so-called stroboscopic illumination method to overcome the incompatibility of the high frequency of the PEM to the relatively slow frame rate of a camera. We trigger the LED light source using a field-programmable gate array (FPGA) in synchrony with the modulation of the PEM. We show the measurement results of several standard birefringent samples as a part of the instrument calibration. Furthermore, we show results observed in two birefringent biological specimens, a human skin tissue that contains collagen and a slice of mouse brain that contains bundles of myelinated axonal fibers. Novel applications of this PEM-based birefringence imaging microscope to both research communities and industrial applications are being tested.

  13. Development of a driving method suitable for ultrahigh-speed shooting in a 2M-fps 300k-pixel single-chip color camera

    NASA Astrophysics Data System (ADS)

    Yonai, J.; Arai, T.; Hayashida, T.; Ohtake, H.; Namiki, J.; Yoshida, T.; Etoh, T. Goji

    2012-03-01

    We have developed an ultrahigh-speed CCD camera that can capture instantaneous phenomena not visible to the human eye and impossible to capture with a regular video camera. The ultrahigh-speed CCD was specially constructed so that the CCD memory between the photodiode and the vertical transfer path of each pixel can store 144 frames each. For every one-frame shot, the electric charges generated from the photodiodes are transferred in one step to the memory of all the parallel pixels, making ultrahigh-speed shooting possible. Earlier, we experimentally manufactured a 1M-fps ultrahigh-speed camera and tested it for broadcasting applications. Through those tests, we learned that there are cases that require shooting speeds (frame rate) of more than 1M fps; hence we aimed to develop a new ultrahigh-speed camera that will enable much faster shooting speeds than what is currently possible. Since shooting at speeds of more than 200,000 fps results in decreased image quality and abrupt heating of the image sensor and drive circuit board, faster speeds cannot be achieved merely by increasing the drive frequency. We therefore had to improve the image sensor wiring layout and the driving method to develop a new 2M-fps, 300k-pixel ultrahigh-speed single-chip color camera for broadcasting purposes.

  14. Navigation accuracy comparing non-covered frame and use of plastic sterile drapes to cover the reference frame in 3D acquisition.

    PubMed

    Corenman, Donald S; Strauch, Eric L; Dornan, Grant J; Otterstrom, Eric; Zalepa King, Lisa

    2017-09-01

    Advancements in surgical navigation technology coupled with 3-dimensional (3D) radiographic data have significantly enhanced the accuracy and efficiency of spinal fusion implant placement. Increased usage of such technology has led to rising concerns regarding maintenance of the sterile field, as makeshift drape systems are fraught with breaches thus presenting increased risk of surgical site infections (SSIs). A clinical need exists for a sterile draping solution with these techniques. Our objective was to quantify expected accuracy error associated with 2MM and 4MM thickness Sterile-Z Patient Drape ® using Medtronic O-Arm ® Surgical Imaging with StealthStation ® S7 ® Navigation System. Camera distance to reference frame was investigated for contribution to accuracy error. A testing jig was placed on the radiolucent table and the Medtronic passive reference frame was attached to jig. The StealthStation ® S7 ® navigation camera was placed at various distances from testing jig and the geometry error of reference frame was captured for three different drape configurations: no drape, 2MM drape and 4MM drape. The O-Arm ® gantry location and StealthStation ® S7 ® camera position was maintained and seven 3D acquisitions for each of drape configurations were measured. Data was analyzed by a two-factor analysis of variance (ANOVA) and Bonferroni comparisons were used to assess the independent effects of camera angle and drape on accuracy error. Median (and maximum) measurement accuracy error was higher for the 2MM than for the 4MM drape for each camera distance. The most extreme error observed (4.6 mm) occurred when using the 2MM and the 'far' camera distance. The 4MM drape was found to induce an accuracy error of 0.11 mm (95% confidence interval, 0.06-0.15; P<0.001) relative to the no drape testing, regardless of camera distance. Medium camera distance produced lower accuracy error than either the close (additional 0.08 mm error; 95% CI, 0-0.15; P=0.035) or far (additional 0.21mm error; 95% CI, 0.13-0.28; P<0.001) camera distances, regardless of whether a drape was used. In comparison to the 'no drape' condition, the accuracy error of 0.11 mm when using a 4MM film drape is minimal and clinically insignificant.

  15. The Television Framing Methods of the National Basketball Association: An Agenda-Setting Application.

    ERIC Educational Resources Information Center

    Fortunato, John A.

    2001-01-01

    Identifies and analyzes the exposure and portrayal framing methods that are utilized by the National Basketball Association (NBA). Notes that key informant interviews provide insight into the exposure framing method and reveal two portrayal instruments: cameras and announcers; and three framing strategies: depicting the NBA as a team game,…

  16. Optical space weathering on Vesta: Radiative-transfer models and Dawn observations

    NASA Astrophysics Data System (ADS)

    Blewett, David T.; Denevi, Brett W.; Le Corre, Lucille; Reddy, Vishnu; Schröder, Stefan E.; Pieters, Carle M.; Tosi, Federico; Zambon, Francesca; De Sanctis, Maria Cristina; Ammannito, Eleonora; Roatsch, Thomas; Raymond, Carol A.; Russell, Christopher T.

    2016-02-01

    Exposure to ion and micrometeoroid bombardment in the space environment causes physical and chemical changes in the surface of an airless planetary body. These changes, called space weathering, can strongly influence a surface's optical characteristics, and hence complicate interpretation of composition from reflectance spectroscopy. Prior work using data from the Dawn spacecraft (Pieters, C.M. et al. [2012]. Nature 491, 79-82) found that accumulation of nanophase metallic iron (npFe0), which is a key space-weathering product on the Moon, does not appear to be important on Vesta, and instead regolith evolution is dominated by mixing with carbonaceous chondrite (CC) material delivered by impacts. In order to gain further insight into the nature of space weathering on Vesta, we constructed model reflectance spectra using Hapke's radiative-transfer theory and used them as an aid to understanding multispectral observations obtained by Dawn's Framing Cameras (FC). The model spectra, for a howardite mineral assemblage, include both the effects of npFe0 and that of a mixed CC component. We found that a plot of the 438-nm/555-nm ratio vs. the 555-nm reflectance for the model spectra helps to separate the effects of lunar-style space weathering (LSSW) from those of CC-mixing. We then constructed ratio-reflectance pixel scatterplots using FC images for four areas of contrasting composition: a eucritic area at Vibidia crater, a diogenitic area near Antonia crater, olivine-bearing material within Bellicia crater, and a light mantle unit (referred to as an ;orange patch; in some previous studies, based on steep spectral slope in the visible) northeast of Oppia crater. In these four cases the observed spectral trends are those expected from CC-mixing, with no evidence for weathering dominated by production of npFe0. In order to survey a wider range of surfaces, we also defined a spectral parameter that is a function of the change in 438-nm/555-nm ratio and the 555-nm reflectance between fresh and mature surfaces, permitting the spectral change to be classified as LSSW-like or CC-mixing-like. When applied to 21 fresh and mature FC spectral pairs, it was found that none have changes consistent with LSSW. We discuss Vesta's lack of LSSW in relation to the possible agents of space weathering, the effects of physical and compositional differences among asteroid surfaces, and the possible role of magnetic shielding from the solar wind.

  17. Advanced High-Definition Video Cameras

    NASA Technical Reports Server (NTRS)

    Glenn, William

    2007-01-01

    A product line of high-definition color video cameras, now under development, offers a superior combination of desirable characteristics, including high frame rates, high resolutions, low power consumption, and compactness. Several of the cameras feature a 3,840 2,160-pixel format with progressive scanning at 30 frames per second. The power consumption of one of these cameras is about 25 W. The size of the camera, excluding the lens assembly, is 2 by 5 by 7 in. (about 5.1 by 12.7 by 17.8 cm). The aforementioned desirable characteristics are attained at relatively low cost, largely by utilizing digital processing in advanced field-programmable gate arrays (FPGAs) to perform all of the many functions (for example, color balance and contrast adjustments) of a professional color video camera. The processing is programmed in VHDL so that application-specific integrated circuits (ASICs) can be fabricated directly from the program. ["VHDL" signifies VHSIC Hardware Description Language C, a computing language used by the United States Department of Defense for describing, designing, and simulating very-high-speed integrated circuits (VHSICs).] The image-sensor and FPGA clock frequencies in these cameras have generally been much higher than those used in video cameras designed and manufactured elsewhere. Frequently, the outputs of these cameras are converted to other video-camera formats by use of pre- and post-filters.

  18. Investigating plasma viscosity with fast framing photography in the ZaP-HD Flow Z-Pinch experiment

    NASA Astrophysics Data System (ADS)

    Weed, Jonathan Robert

    The ZaP-HD Flow Z-Pinch experiment investigates the stabilizing effect of sheared axial flows while scaling toward a high-energy-density laboratory plasma (HEDLP > 100 GPa). Stabilizing flows may persist until viscous forces dissipate a sheared flow profile. Plasma viscosity is investigated by measuring scale lengths in turbulence intentionally introduced in the plasma flow. A boron nitride turbulence-tripping probe excites small scale length turbulence in the plasma, and fast framing optical cameras are used to study time-evolved turbulent structures and viscous dissipation. A Hadland Imacon 790 fast framing camera is modified for digital image capture, but features insufficient resolution to study turbulent structures. A Shimadzu HPV-X camera captures the evolution of turbulent structures with great spatial and temporal resolution, but is unable to resolve the anticipated Kolmogorov scale in ZaP-HD as predicted by a simplified pinch model.

  19. The Last Meter: Blind Visual Guidance to a Target.

    PubMed

    Manduchi, Roberto; Coughlan, James M

    2014-01-01

    Smartphone apps can use object recognition software to provide information to blind or low vision users about objects in the visual environment. A crucial challenge for these users is aiming the camera properly to take a well-framed picture of the desired target object. We investigate the effects of two fundamental constraints of object recognition - frame rate and camera field of view - on a blind person's ability to use an object recognition smartphone app. The app was used by 18 blind participants to find visual targets beyond arm's reach and approach them to within 30 cm. While we expected that a faster frame rate or wider camera field of view should always improve search performance, our experimental results show that in many cases increasing the field of view does not help, and may even hurt, performance. These results have important implications for the design of object recognition systems for blind users.

  20. An Innovative Procedure for Calibration of Strapdown Electro-Optical Sensors Onboard Unmanned Air Vehicles

    PubMed Central

    Fasano, Giancarmine; Accardo, Domenico; Moccia, Antonio; Rispoli, Attilio

    2010-01-01

    This paper presents an innovative method for estimating the attitude of airborne electro-optical cameras with respect to the onboard autonomous navigation unit. The procedure is based on the use of attitude measurements under static conditions taken by an inertial unit and carrier-phase differential Global Positioning System to obtain accurate camera position estimates in the aircraft body reference frame, while image analysis allows line-of-sight unit vectors in the camera based reference frame to be computed. The method has been applied to the alignment of the visible and infrared cameras installed onboard the experimental aircraft of the Italian Aerospace Research Center and adopted for in-flight obstacle detection and collision avoidance. Results show an angular uncertainty on the order of 0.1° (rms). PMID:22315559

  1. Multi-frame image processing with panning cameras and moving subjects

    NASA Astrophysics Data System (ADS)

    Paolini, Aaron; Humphrey, John; Curt, Petersen; Kelmelis, Eric

    2014-06-01

    Imaging scenarios commonly involve erratic, unpredictable camera behavior or subjects that are prone to movement, complicating multi-frame image processing techniques. To address these issues, we developed three techniques that can be applied to multi-frame image processing algorithms in order to mitigate the adverse effects observed when cameras are panning or subjects within the scene are moving. We provide a detailed overview of the techniques and discuss the applicability of each to various movement types. In addition to this, we evaluated algorithm efficacy with demonstrated benefits using field test video, which has been processed using our commercially available surveillance product. Our results show that algorithm efficacy is significantly improved in common scenarios, expanding our software's operational scope. Our methods introduce little computational burden, enabling their use in real-time and low-power solutions, and are appropriate for long observation periods. Our test cases focus on imaging through turbulence, a common use case for multi-frame techniques. We present results of a field study designed to test the efficacy of these techniques under expanded use cases.

  2. Saturnian Dawn

    NASA Image and Video Library

    2017-06-26

    NASA's Cassini spacecraft peers toward a sliver of Saturn's sunlit atmosphere while the icy rings stretch across the foreground as a dark band. This view looks toward the unilluminated side of the rings from about 7 degrees below the ring plane. The image was taken in green light with the Cassini spacecraft wide-angle camera on March 31, 2017. The view was obtained at a distance of approximately 620,000 miles (1 million kilometers) from Saturn. Image scale is 38 miles (61 kilometers) per pixel. https://photojournal.jpl.nasa.gov/catalog/PIA21334

  3. Large format geiger-mode avalanche photodiode LADAR camera

    NASA Astrophysics Data System (ADS)

    Yuan, Ping; Sudharsanan, Rengarajan; Bai, Xiaogang; Labios, Eduardo; Morris, Bryan; Nicholson, John P.; Stuart, Gary M.; Danny, Harrison

    2013-05-01

    Recently Spectrolab has successfully demonstrated a compact 32x32 Laser Detection and Range (LADAR) camera with single photo-level sensitivity with small size, weight, and power (SWAP) budget for threedimensional (3D) topographic imaging at 1064 nm on various platforms. With 20-kHz frame rate and 500- ps timing uncertainty, this LADAR system provides coverage down to inch-level fidelity and allows for effective wide-area terrain mapping. At a 10 mph forward speed and 1000 feet above ground level (AGL), it covers 0.5 square-mile per hour with a resolution of 25 in2/pixel after data averaging. In order to increase the forward speed to fit for more platforms and survey a large area more effectively, Spectrolab is developing 32x128 Geiger-mode LADAR camera with 43 frame rate. With the increase in both frame rate and array size, the data collection rate is improved by 10 times. With a programmable bin size from 0.3 ps to 0.5 ns and 14-bit timing dynamic range, LADAR developers will have more freedom in system integration for various applications. Most of the special features of Spectrolab 32x32 LADAR camera, such as non-uniform bias correction, variable range gate width, windowing for smaller arrays, and short pixel protection, are implemented in this camera.

  4. Software for Acquiring Image Data for PIV

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.; Cheung, H. M.; Kressler, Brian

    2003-01-01

    PIV Acquisition (PIVACQ) is a computer program for acquisition of data for particle-image velocimetry (PIV). In the PIV system for which PIVACQ was developed, small particles entrained in a flow are illuminated with a sheet of light from a pulsed laser. The illuminated region is monitored by a charge-coupled-device camera that operates in conjunction with a data-acquisition system that includes a frame grabber and a counter-timer board, both installed in a single computer. The camera operates in "frame-straddle" mode where a pair of images can be obtained closely spaced in time (on the order of microseconds). The frame grabber acquires image data from the camera and stores the data in the computer memory. The counter/timer board triggers the camera and synchronizes the pulsing of the laser with acquisition of data from the camera. PIVPROC coordinates all of these functions and provides a graphical user interface, through which the user can control the PIV data-acquisition system. PIVACQ enables the user to acquire a sequence of single-exposure images, display the images, process the images, and then save the images to the computer hard drive. PIVACQ works in conjunction with the PIVPROC program which processes the images of particles into the velocity field in the illuminated plane.

  5. Ground volume assessment using 'Structure from Motion' photogrammetry with a smartphone and a compact camera

    NASA Astrophysics Data System (ADS)

    Wróżyński, Rafał; Pyszny, Krzysztof; Sojka, Mariusz; Przybyła, Czesław; Murat-Błażejewska, Sadżide

    2017-06-01

    The article describes how the Structure-from-Motion (SfM) method can be used to calculate the volume of anthropogenic microtopography. In the proposed workflow, data is obtained using mass-market devices such as a compact camera (Canon G9) and a smartphone (iPhone5). The volume is computed using free open source software (VisualSFMv0.5.23, CMPMVSv0.6.0., MeshLab) on a PCclass computer. The input data is acquired from video frames. To verify the method laboratory tests on the embankment of a known volume has been carried out. Models of the test embankment were built using two independent measurements made with those two devices. No significant differences were found between the models in a comparative analysis. The volumes of the models differed from the actual volume just by 0.7‰ and 2‰. After a successful laboratory verification, field measurements were carried out in the same way. While building the model from the data acquired with a smartphone, it was observed that a series of frames, approximately 14% of all the frames, was rejected. The missing frames caused the point cloud to be less dense in the place where they had been rejected. This affected the model's volume differed from the volume acquired with a camera by 7%. In order to improve the homogeneity, the frame extraction frequency was increased in the place where frames have been previously missing. A uniform model was thereby obtained with point cloud density evenly distributed. There was a 1.5% difference between the embankment's volume and the volume calculated from the camera-recorded video. The presented method permits the number of input frames to be increased and the model's accuracy to be enhanced without making an additional measurement, which may not be possible in the case of temporary features.

  6. 10. 22'X34' original blueprint, VariableAngle Launcher, 'SIDE VIEW CAMERA CARSTEEL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. 22'X34' original blueprint, Variable-Angle Launcher, 'SIDE VIEW CAMERA CAR-STEEL FRAME AND AXLES' drawn at 1/2'=1'-0'. (BOURD Sketch # 209124). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  7. Variable-Interval Sequenced-Action Camera (VINSAC). Dissemination Document No. 1.

    ERIC Educational Resources Information Center

    Ward, Ted

    The 16 millimeter (mm) Variable-Interval Sequenced-Action Camera (VINSAC) is designed for inexpensive photographic recording of effective teacher instruction and use of instructional materials for teacher education and research purposes. The camera photographs single frames at preselected time intervals (.5 second to 20 seconds) which are…

  8. Students' Framing of Laboratory Exercises Using Infrared Cameras

    ERIC Educational Resources Information Center

    Haglund, Jesper; Jeppsson, Fredrik; Hedberg, David; Schönborn, Konrad J.

    2015-01-01

    Thermal science is challenging for students due to its largely imperceptible nature. Handheld infrared cameras offer a pedagogical opportunity for students to see otherwise invisible thermal phenomena. In the present study, a class of upper secondary technology students (N = 30) partook in four IR-camera laboratory activities, designed around the…

  9. Dawn at Vesta: An overview after the Dawn mission

    NASA Astrophysics Data System (ADS)

    Zambon, Francesca

    2016-07-01

    Vesta, the second largest object in the main asteroid belt of our Solar System, was explored by the Dawn mission for over a year [1, 2]. Dawn is equipped with the Framing Camera (FC) [3], which provides geological and compositional analysis, the Visible and InfraRed (VIR) mapping spectrometer [4], which allowed a comprehensive mineralogical mapping of the surface, and the Gamma Ray and Neutron Detector (GRaND) [5], which reveals the elemental composition. A wealth of data acquired by these three instruments allowed for improving the knowledge on the surface and near-surface properties of Vesta. Dawn covered a large fraction of Vesta' surface. Dawn's mission at Vesta has been divided into four different phases based on the spacecraft altitude [1], which resulted in a variety of pixel resolutions, reaching down to ~70 m/pixel for VIR and ~25 m/pixel for the FC. Pyroxene absorptions are the most prominent visible-to-near infrared spectral features of Vesta [6]. The overall mineralogy is consistent with howardite-eucrite-diogenite (HED) meteorites [7, 8]. More specifically, VIR spectra, acquired in the overall range 0.25-5.1 μm at spatial scales ranging from tens of meters to tens of kilometers, are consistent with a surface covered by a howardite-like regolith containing various proportions of eucrite and diogenite at different locations [9, 10]. Diogenite shows up in localized regions and mostly occurs in the southern polar region within the Rheasilvia impact basin [10]. Lithologies other than HEDs were indeed revealed by VIR spectra at the local scale. Olivine-rich deposits have been detected in Bellicia and Arruntia craters as well as in a limited number of other sites [11, 12, 13], while a large number of bright [14] and dark units [15, 16, 17] overlay Vesta'surface. Spectrally distinct, eucrite-rich ejecta have been observed in the Oppia and Octavia ejecta, interpreted to be glassy impact melt [18, 19]. VIR spectral analysis highlights a shallow 2.8-μm band diagnostic of OH-bearing phases [20]. This feature is mostly associated with dark material, which is believed to be exogenous material delivered by carbonaceous asteroids over time [16]. However, OH-rich material has been found to correlate with moderate-reflectance such as small spectrally distinct patches in the Oppia region [18], and a large circular area including Bellicia and Arruntia craters [21]. Compared to dark units, bright material units have a much weaker 2.8-μm band [14]. These units are interpreted to have an endogenous origin, therefore representing the relatively uncontaminated indigenous Vesta basaltic soil [16]. [1] Russell, C.T., Raymond, C.A., Space Sci. Rev. 163, 3-23, 2011. [2] Russell, C.T. et al., Science, 336, 684, 2012. [3] Sierks, H. et al., Space Sci. Rev. 163, 263-327, 2011. [4] De Sanctis et al., Space Sci. Rev. 163, 329-369, 2011. [5] Prettyman, T.H. et al., Space Sci. Rev. 163, 371-459, 2011. [6] McCord, Science 168, 1445-1447, 1970. [7] Drake, University of Arizona Press, 1979. [8] Feierberg and Drake, Science 209, 805-807, 1980. [9] De Sanctis et al., Science 336, 697-700, 2012. [10] Ammannito, E. et al., Maps 48, 2185-2198, 2013. [11] Ammannito, E. et al., Nature 504, 112-125, 2013. [12] Ruesch, O. et al, JGR 119-9, 2078-2108, 2014. [13] Palomba et al., Icarus 258, p. 120-134, 2015. [14] Zambon et al., Icarus 240, p. 73-85, 2014. [15] Jaumann et al., Science, 336 (6082), 687, 2012. [16] McCord et al., Nature 491, 83-86, 2012. [17] Palomba et al., Icarus 240, 58-72, 2014. [18] Le Corre, et al., Icarus, 226-2, 1568-1594, 2013. [19] Tosi et al., Icarus 259, 129-149, 2015. [20] De Sanctis et al., Astrophys. J. Lett. 758, L36, 2012. [21] Combe et al., Icarus 259, 53-71, 2015

  10. Performance characterization of UV science cameras developed for the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP)

    NASA Astrophysics Data System (ADS)

    Champey, P.; Kobayashi, K.; Winebarger, A.; Cirtain, J.; Hyde, D.; Robertson, B.; Beabout, D.; Beabout, B.; Stewart, M.

    2014-07-01

    The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras will be built and tested for flight with the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The goal of the CLASP mission is to observe the scattering polarization in Lyman-α and to detect the Hanle effect in the line core. Due to the nature of Lyman-α polarizationin the chromosphere, strict measurement sensitivity requirements are imposed on the CLASP polarimeter and spectrograph systems; science requirements for polarization measurements of Q/I and U/I are 0.1% in the line core. CLASP is a dual-beam spectro-polarimeter, which uses a continuously rotating waveplate as a polarization modulator, while the waveplate motor driver outputs trigger pulses to synchronize the exposures. The CCDs are operated in frame-transfer mode; the trigger pulse initiates the frame transfer, effectively ending the ongoing exposure and starting the next. The strict requirement of 0.1% polarization accuracy is met by using frame-transfer cameras to maximize the duty cycle in order to minimize photon noise. The CLASP cameras were designed to operate with ≤ 10 e-/pixel/second dark current, ≤ 25 e- read noise, a gain of 2.0 +- 0.5 and ≤ 1.0% residual non-linearity. We present the results of the performance characterization study performed on the CLASP prototype camera; dark current, read noise, camera gain and residual non-linearity.

  11. Effects of frame rate and image resolution on pulse rate measured using multiple camera imaging photoplethysmography

    NASA Astrophysics Data System (ADS)

    Blackford, Ethan B.; Estepp, Justin R.

    2015-03-01

    Non-contact, imaging photoplethysmography uses cameras to facilitate measurements including pulse rate, pulse rate variability, respiration rate, and blood perfusion by measuring characteristic changes in light absorption at the skin's surface resulting from changes in blood volume in the superficial microvasculature. Several factors may affect the accuracy of the physiological measurement including imager frame rate, resolution, compression, lighting conditions, image background, participant skin tone, and participant motion. Before this method can gain wider use outside basic research settings, its constraints and capabilities must be well understood. Recently, we presented a novel approach utilizing a synchronized, nine-camera, semicircular array backed by measurement of an electrocardiogram and fingertip reflectance photoplethysmogram. Twenty-five individuals participated in six, five-minute, controlled head motion artifact trials in front of a black and dynamic color backdrop. Increasing the input channel space for blind source separation using the camera array was effective in mitigating error from head motion artifact. Herein we present the effects of lower frame rates at 60 and 30 (reduced from 120) frames per second and reduced image resolution at 329x246 pixels (one-quarter of the original 658x492 pixel resolution) using bilinear and zero-order downsampling. This is the first time these factors have been examined for a multiple imager array and align well with previous findings utilizing a single imager. Examining windowed pulse rates, there is little observable difference in mean absolute error or error distributions resulting from reduced frame rates or image resolution, thus lowering requirements for systems measuring pulse rate over sufficient length time windows.

  12. Optical fringe-reflection deflectometry with bundle adjustment

    NASA Astrophysics Data System (ADS)

    Xiao, Yong-Liang; Li, Sikun; Zhang, Qican; Zhong, Jianxin; Su, Xianyu; You, Zhisheng

    2018-06-01

    Liquid crystal display (LCD) screens are located outside of a camera's field of view in fringe-reflection deflectometry. Therefore, fringes that are displayed on LCD screens are obtained through specular reflection by a fixed camera. Thus, the pose calibration between the camera and LCD screen is one of the main challenges in fringe-reflection deflectometry. A markerless planar mirror is used to reflect the LCD screen more than three times, and the fringes are mapped into the fixed camera. The geometrical calibration can be accomplished by estimating the pose between the camera and the virtual image of fringes. Considering the relation between their pose, the incidence and reflection rays can be unified in the camera frame, and a forward triangulation intersection can be operated in the camera frame to measure three-dimensional (3D) coordinates of the specular surface. In the final optimization, constraint-bundle adjustment is operated to refine simultaneously the camera intrinsic parameters, including distortion coefficients, estimated geometrical pose between the LCD screen and camera, and 3D coordinates of the specular surface, with the help of the absolute phase collinear constraint. Simulation and experiment results demonstrate that the pose calibration with planar mirror reflection is simple and feasible, and the constraint-bundle adjustment can enhance the 3D coordinate measurement accuracy in fringe-reflection deflectometry.

  13. Synchronization of video recording and laser pulses including background light suppression

    NASA Technical Reports Server (NTRS)

    Kalshoven, Jr., James E. (Inventor); Tierney, Jr., Michael (Inventor); Dabney, Philip W. (Inventor)

    2004-01-01

    An apparatus for and a method of triggering a pulsed light source, in particular a laser light source, for predictable capture of the source by video equipment. A frame synchronization signal is derived from the video signal of a camera to trigger the laser and position the resulting laser light pulse in the appropriate field of the video frame and during the opening of the electronic shutter, if such shutter is included in the camera. Positioning of the laser pulse in the proper video field allows, after recording, for the viewing of the laser light image with a video monitor using the pause mode on a standard cassette-type VCR. This invention also allows for fine positioning of the laser pulse to fall within the electronic shutter opening. For cameras with externally controllable electronic shutters, the invention provides for background light suppression by increasing shutter speed during the frame in which the laser light image is captured. This results in the laser light appearing in one frame in which the background scene is suppressed with the laser light being uneffected, while in all other frames, the shutter speed is slower, allowing for the normal recording of the background scene. This invention also allows for arbitrary (manual or external) triggering of the laser with full video synchronization and background light suppression.

  14. A New Hyperspectral Designed for Small UAS Tested in Real World Applications

    NASA Astrophysics Data System (ADS)

    Marcucci, E.; Saiet, E., II; Hatfield, M. C.

    2014-12-01

    The ability to investigate landscape and vegetation from airborne instruments offers many advantages, including high resolution data, ability to deploy instruments over a specific area, and repeat measurements. The Alaska Center for Unmanned Aircraft Systems Integration (ACUASI) has recently integrated a hyperspectral imaging camera onto their Ptarmigan hexacopter. The Rikola Hyperspectral Camera manufactured by VTT and Rikola, Ltd. is capable of obtaining data within the 400-950 nm range with an accuracy of ~1 nm. Using the compact flash on the UAV limits the maximum number of channels to 24 this summer. The camera uses a single frame to sequentially record the spectral bands of interest in a 37° field-of-view. Because the camera collects data as single frames it takes a finite amount of time to compile the complete spectral. Although each frame takes only 5 nanoseconds, co-registration of frames is still required. The hovering ability of the hexacopter helps eliminate frame shift. GPS records data for incorporation into a larger dataset. Conservatively, the Ptarmigan can fly at an altitude of 400 feet, for 15 minutes, and 7000 feet away from the operator. The airborne hyperspectral instrument will be extremely useful to scientists as a platform that can provide data on-request. Since the spectral range of the camera is ideal for the study of vegetation, this study 1) examines seasonal changes of vegetation of the Fairbanks area, 2) ground-truths satellite measurements, and 3) ties vegetation conditions around a weather tower to the tower readings. Through this proof of concept, ACUASI provides a means for scientists to request the most up-to-date and location-specific data for their field sites. Additionally, the resolution of the airborne instruments is much higher than that of satellite data, these may be readily tasked, and they have the advantage over manned flights in terms of manpower and cost.

  15. A simple demonstration when studying the equivalence principle

    NASA Astrophysics Data System (ADS)

    Mayer, Valery; Varaksina, Ekaterina

    2016-06-01

    The paper proposes a lecture experiment that can be demonstrated when studying the equivalence principle formulated by Albert Einstein. The demonstration consists of creating stroboscopic photographs of a ball moving along a parabola in Earth's gravitational field. In the first experiment, a camera is stationary relative to Earth's surface. In the second, the camera falls freely downwards with the ball, allowing students to see that the ball moves uniformly and rectilinearly relative to the frame of reference of the freely falling camera. The equivalence principle explains this result, as it is always possible to propose an inertial frame of reference for a small region of a gravitational field, where space-time effects of curvature are negligible.

  16. An airborne multispectral imaging system based on two consumer-grade cameras for agricultural remote sensing

    USDA-ARS?s Scientific Manuscript database

    This paper describes the design and evaluation of an airborne multispectral imaging system based on two identical consumer-grade cameras for agricultural remote sensing. The cameras are equipped with a full-frame complementary metal oxide semiconductor (CMOS) sensor with 5616 × 3744 pixels. One came...

  17. Image system for three dimensional, 360 DEGREE, time sequence surface mapping of moving objects

    DOEpatents

    Lu, Shin-Yee

    1998-01-01

    A three-dimensional motion camera system comprises a light projector placed between two synchronous video cameras all focused on an object-of-interest. The light projector shines a sharp pattern of vertical lines (Ronchi ruling) on the object-of-interest that appear to be bent differently to each camera by virtue of the surface shape of the object-of-interest and the relative geometry of the cameras, light projector and object-of-interest Each video frame is captured in a computer memory and analyzed. Since the relative geometry is known and the system pre-calibrated, the unknown three-dimensional shape of the object-of-interest can be solved for by matching the intersections of the projected light lines with orthogonal epipolar lines corresponding to horizontal rows in the video camera frames. A surface reconstruction is made and displayed on a monitor screen. For 360.degree. all around coverage of theobject-of-interest, two additional sets of light projectors and corresponding cameras are distributed about 120.degree. apart from one another.

  18. Image system for three dimensional, 360{degree}, time sequence surface mapping of moving objects

    DOEpatents

    Lu, S.Y.

    1998-12-22

    A three-dimensional motion camera system comprises a light projector placed between two synchronous video cameras all focused on an object-of-interest. The light projector shines a sharp pattern of vertical lines (Ronchi ruling) on the object-of-interest that appear to be bent differently to each camera by virtue of the surface shape of the object-of-interest and the relative geometry of the cameras, light projector and object-of-interest. Each video frame is captured in a computer memory and analyzed. Since the relative geometry is known and the system pre-calibrated, the unknown three-dimensional shape of the object-of-interest can be solved for by matching the intersections of the projected light lines with orthogonal epipolar lines corresponding to horizontal rows in the video camera frames. A surface reconstruction is made and displayed on a monitor screen. For 360{degree} all around coverage of the object-of-interest, two additional sets of light projectors and corresponding cameras are distributed about 120{degree} apart from one another. 20 figs.

  19. 3D kinematic measurement of human movement using low cost fish-eye cameras

    NASA Astrophysics Data System (ADS)

    Islam, Atiqul; Asikuzzaman, Md.; Garratt, Matthew A.; Pickering, Mark R.

    2017-02-01

    3D motion capture is difficult when the capturing is performed in an outdoor environment without controlled surroundings. In this paper, we propose a new approach of using two ordinary cameras arranged in a special stereoscopic configuration and passive markers on a subject's body to reconstruct the motion of the subject. Firstly for each frame of the video, an adaptive thresholding algorithm is applied for extracting the markers on the subject's body. Once the markers are extracted, an algorithm for matching corresponding markers in each frame is applied. Zhang's planar calibration method is used to calibrate the two cameras. As the cameras use the fisheye lens, they cannot be well estimated using a pinhole camera model which makes it difficult to estimate the depth information. In this work, to restore the 3D coordinates we use a unique calibration method for fisheye lenses. The accuracy of the 3D coordinate reconstruction is evaluated by comparing with results from a commercially available Vicon motion capture system.

  20. Ultrahigh- and high-speed photography, videography, and photonics '91; Proceedings of the Meeting, San Diego, CA, July 24-26, 1991

    NASA Astrophysics Data System (ADS)

    Jaanimagi, Paul A.

    1992-01-01

    This volume presents papers grouped under the topics on advances in streak and framing camera technology, applications of ultrahigh-speed photography, characterizing high-speed instrumentation, high-speed electronic imaging technology and applications, new technology for high-speed photography, high-speed imaging and photonics in detonics, and high-speed velocimetry. The papers presented include those on a subpicosecond X-ray streak camera, photocathodes for ultrasoft X-ray region, streak tube dynamic range, high-speed TV cameras for streak tube readout, femtosecond light-in-flight holography, and electrooptical systems characterization techniques. Attention is also given to high-speed electronic memory video recording techniques, high-speed IR imaging of repetitive events using a standard RS-170 imager, use of a CCD array as a medium-speed streak camera, the photography of shock waves in explosive crystals, a single-frame camera based on the type LD-S-10 intensifier tube, and jitter diagnosis for pico- and femtosecond sources.

  1. Evolution of Occator Crater on (1) Ceres

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nathues, A.; Platz, T.; Thangjam, G.

    2017-03-01

    The dwarf planet Ceres (diameter 939 km) is the largest object in the main asteroid belt. Recent investigations suggest that Ceres is a thermally evolved, volatile-rich body with potential geological activity, a body which was never completely molten but possibly differentiated into a rocky core, an ice-rich mantle, and which may contain remnant internal liquid water. Thermal alteration and exogenic material infall contribute to producing a (dark) carbonaceous chondritic-like surface containing ammoniated phyllosilicates. Here we report imaging and spectroscopic analyses of Occator crater derived from the Framing Camera and the Visible and Infrared Spectrometer onboard Dawn. We found that themore » central bright spot (Cerealia Facula) of Occator is ∼30 Myr younger than the crater itself. The central spot is located in a central pit which contains a dome that is spectrally homogenous, exhibiting absorption features that are consistent with carbonates. Multiple radial fractures across the dome indicate an extrusive formation process. Our results lead us to conclude that the floor region was subject to past endogenic activity. Dome and bright material in its vicinity formed likely due to a long-lasting, periodic, or episodic ascent of bright material from a subsurface reservoir rich in carbonates. Originally triggered by an impact event, gases, possibly dissolved from a subsurface water/brine layer, enabled material rich in carbonates to ascend through fractures and be deposited onto the surface.« less

  2. System selects framing rate for spectrograph camera

    NASA Technical Reports Server (NTRS)

    1965-01-01

    Circuit using zero-order light is reflected to a photomultiplier in the incoming radiation of a spectrograph monitor to provide an error signal which controls the advancing and driving rate of the film through the camera.

  3. Comet Wild 2 Up Close and Personal

    NASA Technical Reports Server (NTRS)

    2004-01-01

    On January 2, 2004 NASA's Stardust spacecraft made a close flyby of comet Wild 2 (pronounced 'Vilt-2'). Among the equipment the spacecraft carried on board was a navigation camera. This is the 34th of the 72 images taken by Stardust's navigation camera during close encounter. The exposure time was 10 milliseconds. The two frames are actually of 1 single exposure. The frame on the left depicts the comet as the human eye would see it. The frame on the right depicts the same image but 'stretched' so that the faint jets emanating from Wild 2 can be plainly seen. Comet Wild 2 is about five kilometers (3.1 miles) in diameter.

  4. Data rate enhancement of optical camera communications by compensating inter-frame gaps

    NASA Astrophysics Data System (ADS)

    Nguyen, Duy Thong; Park, Youngil

    2017-07-01

    Optical camera communications (OCC) is a convenient way of transmitting data between LED lamps and image sensors that are included in most smart devices. Although many schemes have been suggested to increase the data rate of the OCC system, it is still much lower than that of the photodiode-based LiFi system. One major reason of this low data rate is attributed to the inter-frame gap (IFG) of image sensor system, that is, the time gap between consecutive image frames. In this paper, we propose a way to compensate for this IFG efficiently by an interleaved Hamming coding scheme. The proposed scheme is implemented and the performance is measured.

  5. An Automatic Portable Telecine Camera.

    DTIC Science & Technology

    1978-08-01

    five television frames to achieve synchronous operation, that is about 0.2 second. 6.3 Video recorder noise imnunity The synchronisation pulse separator...display is filmed by a modified 16 am cine camera driven by a control unit in which the camera supply voltage is derived from the field synchronisation ...pulses of the video signal. Automatic synchronisation of the camera mechanism is achieved over a wide range of television field frequencies and the

  6. Development of biostereometric experiments. [stereometric camera system

    NASA Technical Reports Server (NTRS)

    Herron, R. E.

    1978-01-01

    The stereometric camera was designed for close-range techniques in biostereometrics. The camera focusing distance of 360 mm to infinity covers a broad field of close-range photogrammetry. The design provides for a separate unit for the lens system and interchangeable backs on the camera for the use of single frame film exposure, roll-type film cassettes, or glass plates. The system incorporates the use of a surface contrast optical projector.

  7. LabVIEW Graphical User Interface for a New High Sensitivity, High Resolution Micro-Angio-Fluoroscopic and ROI-CBCT System

    PubMed Central

    Keleshis, C; Ionita, CN; Yadava, G; Patel, V; Bednarek, DR; Hoffmann, KR; Verevkin, A; Rudin, S

    2008-01-01

    A graphical user interface based on LabVIEW software was developed to enable clinical evaluation of a new High-Sensitivity Micro-Angio-Fluoroscopic (HSMAF) system for real-time acquisition, display and rapid frame transfer of high-resolution region-of-interest images. The HSMAF detector consists of a CsI(Tl) phosphor, a light image intensifier (LII), and a fiber-optic taper coupled to a progressive scan, frame-transfer, charged-coupled device (CCD) camera which provides real-time 12 bit, 1k × 1k images capable of greater than 10 lp/mm resolution. Images can be captured in continuous or triggered mode, and the camera can be programmed by a computer using Camera Link serial communication. A graphical user interface was developed to control the camera modes such as gain and pixel binning as well as to acquire, store, display, and process the images. The program, written in LabVIEW, has the following capabilities: camera initialization, synchronized image acquisition with the x-ray pulses, roadmap and digital subtraction angiography acquisition (DSA), flat field correction, brightness and contrast control, last frame hold in fluoroscopy, looped playback of the acquired images in angiography, recursive temporal filtering and LII gain control. Frame rates can be up to 30 fps in full-resolution mode. The user friendly implementation of the interface along with the high framerate acquisition and display for this unique high-resolution detector should provide angiographers and interventionalists with a new capability for visualizing details of small vessels and endovascular devices such as stents and hence enable more accurate diagnoses and image guided interventions. (Support: NIH Grants R01NS43924, R01EB002873) PMID:18836570

  8. LabVIEW Graphical User Interface for a New High Sensitivity, High Resolution Micro-Angio-Fluoroscopic and ROI-CBCT System.

    PubMed

    Keleshis, C; Ionita, Cn; Yadava, G; Patel, V; Bednarek, Dr; Hoffmann, Kr; Verevkin, A; Rudin, S

    2008-01-01

    A graphical user interface based on LabVIEW software was developed to enable clinical evaluation of a new High-Sensitivity Micro-Angio-Fluoroscopic (HSMAF) system for real-time acquisition, display and rapid frame transfer of high-resolution region-of-interest images. The HSMAF detector consists of a CsI(Tl) phosphor, a light image intensifier (LII), and a fiber-optic taper coupled to a progressive scan, frame-transfer, charged-coupled device (CCD) camera which provides real-time 12 bit, 1k × 1k images capable of greater than 10 lp/mm resolution. Images can be captured in continuous or triggered mode, and the camera can be programmed by a computer using Camera Link serial communication. A graphical user interface was developed to control the camera modes such as gain and pixel binning as well as to acquire, store, display, and process the images. The program, written in LabVIEW, has the following capabilities: camera initialization, synchronized image acquisition with the x-ray pulses, roadmap and digital subtraction angiography acquisition (DSA), flat field correction, brightness and contrast control, last frame hold in fluoroscopy, looped playback of the acquired images in angiography, recursive temporal filtering and LII gain control. Frame rates can be up to 30 fps in full-resolution mode. The user friendly implementation of the interface along with the high framerate acquisition and display for this unique high-resolution detector should provide angiographers and interventionalists with a new capability for visualizing details of small vessels and endovascular devices such as stents and hence enable more accurate diagnoses and image guided interventions. (Support: NIH Grants R01NS43924, R01EB002873).

  9. Proposed patient motion monitoring system using feature point tracking with a web camera.

    PubMed

    Miura, Hideharu; Ozawa, Shuichi; Matsuura, Takaaki; Yamada, Kiyoshi; Nagata, Yasushi

    2017-12-01

    Patient motion monitoring systems play an important role in providing accurate treatment dose delivery. We propose a system that utilizes a web camera (frame rate up to 30 fps, maximum resolution of 640 × 480 pixels) and an in-house image processing software (developed using Microsoft Visual C++ and OpenCV). This system is simple to use and convenient to set up. The pyramidal Lucas-Kanade method was applied to calculate motions for each feature point by analysing two consecutive frames. The image processing software employs a color scheme where the defined feature points are blue under stable (no movement) conditions and turn red along with a warning message and an audio signal (beeping alarm) for large patient movements. The initial position of the marker was used by the program to determine the marker positions in all the frames. The software generates a text file that contains the calculated motion for each frame and saves it as a compressed audio video interleave (AVI) file. We proposed a patient motion monitoring system using a web camera, which is simple and convenient to set up, to increase the safety of treatment delivery.

  10. Performance Characterization of UV Science Cameras Developed for the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP)

    NASA Technical Reports Server (NTRS)

    Champey, Patrick; Kobayashi, Ken; Winebarger, Amy; Cirtin, Jonathan; Hyde, David; Robertson, Bryan; Beabout, Brent; Beabout, Dyana; Stewart, Mike

    2014-01-01

    The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras will be built and tested for flight with the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The goal of the CLASP mission is to observe the scattering polarization in Lyman-alpha and to detect the Hanle effect in the line core. Due to the nature of Lyman-alpha polarization in the chromosphere, strict measurement sensitivity requirements are imposed on the CLASP polarimeter and spectrograph systems; science requirements for polarization measurements of Q/I and U/I are 0.1% in the line core. CLASP is a dual-beam spectro-polarimeter, which uses a continuously rotating waveplate as a polarization modulator, while the waveplate motor driver outputs trigger pulses to synchronize the exposures. The CCDs are operated in frame-transfer mode; the trigger pulse initiates the frame transfer, effectively ending the ongoing exposure and starting the next. The strict requirement of 0.1% polarization accuracy is met by using frame-transfer cameras to maximize the duty cycle in order to minimize photon noise. Coating the e2v CCD57-10 512x512 detectors with Lumogen-E coating allows for a relatively high (30%) quantum efficiency at the Lyman-$\\alpha$ line. The CLASP cameras were designed to operate with =10 e- /pixel/second dark current, = 25 e- read noise, a gain of 2.0 and =0.1% residual non-linearity. We present the results of the performance characterization study performed on the CLASP prototype camera; dark current, read noise, camera gain and residual non-linearity.

  11. Performance Characterization of UV Science Cameras Developed for the Chromospheric Lyman-Alpha Spectro-Polarimeter

    NASA Technical Reports Server (NTRS)

    Champey, P.; Kobayashi, K.; Winebarger, A.; Cirtain, J.; Hyde, D.; Robertson, B.; Beabout, D.; Beabout, B.; Stewart, M.

    2014-01-01

    The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras will be built and tested for flight with the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The goal of the CLASP mission is to observe the scattering polarization in Lyman-alpha and to detect the Hanle effect in the line core. Due to the nature of Lyman-alpha polarization in the chromosphere, strict measurement sensitivity requirements are imposed on the CLASP polarimeter and spectrograph systems; science requirements for polarization measurements of Q/I and U/I are 0.1 percent in the line core. CLASP is a dual-beam spectro-polarimeter, which uses a continuously rotating waveplate as a polarization modulator, while the waveplate motor driver outputs trigger pulses to synchronize the exposures. The CCDs are operated in frame-transfer mode; the trigger pulse initiates the frame transfer, effectively ending the ongoing exposure and starting the next. The strict requirement of 0.1 percent polarization accuracy is met by using frame-transfer cameras to maximize the duty cycle in order to minimize photon noise. Coating the e2v CCD57-10 512x512 detectors with Lumogen-E coating allows for a relatively high (30 percent) quantum efficiency at the Lyman-alpha line. The CLASP cameras were designed to operate with 10 e-/pixel/second dark current, 25 e- read noise, a gain of 2.0 +/- 0.5 and 1.0 percent residual non-linearity. We present the results of the performance characterization study performed on the CLASP prototype camera; dark current, read noise, camera gain and residual non-linearity.

  12. Precision of FLEET Velocimetry Using High-speed CMOS Camera Systems

    NASA Technical Reports Server (NTRS)

    Peters, Christopher J.; Danehy, Paul M.; Bathel, Brett F.; Jiang, Naibo; Calvert, Nathan D.; Miles, Richard B.

    2015-01-01

    Femtosecond laser electronic excitation tagging (FLEET) is an optical measurement technique that permits quantitative velocimetry of unseeded air or nitrogen using a single laser and a single camera. In this paper, we seek to determine the fundamental precision of the FLEET technique using high-speed complementary metal-oxide semiconductor (CMOS) cameras. Also, we compare the performance of several different high-speed CMOS camera systems for acquiring FLEET velocimetry data in air and nitrogen free-jet flows. The precision was defined as the standard deviation of a set of several hundred single-shot velocity measurements. Methods of enhancing the precision of the measurement were explored such as digital binning (similar in concept to on-sensor binning, but done in post-processing), row-wise digital binning of the signal in adjacent pixels and increasing the time delay between successive exposures. These techniques generally improved precision; however, binning provided the greatest improvement to the un-intensified camera systems which had low signal-to-noise ratio. When binning row-wise by 8 pixels (about the thickness of the tagged region) and using an inter-frame delay of 65 micro sec, precisions of 0.5 m/s in air and 0.2 m/s in nitrogen were achieved. The camera comparison included a pco.dimax HD, a LaVision Imager scientific CMOS (sCMOS) and a Photron FASTCAM SA-X2, along with a two-stage LaVision High Speed IRO intensifier. Excluding the LaVision Imager sCMOS, the cameras were tested with and without intensification and with both short and long inter-frame delays. Use of intensification and longer inter-frame delay generally improved precision. Overall, the Photron FASTCAM SA-X2 exhibited the best performance in terms of greatest precision and highest signal-to-noise ratio primarily because it had the largest pixels.

  13. Inexpensive Neutron Imaging Cameras Using CCDs for Astronomy

    NASA Astrophysics Data System (ADS)

    Hewat, A. W.

    We have developed inexpensive neutron imaging cameras using CCDs originally designed for amateur astronomical observation. The low-light, high resolution requirements of such CCDs are similar to those for neutron imaging, except that noise as well as cost is reduced by using slower read-out electronics. For example, we use the same 2048x2048 pixel ;Kodak; KAI-4022 CCD as used in the high performance PCO-2000 CCD camera, but our electronics requires ∼5 sec for full-frame read-out, ten times slower than the PCO-2000. Since neutron exposures also require several seconds, this is not seen as a serious disadvantage for many applications. If higher frame rates are needed, the CCD unit on our camera can be easily swapped for a faster readout detector with similar chip size and resolution, such as the PCO-2000 or the sCMOS PCO.edge 4.2.

  14. Layered/Pancake-like Ejecta on Ceres: Inferring the Composition and Mechanical Properties of the Cerean Surface through Modeling of Ejecta Emplacement

    NASA Astrophysics Data System (ADS)

    Hughson, K.; Russell, C. T.; Schmidt, B. E.; Chilton, H.; Scully, J. E. C.; Sizemore, H. G.; Byrne, S.; Platz, T.; Raymond, C. A.

    2017-12-01

    During the Survey, High Altitude Mapping Orbit, and Low Altitude Mapping Orbit phases of the primary mission Dawn's Framing Camera observed a multitude of globally distributed lobate deposits. These flows were broadly interpreted as either similar to ice-cored/ice-cemented flows (Type 1 flows) on Earth and Mars, long run-out terrestrial or martian landslides (Type 2 flows), or highly mobile fluidized ejecta-like deposits (Type 3 flows) (Buczckowski et al., 2016; Schmidt et al., 2017). The Type 3 flows are morphologically similar to layered/pancake ejecta found on Mars and Ganymede where they are thought to be caused by impacts into ground ice rich substrates (Mouginis-Mark, 1979; Boyce et al., 2010). We assess the effects of target material strength, sliding friction, and vapor entrainment on the production of these features by comparing the ejecta mobility (EM: the ratio of the radius of the ejecta blanket to the radius of the parent crater) values for all Type 3 cerean flows to a ballistic/kinematic sliding model similar to the one developed by Weiss et al. (2014) to model EM for impacts into a variety of ground ice rich substrates of differing volatile content on Mars. Initial results suggest that, in order for these features to form, the cerean surface requires a large coefficient of sliding friction (>0.1), and that significant amounts of water be vaporized during impact. However, the model does not tightly constrain the strength of the target material (best-fit values range from granite-like to unconsolidated-sand-like). These results are consistent with a largely dry, rough, and thin surface layer underlain by material rich in pore-filling ground ice, even at low latitudes. Additionally, before the Fall Meeting we will attempt to constrain the thickness of the ice-poor surface layer. This will be done through a combined analysis of model results and morphometric parameters of individual Type 3 flows. Future implementation of this model will further incorporate compositional and geophysical knowledge attained from Dawn in order to better constrain the strength of the cerean surface.

  15. Three-dimensional spectral analysis of compositional heterogeneity at Arruntia crater on (4) Vesta using Dawn FC

    NASA Astrophysics Data System (ADS)

    Thangjam, Guneshwar; Nathues, Andreas; Mengel, Kurt; Schäfer, Michael; Hoffmann, Martin; Cloutis, Edward A.; Mann, Paul; Müller, Christian; Platz, Thomas; Schäfer, Tanja

    2016-03-01

    We introduce an innovative three-dimensional spectral approach (three band parameter space with polyhedrons) that can be used for both qualitative and quantitative analyzes improving the characterization of surface compositional heterogeneity of (4) Vesta. It is an advanced and more robust methodology compared to the standard two-dimensional spectral approach (two band parameter space). The Dawn Framing Camera (FC) color data obtained during High Altitude Mapping Orbit (resolution ∼ 60 m/pixel) is used. The main focus is on the howardite-eucrite-diogenite (HED) lithologies containing carbonaceous chondritic material, olivine, and impact-melt. The archived spectra of HEDs and their mixtures, from RELAB, HOSERLab and USGS databases as well as our laboratory-measured spectra are used for this study. Three-dimensional convex polyhedrons are defined using computed band parameter values of laboratory spectra. Polyhedrons based on the parameters of Band Tilt (R0.92μm/R0.96μm), Mid Ratio ((R0.75μm/R0.83μm)/(R0.83μm/R0.92μm)) and reflectance at 0.55 μm (R0.55μm) are chosen for the present analysis. An algorithm in IDL programming language is employed to assign FC data points to the respective polyhedrons. The Arruntia region in the northern hemisphere of Vesta is selected for a case study because of its geological and mineralogical importance. We observe that this region is eucrite-dominated howarditic in composition. The extent of olivine-rich exposures within an area of 2.5 crater radii is ∼12% larger than the previous finding (Thangjam, G. et al. [2014]. Meteorit. Planet. Sci. 49, 1831-1850). Lithologies of nearly pure CM2-chondrite, olivine, glass, and diogenite are not found in this region. Although there are no unambiguous spectral features of impact melt, the investigation of morphological features using FC clear filter data from Low Altitude Mapping Orbit (resolution ∼ 18 m/pixel) suggests potential impact-melt features inside and outside of the crater. Our spectral approach can be extended to the entire Vestan surface to study the heterogeneous surface composition and its geology.

  16. Geochemical Mapping of 4 Vesta Begins

    NASA Technical Reports Server (NTRS)

    Prettyman, Thomas H.; Feldman, William C.; Forni, Olivier; Joy, Steven P.; Lawrence, David J.; LeCorre, Lucille; Mafi, Joseph N.; McCord, Thomas B.; McCoy, Timothy J.; McSween, Harry Y.; hide

    2011-01-01

    By December, the NASA Dawn spacecraft will have descended to a low altitude mapping orbit (LAMO), where the Gamma Ray and Neutron Detector (GRaND) will acquire global mapping data for up to four months. Measurements by GRaND will help answer elusive questions about how Vesta differentiated and the nature of processes that shaped Vesta s surface. The data will be analyzed to determine the abundances of Mg, Si, Fe, K, Th, and H at a spatial resolution of roughly 300 km full-width-at-half-maximum from a 465 km radius orbit. Thermal and fast neutron counting data will be analyzed to determine the neutron macroscopic absorption cross section and average atomic mass, providing constraints on additional elements, such as Ca and Al. GRaND will quantify the elemental composition of coarse spatial units identified by Dawn s Framing Camera (FC) and the Visible & Infrared Spectrometer (VIR). In addition, GRaND will map the mixing ratio of compositional end members selected from the howardite, eucrite and diogenite (HED) meteorites, determine the relative proportions of plagioclase and mafic minerals, and search for compositions that are absent or under-represented in the meteorite collection. While it is generally thought that Vesta s crust on a regional scale should be well-represented by linear mixing of HED whole-rock compositions, there are hints that Vesta may be more diverse than implied by this model. For example, the discovery of K-rich impact glasses in howardites suggests that K-rich rocks may be present on a portion of Vesta s surface, and the analysis of diogenites indicates considerable variability in the magmatic processes that formed them. The chemical composition of materials within Vesta s south polar structure may provide further clues to how it formed. An impact might have exposed mantle and lower crustal materials, which should have a distinctive compositional signature. We present the analysis of data acquired by GRaND from cruise through the descent to LAMO, including GRaND s sensitivity to different elements and geochemical processes.

  17. A passive terahertz video camera based on lumped element kinetic inductance detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rowe, Sam, E-mail: sam.rowe@astro.cf.ac.uk; Pascale, Enzo; Doyle, Simon

    We have developed a passive 350 GHz (850 μm) video-camera to demonstrate lumped element kinetic inductance detectors (LEKIDs)—designed originally for far-infrared astronomy—as an option for general purpose terrestrial terahertz imaging applications. The camera currently operates at a quasi-video frame rate of 2 Hz with a noise equivalent temperature difference per frame of ∼0.1 K, which is close to the background limit. The 152 element superconducting LEKID array is fabricated from a simple 40 nm aluminum film on a silicon dielectric substrate and is read out through a single microwave feedline with a cryogenic low noise amplifier and room temperature frequencymore » domain multiplexing electronics.« less

  18. Rapid orthophoto development system.

    DOT National Transportation Integrated Search

    2013-06-01

    The DMC system procured in the project represented state-of-the-art, large-format digital aerial camera systems at the start of : project. DMC is based on the frame camera model, and to achieve large ground coverage with high spatial resolution, the ...

  19. Geological Mapping of the Ac-H-12 Toharu Quadrangle of Ceres from NASA Dawn Mission

    NASA Astrophysics Data System (ADS)

    Mest, Scott; Williams, David; Crown, David; Yingst, Aileen; Buczkowski, Debra; Scully, Jennifer; Jaumann, Ralf; Roatsch, Thomas; Preusker, Frank; Nathues, Andres; Hoffmann, Martin; Schaefer, Michael; Raymond, Carol; Russell, Christopher

    2016-04-01

    The Dawn Science Team is conducting a geologic mapping campaign for Ceres similar to that done for Vesta [1,2], including production of a Survey- and High Altitude Mapping Orbit (HAMO)-based global map and a series of 15 Low Altitude Mapping Orbit (LAMO)-based quadrangle maps. In this abstract we discuss the surface geology and geologic evolution of the Ac-H-12 Toharu Quadrangle (21-66°S, 90-180°E). At the time of this writing LAMO images (35 m/pixel) are just becoming available. The current geologic map of Ac-H-12 was produced using ArcGIS software, and is based on HAMO images (140 m/pixel) and Survey (400 m/pixel) digital terrain models (for topographic information). Dawn Framing Camera (FC) color images were also used to provide context for map unit identification. The map (to be presented as a poster) will be updated from analyses of LAMO images. The Toharu Quadrangle is named after crater Toharu (86 km diameter; 48.3°S, 156°E), and is dominated by smooth terrain in the north, and more heavily cratered terrain in the south. The quad exhibits ~9 km of relief, with the highest elevations (~3.5-4.6 km) found among the western plateau and eastern crater rims, and the lowest elevation found on the floor of crater Chaminuka. Preliminary geologic mapping has defined three regional units (smooth material, smooth Kerwan floor material, and cratered terrain) that dominate the quadrangle, as well as a series of impact crater material units. Smooth materials form nearly flat-lying plains in the northwest part of the quad, and overlies hummocky materials in some areas. These smooth materials extend over a much broader area outside of the quad, and appear to contain some of the lowest crater densities on Ceres. Cratered terrain forms much of the map area and contains rugged surfaces formed largely by the structures and deposits of impact features. In addition to geologic units, a number of geologic features - including crater rims, furrows, scarps, troughs, and impact crater chains - have been mapped. The Toharu Quadrangle predominantly displays impact craters that exhibit a range of sizes - from the limits of resolution to part of the Kerwan basin (280 km diameter) - and preservation styles. The quad also contains a number large (>20 km across) depressions that are only observable in the topographic data. Smaller craters (<40 km) generally appear morphologically "fresh", and their rims are nearly circular and raised above the surrounding terrain. Larger craters, such as Toharu, appear more degraded, exhibiting irregularly shaped, sometimes scalloped, rim structures, and debris lobes on their floors. Numerous craters (> 20 km) contain central mounds; at current FC resolution, it is difficult to discern if these are primary structures (i.e., central peaks) or secondary features. Support of the Dawn Instrument, Operations, & Science Teams is acknowledged. This work is supported by grants from NASA, DLR and MPG. References: [1] Williams D.A. et al. (2014) Icarus, 244, 1-12. [2] Yingst R.A. et al. (2014) PSS, 103, 2-23.

  20. Dawn: Testing Paradigms by Exploring Dichotomies

    NASA Astrophysics Data System (ADS)

    Russell, C. T.; Schmidt, B. E.; Wise, J.; Ristvey, J.; Raymond, C. A.

    2010-12-01

    NASA’s Dawn mission represents a series of “firsts” for major NASA missions. Dawn is the first major NASA science mission to use ion propulsion engines, allowing Dawn to be the first mission to orbit one target and then leave its gravity well to explore a second destination. Dawn is the first science mission to the main asteroid belt, reaching protoplanet Vesta in summer 2011, and will be the first mission to reach a “dwarf planet” when it arrives at Ceres in 2015. By targeting both Vesta and Ceres, Dawn explores two intriguing dichotomies in the solar system, that of the dry rocky planets and the wet icy bodies (Fire and Ice) and the dichotomy between planets and asteroids. Is there a clear dividing line here? Vesta, the second most massive asteroid, is a protoplanet: a round, mostly intact asteroid that bears more resemblance to a planet than to smaller asteroids. Vesta is also the likely parent body of the HED meteorites that richly populate Earth’s meteorite collections. It is possible to hold a piece of Vesta in your hands. From the HED meteorites, scientists have learned the Vesta is one of few differentiated asteroids. And from its spectrum, rich in basaltic minerals, it is known to be much like a mini-version of Earth’s Moon and Mercury. Vesta’s surface once was home to floods of lava not unlike those found still today on the Earth. Vesta is very similar to a terrestrial planet. Ceres is the giant of the asteroid belt with a hydrostatic shape that earns it a dwarf planet classification. Like its larger cousins, Ceres’ round shape suggests that the body may be differentiated, but due to its low density, Ceres’ interior is more like an icy moon of Jupiter. Beneath a relatively thin clay veneer probably lies an ice-rich mantle and rocky core, and even possibly a liquid ocean. With such enticing questions posed for Vesta and Ceres, Dawn will enable scientists and the public alike to explore how planets were born, how fire and ice have shaped the solar system, and have a chance to push the boundaries of our own classification system. Dawn’s set of instrumentation, with cameras, a visible and infrared spectrometer, a gamma ray and neutron detector and radio science, will produce a wealth of information about two previously unexplored, diverse and yet somehow familiar worlds. Communication of the lessons learned by Dawn from the scientists to the public has and will occur over a range of interfaces, including a series of online activities such as Find a Meteorite, Clickworkers and a simulation of an ion engine. Other activities include Dawn “Science of the Day” archives, fun family activities and games as well as classroom materials and outreach events. Since the two bodies are the brightest sources in the main belt, an integral part of Dawn’s journey has been the integration of amateur and “backyard” astronomers. All these activities allow us to share the science with the public. Dawn arrives at Vesta in the middle of the Year of the Solar System in July 2011 and will depart for Ceres as the YSS ends.

  1. Compressive Video Recovery Using Block Match Multi-Frame Motion Estimation Based on Single Pixel Cameras

    PubMed Central

    Bi, Sheng; Zeng, Xiao; Tang, Xin; Qin, Shujia; Lai, King Wai Chiu

    2016-01-01

    Compressive sensing (CS) theory has opened up new paths for the development of signal processing applications. Based on this theory, a novel single pixel camera architecture has been introduced to overcome the current limitations and challenges of traditional focal plane arrays. However, video quality based on this method is limited by existing acquisition and recovery methods, and the method also suffers from being time-consuming. In this paper, a multi-frame motion estimation algorithm is proposed in CS video to enhance the video quality. The proposed algorithm uses multiple frames to implement motion estimation. Experimental results show that using multi-frame motion estimation can improve the quality of recovered videos. To further reduce the motion estimation time, a block match algorithm is used to process motion estimation. Experiments demonstrate that using the block match algorithm can reduce motion estimation time by 30%. PMID:26950127

  2. Multithreaded hybrid feature tracking for markerless augmented reality.

    PubMed

    Lee, Taehee; Höllerer, Tobias

    2009-01-01

    We describe a novel markerless camera tracking approach and user interaction methodology for augmented reality (AR) on unprepared tabletop environments. We propose a real-time system architecture that combines two types of feature tracking. Distinctive image features of the scene are detected and tracked frame-to-frame by computing optical flow. In order to achieve real-time performance, multiple operations are processed in a synchronized multi-threaded manner: capturing a video frame, tracking features using optical flow, detecting distinctive invariant features, and rendering an output frame. We also introduce user interaction methodology for establishing a global coordinate system and for placing virtual objects in the AR environment by tracking a user's outstretched hand and estimating a camera pose relative to it. We evaluate the speed and accuracy of our hybrid feature tracking approach, and demonstrate a proof-of-concept application for enabling AR in unprepared tabletop environments, using bare hands for interaction.

  3. Comet Wild 2 Up Close and Personal

    NASA Image and Video Library

    2004-01-02

    On January 2, 2004 NASA's Stardust spacecraft made a close flyby of comet Wild 2 (pronounced "Vilt-2"). Among the equipment the spacecraft carried on board was a navigation camera. This is the 34th of the 72 images taken by Stardust's navigation camera during close encounter. The exposure time was 10 milliseconds. The two frames are actually of 1 single exposure. The frame on the left depicts the comet as the human eye would see it. The frame on the right depicts the same image but "stretched" so that the faint jets emanating from Wild 2 can be plainly seen. Comet Wild 2 is about five kilometers (3.1 miles) in diameter. http://photojournal.jpl.nasa.gov/catalog/PIA05571

  4. Ultra-fast high-resolution hybrid and monolithic CMOS imagers in multi-frame radiography

    NASA Astrophysics Data System (ADS)

    Kwiatkowski, Kris; Douence, Vincent; Bai, Yibin; Nedrow, Paul; Mariam, Fesseha; Merrill, Frank; Morris, Christopher L.; Saunders, Andy

    2014-09-01

    A new burst-mode, 10-frame, hybrid Si-sensor/CMOS-ROIC FPA chip has been recently fabricated at Teledyne Imaging Sensors. The intended primary use of the sensor is in the multi-frame 800 MeV proton radiography at LANL. The basic part of the hybrid is a large (48×49 mm2) stitched CMOS chip of 1100×1100 pixel count, with a minimum shutter speed of 50 ns. The performance parameters of this chip are compared to the first generation 3-frame 0.5-Mpixel custom hybrid imager. The 3-frame cameras have been in continuous use for many years, in a variety of static and dynamic experiments at LANSCE. The cameras can operate with a per-frame adjustable integration time of ~ 120ns-to- 1s, and inter-frame time of 250ns to 2s. Given the 80 ms total readout time, the original and the new imagers can be externally synchronized to 0.1-to-5 Hz, 50-ns wide proton beam pulses, and record up to ~1000-frame radiographic movies typ. of 3-to-30 minute duration. The performance of the global electronic shutter is discussed and compared to that of a high-resolution commercial front-illuminated monolithic CMOS imager.

  5. A state observer for using a slow camera as a sensor for fast control applications

    NASA Astrophysics Data System (ADS)

    Gahleitner, Reinhard; Schagerl, Martin

    2013-03-01

    This contribution concerns about a problem that often arises in vision based control, when a camera is used as a sensor for fast control applications, or more precisely, when the sample rate of the control loop is higher than the frame rate of the camera. In control applications for mechanical axes, e.g. in robotics or automated production, a camera and some image processing can be used as a sensor to detect positions or angles. The sample time in these applications is typically in the range of a few milliseconds or less and this demands the use of a camera with a high frame rate up to 1000 fps. The presented solution is a special state observer that can work with a slower and therefore cheaper camera to estimate the state variables at the higher sample rate of the control loop. To simplify the image processing for the determination of positions or angles and make it more robust, some LED markers are applied to the plant. Simulation and experimental results show that the concept can be used even if the plant is unstable like the inverted pendulum.

  6. Rover mast calibration, exact camera pointing, and camara handoff for visual target tracking

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Ansar, Adnan I.; Steele, Robert D.

    2005-01-01

    This paper presents three technical elements that we have developed to improve the accuracy of the visual target tracking for single-sol approach-and-instrument placement in future Mars rover missions. An accurate, straightforward method of rover mast calibration is achieved by using a total station, a camera calibration target, and four prism targets mounted on the rover. The method was applied to Rocky8 rover mast calibration and yielded a 1.1-pixel rms residual error. Camera pointing requires inverse kinematic solutions for mast pan and tilt angles such that the target image appears right at the center of the camera image. Two issues were raised. Mast camera frames are in general not parallel to the masthead base frame. Further, the optical axis of the camera model in general does not pass through the center of the image. Despite these issues, we managed to derive non-iterative closed-form exact solutions, which were verified with Matlab routines. Actual camera pointing experiments aver 50 random target image paints yielded less than 1.3-pixel rms pointing error. Finally, a purely geometric method for camera handoff using stereo views of the target has been developed. Experimental test runs show less than 2.5 pixels error on high-resolution Navcam for Pancam-to-Navcam handoff, and less than 4 pixels error on lower-resolution Hazcam for Navcam-to-Hazcam handoff.

  7. Evaluation of sequential images for photogrammetrically point determination

    NASA Astrophysics Data System (ADS)

    Kowalczyk, M.

    2011-12-01

    Close range photogrammetry encounters many problems with reconstruction of objects three-dimensional shape. Relative orientation parameters of taken photos makes usually key role leading to right solution of this problem. Automation of technology process is hardly performed due to recorded scene complexity and configuration of camera positions. This configuration makes the process of joining photos into one set usually impossible automatically. Application of camcorder is the solution widely proposed in literature for support in 3D models creation. Main advantages of this tool are connected with large number of recorded images and camera positions. Exterior orientation changes barely between two neighboring frames. Those features of film sequence gives possibilities for creating models with basic algorithms, working faster and more robust, than with remotely taken photos. The first part of this paper presents results of experiments determining interior orientation parameters of some sets of frames, presenting three-dimensional test field. This section describes calibration repeatability of film frames taken from camcorder. It is important due to stability of interior camera geometric parameters. Parametric model of systematical errors was applied for correcting images. Afterwards a short film of the same test field had been taken for determination of check points group. This part has been done for controlling purposes of camera application in measurement tasks. Finally there are presented some results of experiments which compare determination of recorded object points in 3D space. In common digital photogrammetry, where separate photos are used, first levels of image pyramids are taken to connect with feature based matching. This complicated process creates a lot of emergencies, which can produce false detections of image similarities. In case of digital film camera, authors of publications avoid this dangerous step, going straightly to area based matching, aiming high degree of similarity for two corresponding film frames. First approximation, in establishing connections between photos, comes from whole image distance. This image distance method can work with more than just two dimensions of translation vector. Scale and angles are also used for improving image matching. This operation creates more similar looking frames where corresponding characteristic points lays close to each other. Procedure searching for pairs of points works faster and more accurately, because analyzed areas can be reduced. Another proposed solution comes from image created by adding differences between particular frames, gives more rough results, but works much faster than standard matching.

  8. Final Report for the Advanced Camera for Surveys (ACS) from Ball Aerospace and Technologies Corporation

    NASA Technical Reports Server (NTRS)

    Volmer, Paul; Sullivan, Pam (Technical Monitor)

    2003-01-01

    The Advanced Camera for Surveys ACS was launched aboard the Space Shuttle Columbia just before dawn on March 1, 2002. After successfully docking with the Hubble Space Telescope (HST), several components were replaced. One of the components was the Advanced Camera for Surveys built by Ball Aerospace & Technologies Corp. (BATC) in Boulder, Colorado. Over the life of the HST contract at BATC hundreds of employees had the pleasure of working on the concept, design, fabrication, assembly and test of ACS. Those employees thank NASA - Goddard Space Flight Center and the science team at Johns Hopkins University (JHU) for the opportunity to participate in building a great science instrument for HST. After installation in HST a mini-functional test was performed and later a complete functional test. ACS performed well and has continued performing well since then. One of the greatest rewards for the BATC employees is a satisfied science team. Following is an excerpt from the JHU final report, "The foremost promise of ACS was to increase Hubble's capability for surveys in the near infrared by a factor of 10. That promise was kept. "

  9. Behavior of Compact Toroid Injected into C-2U Confinement Vessel

    NASA Astrophysics Data System (ADS)

    Matsumoto, Tadafumi; Roche, T.; Allrey, I.; Sekiguchi, J.; Asai, T.; Conroy, M.; Gota, H.; Granstedt, E.; Hooper, C.; Kinley, J.; Valentine, T.; Waggoner, W.; Binderbauer, M.; Tajima, T.; the TAE Team

    2016-10-01

    The compact toroid (CT) injector system has been developed for particle refueling on the C-2U device. A CT is formed by a magnetized coaxial plasma gun (MCPG) and the typical ejected CT/plasmoid parameters are as follows: average velocity 100 km/s, average electron density 1.9 ×1015 cm-3, electron temperature 30-40 eV, mass 12 μg . To refuel particles into FC plasma the CT must penetrate the transverse magnetic field that surrounds the FRC. The kinetic energy density of the CT should be higher than magnetic energy density of the axial magnetic field, i.e., ρv2 / 2 >=B2 / 2μ0 , where ρ, v, and B are mass density, velocity, and surrounded magnetic field, respectively. Also, the penetrated CT's trajectory is deflected by the transverse magnetic field (Bz 1 kG). Thus, we have to estimate CT's energy and track the CT trajectory inside the magnetic field, for which we adopted a fast-framing camera on C-2U: framing rate is up to 1.25 MHz for 120 frames. By employing the camera we clearly captured the CT/plasmoid trajectory. Comparisons between the fast-framing camera and some other diagnostics as well as CT injection results on C-2U will be presented.

  10. Infrared Imaging Camera Final Report CRADA No. TC02061.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roos, E. V.; Nebeker, S.

    This was a collaborative effort between the University of California, Lawrence Livermore National Laboratory (LLNL) and Cordin Company (Cordin) to enhance the U.S. ability to develop a commercial infrared camera capable of capturing high-resolution images in a l 00 nanoseconds (ns) time frame. The Department of Energy (DOE), under an Initiative for Proliferation Prevention (IPP) project, funded the Russian Federation Nuclear Center All-Russian Scientific Institute of Experimental Physics (RFNC-VNIIEF) in Sarov. VNIIEF was funded to develop a prototype commercial infrared (IR) framing camera and to deliver a prototype IR camera to LLNL. LLNL and Cordin were partners with VNIIEF onmore » this project. A prototype IR camera was delivered by VNIIEF to LLNL in December 2006. In June of 2007, LLNL and Cordin evaluated the camera and the test results revealed that the camera exceeded presently available commercial IR cameras. Cordin believes that the camera can be sold on the international market. The camera is currently being used as a scientific tool within Russian nuclear centers. This project was originally designated as a two year project. The project was not started on time due to changes in the IPP project funding conditions; the project funding was re-directed through the International Science and Technology Center (ISTC), which delayed the project start by over one year. The project was not completed on schedule due to changes within the Russian government export regulations. These changes were directed by Export Control regulations on the export of high technology items that can be used to develop military weapons. The IR camera was on the list that export controls required. The ISTC and Russian government, after negotiations, allowed the delivery of the camera to LLNL. There were no significant technical or business changes to the original project.« less

  11. Full-Frame Reference for Test Photo of Moon

    NASA Image and Video Library

    2005-09-10

    This pair of views shows how little of the full image frame was taken up by the Moon in test images taken Sept. 8, 2005, by the High Resolution Imaging Science Experiment HiRISE camera on NASA Mars Reconnaissance Orbiter.

  12. Estimating pixel variances in the scenes of staring sensors

    DOEpatents

    Simonson, Katherine M [Cedar Crest, NM; Ma, Tian J [Albuquerque, NM

    2012-01-24

    A technique for detecting changes in a scene perceived by a staring sensor is disclosed. The technique includes acquiring a reference image frame and a current image frame of a scene with the staring sensor. A raw difference frame is generated based upon differences between the reference image frame and the current image frame. Pixel error estimates are generated for each pixel in the raw difference frame based at least in part upon spatial error estimates related to spatial intensity gradients in the scene. The pixel error estimates are used to mitigate effects of camera jitter in the scene between the current image frame and the reference image frame.

  13. Visual Odometry Based on Structural Matching of Local Invariant Features Using Stereo Camera Sensor

    PubMed Central

    Núñez, Pedro; Vázquez-Martín, Ricardo; Bandera, Antonio

    2011-01-01

    This paper describes a novel sensor system to estimate the motion of a stereo camera. Local invariant image features are matched between pairs of frames and linked into image trajectories at video rate, providing the so-called visual odometry, i.e., motion estimates from visual input alone. Our proposal conducts two matching sessions: the first one between sets of features associated to the images of the stereo pairs and the second one between sets of features associated to consecutive frames. With respect to previously proposed approaches, the main novelty of this proposal is that both matching algorithms are conducted by means of a fast matching algorithm which combines absolute and relative feature constraints. Finding the largest-valued set of mutually consistent matches is equivalent to finding the maximum-weighted clique on a graph. The stereo matching allows to represent the scene view as a graph which emerge from the features of the accepted clique. On the other hand, the frame-to-frame matching defines a graph whose vertices are features in 3D space. The efficiency of the approach is increased by minimizing the geometric and algebraic errors to estimate the final displacement of the stereo camera between consecutive acquired frames. The proposed approach has been tested for mobile robotics navigation purposes in real environments and using different features. Experimental results demonstrate the performance of the proposal, which could be applied in both industrial and service robot fields. PMID:22164016

  14. Slow Speed--Fast Motion: Time-Lapse Recordings in Physics Education

    ERIC Educational Resources Information Center

    Vollmer, Michael; Möllmann, Klaus-Peter

    2018-01-01

    Video analysis with a 30 Hz frame rate is the standard tool in physics education. The development of affordable high-speed-cameras has extended the capabilities of the tool for much smaller time scales to the 1 ms range, using frame rates of typically up to 1000 frames s[superscript -1], allowing us to study transient physics phenomena happening…

  15. Study of atmospheric discharges caracteristics using with a standard video camera

    NASA Astrophysics Data System (ADS)

    Ferraz, E. C.; Saba, M. M. F.

    In this study is showed some preliminary statistics on lightning characteristics such as: flash multiplicity, number of ground contact points, formation of new and altered channels and presence of continuous current in the strokes that form the flash. The analysis is based on the images of a standard video camera (30 frames.s-1). The results obtained for some flashes will be compared to the images of a high-speed CCD camera (1000 frames.s-1). The camera observing site is located in São José dos Campos (23°S,46° W) at an altitude of 630m. This observational site has nearly 360° field of view at a height of 25m. It is possible to visualize distant thunderstorms occurring within a radius of 25km from the site. The room, situated over a metal structure, has water and power supplies, a telephone line and a small crane on the roof. KEY WORDS: Video images, Lightning, Multiplicity, Stroke.

  16. HIGH SPEED KERR CELL FRAMING CAMERA

    DOEpatents

    Goss, W.C.; Gilley, L.F.

    1964-01-01

    The present invention relates to a high speed camera utilizing a Kerr cell shutter and a novel optical delay system having no moving parts. The camera can selectively photograph at least 6 frames within 9 x 10/sup -8/ seconds during any such time interval of an occurring event. The invention utilizes particularly an optical system which views and transmits 6 images of an event to a multi-channeled optical delay relay system. The delay relay system has optical paths of successively increased length in whole multiples of the first channel optical path length, into which optical paths the 6 images are transmitted. The successively delayed images are accepted from the exit of the delay relay system by an optical image focusing means, which in turn directs the images into a Kerr cell shutter disposed to intercept the image paths. A camera is disposed to simultaneously view and record the 6 images during a single exposure of the Kerr cell shutter. (AEC)

  17. Frames of Reference in the Classroom

    NASA Astrophysics Data System (ADS)

    Grossman, Joshua

    2012-12-01

    The classic film "Frames of Reference"1,2 effectively illustrates concepts involved with inertial and non-inertial reference frames. In it, Donald G. Ivey and Patterson Hume use the cameras perspective to allow the viewer to see motion in reference frames translating with a constant velocity, translating while accelerating, and rotating—all with respect to the Earth frame. The film is a classic for good reason, but today it does have a couple of drawbacks: 1) The film by nature only accommodates passive learning. It does not give students the opportunity to try any of the experiments themselves. 2) The dated style of the 50-year-old film can distract students from the physics content. I present here a simple setup that can recreate many of the movies demonstrations in the classroom. The demonstrations can be used to supplement the movie or in its place, if desired. All of the materials except perhaps the inexpensive web camera should likely be available already in most teaching laboratories. Unlike previously described activities, these experiments do not require travel to another location3 or an involved setup.4,5

  18. A Remotely Operated Observatory for Minor Planet Photometry

    NASA Astrophysics Data System (ADS)

    Ditteon, Richard

    2008-05-01

    In October of 2007 Rose-Hulman Institute of Technology in Terre Haute, Indiana began operating the Oakley Southern Sky Observatory (E09) located near Siding Spring Observatory in New South Wales, Australia. The observatory houses a 0.5-m, f/8.4 Ritchey-Chretien telescope mounted on a Paramount ME, German equatorial mount. Attached to the telescope is an STL-1001E CCD camera which has 1024 by 1024, 24 µm pixels, a two-stage thermoelectric cooler, and built in color filter wheel with BVRI and clear filters. Image scale is 1.2 arcseconds per pixel. A cloud sensor is used to monitor sky conditions. The observatory has a roll-off roof with limit switches to detect when the roof is fully open and fully closed. In addition, a switch has been added to the mount to detect when the telescope is parked and that it is safe to open or close the roof. All of the hardware is controlled by a custom program which reads a simple text file containing the sequence of images and targets to be collected each night. The text file is loaded onto the control computer once each day, then the software waits until sunset to determine if the sky is clear. When conditions are favorable, power is turned on, the roof opens, twilight flats, dark and bias frames are recorded, and when it is fully dark data frames are recorded. Images are transferred via the Internet back to Rose-Hulman by another program running in the background. The observatory closes itself before dawn or if it gets cloudy. Currently we are using the observatory for photometry of minor planets. Students are responsible for selecting targets, processing the returned images, determining the period and light curve of each minor planet and writing a paper for publication. Recent results will be presented.

  19. Calibration of asynchronous smart phone cameras from moving objects

    NASA Astrophysics Data System (ADS)

    Hagen, Oksana; Istenič, Klemen; Bharti, Vibhav; Dhali, Maruf Ahmed; Barmaimon, Daniel; Houssineau, Jérémie; Clark, Daniel

    2015-04-01

    Calibrating multiple cameras is a fundamental prerequisite for many Computer Vision applications. Typically this involves using a pair of identical synchronized industrial or high-end consumer cameras. This paper considers an application on a pair of low-cost portable cameras with different parameters that are found in smart phones. This paper addresses the issues of acquisition, detection of moving objects, dynamic camera registration and tracking of arbitrary number of targets. The acquisition of data is performed using two standard smart phone cameras and later processed using detections of moving objects in the scene. The registration of cameras onto the same world reference frame is performed using a recently developed method for camera calibration using a disparity space parameterisation and the single-cluster PHD filter.

  20. Camera Trajectory fromWide Baseline Images

    NASA Astrophysics Data System (ADS)

    Havlena, M.; Torii, A.; Pajdla, T.

    2008-09-01

    Camera trajectory estimation, which is closely related to the structure from motion computation, is one of the fundamental tasks in computer vision. Reliable camera trajectory estimation plays an important role in 3D reconstruction, self localization, and object recognition. There are essential issues for a reliable camera trajectory estimation, for instance, choice of the camera and its geometric projection model, camera calibration, image feature detection and description, and robust 3D structure computation. Most of approaches rely on classical perspective cameras because of the simplicity of their projection models and ease of their calibration. However, classical perspective cameras offer only a limited field of view, and thus occlusions and sharp camera turns may cause that consecutive frames look completely different when the baseline becomes longer. This makes the image feature matching very difficult (or impossible) and the camera trajectory estimation fails under such conditions. These problems can be avoided if omnidirectional cameras, e.g. a fish-eye lens convertor, are used. The hardware which we are using in practice is a combination of Nikon FC-E9 mounted via a mechanical adaptor onto a Kyocera Finecam M410R digital camera. Nikon FC-E9 is a megapixel omnidirectional addon convertor with 180° view angle which provides images of photographic quality. Kyocera Finecam M410R delivers 2272×1704 images at 3 frames per second. The resulting combination yields a circular view of diameter 1600 pixels in the image. Since consecutive frames of the omnidirectional camera often share a common region in 3D space, the image feature matching is often feasible. On the other hand, the calibration of these cameras is non-trivial and is crucial for the accuracy of the resulting 3D reconstruction. We calibrate omnidirectional cameras off-line using the state-of-the-art technique and Mičušík's two-parameter model, that links the radius of the image point r to the angle θ of its corresponding rays w.r.t. the optical axis as θ = ar 1+br2 . After a successful calibration, we know the correspondence of the image points to the 3D optical rays in the coordinate system of the camera. The following steps aim at finding the transformation between the camera and the world coordinate systems, i.e. the pose of the camera in the 3D world, using 2D image matches. For computing 3D structure, we construct a set of tentative matches detecting different affine covariant feature regions including MSER, Harris Affine, and Hessian Affine in acquired images. These features are alternative to popular SIFT features and work comparably in our situation. Parameters of the detectors are chosen to limit the number of regions to 1-2 thousands per image. The detected regions are assigned local affine frames (LAF) and transformed into standard positions w.r.t. their LAFs. Discrete Cosine Descriptors are computed for each region in the standard position. Finally, mutual distances of all regions in one image and all regions in the other image are computed as the Euclidean distances of their descriptors and tentative matches are constructed by selecting the mutually closest pairs. Opposed to the methods using short baseline images, simpler image features which are not affine covariant cannot be used because the view point can change a lot between consecutive frames. Furthermore, feature matching has to be performed on the whole frame because no assumptions on the proximity of the consecutive projections can be made for wide baseline images. This is making the feature detection, description, and matching much more time-consuming than it is for short baseline images and limits the usage to low frame rate sequences when operating in real-time. Robust 3D structure can be computed by RANSAC which searches for the largest subset of the set of tentative matches which is, within a predefined threshold ", consistent with an epipolar geometry. We use ordered sampling as suggested in to draw 5-tuples from the list of tentative matches ordered ascendingly by the distance of their descriptors which may help to reduce the number of samples in RANSAC. From each 5-tuple, relative orientation is computed by solving the 5-point minimal relative orientation problem for calibrated cameras. Often, there are more models which are supported by a large number of matches. Thus the chance that the correct model, even if it has the largest support, will be found by running a single RANSAC is small. Work suggested to generate models by randomized sampling as in RANSAC but to use soft (kernel) voting for a parameter instead of looking for the maximal support. The best model is then selected as the one with the parameter closest to the maximum in the accumulator space. In our case, we vote in a two-dimensional accumulator for the estimated camera motion direction. However, unlike in, we do not cast votes directly by each sampled epipolar geometry but by the best epipolar geometries recovered by ordered sampling of RANSAC. With our technique, we could go up to the 98.5 % contamination of mismatches with comparable effort as simple RANSAC does for the contamination by 84 %. The relative camera orientation with the motion direction closest to the maximum in the voting space is finally selected. As already mentioned in the first paragraph, the use of camera trajectory estimates is quite wide. In we have introduced a technique for measuring the size of camera translation relatively to the observed scene which uses the dominant apical angle computed at the reconstructed scene points and is robust against mismatches. The experiments demonstrated that the measure can be used to improve the robustness of camera path computation and object recognition for methods which use a geometric, e.g. the ground plane, constraint such as does for the detection of pedestrians. Using the camera trajectories, perspective cutouts with stabilized horizon are constructed and an arbitrary object recognition routine designed to work with images acquired by perspective cameras can be used without any further modifications.

  1. High speed imaging - An important industrial tool

    NASA Technical Reports Server (NTRS)

    Moore, Alton; Pinelli, Thomas E.

    1986-01-01

    High-speed photography, which is a rapid sequence of photographs that allow an event to be analyzed through the stoppage of motion or the production of slow-motion effects, is examined. In high-speed photography 16, 35, and 70 mm film and framing rates between 64-12,000 frames per second are utilized to measure such factors as angles, velocities, failure points, and deflections. The use of dual timing lamps in high-speed photography and the difficulties encountered with exposure and programming the camera and event are discussed. The application of video cameras to the recording of high-speed events is described.

  2. Dynamic characteristics of far-field radiation of current modulated phase-locked diode laser arrays

    NASA Technical Reports Server (NTRS)

    Elliott, R. A.; Hartnett, K.

    1987-01-01

    A versatile and powerful streak camera/frame grabber system for studying the evolution of the near and far field radiation patterns of diode lasers was assembled and tested. Software needed to analyze and display the data acquired with the steak camera/frame grabber system was written and the total package used to record and perform preliminary analyses on the behavior of two types of laser, a ten emitter gain guided array and a flared waveguide Y-coupled array. Examples of the information which can be gathered with this system are presented.

  3. One-click scanning of large-size documents using mobile phone camera

    NASA Astrophysics Data System (ADS)

    Liu, Sijiang; Jiang, Bo; Yang, Yuanjie

    2016-07-01

    Currently mobile apps for document scanning do not provide convenient operations to tackle large-size documents. In this paper, we present a one-click scanning approach for large-size documents using mobile phone camera. After capturing a continuous video of documents, our approach automatically extracts several key frames by optical flow analysis. Then based on key frames, a mobile GPU based image stitching method is adopted to generate a completed document image with high details. There are no extra manual intervention in the process and experimental results show that our app performs well, showing convenience and practicability for daily life.

  4. Precision of FLEET Velocimetry Using High-Speed CMOS Camera Systems

    NASA Technical Reports Server (NTRS)

    Peters, Christopher J.; Danehy, Paul M.; Bathel, Brett F.; Jiang, Naibo; Calvert, Nathan D.; Miles, Richard B.

    2015-01-01

    Femtosecond laser electronic excitation tagging (FLEET) is an optical measurement technique that permits quantitative velocimetry of unseeded air or nitrogen using a single laser and a single camera. In this paper, we seek to determine the fundamental precision of the FLEET technique using high-speed complementary metal-oxide semiconductor (CMOS) cameras. Also, we compare the performance of several different high-speed CMOS camera systems for acquiring FLEET velocimetry data in air and nitrogen free-jet flows. The precision was defined as the standard deviation of a set of several hundred single-shot velocity measurements. Methods of enhancing the precision of the measurement were explored such as digital binning (similar in concept to on-sensor binning, but done in post-processing), row-wise digital binning of the signal in adjacent pixels and increasing the time delay between successive exposures. These techniques generally improved precision; however, binning provided the greatest improvement to the un-intensified camera systems which had low signal-to-noise ratio. When binning row-wise by 8 pixels (about the thickness of the tagged region) and using an inter-frame delay of 65 microseconds, precisions of 0.5 meters per second in air and 0.2 meters per second in nitrogen were achieved. The camera comparison included a pco.dimax HD, a LaVision Imager scientific CMOS (sCMOS) and a Photron FASTCAM SA-X2, along with a two-stage LaVision HighSpeed IRO intensifier. Excluding the LaVision Imager sCMOS, the cameras were tested with and without intensification and with both short and long inter-frame delays. Use of intensification and longer inter-frame delay generally improved precision. Overall, the Photron FASTCAM SA-X2 exhibited the best performance in terms of greatest precision and highest signal-to-noise ratio primarily because it had the largest pixels.

  5. Fast-camera imaging on the W7-X stellarator

    NASA Astrophysics Data System (ADS)

    Ballinger, S. B.; Terry, J. L.; Baek, S. G.; Tang, K.; Grulke, O.

    2017-10-01

    Fast cameras recording in the visible range have been used to study filamentary (``blob'') edge turbulence in tokamak plasmas, revealing that emissive filaments aligned with the magnetic field can propagate perpendicular to it at speeds on the order of 1 km/s in the SOL or private flux region. The motion of these filaments has been studied in several tokamaks, including MAST, NSTX, and Alcator C-Mod. Filaments were also observed in the W7-X Stellarator using fast cameras during its initial run campaign. For W7-X's upcoming 2017-18 run campaign, we have installed a Phantom V710 fast camera with a view of the machine cross section and part of a divertor module in order to continue studying edge and divertor filaments. The view is coupled to the camera via a coherent fiber bundle. The Phantom camera is able to record at up to 400,000 frames per second and has a spatial resolution of roughly 2 cm in the view. A beam-splitter is used to share the view with a slower machine-protection camera. Stepping-motor actuators tilt the beam-splitter about two orthogonal axes, making it possible to frame user-defined sub-regions anywhere within the view. The diagnostic has been prepared to be remotely controlled via MDSplus. The MIT portion of this work is supported by US DOE award DE-SC0014251.

  6. Research on inosculation between master of ceremonies or players and virtual scene in virtual studio

    NASA Astrophysics Data System (ADS)

    Li, Zili; Zhu, Guangxi; Zhu, Yaoting

    2003-04-01

    A technical principle about construction of virtual studio has been proposed where orientation tracker and telemeter has been used for improving conventional BETACAM pickup camera and connecting with the software module of the host. A model of virtual camera named Camera & Post-camera Coupling Pair has been put forward, which is different from the common model in computer graphics and has been bound to real BETACAM pickup camera for shooting. The formula has been educed to compute the foreground frame buffer image and the background frame buffer image of the virtual scene whose boundary is based on the depth information of target point of the real BETACAM pickup camera's projective ray. The effect of real-time consistency has been achieved between the video image sequences of the master of ceremonies or players and the CG video image sequences for the virtual scene in spatial position, perspective relationship and image object masking. The experimental result has shown that the technological scheme of construction of virtual studio submitted in this paper is feasible and more applicative and more effective than the existing technology to establish a virtual studio based on color-key and image synthesis with background using non-linear video editing technique.

  7. Development of a table tennis robot for ball interception using visual feedback

    NASA Astrophysics Data System (ADS)

    Parnichkun, Manukid; Thalagoda, Janitha A.

    2016-07-01

    This paper presents a concept of intercepting a moving table tennis ball using a robot. The robot has four degrees of freedom(DOF) which are simplified in such a way that The system is able to perform the task within the bounded limit. It employs computer vision to localize the ball. For ball identification, Colour Based Threshold Segmentation(CBTS) and Background Subtraction(BS) methodologies are used. Coordinate Transformation(CT) is employed to transform the data, which is taken based on camera coordinate frame to the general coordinate frame. The sensory system consisted of two HD Web Cameras. The computation time of image processing from web cameras is long .it is not possible to intercept table tennis ball using only image processing. Therefore the projectile motion model is employed to predict the final destination of the ball.

  8. Satellite markers: a simple method for ground truth car pose on stereo video

    NASA Astrophysics Data System (ADS)

    Gil, Gustavo; Savino, Giovanni; Piantini, Simone; Pierini, Marco

    2018-04-01

    Artificial prediction of future location of other cars in the context of advanced safety systems is a must. The remote estimation of car pose and particularly its heading angle is key to predict its future location. Stereo vision systems allow to get the 3D information of a scene. Ground truth in this specific context is associated with referential information about the depth, shape and orientation of the objects present in the traffic scene. Creating 3D ground truth is a measurement and data fusion task associated with the combination of different kinds of sensors. The novelty of this paper is the method to generate ground truth car pose only from video data. When the method is applied to stereo video, it also provides the extrinsic camera parameters for each camera at frame level which are key to quantify the performance of a stereo vision system when it is moving because the system is subjected to undesired vibrations and/or leaning. We developed a video post-processing technique which employs a common camera calibration tool for the 3D ground truth generation. In our case study, we focus in accurate car heading angle estimation of a moving car under realistic imagery. As outcomes, our satellite marker method provides accurate car pose at frame level, and the instantaneous spatial orientation for each camera at frame level.

  9. Strategic options towards an affordable high-performance infrared camera

    NASA Astrophysics Data System (ADS)

    Oduor, Patrick; Mizuno, Genki; Dutta, Achyut K.; Lewis, Jay; Dhar, Nibir K.

    2016-05-01

    The promise of infrared (IR) imaging attaining low-cost akin to CMOS sensors success has been hampered by the inability to achieve cost advantages that are necessary for crossover from military and industrial applications into the consumer and mass-scale commercial realm despite well documented advantages. Banpil Photonics is developing affordable IR cameras by adopting new strategies to speed-up the decline of the IR camera cost curve. We present a new short-wave IR (SWIR) camera; 640x512 pixel InGaAs uncooled system that is high sensitivity low noise (<50e-), high dynamic range (100 dB), high-frame rates (> 500 frames per second (FPS)) at full resolution, and low power consumption (< 1 W) in a compact system. This camera paves the way towards mass market adoption by not only demonstrating high-performance IR imaging capability value add demanded by military and industrial application, but also illuminates a path towards justifiable price points essential for consumer facing application industries such as automotive, medical, and security imaging adoption. Among the strategic options presented include new sensor manufacturing technologies that scale favorably towards automation, multi-focal plane array compatible readout electronics, and dense or ultra-small pixel pitch devices.

  10. Low cost thermal camera for use in preclinical detection of diabetic peripheral neuropathy in primary care setting

    NASA Astrophysics Data System (ADS)

    Joshi, V.; Manivannan, N.; Jarry, Z.; Carmichael, J.; Vahtel, M.; Zamora, G.; Calder, C.; Simon, J.; Burge, M.; Soliz, P.

    2018-02-01

    Diabetic peripheral neuropathy (DPN) accounts for around 73,000 lower-limb amputations annually in the US on patients with diabetes. Early detection of DPN is critical. Current clinical methods for diagnosing DPN are subjective and effective only at later stages. Until recently, thermal cameras used for medical imaging have been expensive and hence prohibitive to be installed in primary care setting. The objective of this study is to compare results from a low-cost thermal camera with a high-end thermal camera used in screening for DPN. Thermal imaging has demonstrated changes in microvascular function that correlates with nerve function affected by DPN. The limitations for using low-cost cameras for DPN imaging are: less resolution (active pixels), frame rate, thermal sensitivity etc. We integrated two FLIR Lepton (80x60 active pixels, 50° HFOV, thermal sensitivity < 50mK) as one unit. Right and left cameras record the videos of right and left foot respectively. A compactible embedded system (raspberry pi3 model Bv1.2) is used to configure the sensors, capture and stream the video via ethernet. The resulting video has 160x120 active pixels (8 frames/second). We compared the temperature measurement of feet obtained using low-cost camera against the gold standard highend FLIR SC305. Twelve subjects (aged 35-76) were recruited. Difference in the temperature measurements between cameras was calculated for each subject and the results show that the difference between the temperature measurements of two cameras (mean difference=0.4, p-value=0.2) is not statistically significant. We conclude that the low-cost thermal camera system shows potential for use in detecting early-signs of DPN in under-served and rural clinics.

  11. Object recognition through turbulence with a modified plenoptic camera

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher

    2015-03-01

    Atmospheric turbulence adds accumulated distortion to images obtained by cameras and surveillance systems. When the turbulence grows stronger or when the object is further away from the observer, increasing the recording device resolution helps little to improve the quality of the image. Many sophisticated methods to correct the distorted images have been invented, such as using a known feature on or near the target object to perform a deconvolution process, or use of adaptive optics. However, most of the methods depend heavily on the object's location, and optical ray propagation through the turbulence is not directly considered. Alternatively, selecting a lucky image over many frames provides a feasible solution, but at the cost of time. In our work, we propose an innovative approach to improving image quality through turbulence by making use of a modified plenoptic camera. This type of camera adds a micro-lens array to a traditional high-resolution camera to form a semi-camera array that records duplicate copies of the object as well as "superimposed" turbulence at slightly different angles. By performing several steps of image reconstruction, turbulence effects will be suppressed to reveal more details of the object independently (without finding references near the object). Meanwhile, the redundant information obtained by the plenoptic camera raises the possibility of performing lucky image algorithmic analysis with fewer frames, which is more efficient. In our work, the details of our modified plenoptic cameras and image processing algorithms will be introduced. The proposed method can be applied to coherently illuminated object as well as incoherently illuminated objects. Our result shows that the turbulence effect can be effectively suppressed by the plenoptic camera in the hardware layer and a reconstructed "lucky image" can help the viewer identify the object even when a "lucky image" by ordinary cameras is not achievable.

  12. Accuracy and precision of a custom camera-based system for 2D and 3D motion tracking during speech and nonspeech motor tasks

    PubMed Central

    Feng, Yongqiang; Max, Ludo

    2014-01-01

    Purpose Studying normal or disordered motor control requires accurate motion tracking of the effectors (e.g., orofacial structures). The cost of electromagnetic, optoelectronic, and ultrasound systems is prohibitive for many laboratories, and limits clinical applications. For external movements (lips, jaw), video-based systems may be a viable alternative, provided that they offer high temporal resolution and sub-millimeter accuracy. Method We examined the accuracy and precision of 2D and 3D data recorded with a system that combines consumer-grade digital cameras capturing 60, 120, or 240 frames per second (fps), retro-reflective markers, commercially-available computer software (APAS, Ariel Dynamics), and a custom calibration device. Results Overall mean error (RMSE) across tests was 0.15 mm for static tracking and 0.26 mm for dynamic tracking, with corresponding precision (SD) values of 0.11 and 0.19 mm, respectively. The effect of frame rate varied across conditions, but, generally, accuracy was reduced at 240 fps. The effect of marker size (3 vs. 6 mm diameter) was negligible at all frame rates for both 2D and 3D data. Conclusion Motion tracking with consumer-grade digital cameras and the APAS software can achieve sub-millimeter accuracy at frame rates that are appropriate for kinematic analyses of lip/jaw movements for both research and clinical purposes. PMID:24686484

  13. Marker-less multi-frame motion tracking and compensation in PET-brain imaging

    NASA Astrophysics Data System (ADS)

    Lindsay, C.; Mukherjee, J. M.; Johnson, K.; Olivier, P.; Song, X.; Shao, L.; King, M. A.

    2015-03-01

    In PET brain imaging, patient motion can contribute significantly to the degradation of image quality potentially leading to diagnostic and therapeutic problems. To mitigate the image artifacts resulting from patient motion, motion must be detected and tracked then provided to a motion correction algorithm. Existing techniques to track patient motion fall into one of two categories: 1) image-derived approaches and 2) external motion tracking (EMT). Typical EMT requires patients to have markers in a known pattern on a rigid too attached to their head, which are then tracked by expensive and bulky motion tracking camera systems or stereo cameras. This has made marker-based EMT unattractive for routine clinical application. Our main contributions are the development of a marker-less motion tracking system that uses lowcost, small depth-sensing cameras which can be installed in the bore of the imaging system. Our motion tracking system does not require anything to be attached to the patient and can track the rigid transformation (6-degrees of freedom) of the patient's head at a rate 60 Hz. We show that our method can not only be used in with Multi-frame Acquisition (MAF) PET motion correction, but precise timing can be employed to determine only the necessary frames needed for correction. This can speeds up reconstruction by eliminating the unnecessary subdivision of frames.

  14. Relativistic Astronomy

    NASA Astrophysics Data System (ADS)

    Zhang, Bing; Li, Kunyang

    2018-02-01

    The “Breakthrough Starshot” aims at sending near-speed-of-light cameras to nearby stellar systems in the future. Due to the relativistic effects, a transrelativistic camera naturally serves as a spectrograph, a lens, and a wide-field camera. We demonstrate this through a simulation of the optical-band image of the nearby galaxy M51 in the rest frame of the transrelativistic camera. We suggest that observing celestial objects using a transrelativistic camera may allow one to study the astronomical objects in a special way, and to perform unique tests on the principles of special relativity. We outline several examples that suggest transrelativistic cameras may make important contributions to astrophysics and suggest that the Breakthrough Starshot cameras may be launched in any direction to serve as a unique astronomical observatory.

  15. Utilizing ISS Camera Systems for Scientific Analysis of Lightning Characteristics and Comparison with ISS-LIS and GLM

    NASA Technical Reports Server (NTRS)

    Schultz, Christopher J.; Lang, Timothy J.; Leake, Skye; Runco, Mario, Jr.; Blakeslee, Richard J.

    2017-01-01

    Video and still frame images from cameras aboard the International Space Station (ISS) are used to inspire, educate, and provide a unique vantage point from low-Earth orbit that is second to none; however, these cameras have overlooked capabilities for contributing to scientific analysis of the Earth and near-space environment. The goal of this project is to study how geo referenced video/images from available ISS camera systems can be useful for scientific analysis, using lightning properties as a demonstration.

  16. Collection and Analysis of Crowd Data with Aerial, Rooftop, and Ground Views

    DTIC Science & Technology

    2014-11-10

    collected these datasets using different aircrafts. Erista 8 HL OctaCopter is a heavy-lift aerial platform capable of using high-resolution cinema ...is another high-resolution camera that is cinema grade and high quality, with the capability of capturing videos with 4K resolution at 30 frames per...292.58 Imaging Systems and Accessories Blackmagic Production Camera 4 Crowd Counting using 4K Cameras High resolution cinema grade digital video

  17. Ground-based remote sensing with long lens video camera for upper-stem diameter and other tree crown measurements

    Treesearch

    Neil A. Clark; Sang-Mook Lee

    2004-01-01

    This paper demonstrates how a digital video camera with a long lens can be used with pulse laser ranging in order to collect very large-scale tree crown measurements. The long focal length of the camera lens provides the magnification required for precise viewing of distant points with the trade-off of spatial coverage. Multiple video frames are mosaicked into a single...

  18. A framed, 16-image Kirkpatrick–Baez x-ray microscope

    DOE PAGES

    Marshall, F. J.; Bahr, R. E.; Goncharov, V. N.; ...

    2017-09-08

    A 16-image Kirkpatrick–Baez (KB)–type x-ray microscope consisting of compact KB mirrors has been assembled for the first time with mirrors aligned to allow it to be coupled to a high-speed framing camera. The high-speed framing camera has four independently gated strips whose emission sampling interval is ~30 ps. Images are arranged four to a strip with ~60-ps temporal spacing between frames on a strip. By spacing the timing of the strips, a frame spacing of ~15 ps is achieved. A framed resolution of ~6-um is achieved with this combination in a 400-um region of laser–plasma x-ray emission in the 2-more » to 8-keV energy range. A principal use of the microscope is to measure the evolution of the implosion stagnation region of cryogenic DT target implosions on the University of Rochester’s OMEGA Laser System. The unprecedented time and spatial resolution achieved with this framed, multi-image KB microscope have made it possible to accurately determine the cryogenic implosion core emission size and shape at the peak of stagnation. In conclusion, these core size measurements, taken in combination with those of ion temperature, neutron-production temporal width, and neutron yield allow for inference of core pressures, currently exceeding 50 GBar in OMEGA cryogenic target implosions.« less

  19. A framed, 16-image Kirkpatrick–Baez x-ray microscope

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marshall, F. J.; Bahr, R. E.; Goncharov, V. N.

    A 16-image Kirkpatrick–Baez (KB)–type x-ray microscope consisting of compact KB mirrors has been assembled for the first time with mirrors aligned to allow it to be coupled to a high-speed framing camera. The high-speed framing camera has four independently gated strips whose emission sampling interval is ~30 ps. Images are arranged four to a strip with ~60-ps temporal spacing between frames on a strip. By spacing the timing of the strips, a frame spacing of ~15 ps is achieved. A framed resolution of ~6-um is achieved with this combination in a 400-um region of laser–plasma x-ray emission in the 2-more » to 8-keV energy range. A principal use of the microscope is to measure the evolution of the implosion stagnation region of cryogenic DT target implosions on the University of Rochester’s OMEGA Laser System. The unprecedented time and spatial resolution achieved with this framed, multi-image KB microscope have made it possible to accurately determine the cryogenic implosion core emission size and shape at the peak of stagnation. In conclusion, these core size measurements, taken in combination with those of ion temperature, neutron-production temporal width, and neutron yield allow for inference of core pressures, currently exceeding 50 GBar in OMEGA cryogenic target implosions.« less

  20. Geological Mapping of the Ac-H-5 Fejokoo Quadrangle of Ceres from NASA's Dawn Mission

    NASA Astrophysics Data System (ADS)

    Hughson, Kynan; Russell, Christopher; Williams, David; Buczkowski, Debra; Mest, Scott; Scully, Jennifer; Kneissl, Thomas; Ruesch, Ottaviano; Frigeri, Alessandro; Combe, Jean-Philippe; Jaumann, Ralf; Roatsch, Thomas; Preusker, Frank; Platz, Thomas; Nathues, Andreas; Hoffmann, Martin; Schaefer, Michael; Park, Ryan; Marchi, Simone; Raymond, Carol

    2016-04-01

    NASA's Dawn spacecraft arrived at Ceres on March 6, 2015, and has been studying the dwarf planet through a series of successively lower orbits, obtaining morphological & topographical image, mineralogical, elemental abundance, and gravity data. Ceres is the largest object in the asteroid belt with a mean diameter of ~950 km. The Dawn Science Team is conducting a geologic mapping campaign for Ceres similar to that done for the asteroid Vesta [1, 2], including production of a Survey- and High Altitude Mapping Orbit (HAMO)-based global map, and a series of 15 Low Altitude Mapping Orbit (LAMO)-based quadrangle maps. In this abstract we present the LAMO-based geologic map of the Ac-H-5 Fejokoo quadrangle (21-66 °N and 270-360 °E) and discuss its geologic evolution. At the time of this writing LAMO images (35 m/pixel) are just becoming available. Thus, our geologic maps are based on HAMO images (~140 m/pixel) and Survey (~400 m/pixel) digital terrain models (for topographic information) [3, 4]. Dawn Framing Camera (FC) color images are also used to provide context for map unit identification. The maps to be presented as posters will be updated from analyses of LAMO images (~35 m/pixel). The Fejokoo quadrangle hosts six primary geologic features: (1) the centrally located, ~80 km diameter, distinctly hexagonal impact crater Fejokoo; (2) Victa crater with its large exterior dark lobate flow feature, and interior lobate and furrowed deposits; (3) Abellio crater, which exhibits a well formed ejecta blanket and has an arcuately textured infilled floor whose morphology is similar to those of homologously sized craters on some of the icy Saturnian satellites [5]; (4) Cozobi crater, whose floor is filled with an unusually bulbous and smooth deposit, thin sheeted multi-lobed flow-like features that are reminiscent of fluidized ejecta as seen on Mars are also observed to be emanating outwards from the N and S rims of this crater [6]; (5) the peculiar Oxo crater on the eastern border whose strange bright ejecta and mysterious "missing block" are unlike anything else seen on Ceres; and (6) the numerous tholi that occupy the central region of the quadrangle. Based on our current geologic mapping of the Fejokoo quadrangle, we have developed the following preliminary geologic history: (I) The background cratered terrain was emplaced and represents the oldest geologic unit in the quadrangle, (II) the growth of tholi and emplacement of undegraded craters Takel, Cozobi, Abellio, Victa, Fejokoo, Dada, and Roskva happened subsequent to the development of the cratered terrain, and (III) most recently Oxo crater and its undisturbed ejecta was emplaced. Before EGU, we will: (i) explore the possibility that the Fejokoo tholi are intrusive structures, (ii) examine the different types of mass wasting in this quadrangle and their relationship to ground ice, and (iii) investigate the morphological and compositional nature of Oxo crater. References: [1] Williams et al. (2014) Icarus. [2] Yingst R.A. et al. (2014) PSS. [3] Roatsch et al. (2015) Planetary and Space Science. [4] Preusker F. et al. (2016) LPSC XXXXVII. [5] Schenk P. M. (1989) JGR. [6] Senft & Stewart (2008) Met. & Planet. Sci.

  1. Extreme ultra-violet movie camera for imaging microsecond time scale magnetic reconnection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chai, Kil-Byoung; Bellan, Paul M.

    2013-12-15

    An ultra-fast extreme ultra-violet (EUV) movie camera has been developed for imaging magnetic reconnection in the Caltech spheromak/astrophysical jet experiment. The camera consists of a broadband Mo:Si multilayer mirror, a fast decaying YAG:Ce scintillator, a visible light block, and a high-speed visible light CCD camera. The camera can capture EUV images as fast as 3.3 × 10{sup 6} frames per second with 0.5 cm spatial resolution. The spectral range is from 20 eV to 60 eV. EUV images reveal strong, transient, highly localized bursts of EUV radiation when magnetic reconnection occurs.

  2. Center of parcel with picture tube wall along walkway. Leaning ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Center of parcel with picture tube wall along walkway. Leaning Tower of Bottle Village at frame right; oblique view of Rumpus Room, remnants of Little Hut destroyed by Northridge earthquake at frame left. Camera facing northeast. - Grandma Prisbrey's Bottle Village, 4595 Cochran Street, Simi Valley, Ventura County, CA

  3. KENNEDY SPACE CENTER, FLA. - The camera installed on the aft skirt of a solid rocket booster is seen here, framed by the railing. The installation is in preparation for a vibration test of the Mobile Launcher Platform with SRBs and external tank mounted. The MLP will roll from one bay to another in the Vehicle Assembly Building.

    NASA Image and Video Library

    2003-11-06

    KENNEDY SPACE CENTER, FLA. - The camera installed on the aft skirt of a solid rocket booster is seen here, framed by the railing. The installation is in preparation for a vibration test of the Mobile Launcher Platform with SRBs and external tank mounted. The MLP will roll from one bay to another in the Vehicle Assembly Building.

  4. Pulsed x-ray sources for characterization of gated framing cameras

    NASA Astrophysics Data System (ADS)

    Filip, Catalin V.; Koch, Jeffrey A.; Freeman, Richard R.; King, James A.

    2017-08-01

    Gated X-ray framing cameras are used to measure important characteristics of inertial confinement fusion (ICF) implosions such as size and symmetry, with 50 ps time resolution in two dimensions. A pulsed source of hard (>8 keV) X-rays, would be a valuable calibration device, for example for gain-droop measurements of the variation in sensitivity of the gated strips. We have explored the requirements for such a source and a variety of options that could meet these requirements. We find that a small-size dense plasma focus machine could be a practical single-shot X-ray source for this application if timing uncertainties can be overcome.

  5. Characterization of x-ray framing cameras for the National Ignition Facility using single photon pulse height analysis.

    PubMed

    Holder, J P; Benedetti, L R; Bradley, D K

    2016-11-01

    Single hit pulse height analysis is applied to National Ignition Facility x-ray framing cameras to quantify gain and gain variation in a single micro-channel plate-based instrument. This method allows the separation of gain from detectability in these photon-detecting devices. While pulse heights measured by standard-DC calibration methods follow the expected exponential distribution at the limit of a compound-Poisson process, gain-gated pulse heights follow a more complex distribution that may be approximated as a weighted sum of a few exponentials. We can reproduce this behavior with a simple statistical-sampling model.

  6. A Digital Video System for Observing and Recording Occultations

    NASA Astrophysics Data System (ADS)

    Barry, M. A. Tony; Gault, Dave; Pavlov, Hristo; Hanna, William; McEwan, Alistair; Filipović, Miroslav D.

    2015-09-01

    Stellar occultations by asteroids and outer solar system bodies can offer ground based observers with modest telescopes and camera equipment the opportunity to probe the shape, size, atmosphere, and attendant moons or rings of these distant objects. The essential requirements of the camera and recording equipment are: good quantum efficiency and low noise; minimal dead time between images; good horological faithfulness of the image timestamps; robustness of the recording to unexpected failure; and low cost. We describe an occultation observing and recording system which attempts to fulfil these requirements and compare the system with other reported camera and recorder systems. Five systems have been built, deployed, and tested over the past three years, and we report on three representative occultation observations: one being a 9 ± 1.5 s occultation of the trans-Neptunian object 28978 Ixion (m v =15.2) at 3 seconds per frame; one being a 1.51 ± 0.017 s occultation of Deimos, the 12 km diameter satellite of Mars, at 30 frames per second; and one being a 11.04 ± 0.4 s occultation, recorded at 7.5 frames per second, of the main belt asteroid 361 Havnia, representing a low magnitude drop (Δm v = ~0.4) occultation.

  7. Particle displacement tracking applied to air flows

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.

    1991-01-01

    Electronic Particle Image Velocimeter (PIV) techniques offer many advantages over conventional photographic PIV methods such as fast turn around times and simplified data reduction. A new all electronic PIV technique was developed which can measure high speed gas velocities. The Particle Displacement Tracking (PDT) technique employs a single cw laser, small seed particles (1 micron), and a single intensified, gated CCD array frame camera to provide a simple and fast method of obtaining two-dimensional velocity vector maps with unambiguous direction determination. Use of a single CCD camera eliminates registration difficulties encountered when multiple cameras are used to obtain velocity magnitude and direction information. An 80386 PC equipped with a large memory buffer frame-grabber board provides all of the data acquisition and data reduction operations. No array processors of other numerical processing hardware are required. Full video resolution (640x480 pixel) is maintained in the acquired images, providing high resolution video frames of the recorded particle images. The time between data acquisition to display of the velocity vector map is less than 40 sec. The new electronic PDT technique is demonstrated on an air nozzle flow with velocities less than 150 m/s.

  8. Single frequency thermal wave radar: A next-generation dynamic thermography for quantitative non-destructive imaging over wide modulation frequency ranges.

    PubMed

    Melnikov, Alexander; Chen, Liangjie; Ramirez Venegas, Diego; Sivagurunathan, Koneswaran; Sun, Qiming; Mandelis, Andreas; Rodriguez, Ignacio Rojas

    2018-04-01

    Single-Frequency Thermal Wave Radar Imaging (SF-TWRI) was introduced and used to obtain quantitative thickness images of coatings on an aluminum block and on polyetherketone, and to image blind subsurface holes in a steel block. In SF-TWR, the starting and ending frequencies of a linear frequency modulation sweep are chosen to coincide. Using the highest available camera frame rate, SF-TWRI leads to a higher number of sampled points along the modulation waveform than conventional lock-in thermography imaging because it is not limited by conventional undersampling at high frequencies due to camera frame-rate limitations. This property leads to large reduction in measurement time, better quality of images, and higher signal-noise-ratio across wide frequency ranges. For quantitative thin-coating imaging applications, a two-layer photothermal model with lumped parameters was used to reconstruct the layer thickness from multi-frequency SF-TWR images. SF-TWRI represents a next-generation thermography method with superior features for imaging important classes of thin layers, materials, and components that require high-frequency thermal-wave probing well above today's available infrared camera technology frame rates.

  9. Single frequency thermal wave radar: A next-generation dynamic thermography for quantitative non-destructive imaging over wide modulation frequency ranges

    NASA Astrophysics Data System (ADS)

    Melnikov, Alexander; Chen, Liangjie; Ramirez Venegas, Diego; Sivagurunathan, Koneswaran; Sun, Qiming; Mandelis, Andreas; Rodriguez, Ignacio Rojas

    2018-04-01

    Single-Frequency Thermal Wave Radar Imaging (SF-TWRI) was introduced and used to obtain quantitative thickness images of coatings on an aluminum block and on polyetherketone, and to image blind subsurface holes in a steel block. In SF-TWR, the starting and ending frequencies of a linear frequency modulation sweep are chosen to coincide. Using the highest available camera frame rate, SF-TWRI leads to a higher number of sampled points along the modulation waveform than conventional lock-in thermography imaging because it is not limited by conventional undersampling at high frequencies due to camera frame-rate limitations. This property leads to large reduction in measurement time, better quality of images, and higher signal-noise-ratio across wide frequency ranges. For quantitative thin-coating imaging applications, a two-layer photothermal model with lumped parameters was used to reconstruct the layer thickness from multi-frequency SF-TWR images. SF-TWRI represents a next-generation thermography method with superior features for imaging important classes of thin layers, materials, and components that require high-frequency thermal-wave probing well above today's available infrared camera technology frame rates.

  10. Mars Science Laboratory Frame Manager for Centralized Frame Tree Database and Target Pointing

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Leger, Chris; Peters, Stephen; Carsten, Joseph; Diaz-Calderon, Antonio

    2013-01-01

    The FM (Frame Manager) flight software module is responsible for maintaining the frame tree database containing coordinate transforms between frames. The frame tree is a proper tree structure of directed links, consisting of surface and rover subtrees. Actual frame transforms are updated by their owner. FM updates site and saved frames for the surface tree. As the rover drives to a new area, a new site frame with an incremented site index can be created. Several clients including ARM and RSM (Remote Sensing Mast) update their related rover frames that they own. Through the onboard centralized FM frame tree database, client modules can query transforms between any two frames. Important applications include target image pointing for RSM-mounted cameras and frame-referenced arm moves. The use of frame tree eliminates cumbersome, error-prone calculations of coordinate entries for commands and thus simplifies flight operations significantly.

  11. In-vessel visible inspection system on KSTAR

    NASA Astrophysics Data System (ADS)

    Chung, Jinil; Seo, D. C.

    2008-08-01

    To monitor the global formation of the initial plasma and damage to the internal structures of the vacuum vessel, an in-vessel visible inspection system has been installed and operated on the Korean superconducting tokamak advanced research (KSTAR) device. It consists of four inspection illuminators and two visible/H-alpha TV cameras. Each illuminator uses four 150W metal-halide lamps with separate lamp controllers, and programmable progressive scan charge-coupled device cameras with 1004×1004 resolution at 48frames/s and a resolution of 640×480 at 210frames/s are used to capture images. In order to provide vessel inspection capability under any operation condition, the lamps and cameras are fully controlled from the main control room and protected by shutters from deposits during plasma operation. In this paper, we describe the design and operation results of the visible inspection system with the images of the KSTAR Ohmic discharges during the first plasma campaign.

  12. Body worn camera

    NASA Astrophysics Data System (ADS)

    Aishwariya, A.; Pallavi Sudhir, Gulavani; Garg, Nemesa; Karthikeyan, B.

    2017-11-01

    A body worn camera is small video camera worn on the body, typically used by police officers to record arrests, evidence from crime scenes. It helps preventing and resolving complaints brought by members of the public; and strengthening police transparency, performance, and accountability. The main constants of this type of the system are video format, resolution, frames rate, and audio quality. This system records the video in .mp4 format with 1080p resolution and 30 frames per second. One more important aspect to while designing this system is amount of power the system requires as battery management becomes very critical. The main design challenges are Size of the Video, Audio for the video. Combining both audio and video and saving it in .mp4 format, Battery, size that is required for 8 hours of continuous recording, Security. For prototyping this system is implemented using Raspberry Pi model B.

  13. Inspecting rapidly moving surfaces for small defects using CNN cameras

    NASA Astrophysics Data System (ADS)

    Blug, Andreas; Carl, Daniel; Höfler, Heinrich

    2013-04-01

    A continuous increase in production speed and manufacturing precision raises a demand for the automated detection of small image features on rapidly moving surfaces. An example are wire drawing processes where kilometers of cylindrical metal surfaces moving with 10 m/s have to be inspected for defects such as scratches, dents, grooves, or chatter marks with a lateral size of 100 μm in real time. Up to now, complex eddy current systems are used for quality control instead of line cameras, because the ratio between lateral feature size and surface speed is limited by the data transport between camera and computer. This bottleneck is avoided by "cellular neural network" (CNN) cameras which enable image processing directly on the camera chip. This article reports results achieved with a demonstrator based on this novel analogue camera - computer system. The results show that computational speed and accuracy of the analogue computer system are sufficient to detect and discriminate the different types of defects. Area images with 176 x 144 pixels are acquired and evaluated in real time with frame rates of 4 to 10 kHz - depending on the number of defects to be detected. These frame rates correspond to equivalent line rates on line cameras between 360 and 880 kHz, a number far beyond the available features. Using the relation between lateral feature size and surface speed as a figure of merit, the CNN based system outperforms conventional image processing systems by an order of magnitude.

  14. High-Speed Videography Instrumentation And Procedures

    NASA Astrophysics Data System (ADS)

    Miller, C. E.

    1982-02-01

    High-speed videography has been an electronic analog of low-speed film cameras, but having the advantages of instant-replay and simplicity of operation. Recent advances have pushed frame-rates into the realm of the rotating prism camera. Some characteristics of videography systems are discussed in conjunction with applications in sports analysis, and with sports equipment testing.

  15. Automatic treatment of flight test images using modern tools: SAAB and Aeritalia joint approach

    NASA Astrophysics Data System (ADS)

    Kaelldahl, A.; Duranti, P.

    The use of onboard cine cameras, as well as that of on ground cinetheodolites, is very popular in flight tests. The high resolution of film and the high frame rate of cinecameras are still not exceeded by video technology. Video technology can successfully enter the flight test scenario once the availability of solid-state optical sensors dramatically reduces the dimensions, and weight of TV cameras, thus allowing to locate them in positions compatible with space or operational limitations (e.g., HUD cameras). A proper combination of cine and video cameras is the typical solution for a complex flight test program. The output of such devices is very helpful in many flight areas. Several sucessful applications of this technology are summarized. Analysis of the large amount of data produced (frames of images) requires a very long time. The analysis is normally carried out manually. In order to improve the situation, in the last few years, several flight test centers have devoted their attention to possible techniques which allow for quicker and more effective image treatment.

  16. Electronic camera-management system for 35-mm and 70-mm film cameras

    NASA Astrophysics Data System (ADS)

    Nielsen, Allan

    1993-01-01

    Military and commercial test facilities have been tasked with the need for increasingly sophisticated data collection and data reduction. A state-of-the-art electronic control system for high speed 35 mm and 70 mm film cameras designed to meet these tasks is described. Data collection in today's test range environment is difficult at best. The need for a completely integrated image and data collection system is mandated by the increasingly complex test environment. Instrumentation film cameras have been used on test ranges to capture images for decades. Their high frame rates coupled with exceptionally high resolution make them an essential part of any test system. In addition to documenting test events, today's camera system is required to perform many additional tasks. Data reduction to establish TSPI (time- space-position information) may be performed after a mission and is subject to all of the variables present in documenting the mission. A typical scenario would consist of multiple cameras located on tracking mounts capturing the event along with azimuth and elevation position data. Corrected data can then be reduced using each camera's time and position deltas and calculating the TSPI of the object using triangulation. An electronic camera control system designed to meet these requirements has been developed by Photo-Sonics, Inc. The feedback received from test technicians at range facilities throughout the world led Photo-Sonics to design the features of this control system. These prominent new features include: a comprehensive safety management system, full local or remote operation, frame rate accuracy of less than 0.005 percent, and phase locking capability to Irig-B. In fact, Irig-B phase lock operation of multiple cameras can reduce the time-distance delta of a test object traveling at mach-1 to less than one inch during data reduction.

  17. Explosives Instrumentation Group Trial 6/77-Propellant Fire Trials (Series Two).

    DTIC Science & Technology

    1981-10-01

    frames/s. A 19 mm Sony U-Matic video cassette recorder (VCR) and camera were used to view the hearth from a tower 100 m from ground-zero (GZ). Normal...camera started. This procedure permitted increased recording time of the event. A 19 mm Sony U-Matic VCR and camera was used to view the container...Lumpur, Malaysia Exchange Section, British Library, U.K. Periodicals Recording Section, Science Reference Library, British Library, U.K. Library, Chemical

  18. Adaptive-Repetitive Visual-Servo Control of Low-Flying Aerial Robots via Uncalibrated High-Flying Cameras

    NASA Astrophysics Data System (ADS)

    Guo, Dejun; Bourne, Joseph R.; Wang, Hesheng; Yim, Woosoon; Leang, Kam K.

    2017-08-01

    This paper presents the design and implementation of an adaptive-repetitive visual-servo control system for a moving high-flying vehicle (HFV) with an uncalibrated camera to monitor, track, and precisely control the movements of a low-flying vehicle (LFV) or mobile ground robot. Applications of this control strategy include the use of high-flying unmanned aerial vehicles (UAVs) with computer vision for monitoring, controlling, and coordinating the movements of lower altitude agents in areas, for example, where GPS signals may be unreliable or nonexistent. When deployed, a remote operator of the HFV defines the desired trajectory for the LFV in the HFV's camera frame. Due to the circular motion of the HFV, the resulting motion trajectory of the LFV in the image frame can be periodic in time, thus an adaptive-repetitive control system is exploited for regulation and/or trajectory tracking. The adaptive control law is able to handle uncertainties in the camera's intrinsic and extrinsic parameters. The design and stability analysis of the closed-loop control system is presented, where Lyapunov stability is shown. Simulation and experimental results are presented to demonstrate the effectiveness of the method for controlling the movement of a low-flying quadcopter, demonstrating the capabilities of the visual-servo control system for localization (i.e.,, motion capturing) and trajectory tracking control. In fact, results show that the LFV can be commanded to hover in place as well as track a user-defined flower-shaped closed trajectory, while the HFV and camera system circulates above with constant angular velocity. On average, the proposed adaptive-repetitive visual-servo control system reduces the average RMS tracking error by over 77% in the image plane and over 71% in the world frame compared to using just the adaptive visual-servo control law.

  19. Reliability of sagittal plane hip, knee, and ankle joint angles from a single frame of video data using the GAITRite camera system.

    PubMed

    Ross, Sandy A; Rice, Clinton; Von Behren, Kristyn; Meyer, April; Alexander, Rachel; Murfin, Scott

    2015-01-01

    The purpose of this study was to establish intra-rater, intra-session, and inter-rater, reliability of sagittal plane hip, knee, and ankle angles with and without reflective markers using the GAITRite walkway and single video camera between student physical therapists and an experienced physical therapist. This study included thirty-two healthy participants age 20-59, stratified by age and gender. Participants performed three successful walks with and without markers applied to anatomical landmarks. GAITRite software was used to digitize sagittal hip, knee, and ankle angles at two phases of gait: (1) initial contact; and (2) mid-stance. Intra-rater reliability was more consistent for the experienced physical therapist, regardless of joint or phase of gait. Intra-session reliability was variable, the experienced physical therapist showed moderate to high reliability (intra-class correlation coefficient (ICC) = 0.50-0.89) and the student physical therapist showed very poor to high reliability (ICC = 0.07-0.85). Inter-rater reliability was highest during mid-stance at the knee with markers (ICC = 0.86) and lowest during mid-stance at the hip without markers (ICC = 0.25). Reliability of a single camera system, especially at the knee joint shows promise. Depending on the specific type of reliability, error can be attributed to the testers (e.g. lack of digitization practice and marker placement), participants (e.g. loose fitting clothing) and camera systems (e.g. frame rate and resolution). However, until the camera technology can be upgraded to a higher frame rate and resolution, and the software can be linked to the GAITRite walkway, the clinical utility for pre/post measures is limited.

  20. CMOS Imaging Sensor Technology for Aerial Mapping Cameras

    NASA Astrophysics Data System (ADS)

    Neumann, Klaus; Welzenbach, Martin; Timm, Martin

    2016-06-01

    In June 2015 Leica Geosystems launched the first large format aerial mapping camera using CMOS sensor technology, the Leica DMC III. This paper describes the motivation to change from CCD sensor technology to CMOS for the development of this new aerial mapping camera. In 2002 the DMC first generation was developed by Z/I Imaging. It was the first large format digital frame sensor designed for mapping applications. In 2009 Z/I Imaging designed the DMC II which was the first digital aerial mapping camera using a single ultra large CCD sensor to avoid stitching of smaller CCDs. The DMC III is now the third generation of large format frame sensor developed by Z/I Imaging and Leica Geosystems for the DMC camera family. It is an evolution of the DMC II using the same system design with one large monolithic PAN sensor and four multi spectral camera heads for R,G, B and NIR. For the first time a 391 Megapixel large CMOS sensor had been used as PAN chromatic sensor, which is an industry record. Along with CMOS technology goes a range of technical benefits. The dynamic range of the CMOS sensor is approx. twice the range of a comparable CCD sensor and the signal to noise ratio is significantly better than with CCDs. Finally results from the first DMC III customer installations and test flights will be presented and compared with other CCD based aerial sensors.

  1. High-Speed Video Analysis in a Conceptual Physics Class

    NASA Astrophysics Data System (ADS)

    Desbien, Dwain M.

    2011-09-01

    The use of probe ware and computers has become quite common in introductory physics classrooms. Video analysis is also becoming more popular and is available to a wide range of students through commercially available and/or free software.2,3 Video analysis allows for the study of motions that cannot be easily measured in the traditional lab setting and also allows real-world situations to be analyzed. Many motions are too fast to easily be captured at the standard video frame rate of 30 frames per second (fps) employed by most video cameras. This paper will discuss using a consumer camera that can record high-frame-rate video in a college-level conceptual physics class. In particular this will involve the use of model rockets to determine the acceleration during the boost period right at launch and compare it to a simple model of the expected acceleration.

  2. Performance Characterization of UV Science Cameras Developed for the Chromospheric Lyman-Alpha Spectro-Polarimeter

    NASA Technical Reports Server (NTRS)

    Champey, Patrick; Kobayashi, Ken; Winebarger, Amy; Cirtain, Jonathan; Hyde, David; Robertson, Bryan; Beabout, Brent; Beabout, Dyana; Stewart, Mike

    2014-01-01

    The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras will be built and tested for flight with the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The goal of the CLASP mission is to observe the scattering polarization in Lyman-alpha and to detect the Hanle effect in the line core. Due to the nature of Lyman-alpha polarization in the chromosphere, strict measurement sensitivity requirements are imposed on the CLASP polarimeter and spectrograph systems; science requirements for polarization measurements of Q/I and U/I are 0.1 percent in the line core. CLASP is a dual-beam spectro- polarimeter, which uses a continuously rotating waveplate as a polarization modulator, while the waveplate motor driver outputs trigger pulses to synchronize the exposures. The CCDs are operated in frame-transfer mode; the trigger pulse initiates the frame transfer, effectively ending the ongoing exposure and starting the next. The strict requirement of 0.1 percent polarization accuracy is met by using frame-transfer cameras to maximize the duty cycle in order to minimize photon noise. Coating the e2v CCD57-10 512x512 detectors with Lumogen-E coating allows for a relatively high (30 percent) quantum efficiency at the Lyman-alpha line. The CLASP cameras were designed to operate with a gain of 2.0 +/- 0.5, less than or equal to 25 e- readout noise, less than or equal to 10 e-/second/pixel dark current, and less than 0.1percent residual non-linearity. We present the results of the performance characterization study performed on the CLASP prototype camera; system gain, dark current, read noise, and residual non-linearity.

  3. High dynamic range adaptive real-time smart camera: an overview of the HDR-ARTiST project

    NASA Astrophysics Data System (ADS)

    Lapray, Pierre-Jean; Heyrman, Barthélémy; Ginhac, Dominique

    2015-04-01

    Standard cameras capture only a fraction of the information that is visible to the human visual system. This is specifically true for natural scenes including areas of low and high illumination due to transitions between sunlit and shaded areas. When capturing such a scene, many cameras are unable to store the full Dynamic Range (DR) resulting in low quality video where details are concealed in shadows or washed out by sunlight. The imaging technique that can overcome this problem is called HDR (High Dynamic Range) imaging. This paper describes a complete smart camera built around a standard off-the-shelf LDR (Low Dynamic Range) sensor and a Virtex-6 FPGA board. This smart camera called HDR-ARtiSt (High Dynamic Range Adaptive Real-time Smart camera) is able to produce a real-time HDR live video color stream by recording and combining multiple acquisitions of the same scene while varying the exposure time. This technique appears as one of the most appropriate and cheapest solution to enhance the dynamic range of real-life environments. HDR-ARtiSt embeds real-time multiple captures, HDR processing, data display and transfer of a HDR color video for a full sensor resolution (1280 1024 pixels) at 60 frames per second. The main contributions of this work are: (1) Multiple Exposure Control (MEC) dedicated to the smart image capture with alternating three exposure times that are dynamically evaluated from frame to frame, (2) Multi-streaming Memory Management Unit (MMMU) dedicated to the memory read/write operations of the three parallel video streams, corresponding to the different exposure times, (3) HRD creating by combining the video streams using a specific hardware version of the Devebecs technique, and (4) Global Tone Mapping (GTM) of the HDR scene for display on a standard LCD monitor.

  4. Vision System Measures Motions of Robot and External Objects

    NASA Technical Reports Server (NTRS)

    Talukder, Ashit; Matthies, Larry

    2008-01-01

    A prototype of an advanced robotic vision system both (1) measures its own motion with respect to a stationary background and (2) detects other moving objects and estimates their motions, all by use of visual cues. Like some prior robotic and other optoelectronic vision systems, this system is based partly on concepts of optical flow and visual odometry. Whereas prior optoelectronic visual-odometry systems have been limited to frame rates of no more than 1 Hz, a visual-odometry subsystem that is part of this system operates at a frame rate of 60 to 200 Hz, given optical-flow estimates. The overall system operates at an effective frame rate of 12 Hz. Moreover, unlike prior machine-vision systems for detecting motions of external objects, this system need not remain stationary: it can detect such motions while it is moving (even vibrating). The system includes a stereoscopic pair of cameras mounted on a moving robot. The outputs of the cameras are digitized, then processed to extract positions and velocities. The initial image-data-processing functions of this system are the same as those of some prior systems: Stereoscopy is used to compute three-dimensional (3D) positions for all pixels in the camera images. For each pixel of each image, optical flow between successive image frames is used to compute the two-dimensional (2D) apparent relative translational motion of the point transverse to the line of sight of the camera. The challenge in designing this system was to provide for utilization of the 3D information from stereoscopy in conjunction with the 2D information from optical flow to distinguish between motion of the camera pair and motions of external objects, compute the motion of the camera pair in all six degrees of translational and rotational freedom, and robustly estimate the motions of external objects, all in real time. To meet this challenge, the system is designed to perform the following image-data-processing functions: The visual-odometry subsystem (the subsystem that estimates the motion of the camera pair relative to the stationary background) utilizes the 3D information from stereoscopy and the 2D information from optical flow. It computes the relationship between the 3D and 2D motions and uses a least-mean-squares technique to estimate motion parameters. The least-mean-squares technique is suitable for real-time implementation when the number of external-moving-object pixels is smaller than the number of stationary-background pixels.

  5. Estimation of Antenna Pose in the Earth Frame Using Camera and IMU Data from Mobile Phones

    PubMed Central

    Wang, Zhen; Jin, Bingwen; Geng, Weidong

    2017-01-01

    The poses of base station antennas play an important role in cellular network optimization. Existing methods of pose estimation are based on physical measurements performed either by tower climbers or using additional sensors attached to antennas. In this paper, we present a novel non-contact method of antenna pose measurement based on multi-view images of the antenna and inertial measurement unit (IMU) data captured by a mobile phone. Given a known 3D model of the antenna, we first estimate the antenna pose relative to the phone camera from the multi-view images and then employ the corresponding IMU data to transform the pose from the camera coordinate frame into the Earth coordinate frame. To enhance the resulting accuracy, we improve existing camera-IMU calibration models by introducing additional degrees of freedom between the IMU sensors and defining a new error metric based on both the downtilt and azimuth angles, instead of a unified rotational error metric, to refine the calibration. In comparison with existing camera-IMU calibration methods, our method achieves an improvement in azimuth accuracy of approximately 1.0 degree on average while maintaining the same level of downtilt accuracy. For the pose estimation in the camera coordinate frame, we propose an automatic method of initializing the optimization solver and generating bounding constraints on the resulting pose to achieve better accuracy. With this initialization, state-of-the-art visual pose estimation methods yield satisfactory results in more than 75% of cases when plugged into our pipeline, and our solution, which takes advantage of the constraints, achieves even lower estimation errors on the downtilt and azimuth angles, both on average (0.13 and 0.3 degrees lower, respectively) and in the worst case (0.15 and 7.3 degrees lower, respectively), according to an evaluation conducted on a dataset consisting of 65 groups of data. We show that both of our enhancements contribute to the performance improvement offered by the proposed estimation pipeline, which achieves downtilt and azimuth accuracies of respectively 0.47 and 5.6 degrees on average and 1.38 and 12.0 degrees in the worst case, thereby satisfying the accuracy requirements for network optimization in the telecommunication industry. PMID:28397765

  6. Whirlwind Drama During Spirit's 496th Sol

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This movie clip shows a dust devil growing in size and blowing across the plain inside Mars' Gusev Crater. The clip consists of frames taken by the navigation camera on NASA's Mars Exploration Rover Spirit during the morning of the rover's 496th martian day, or sol (May 26, 2005). Contrast has been enhanced for anything in the images that changes from frame to frame, that is, for the dust moved by wind.

  7. A Summary of the Evaluation of PPG Herculite XP Glass in Punched Window and Storefront Assemblies

    DTIC Science & Technology

    2013-01-01

    frames for all IGU windows extruded from existing dies. The glazing was secured to the frame on all four sides with a 1/2-in bead width of DOW 995...lite and non-laminated IGU debris tests. A wood frame with a 4-in wide slit was placed behind the window to transform the debris cloud into a narrow...speed camera DIC Set-up laser deflection gauge shock tube window wood frame with slit high speed camerawell lit backdrop Debris Tracking Set-up laser

  8. Polarizing aperture stereoscopic cinema camera

    NASA Astrophysics Data System (ADS)

    Lipton, Lenny

    2012-03-01

    The art of stereoscopic cinematography has been held back because of the lack of a convenient way to reduce the stereo camera lenses' interaxial to less than the distance between the eyes. This article describes a unified stereoscopic camera and lens design that allows for varying the interaxial separation to small values using a unique electro-optical polarizing aperture design for imaging left and right perspective views onto a large single digital sensor (the size of the standard 35mm frame) with the means to select left and right image information. Even with the added stereoscopic capability the appearance of existing camera bodies will be unaltered.

  9. Polarizing aperture stereoscopic cinema camera

    NASA Astrophysics Data System (ADS)

    Lipton, Lenny

    2012-07-01

    The art of stereoscopic cinematography has been held back because of the lack of a convenient way to reduce the stereo camera lenses' interaxial to less than the distance between the eyes. This article describes a unified stereoscopic camera and lens design that allows for varying the interaxial separation to small values using a unique electro-optical polarizing aperture design for imaging left and right perspective views onto a large single digital sensor, the size of the standard 35 mm frame, with the means to select left and right image information. Even with the added stereoscopic capability, the appearance of existing camera bodies will be unaltered.

  10. Underwater image mosaicking and visual odometry

    NASA Astrophysics Data System (ADS)

    Sadjadi, Firooz; Tangirala, Sekhar; Sorber, Scott

    2017-05-01

    This paper summarizes the results of studies in underwater odometery using a video camera for estimating the velocity of an unmanned underwater vehicle (UUV). Underwater vehicles are usually equipped with sonar and Inertial Measurement Unit (IMU) - an integrated sensor package that combines multiple accelerometers and gyros to produce a three dimensional measurement of both specific force and angular rate with respect to an inertial reference frame for navigation. In this study, we investigate the use of odometry information obtainable from a video camera mounted on a UUV to extract vehicle velocity relative to the ocean floor. A key challenge with this process is the seemingly bland (i.e. featureless) nature of video data obtained underwater which could make conventional approaches to image-based motion estimation difficult. To address this problem, we perform image enhancement, followed by frame to frame image transformation, registration and mosaicking/stitching. With this approach the velocity components associated with the moving sensor (vehicle) are readily obtained from (i) the components of the transform matrix at each frame; (ii) information about the height of the vehicle above the seabed; and (iii) the sensor resolution. Preliminary results are presented.

  11. Heterogeneous CPU-GPU moving targets detection for UAV video

    NASA Astrophysics Data System (ADS)

    Li, Maowen; Tang, Linbo; Han, Yuqi; Yu, Chunlei; Zhang, Chao; Fu, Huiquan

    2017-07-01

    Moving targets detection is gaining popularity in civilian and military applications. On some monitoring platform of motion detection, some low-resolution stationary cameras are replaced by moving HD camera based on UAVs. The pixels of moving targets in the HD Video taken by UAV are always in a minority, and the background of the frame is usually moving because of the motion of UAVs. The high computational cost of the algorithm prevents running it at higher resolutions the pixels of frame. Hence, to solve the problem of moving targets detection based UAVs video, we propose a heterogeneous CPU-GPU moving target detection algorithm for UAV video. More specifically, we use background registration to eliminate the impact of the moving background and frame difference to detect small moving targets. In order to achieve the effect of real-time processing, we design the solution of heterogeneous CPU-GPU framework for our method. The experimental results show that our method can detect the main moving targets from the HD video taken by UAV, and the average process time is 52.16ms per frame which is fast enough to solve the problem.

  12. 3D Position and Velocity Vector Computations of Objects Jettisoned from the International Space Station Using Close-Range Photogrammetry Approach

    NASA Technical Reports Server (NTRS)

    Papanyan, Valeri; Oshle, Edward; Adamo, Daniel

    2008-01-01

    Measurement of the jettisoned object departure trajectory and velocity vector in the International Space Station (ISS) reference frame is vitally important for prompt evaluation of the object s imminent orbit. We report on the first successful application of photogrammetric analysis of the ISS imagery for the prompt computation of the jettisoned object s position and velocity vectors. As post-EVA analyses examples, we present the Floating Potential Probe (FPP) and the Russian "Orlan" Space Suit jettisons, as well as the near-real-time (provided in several hours after the separation) computations of the Video Stanchion Support Assembly Flight Support Assembly (VSSA-FSA) and Early Ammonia Servicer (EAS) jettisons during the US astronauts space-walk. Standard close-range photogrammetry analysis was used during this EVA to analyze two on-board camera image sequences down-linked from the ISS. In this approach the ISS camera orientations were computed from known coordinates of several reference points on the ISS hardware. Then the position of the jettisoned object for each time-frame was computed from its image in each frame of the video-clips. In another, "quick-look" approach used in near-real time, orientation of the cameras was computed from their position (from the ISS CAD model) and operational data (pan and tilt) then location of the jettisoned object was calculated only for several frames of the two synchronized movies. Keywords: Photogrammetry, International Space Station, jettisons, image analysis.

  13. The Example of Using the Xiaomi Cameras in Inventory of Monumental Objects - First Results

    NASA Astrophysics Data System (ADS)

    Markiewicz, J. S.; Łapiński, S.; Bienkowski, R.; Kaliszewska, A.

    2017-11-01

    At present, digital documentation recorded in the form of raster or vector files is the obligatory way of inventorying historical objects. Today, photogrammetry is becoming more and more popular and is becoming the standard of documentation in many projects involving the recording of all possible spatial data on landscape, architecture, or even single objects. Low-cost sensors allow for the creation of reliable and accurate three-dimensional models of investigated objects. This paper presents the results of a comparison between the outcomes obtained when using three sources of image: low-cost Xiaomi cameras, a full-frame camera (Canon 5D Mark II) and middle-frame camera (Hasselblad-Hd4). In order to check how the results obtained from the two sensors differ the following parameters were analysed: the accuracy of the orientation of the ground level photos on the control and check points, the distribution of appointed distortion in the self-calibration process, the flatness of the walls, the discrepancies between point clouds from the low-cost cameras and references data. The results presented below are a result of co-operation of researchers from three institutions: the Systems Research Institute PAS, The Department of Geodesy and Cartography at the Warsaw University of Technology and the National Museum in Warsaw.

  14. Full-Frame Reference for Test Photo of Moon

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This pair of views shows how little of the full image frame was taken up by the Moon in test images taken Sept. 8, 2005, by the High Resolution Imaging Science Experiment (HiRISE) camera on NASA's Mars Reconnaissance Orbiter. The Mars-bound camera imaged Earth's Moon from a distance of about 10 million kilometers (6 million miles) away -- 26 times the distance between Earth and the Moon -- as part of an activity to test and calibrate the camera. The images are very significant because they show that the Mars Reconnaissance Orbiter spacecraft and this camera can properly operate together to collect very high-resolution images of Mars. The target must move through the camera's telescope view in just the right direction and speed to acquire a proper image. The day's test images also demonstrate that the focus mechanism works properly with the telescope to produce sharp images.

    Out of the 20,000-pixel-by-6,000-pixel full frame, the Moon's diameter is about 340 pixels, if the full Moon could be seen. The illuminated crescent is about 60 pixels wide, and the resolution is about 10 kilometers (6 miles) per pixel. At Mars, the entire image region will be filled with high-resolution information.

    The Mars Reconnaissance Orbiter, launched on Aug. 12, 2005, is on course to reach Mars on March 10, 2006. After gradually adjusting the shape of its orbit for half a year, it will begin its primary science phase in November 2006. From the mission's planned science orbit about 300 kilometers (186 miles) above the surface of Mars, the high resolution camera will be able to discern features as small as one meter or yard across.

    The Mars Reconnaissance Orbiter mission is managed by NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology, Pasadena, for the NASA Science Mission Directorate. Lockheed Martin Space Systems, Denver, prime contractor for the project, built the spacecraft. Ball Aerospace & Technologies Corp., Boulder, Colo., built the High Resolution Imaging Science Experiment instrument for the University of Arizona, Tucson, to provide to the mission. The HiRISE Operations Center at the University of Arizona processes images from the camera.

  15. Lunar Reconnaissance Orbiter Camera (LROC) instrument overview

    USGS Publications Warehouse

    Robinson, M.S.; Brylow, S.M.; Tschimmel, M.; Humm, D.; Lawrence, S.J.; Thomas, P.C.; Denevi, B.W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M.A.; Caplinger, M.A.; Ghaemi, F.T.; Schaffner, J.A.; Malin, M.C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T.N.; Eliason, E.M.; McEwen, A.S.; Turtle, E.; Jolliff, B.L.; Hiesinger, H.

    2010-01-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.

  16. "Teacher in Space" Trainees - Arriflex Motion Picture Camera

    NASA Image and Video Library

    1985-09-20

    S85-40670 (18 Sept. 1985) --- The two teachers, Sharon Christa McAuliffe and Barbara R. Morgan (out of frame) have hands-on experience with an Arriflex motion picture camera following a briefing on space photography. The two began training Sept. 10, 1985 with the STS-51L crew and learning basic procedures for space travelers. The second week of training included camera training, aircraft familiarization and other activities. McAuliffe zeroes in on a test subject during a practice session with the Arriflex. Photo credit: NASA

  17. "Teacher in Space" Trainees - Arriflex Motion Picture Camera

    NASA Image and Video Library

    1985-09-20

    S85-40671 (18 Sept. 1985) --- The two teachers, Barbara R. Morgan and Sharon Christa McAuliffe (out of frame) have hands-on experience with an Arriflex motion picture camera following a briefing on space photography. The two began training Sept. 10, 1985 with the STS-51L crew and learning basic procedures for space travelers. The second week of training included camera training, aircraft familiarization and other activities. Morgan zeroes in on a test subject during a practice session with the Arriflex. Photo credit: NASA

  18. Chondritic Models of 4 Vesta: Comparison of Data from the Dawn Mission with Predicted Internal Structure and Surface Composition/Mineralogy

    NASA Technical Reports Server (NTRS)

    Toplis, M. J.; Mizzon, H.; Forni, O.; Monnereau, M.; Barrat, J-A.; Prettyman, T. H.; McSween, H. Y.; McCoy, T. J.; Mittlefehldt, D. W.; De Sanctis, M. C.; hide

    2012-01-01

    While the HEDs provide an extremely useful basis for interpreting data from the Dawn mission, there is no guarantee that they provide a complete vision of all possible crustal (and possibly mantle) lithologies that are exposed at the surface of Vesta. With this in mind, an alternative approach is to identify plausible bulk compositions and use mass-balance and geochemical modelling to predict possible internal structures and crust/mantle compositions and mineralogies. While such models must be consistent with known HED samples, this approach has the potential to extend predictions to thermodynamically plausible rock types that are not necessarily present in the HED collection. Nine chondritic bulk compositions are considered (CI, CV, CO, CM, H, L, LL, EH, EL). For each, relative proportions and densities of the core, mantle, and crust are quantified. This calculation is complicated by the fact that iron may occur in metallic form (in the core) and/or in oxidized form (in the mantle and crust). However, considering that the basaltic crust has the composition of Juvinas and assuming that this crust is in thermodynamic equilibrium with the residual mantle, it is possible to calculate a single solution to this problem for a given bulk composition. Of the nine bulk compositions tested, solutions corresponding to CI and LL groups predicted a negative metal fraction and were not considered further. Solutions for enstatite chondrites imply significant oxidation relative to the starting materials and these solutions too are considered unlikely. For the remaining bulk compositions, the relative proportion of crust to bulk silicate is typically in the range 15 to 20% corresponding to crustal thicknesses of 15 to 20 km for a porosity-free Vesta-sized body. The mantle is predicted to be largely dominated by olivine (greater than 85%) for carbonaceous chondrites, but to be a roughly equal mixture of olivine and pyroxene for ordinary chondrite precursors. All bulk compositions have a significant core, but the relative proportions of metal and sulphide can be widely different. Using these data, total core size (metal+ sulphide) and average core densities can be calculated, providing a useful reference frame within which to consider geophysical/gravity data of the Dawn mission. Further to these mass-balance calculations, the MELTS thermodynamic calculator has been used to assess to what extent chondritic bulk compositions can produce Juvinas-like liquids at relevant degrees of partial melting/crystallization. This work will refine acceptable bulk compositions and predict the mineralogy and composition of the associated solid and liquid products over wide ranges of partial melting and crystallization, providing a useful and self-consistent reference frame for interpretation of the data from the VIR and GRaND instruments onboard the Dawn spacecraft.

  19. Deployment of the RCA Satcom K-2 communications satellite

    NASA Image and Video Library

    1985-11-28

    61B-38-36W (28 Nov 1985) --- The 4,144-pound RCA Satcom K-2 communications satellite is photographed as it spins from the cargo bay of the Earth-orbiting Atlantis. A TV camera at right records the deployment for a later playback to Earth. This frame was photographed with a handheld Hasselblad camera inside the spacecraft.

  20. Validation of Viewing Reports: Exploration of a Photographic Method.

    ERIC Educational Resources Information Center

    Fletcher, James E.; Chen, Charles Chao-Ping

    A time lapse camera loaded with Super 8 film was employed to photographically record the area in front of a conventional television receiver in selected homes. The camera took one picture each minute for three days, including in the same frame the face of the television receiver. Family members kept a conventional viewing diary of their viewing…

  1. A device for synchronizing biomechanical data with cine film.

    PubMed

    Rome, L C

    1995-03-01

    Biomechanists are faced with two problems in synchronizing continuous physiological data to discrete, frame-based kinematic data from films. First, the accuracy of most synchronization techniques is good only to one frame and hence depends on framing rate. Second, even if perfectly correlated at the beginning of a 'take', the film and physiological data may become progressively desynchronized as the 'take' proceeds. A system is described, which provides synchronization between cine film and continuous physiological data with an accuracy of +/- 0.2 ms, independent of framing rate and the duration of the film 'take'. Shutter pulses from the camera were output to a computer recording system where they were recorded and counted, and to a digital device which counted the pulses and illuminated the count on the bank of LEDs which was filmed with the subject. Synchronization was performed by using the rising edge of the shutter pulse and by comparing the frame number imprinted on the film to the frame number recorded by the computer system. In addition to providing highly accurate synchronization over long film 'takes', this system provides several other advantages. First, having frame numbers imprinted both on the film and computer record greatly facilitates analysis. Second, the LEDs were designed to show the 'take number' while the camera is coming up to speed, thereby avoiding the use of cue cards which disturb the animal. Finally, use of this device results in considerable savings in film.

  2. Reflectance properties and hydrated material distribution on Vesta: Global investigation of variations and their relationship using improved calibration of Dawn VIR mapping spectrometer

    NASA Astrophysics Data System (ADS)

    Combe, Jean-Philippe; Ammannito, Eleonora; Tosi, Federico; De Sanctis, Maria Cristina; McCord, Thomas B.; Raymond, Carol A.; Russell, Christopher T.

    2015-10-01

    Vesta's surface albedo variations and hydrated material content share similar spatial distribution. This observation is consistent with carbonaceous chondrite meteorites as a likely source material for dark surface units observed by the Dawn spacecraft, as presented by numerous publications. While these deposits have been studied extensively by analysis of data from the Framing Camera (FC) and the Visible and Infrared Spectrometer (VIR), we performed a new analysis based on an improved calibration of VIR. First we identified instrument and calibration artifacts, and we therefore developed corrections of the VIR flat field and response function. Then we developed a photometric correction for Vesta based on the lunar model by Shkuratov et al. (Shkuratov, Yu.G. et al. [1999]. Icarus 141, 132-155. http://dx.doi.org/10.1006/icar.1999.6154), and a semi-analytical inversion of the photometric parameters. This photometric model combines minimization of the scattering effects due to the topography (a disk function) and variations of multiple-scattering with phase angle (the phase function) caused by microscopic physical properties of the regolith. The improved calibration and photometric correction enable more accurate analysis of the spectral properties of Vesta's surface material, especially the reflectance at 1.4 μm and the 2.8 μm hydroxyl absorption band depth. We produced global and quadrangle maps that are used as a common dataset for this Icarus special issue on Vesta's surface composition. The joint interpretation of both the 1.4 μm reflectance and the 2.8 μm absorption band depth reveals unusual spectral properties for a number of impact craters and ejecta compared to the rest of Vesta. An area including the Bellicia, Arruntia and Pomponia craters, where olivine might be present, has relatively high reflectance and a strong hydroxyl absorption band. Another area in the vicinity of Capparonia crater has a high content of hydrated materials, although with moderate reflectance and typical pyroxene-rich composition. Ejecta blankets west of Oppia crater have a spectral behavior similar to Capparonia, except for the wider and more complex shape of the hydroxyl absorption band. On the other hand, some low-hydrated areas associated to crater floors and ejecta have higher reflectance and steeper spectral slope than most low-hydrated terrains Vesta. A broad lane that extends from Rheasilvia rim at Matronalia Rupes to the northern regions hosts little to no hydrated materials and exhibits a moderate spectral slope, similar to Rheasilvia's basin floor. These properties reinforce the hypothesis that the lane is composed of ejecta from Rheasilvia, as indicated by the distribution of pyroxene compositions by previous results from Dawn. A few small and fresh craters exhibit an association between low-reflectance, little to no hydrated materials and a strong positive spectral slope, suggesting optical effects by opaque coatings, as opposed to carbonaceous chondrite deposits, and possible coarser grains.

  3. Context View from 11' on ladder from southeast corner of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Context View from 11' on ladder from southeast corner of Bottle Village parcel, just inside fence. Doll Head Shrine at far left frame, Living Trailer (c.1960 "Spartanette") in center frame. Little Wishing Well at far right frame. Some shrines and small buildings were destroyed in the January 1994 Northridge earthquake, and only their perimeter walls and foundations exist. Camera facing north northwest. - Grandma Prisbrey's Bottle Village, 4595 Cochran Street, Simi Valley, Ventura County, CA

  4. Obstacle Detection in Indoor Environment for Visually Impaired Using Mobile Camera

    NASA Astrophysics Data System (ADS)

    Rahman, Samiur; Ullah, Sana; Ullah, Sehat

    2018-01-01

    Obstacle detection can improve the mobility as well as the safety of visually impaired people. In this paper, we present a system using mobile camera for visually impaired people. The proposed algorithm works in indoor environment and it uses a very simple technique of using few pre-stored floor images. In indoor environment all unique floor types are considered and a single image is stored for each unique floor type. These floor images are considered as reference images. The algorithm acquires an input image frame and then a region of interest is selected and is scanned for obstacle using pre-stored floor images. The algorithm compares the present frame and the next frame and compute mean square error of the two frames. If mean square error is less than a threshold value α then it means that there is no obstacle in the next frame. If mean square error is greater than α then there are two possibilities; either there is an obstacle or the floor type is changed. In order to check if the floor is changed, the algorithm computes mean square error of next frame and all stored floor types. If minimum of mean square error is less than a threshold value α then flour is changed otherwise there exist an obstacle. The proposed algorithm works in real-time and 96% accuracy has been achieved.

  5. CIFAR10-DVS: An Event-Stream Dataset for Object Classification

    PubMed Central

    Li, Hongmin; Liu, Hanchao; Ji, Xiangyang; Li, Guoqi; Shi, Luping

    2017-01-01

    Neuromorphic vision research requires high-quality and appropriately challenging event-stream datasets to support continuous improvement of algorithms and methods. However, creating event-stream datasets is a time-consuming task, which needs to be recorded using the neuromorphic cameras. Currently, there are limited event-stream datasets available. In this work, by utilizing the popular computer vision dataset CIFAR-10, we converted 10,000 frame-based images into 10,000 event streams using a dynamic vision sensor (DVS), providing an event-stream dataset of intermediate difficulty in 10 different classes, named as “CIFAR10-DVS.” The conversion of event-stream dataset was implemented by a repeated closed-loop smooth (RCLS) movement of frame-based images. Unlike the conversion of frame-based images by moving the camera, the image movement is more realistic in respect of its practical applications. The repeated closed-loop image movement generates rich local intensity changes in continuous time which are quantized by each pixel of the DVS camera to generate events. Furthermore, a performance benchmark in event-driven object classification is provided based on state-of-the-art classification algorithms. This work provides a large event-stream dataset and an initial benchmark for comparison, which may boost algorithm developments in even-driven pattern recognition and object classification. PMID:28611582

  6. Optimizing low-light microscopy with back-illuminated electron multiplying charge-coupled device: enhanced sensitivity, speed, and resolution.

    PubMed

    Coates, Colin G; Denvir, Donal J; McHale, Noel G; Thornbury, Keith D; Hollywood, Mark A

    2004-01-01

    The back-illuminated electron multiplying charge-coupled device (EMCCD) camera is having a profound influence on the field of low-light dynamic cellular microscopy, combining highest possible photon collection efficiency with the ability to virtually eliminate the readout noise detection limit. We report here the use of this camera, in 512 x 512 frame-transfer chip format at 10-MHz pixel readout speed, in optimizing a demanding ultra-low-light intracellular calcium flux microscopy setup. The arrangement employed includes a spinning confocal Nipkow disk, which, while facilitating the need to both generate images at very rapid frame rates and minimize background photons, yields very weak signals. The challenge for the camera lies not just in detecting as many of these scarce photons as possible, but also in operating at a frame rate that meets the temporal resolution requirements of many low-light microscopy approaches, a particular demand of smooth muscle calcium flux microscopy. Results presented illustrate both the significant sensitivity improvement offered by this technology over the previous standard in ultra-low-light CCD detection, the GenIII+intensified charge-coupled device (ICCD), and also portray the advanced temporal and spatial resolution capabilities of the EMCCD. Copyright 2004 Society of Photo-Optical Instrumentation Engineers.

  7. CIFAR10-DVS: An Event-Stream Dataset for Object Classification.

    PubMed

    Li, Hongmin; Liu, Hanchao; Ji, Xiangyang; Li, Guoqi; Shi, Luping

    2017-01-01

    Neuromorphic vision research requires high-quality and appropriately challenging event-stream datasets to support continuous improvement of algorithms and methods. However, creating event-stream datasets is a time-consuming task, which needs to be recorded using the neuromorphic cameras. Currently, there are limited event-stream datasets available. In this work, by utilizing the popular computer vision dataset CIFAR-10, we converted 10,000 frame-based images into 10,000 event streams using a dynamic vision sensor (DVS), providing an event-stream dataset of intermediate difficulty in 10 different classes, named as "CIFAR10-DVS." The conversion of event-stream dataset was implemented by a repeated closed-loop smooth (RCLS) movement of frame-based images. Unlike the conversion of frame-based images by moving the camera, the image movement is more realistic in respect of its practical applications. The repeated closed-loop image movement generates rich local intensity changes in continuous time which are quantized by each pixel of the DVS camera to generate events. Furthermore, a performance benchmark in event-driven object classification is provided based on state-of-the-art classification algorithms. This work provides a large event-stream dataset and an initial benchmark for comparison, which may boost algorithm developments in even-driven pattern recognition and object classification.

  8. Web Camera Based Eye Tracking to Assess Visual Memory on a Visual Paired Comparison Task.

    PubMed

    Bott, Nicholas T; Lange, Alex; Rentz, Dorene; Buffalo, Elizabeth; Clopton, Paul; Zola, Stuart

    2017-01-01

    Background: Web cameras are increasingly part of the standard hardware of most smart devices. Eye movements can often provide a noninvasive "window on the brain," and the recording of eye movements using web cameras is a burgeoning area of research. Objective: This study investigated a novel methodology for administering a visual paired comparison (VPC) decisional task using a web camera.To further assess this method, we examined the correlation between a standard eye-tracking camera automated scoring procedure [obtaining images at 60 frames per second (FPS)] and a manually scored procedure using a built-in laptop web camera (obtaining images at 3 FPS). Methods: This was an observational study of 54 clinically normal older adults.Subjects completed three in-clinic visits with simultaneous recording of eye movements on a VPC decision task by a standard eye tracker camera and a built-in laptop-based web camera. Inter-rater reliability was analyzed using Siegel and Castellan's kappa formula. Pearson correlations were used to investigate the correlation between VPC performance using a standard eye tracker camera and a built-in web camera. Results: Strong associations were observed on VPC mean novelty preference score between the 60 FPS eye tracker and 3 FPS built-in web camera at each of the three visits ( r = 0.88-0.92). Inter-rater agreement of web camera scoring at each time point was high (κ = 0.81-0.88). There were strong relationships on VPC mean novelty preference score between 10, 5, and 3 FPS training sets ( r = 0.88-0.94). Significantly fewer data quality issues were encountered using the built-in web camera. Conclusions: Human scoring of a VPC decisional task using a built-in laptop web camera correlated strongly with automated scoring of the same task using a standard high frame rate eye tracker camera.While this method is not suitable for eye tracking paradigms requiring the collection and analysis of fine-grained metrics, such as fixation points, built-in web cameras are a standard feature of most smart devices (e.g., laptops, tablets, smart phones) and can be effectively employed to track eye movements on decisional tasks with high accuracy and minimal cost.

  9. Mission Specialist (MS) Bluford exercises on middeck treadmill

    NASA Image and Video Library

    1983-09-05

    STS008-13-0361 (30 Aug.-5 Sept. 1983) --- Astronaut Guion S. Bluford, STS-8 mission specialist, assists Dr. William E. Thornton (out of frame) with a medical test that requires use of the treadmill exercising device designed for spaceflight by the STS-8 medical doctor. This frame was shot with a 35mm camera. Photo credit: NASA

  10. Mission Specialist Hawley works with the SWUIS experiment

    NASA Image and Video Library

    2013-11-18

    STS093-350-022 (22-27 July 1999) --- Astronaut Steven A. Hawley, mission specialist, works with the Southwest Ultraviolet Imaging System (SWUIS) experiment onboard the Earth-orbiting Space Shuttle Columbia. The SWUIS is based around a Maksutov-design Ultraviolet (UV) telescope and a UV-sensitive, image-intensified Charge-Coupled Device (CCD) camera that frames at video frame rates.

  11. High frame rate imaging systems developed in Northwest Institute of Nuclear Technology

    NASA Astrophysics Data System (ADS)

    Li, Binkang; Wang, Kuilu; Guo, Mingan; Ruan, Linbo; Zhang, Haibing; Yang, Shaohua; Feng, Bing; Sun, Fengrong; Chen, Yanli

    2007-01-01

    This paper presents high frame rate imaging systems developed in Northwest Institute of Nuclear Technology in recent years. Three types of imaging systems are included. The first type of system utilizes EG&G RETICON Photodiode Array (PDA) RA100A as the image sensor, which can work at up to 1000 frame per second (fps). Besides working continuously, the PDA system is also designed to switch to capture flash light event working mode. A specific time sequence is designed to satisfy this request. The camera image data can be transmitted to remote area by coaxial or optic fiber cable and then be stored. The second type of imaging system utilizes PHOTOBIT Complementary Metal Oxygen Semiconductor (CMOS) PB-MV13 as the image sensor, which has a high resolution of 1280 (H) ×1024 (V) pixels per frame. The CMOS system can operate at up to 500fps in full frame and 4000fps partially. The prototype scheme of the system is presented. The third type of imaging systems adopts charge coupled device (CCD) as the imagers. MINTRON MTV-1881EX, DALSA CA-D1 and CA-D6 camera head are used in the systems development. The features comparison of the RA100A, PB-MV13, and CA-D6 based systems are given in the end.

  12. Low-Latency Line Tracking Using Event-Based Dynamic Vision Sensors

    PubMed Central

    Everding, Lukas; Conradt, Jörg

    2018-01-01

    In order to safely navigate and orient in their local surroundings autonomous systems need to rapidly extract and persistently track visual features from the environment. While there are many algorithms tackling those tasks for traditional frame-based cameras, these have to deal with the fact that conventional cameras sample their environment with a fixed frequency. Most prominently, the same features have to be found in consecutive frames and corresponding features then need to be matched using elaborate techniques as any information between the two frames is lost. We introduce a novel method to detect and track line structures in data streams of event-based silicon retinae [also known as dynamic vision sensors (DVS)]. In contrast to conventional cameras, these biologically inspired sensors generate a quasicontinuous stream of vision information analogous to the information stream created by the ganglion cells in mammal retinae. All pixels of DVS operate asynchronously without a periodic sampling rate and emit a so-called DVS address event as soon as they perceive a luminance change exceeding an adjustable threshold. We use the high temporal resolution achieved by the DVS to track features continuously through time instead of only at fixed points in time. The focus of this work lies on tracking lines in a mostly static environment which is observed by a moving camera, a typical setting in mobile robotics. Since DVS events are mostly generated at object boundaries and edges which in man-made environments often form lines they were chosen as feature to track. Our method is based on detecting planes of DVS address events in x-y-t-space and tracing these planes through time. It is robust against noise and runs in real time on a standard computer, hence it is suitable for low latency robotics. The efficacy and performance are evaluated on real-world data sets which show artificial structures in an office-building using event data for tracking and frame data for ground-truth estimation from a DAVIS240C sensor. PMID:29515386

  13. Dwarf planet Ceres: Ellipsoid dimensions and rotational pole from Keck and VLT adaptive optics images

    NASA Astrophysics Data System (ADS)

    Drummond, J. D.; Carry, B.; Merline, W. J.; Dumas, C.; Hammel, H.; Erard, S.; Conrad, A.; Tamblyn, P.; Chapman, C. R.

    2014-07-01

    The dwarf planet (1) Ceres, the largest object between Mars and Jupiter, is the target of the NASA Dawn mission, and we seek a comprehensive description of the spin-axis orientation and dimensions of Ceres in order to support the early science operations at the rendezvous in 2015. We have obtained high-angular resolution images using adaptive optics cameras at the W.M. Keck Observatory and the ESO VLT over ten dates between 2001 and 2010, confirming that the shape of Ceres is well described by an oblate spheroid. We derive equatorial and polar diameters of 967 ± 10 km and 892 ± 10 km, respectively, for a model that includes fading of brightness towards the terminator, presumably linked to limb darkening. These dimensions lie between values derived from a previous analysis of a subset of these images obtained at Keck by Carry et al. (Carry et al. [2008]. Astron. Astrophys. 478 (4), 235-244) and a study of Hubble Space Telescope observations (Thomas et al. [2005]. Nature 437, 224-226). Although the dimensions are 1-2% smaller than those found from the HST, the oblateness is similar. We find the spin-vector coordinates of Ceres to lie at (287°, +64°) in equatorial EQJ2000 reference frame (346°, +82° in ecliptic ECJ2000 coordinates), yielding a small obliquity of 3°. While this is in agreement with the aforementioned studies, we have improved the accuracy of the pole determination, which we set at a 3° radius.

  14. Camera-Based Microswitch Technology to Monitor Mouth, Eyebrow, and Eyelid Responses of Children with Profound Multiple Disabilities

    ERIC Educational Resources Information Center

    Lancioni, Giulio E.; Bellini, Domenico; Oliva, Doretta; Singh, Nirbhay N.; O'Reilly, Mark F.; Lang, Russell; Didden, Robert

    2011-01-01

    A camera-based microswitch technology was recently used to successfully monitor small eyelid and mouth responses of two adults with profound multiple disabilities (Lancioni et al., Res Dev Disab 31:1509-1514, 2010a). This technology, in contrast with the traditional optic microswitches used for those responses, did not require support frames on…

  15. Camera-Based Microswitch Technology for Eyelid and Mouth Responses of Persons with Profound Multiple Disabilities: Two Case Studies

    ERIC Educational Resources Information Center

    Lancioni, Giulio E.; Bellini, Domenico; Oliva, Doretta; Singh, Nirbhay N.; O'Reilly, Mark F.; Sigafoos, Jeff

    2010-01-01

    These two studies assessed camera-based microswitch technology for eyelid and mouth responses of two persons with profound multiple disabilities and minimal motor behavior. This technology, in contrast with the traditional optic microswitches used for those responses, did not require support frames on the participants' face but only small color…

  16. Solid state television camera

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The design, fabrication, and tests of a solid state television camera using a new charge-coupled imaging device are reported. An RCA charge-coupled device arranged in a 512 by 320 format and directly compatible with EIA format standards was the sensor selected. This is a three-phase, sealed surface-channel array that has 163,840 sensor elements, which employs a vertical frame transfer system for image readout. Included are test results of the complete camera system, circuit description and changes to such circuits as a result of integration and test, maintenance and operation section, recommendations to improve the camera system, and a complete set of electrical and mechanical drawing sketches.

  17. Passive stand-off terahertz imaging with 1 hertz frame rate

    NASA Astrophysics Data System (ADS)

    May, T.; Zieger, G.; Anders, S.; Zakosarenko, V.; Starkloff, M.; Meyer, H.-G.; Thorwirth, G.; Kreysa, E.

    2008-04-01

    Terahertz (THz) cameras are expected to be a powerful tool for future security applications. If such a technology shall be useful for typical security scenarios (e.g. airport check-in) it has to meet some minimum standards. A THz camera should record images with video rate from a safe distance (stand-off). Although active cameras are conceivable, a passive system has the benefit of concealed operation. Additionally, from an ethic perspective, the lack of exposure to a radiation source is a considerable advantage in public acceptance. Taking all these requirements into account, only cooled detectors are able to achieve the needed sensitivity. A big leap forward in the detector performance and scalability was driven by the astrophysics community. Superconducting bolometers and midsized arrays of them have been developed and are in routine use. Although devices with many pixels are foreseeable nowadays a device with an additional scanning optic is the straightest way to an imaging system with a useful resolution. We demonstrate the capabilities of a concept for a passive Terahertz video camera based on superconducting technology. The actual prototype utilizes a small Cassegrain telescope with a gyrating secondary mirror to record 2 kilopixel THz images with 1 second frame rate.

  18. Geometric Integration of Hybrid Correspondences for RGB-D Unidirectional Tracking

    PubMed Central

    Tang, Shengjun; Chen, Wu; Wang, Weixi; Li, Xiaoming; Li, Wenbin; Huang, Zhengdong; Hu, Han; Guo, Renzhong

    2018-01-01

    Traditionally, visual-based RGB-D SLAM systems only use correspondences with valid depth values for camera tracking, thus ignoring the regions without 3D information. Due to the strict limitation on measurement distance and view angle, such systems adopt only short-range constraints which may introduce larger drift errors during long-distance unidirectional tracking. In this paper, we propose a novel geometric integration method that makes use of both 2D and 3D correspondences for RGB-D tracking. Our method handles the problem by exploring visual features both when depth information is available and when it is unknown. The system comprises two parts: coarse pose tracking with 3D correspondences, and geometric integration with hybrid correspondences. First, the coarse pose tracking generates the initial camera pose using 3D correspondences with frame-by-frame registration. The initial camera poses are then used as inputs for the geometric integration model, along with 3D correspondences, 2D-3D correspondences and 2D correspondences identified from frame pairs. The initial 3D location of the correspondence is determined in two ways, from depth image and by using the initial poses to triangulate. The model improves the camera poses and decreases drift error during long-distance RGB-D tracking iteratively. Experiments were conducted using data sequences collected by commercial Structure Sensors. The results verify that the geometric integration of hybrid correspondences effectively decreases the drift error and improves mapping accuracy. Furthermore, the model enables a comparative and synergistic use of datasets, including both 2D and 3D features. PMID:29723974

  19. Geometric Integration of Hybrid Correspondences for RGB-D Unidirectional Tracking.

    PubMed

    Tang, Shengjun; Chen, Wu; Wang, Weixi; Li, Xiaoming; Darwish, Walid; Li, Wenbin; Huang, Zhengdong; Hu, Han; Guo, Renzhong

    2018-05-01

    Traditionally, visual-based RGB-D SLAM systems only use correspondences with valid depth values for camera tracking, thus ignoring the regions without 3D information. Due to the strict limitation on measurement distance and view angle, such systems adopt only short-range constraints which may introduce larger drift errors during long-distance unidirectional tracking. In this paper, we propose a novel geometric integration method that makes use of both 2D and 3D correspondences for RGB-D tracking. Our method handles the problem by exploring visual features both when depth information is available and when it is unknown. The system comprises two parts: coarse pose tracking with 3D correspondences, and geometric integration with hybrid correspondences. First, the coarse pose tracking generates the initial camera pose using 3D correspondences with frame-by-frame registration. The initial camera poses are then used as inputs for the geometric integration model, along with 3D correspondences, 2D-3D correspondences and 2D correspondences identified from frame pairs. The initial 3D location of the correspondence is determined in two ways, from depth image and by using the initial poses to triangulate. The model improves the camera poses and decreases drift error during long-distance RGB-D tracking iteratively. Experiments were conducted using data sequences collected by commercial Structure Sensors. The results verify that the geometric integration of hybrid correspondences effectively decreases the drift error and improves mapping accuracy. Furthermore, the model enables a comparative and synergistic use of datasets, including both 2D and 3D features.

  20. Single-camera visual odometry to track a surgical X-ray C-arm base.

    PubMed

    Esfandiari, Hooman; Lichti, Derek; Anglin, Carolyn

    2017-12-01

    This study provides a framework for a single-camera odometry system for localizing a surgical C-arm base. An application-specific monocular visual odometry system (a downward-looking consumer-grade camera rigidly attached to the C-arm base) is proposed in this research. The cumulative dead-reckoning estimation of the base is extracted based on frame-to-frame homography estimation. Optical-flow results are utilized to feed the odometry. Online positional and orientation parameters are then reported. Positional accuracy of better than 2% (of the total traveled distance) for most of the cases and 4% for all the cases studied and angular accuracy of better than 2% (of absolute cumulative changes in orientation) were achieved with this method. This study provides a robust and accurate tracking framework that not only can be integrated with the current C-arm joint-tracking system (i.e. TC-arm) but also is capable of being employed for similar applications in other fields (e.g. robotics).

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ranson, W.F.; Schaeffel, J.A.; Murphree, E.A.

    The response of prestressed and preheated plates subject to an exponentially decaying blast load was experimentally determined. A grid was reflected from the front surface of the plate and the response was recorded with a high speed camera. The camera used in this analysis was a rotating drum camera operating at 20,000 frames per second with a maximum of 224 frames at 39 microseconds separation. Inplane tension loads were applied to the plate by means of air cylinders. Maximum biaxial load applied to the plate was 500 pounds. Plate preheating was obtained with resistance heaters located in the specimen platemore » holder with a maximum capability of 500F. Data analysis was restricted to the maximum conditions at the center of the plate. Strains were determined from the photographic data and the stresses were calculated from the strain data. Results were obtained from zero preload conditions to a maximum of 480 pounds inplane tension loads and a plate temperature of 490F. The blast load ranged from 6 to 23 psi.« less

  2. Development of a Portable 3CCD Camera System for Multispectral Imaging of Biological Samples

    PubMed Central

    Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S.

    2014-01-01

    Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples. PMID:25350510

  3. Fast visible imaging of turbulent plasma in TORPEX

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iraji, D.; Diallo, A.; Fasoli, A.

    2008-10-15

    Fast framing cameras constitute an important recent diagnostic development aimed at monitoring light emission from magnetically confined plasmas, and are now commonly used to study turbulence in plasmas. In the TORPEX toroidal device [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)], low frequency electrostatic fluctuations associated with drift-interchange waves are routinely measured by means of extensive sets of Langmuir probes. A Photron Ultima APX-RS fast framing camera has recently been acquired to complement Langmuir probe measurements, which allows comparing statistical and spectral properties of visible light and electrostatic fluctuations. A direct imaging system has been developed, which allows viewingmore » the light, emitted from microwave-produced plasmas tangentially and perpendicularly to the toroidal direction. The comparison of the probability density function, power spectral density, and autoconditional average of the camera data to those obtained using a multiple head electrostatic probe covering the plasma cross section shows reasonable agreement in the case of perpendicular view and in the plasma region where interchange modes dominate.« less

  4. Iodine filter imaging system for subtraction angiography using synchrotron radiation

    NASA Astrophysics Data System (ADS)

    Umetani, K.; Ueda, K.; Takeda, T.; Itai, Y.; Akisada, M.; Nakajima, T.

    1993-11-01

    A new type of real-time imaging system was developed for transvenous coronary angiography. A combination of an iodine filter and a single energy broad-bandwidth X-ray produces two-energy images for the iodine K-edge subtraction technique. X-ray images are sequentially converted to visible images by an X-ray image intensifier. By synchronizing the timing of the movement of the iodine filter into and out of the X-ray beam, two output images of the image intensifier are focused side by side on the photoconductive layer of a camera tube by an oscillating mirror. Both images are read out by electron beam scanning of a 1050-scanning-line video camera within a camera frame time of 66.7 ms. One hundred ninety two pairs of iodine-filtered and non-iodine-filtered images are stored in the frame memory at a rate of 15 pairs/s. In vivo subtracted images of coronary arteries in dogs were obtained in the form of motion pictures.

  5. Pre-dawn stomatal opening does not substantially enhance early-morning photosynthesis in Helianthus annuus.

    PubMed

    Auchincloss, Lisa; Easlon, Hsien M; Levine, Diedre; Donovan, Lisa; Richards, James H

    2014-06-01

    Most C3 plant species have partially open stomata during the night especially in the 3-5 h before dawn. This pre-dawn stomatal opening has been hypothesized to enhance early-morning photosynthesis (A) by reducing diffusion limitations to CO2 at dawn. We tested this hypothesis in cultivated Helianthus annuus using whole-shoot gas exchange, leaf level gas exchange and modelling approaches. One hour pre-dawn low-humidity treatments were used to reduce pre-dawn stomatal conductance (g). At the whole-shoot level, a difference of pre-dawn g (0.40 versus 0.17 mol m(-2) s(-1)) did not significantly affect A during the first hour after dawn. Shorter term effects were investigated with leaf level gas exchange measurements and a difference of pre-dawn g (0.10 versus 0.04 mol m(-2) s(-1)) affected g and A for only 5 min after dawn. The potential effects of a wider range of stomatal apertures were explored with an empirical model of the relationship between A and intercellular CO2 concentration during the half-hour after dawn. Modelling results demonstrated that even extremely low pre-dawn stomatal conductance values have only a minimal effect on early-morning A for a few minutes after dawn. Thus, we found no evidence that pre-dawn stomatal opening enhances A.

  6. Polarization patterns of the twilight sky

    NASA Astrophysics Data System (ADS)

    Cronin, Thomas W.; Warrant, Eric J.; Greiner, Birgit

    2005-08-01

    Although natural light sources produce depolarized light, patterns of partially linearly polarized light appear in the sky due to scattering from air molecules, dust, and aerosols. Many animals, including bees and ants, orient themselves to patterns of polarization that are present in daytime skies, when the intensity is high and skylight polarization is strong and predictable. The halicitid bee Megalopta genalis inhabits rainforests in Central America. Unlike typical bees, it forages before sunrise and after sunset, when light intensities under the forest canopy are very low, and must find its way to food sources and return to its nest in visually challenging circumstances. An important cue for the orientation could be patterns of polarization in the twilight sky. Therefore, we used a calibrated digital camera to image skylight polarization in an overhead patch of sky, 87.6° across, before dawn on Barro Colorado Island in Panama, where the bees are found. We simultaneously measured the spectral properties of polarized light in a cloudless patch of sky 15° across centered on the zenith. We also performed full-sky imaging of polarization before dawn and after dusk on Lizard Island in Australia, another tropical island. During twilight, celestial polarized light occurs in a wide band stretching perpendicular to the location of the hidden sun and reaching typical degrees of polarization near 80% at wavelengths >600 nm. This pattern appears about 45 minutes before local sunrise or disappears 45 minutes after local sunset (about 20 minutes after the onset of astronomical twilight at dawn, or before its end at dusk) and extends with little change through the entire twilight period. Such a strong and reliable orientation cue could be used for flight orientation by any animal with polarization sensitivity that navigates during twilight.

  7. FC colour images of dwarf planet Ceres reveal a complicated geological history

    NASA Astrophysics Data System (ADS)

    Nathues, A.; Hoffmann, M.; Platz, T.; Thangjam, G. S.; Cloutis, E. A.; Reddy, V.; Le Corre, L.; Li, J.-Y.; Mengel, K.; Rivkin, A.; Applin, D. M.; Schaefer, M.; Christensen, U.; Sierks, H.; Ripken, J.; Schmidt, B. E.; Hiesinger, H.; Sykes, M. V.; Sizemore, H. G.; Preusker, F.; Russell, C. T.

    2016-12-01

    The dwarf planet Ceres (equatorial diameter 963km) is the largest object that has remained in the main asteroid belt (Russell and Raymond, 2012), while most large bodies have been destroyed or removed by dynamical processes (Petit et al. 2001; Minton and Malhotra, 2009). Pre-Dawn investigations (McCord and Sotin, 2005; Castillo-Rogez and McCord, 2010; Castillo-Rogez et al., 2011) suggest that Ceres is a thermally evolved, but still volatile-rich body with potential geological activity, that was never completely molten, but possibly differentiated into a rocky core, an ice-rich mantle, and may contain remnant internal liquid water. Thermal alteration should contribute to producing a (dark) carbonaceous chondritic-like surface (McCord and Sotin, 2005; Castillo-Rogez and McCord, 2010; Castillo-Rogez et al., 2011; Nathues et al., 2015) containing ammoniated phyllosilicates (King et al., 1992; De Sanctis et al., 2015 and 2016). Here we show and analyse global contrast-rich colour mosaics, derived from a camera on-board Dawn at Ceres (Russell et al., 2016). Colours are unexpectedly more diverse on global scale than anticipated by Hubble Space Telescope (Li et al., 2006) and ground-based observations (Reddy et al. 2015). Dawn data led to the identification of five major colour units. The youngest units identified by crater counting, termed bright and bluish units, are exclusively found at equatorial and intermediate latitudes. We identified correlations between the distribution of the colour units, crater size, and formation age, inferring a crustal stratigraphy. Surface brightness and spectral properties are not correlated. The youngest surface features are the bright spots at crater Occator ( Ø 92km). Their colour spectra are highly consistent with the presence of carbonates while most of the remaining surface resembles modifications of various types of ordinary carbonaceous chondrites.

  8. Measuring full-field displacement spectral components using photographs taken with a DSLR camera via an analogue Fourier integral

    NASA Astrophysics Data System (ADS)

    Javh, Jaka; Slavič, Janko; Boltežar, Miha

    2018-02-01

    Instantaneous full-field displacement fields can be measured using cameras. In fact, using high-speed cameras full-field spectral information up to a couple of kHz can be measured. The trouble is that high-speed cameras capable of measuring high-resolution fields-of-view at high frame rates prove to be very expensive (from tens to hundreds of thousands of euro per camera). This paper introduces a measurement set-up capable of measuring high-frequency vibrations using slow cameras such as DSLR, mirrorless and others. The high-frequency displacements are measured by harmonically blinking the lights at specified frequencies. This harmonic blinking of the lights modulates the intensity changes of the filmed scene and the camera-image acquisition makes the integration over time, thereby producing full-field Fourier coefficients of the filmed structure's displacements.

  9. Masked-backlighter technique used to simultaneously image x-ray absorption and x-ray emission from an inertial confinement fusion plasma.

    PubMed

    Marshall, F J; Radha, P B

    2014-11-01

    A method to simultaneously image both the absorption and the self-emission of an imploding inertial confinement fusion plasma has been demonstrated on the OMEGA Laser System. The technique involves the use of a high-Z backlighter, half of which is covered with a low-Z material, and a high-speed x-ray framing camera aligned to capture images backlit by this masked backlighter. Two strips of the four-strip framing camera record images backlit by the high-Z portion of the backlighter, while the other two strips record images aligned with the low-Z portion of the backlighter. The emission from the low-Z material is effectively eliminated by a high-Z filter positioned in front of the framing camera, limiting the detected backlighter emission to that of the principal emission line of the high-Z material. As a result, half of the images are of self-emission from the plasma and the other half are of self-emission plus the backlighter. The advantage of this technique is that the self-emission simultaneous with backlighter absorption is independently measured from a nearby direction. The absorption occurs only in the high-Z backlit frames and is either spatially separated from the emission or the self-emission is suppressed by filtering, or by using a backlighter much brighter than the self-emission, or by subtraction. The masked-backlighter technique has been used on the OMEGA Laser System to simultaneously measure the emission profiles and the absorption profiles of polar-driven implosions.

  10. Triton Mosaic

    NASA Image and Video Library

    1999-08-25

    Mosaic of Triton constructed from 16 individual images. After globally minimizing the camera pointing errors, the frames we reprocessed by map projections, photometric function removal and placement in the mosaic.

  11. Texton-based super-resolution for achieving high spatiotemporal resolution in hybrid camera system

    NASA Astrophysics Data System (ADS)

    Kamimura, Kenji; Tsumura, Norimichi; Nakaguchi, Toshiya; Miyake, Yoichi

    2010-05-01

    Many super-resolution methods have been proposed to enhance the spatial resolution of images by using iteration and multiple input images. In a previous paper, we proposed the example-based super-resolution method to enhance an image through pixel-based texton substitution to reduce the computational cost. In this method, however, we only considered the enhancement of a texture image. In this study, we modified this texton substitution method for a hybrid camera to reduce the required bandwidth of a high-resolution video camera. We applied our algorithm to pairs of high- and low-spatiotemporal-resolution videos, which were synthesized to simulate a hybrid camera. The result showed that the fine detail of the low-resolution video can be reproduced compared with bicubic interpolation and the required bandwidth could be reduced to about 1/5 in a video camera. It was also shown that the peak signal-to-noise ratios (PSNRs) of the images improved by about 6 dB in a trained frame and by 1.0-1.5 dB in a test frame, as determined by comparison with the processed image using bicubic interpolation, and the average PSNRs were higher than those obtained by the well-known Freeman’s patch-based super-resolution method. Compared with that of the Freeman’s patch-based super-resolution method, the computational time of our method was reduced to almost 1/10.

  12. Toward real-time quantum imaging with a single pixel camera

    DOE PAGES

    Lawrie, B. J.; Pooser, R. C.

    2013-03-19

    In this paper, we present a workbench for the study of real-time quantum imaging by measuring the frame-by-frame quantum noise reduction of multi-spatial-mode twin beams generated by four wave mixing in Rb vapor. Exploiting the multiple spatial modes of this squeezed light source, we utilize spatial light modulators to selectively pass macropixels of quantum correlated modes from each of the twin beams to a high quantum efficiency balanced detector. Finally, in low-light-level imaging applications, the ability to measure the quantum correlations between individual spatial modes and macropixels of spatial modes with a single pixel camera will facilitate compressive quantum imagingmore » with sensitivity below the photon shot noise limit.« less

  13. An Application for Driver Drowsiness Identification based on Pupil Detection using IR Camera

    NASA Astrophysics Data System (ADS)

    Kumar, K. S. Chidanand; Bhowmick, Brojeshwar

    A Driver drowsiness identification system has been proposed that generates alarms when driver falls asleep during driving. A number of different physical phenomena can be monitored and measured in order to detect drowsiness of driver in a vehicle. This paper presents a methodology for driver drowsiness identification using IR camera by detecting and tracking pupils. The face region is first determined first using euler number and template matching. Pupils are then located in the face region. In subsequent frames of video, pupils are tracked in order to find whether the eyes are open or closed. If eyes are closed for several consecutive frames then it is concluded that the driver is fatigued and alarm is generated.

  14. Compact Kirkpatrick–Baez microscope mirrors for imaging laser-plasma x-ray emission

    DOE PAGES

    Marshall, F. J.

    2012-07-18

    Compact Kirkpatrick–Baez microscope mirror components for use in imaging laser-plasma x-ray emission have been manufactured, coated, and tested. A single mirror pair has dimensions of 14 × 7 × 9 mm and a best resolution of ~5 μm. The mirrors are coated with Ir providing a useful energy range of 2-8 keV when operated at a grazing angle of 0.7°. The mirrors can be circularly arranged to provide 16 images of the target emission a configuration best suited for use in combination with a custom framing camera. As a result, an alternative arrangement of the mirrors would allow alignment ofmore » the images with a fourstrip framing camera.« less

  15. A real-time ultrasonic field mapping system using a Fabry Pérot single pixel camera for 3D photoacoustic imaging

    NASA Astrophysics Data System (ADS)

    Huynh, Nam; Zhang, Edward; Betcke, Marta; Arridge, Simon R.; Beard, Paul; Cox, Ben

    2015-03-01

    A system for dynamic mapping of broadband ultrasound fields has been designed, with high frame rate photoacoustic imaging in mind. A Fabry-Pérot interferometric ultrasound sensor was interrogated using a coherent light single-pixel camera. Scrambled Hadamard measurement patterns were used to sample the acoustic field at the sensor, and either a fast Hadamard transform or a compressed sensing reconstruction algorithm were used to recover the acoustic pressure data. Frame rates of 80 Hz were achieved for 32x32 images even though no specialist hardware was used for the on-the-fly reconstructions. The ability of the system to obtain photocacoustic images with data compressions as low as 10% was also demonstrated.

  16. Correction And Use Of Jitter In Television Images

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B.; Fender, Derek H.; Fender, Antony R. H.

    1989-01-01

    Proposed system stabilizes jittering television image and/or measures jitter to extract information on motions of objects in image. Alternative version, system controls lateral motion on camera to generate stereoscopic views to measure distances to objects. In another version, motion of camera controlled to keep object in view. Heart of system is digital image-data processor called "jitter-miser", which includes frame buffer and logic circuits to correct for jitter in image. Signals from motion sensors on camera sent to logic circuits and processed into corrections for motion along and across line of sight.

  17. Rapid and highly integrated FPGA-based Shack-Hartmann wavefront sensor for adaptive optics system

    NASA Astrophysics Data System (ADS)

    Chen, Yi-Pin; Chang, Chia-Yuan; Chen, Shean-Jen

    2018-02-01

    In this study, a field programmable gate array (FPGA)-based Shack-Hartmann wavefront sensor (SHWS) programmed on LabVIEW can be highly integrated into customized applications such as adaptive optics system (AOS) for performing real-time wavefront measurement. Further, a Camera Link frame grabber embedded with FPGA is adopted to enhance the sensor speed reacting to variation considering its advantage of the highest data transmission bandwidth. Instead of waiting for a frame image to be captured by the FPGA, the Shack-Hartmann algorithm are implemented in parallel processing blocks design and let the image data transmission synchronize with the wavefront reconstruction. On the other hand, we design a mechanism to control the deformable mirror in the same FPGA and verify the Shack-Hartmann sensor speed by controlling the frequency of the deformable mirror dynamic surface deformation. Currently, this FPGAbead SHWS design can achieve a 266 Hz cyclic speed limited by the camera frame rate as well as leaves 40% logic slices for additionally flexible design.

  18. Online tracking of outdoor lighting variations for augmented reality with moving cameras.

    PubMed

    Liu, Yanli; Granier, Xavier

    2012-04-01

    In augmented reality, one of key tasks to achieve a convincing visual appearance consistency between virtual objects and video scenes is to have a coherent illumination along the whole sequence. As outdoor illumination is largely dependent on the weather, the lighting condition may change from frame to frame. In this paper, we propose a full image-based approach for online tracking of outdoor illumination variations from videos captured with moving cameras. Our key idea is to estimate the relative intensities of sunlight and skylight via a sparse set of planar feature-points extracted from each frame. To address the inevitable feature misalignments, a set of constraints are introduced to select the most reliable ones. Exploiting the spatial and temporal coherence of illumination, the relative intensities of sunlight and skylight are finally estimated by using an optimization process. We validate our technique on a set of real-life videos and show that the results with our estimations are visually coherent along the video sequences.

  19. Eye pupil detection system using an ensemble of regression forest and fast radial symmetry transform with a near infrared camera

    NASA Astrophysics Data System (ADS)

    Jeong, Mira; Nam, Jae-Yeal; Ko, Byoung Chul

    2017-09-01

    In this paper, we focus on pupil center detection in various video sequences that include head poses and changes in illumination. To detect the pupil center, we first find four eye landmarks in each eye by using cascade local regression based on a regression forest. Based on the rough location of the pupil, a fast radial symmetric transform is applied using the previously found pupil location to rearrange the fine pupil center. As the final step, the pupil displacement is estimated between the previous frame and the current frame to maintain the level of accuracy against a false locating result occurring in a particular frame. We generated a new face dataset, called Keimyung University pupil detection (KMUPD), with infrared camera. The proposed method was successfully applied to the KMUPD dataset, and the results indicate that its pupil center detection capability is better than that of other methods and with a shorter processing time.

  20. Alignment of cryo-EM movies of individual particles by optimization of image translations.

    PubMed

    Rubinstein, John L; Brubaker, Marcus A

    2015-11-01

    Direct detector device (DDD) cameras have revolutionized single particle electron cryomicroscopy (cryo-EM). In addition to an improved camera detective quantum efficiency, acquisition of DDD movies allows for correction of movement of the specimen, due to both instabilities in the microscope specimen stage and electron beam-induced movement. Unlike specimen stage drift, beam-induced movement is not always homogeneous within an image. Local correlation in the trajectories of nearby particles suggests that beam-induced motion is due to deformation of the ice layer. Algorithms have already been described that can correct movement for large regions of frames and for >1 MDa protein particles. Another algorithm allows individual <1 MDa protein particle trajectories to be estimated, but requires rolling averages to be calculated from frames and fits linear trajectories for particles. Here we describe an algorithm that allows for individual <1 MDa particle images to be aligned without frame averaging or linear trajectories. The algorithm maximizes the overall correlation of the shifted frames with the sum of the shifted frames. The optimum in this single objective function is found efficiently by making use of analytically calculated derivatives of the function. To smooth estimates of particle trajectories, rapid changes in particle positions between frames are penalized in the objective function and weighted averaging of nearby trajectories ensures local correlation in trajectories. This individual particle motion correction, in combination with weighting of Fourier components to account for increasing radiation damage in later frames, can be used to improve 3-D maps from single particle cryo-EM. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. The Input-Interface of Webcam Applied in 3D Virtual Reality Systems

    ERIC Educational Resources Information Center

    Sun, Huey-Min; Cheng, Wen-Lin

    2009-01-01

    Our research explores a virtual reality application based on Web camera (Webcam) input-interface. The interface can replace with the mouse to control direction intention of a user by the method of frame difference. We divide a frame into nine grids from Webcam and make use of the background registration to compute the moving object. In order to…

  2. View of Saudi Arabia and north eastern Africa from the Apollo 17 spacecraft

    NASA Image and Video Library

    1972-12-09

    AS17-148-22718 (7-19 Dec. 1972) --- This excellent view of Saudi Arabia and the north eastern portion of the African continent was photographed by the Apollo 17 astronauts with a hand-held camera on their trans-lunar coast toward man's last lunar visit. Egypt, Sudan, Ethiopia are some of the African nations are visible. Iran, Iraq, Jordan are not so clearly visible because of cloud cover and their particular location in the picture. India is dimly visible at right of frame. The Red Sea is seen entirely in this one single frame, a rare occurrence in Apollo photography or any photography taken from manned spacecraft. The Gulf of Suez, the Dead Sea, Gulf of Aden, Persian Gulf and Gulf of Oman are also visible. This frame is one of 169 frames on film magazine NN carried aboard Apollo 17, all of which are SO368 (color) film. A 250mm lens on a 70mm Hasselblad camera recorded the image, one of 92 taken during the trans-lunar coast. Note AS17-148-22727 (also magazine NN) for an excellent full Earth picture showing the entire African continent.

  3. DAWN: Dynamic Ad-hoc Wireless Network

    DTIC Science & Technology

    2016-06-19

    DAWN: Dynamic Ad-hoc Wireless Network The DAWN (Dynamic Ad-hoc Wireless Networks) project is developing a general theory of complex and dynamic... wireless communication networks. To accomplish this, DAWN adopts a very different approach than those followed in the past and summarized above. DAWN... wireless communication networks. The members of DAWN investigated difference aspects of wireless mobile ad hoc networks (MANET). The views, opinions and/or

  4. C-RED One and C-RED2: SWIR high-performance cameras using Saphira e-APD and Snake InGaAs detectors

    NASA Astrophysics Data System (ADS)

    Gach, Jean-Luc; Feautrier, Philippe; Stadler, Eric; Clop, Fabien; Lemarchand, Stephane; Carmignani, Thomas; Wanwanscappel, Yann; Boutolleau, David

    2018-02-01

    After the development of the OCAM2 EMCCD fast visible camera dedicated to advanced adaptive optics wavefront sensing, First Light Imaging moved to the SWIR fast cameras with the development of the C-RED One and the C-RED 2 cameras. First Light Imaging's C-RED One infrared camera is capable of capturing up to 3500 full frames per second with a subelectron readout noise and very low background. C-RED One is based on the last version of the SAPHIRA detector developed by Leonardo UK. This breakthrough has been made possible thanks to the use of an e-APD infrared focal plane array which is a real disruptive technology in imagery. C-RED One is an autonomous system with an integrated cooling system and a vacuum regeneration system. It operates its sensor with a wide variety of read out techniques and processes video on-board thanks to an FPGA. We will show its performances and expose its main features. In addition to this project, First Light Imaging developed an InGaAs 640x512 fast camera with unprecedented performances in terms of noise, dark and readout speed based on the SNAKE SWIR detector from Sofradir. The camera was called C-RED 2. The C-RED 2 characteristics and performances will be described. The C-RED One project has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement N° 673944. The C-RED 2 development is supported by the "Investments for the future" program and the Provence Alpes Côte d'Azur Region, in the frame of the CPER.

  5. Choreographing the Frame: A Critical Investigation into How Dance for the Camera Extends the Conceptual and Artistic Boundaries of Dance

    ERIC Educational Resources Information Center

    Preston, Hilary

    2006-01-01

    This essay investigates the collaboration between dance and choreographic practice and film/video medium in a contemporary context. By looking specifically at dance made for the camera and the proliferation of dance-film/video, critical issues will be explored that have surfaced in response to this burgeoning form. Presenting a view of avant-garde…

  6. Double and Multiple Star Measurements in the Northern Sky with a 10" Newtonian and a Fast CCD Camera in 2006 through 2009

    NASA Astrophysics Data System (ADS)

    Anton, Rainer

    2010-07-01

    Using a 10" Newtonian and a fast CCD camera, recordings of double and multiple stars were made at high frame rates with a notebook computer. From superpositions of "lucky images", measurements of 139 systems were obtained and compared with literature data. B/w and color images of some noteworthy systems are also presented.

  7. Camera selection for real-time in vivo radiation treatment verification systems using Cherenkov imaging.

    PubMed

    Andreozzi, Jacqueline M; Zhang, Rongxiao; Glaser, Adam K; Jarvis, Lesley A; Pogue, Brian W; Gladstone, David J

    2015-02-01

    To identify achievable camera performance and hardware needs in a clinical Cherenkov imaging system for real-time, in vivo monitoring of the surface beam profile on patients, as novel visual information, documentation, and possible treatment verification for clinicians. Complementary metal-oxide-semiconductor (CMOS), charge-coupled device (CCD), intensified charge-coupled device (ICCD), and electron multiplying-intensified charge coupled device (EM-ICCD) cameras were investigated to determine Cherenkov imaging performance in a clinical radiotherapy setting, with one emphasis on the maximum supportable frame rate. Where possible, the image intensifier was synchronized using a pulse signal from the Linac in order to image with room lighting conditions comparable to patient treatment scenarios. A solid water phantom irradiated with a 6 MV photon beam was imaged by the cameras to evaluate the maximum frame rate for adequate Cherenkov detection. Adequate detection was defined as an average electron count in the background-subtracted Cherenkov image region of interest in excess of 0.5% (327 counts) of the 16-bit maximum electron count value. Additionally, an ICCD and an EM-ICCD were each used clinically to image two patients undergoing whole-breast radiotherapy to compare clinical advantages and limitations of each system. Intensifier-coupled cameras were required for imaging Cherenkov emission on the phantom surface with ambient room lighting; standalone CMOS and CCD cameras were not viable. The EM-ICCD was able to collect images from a single Linac pulse delivering less than 0.05 cGy of dose at 30 frames/s (fps) and pixel resolution of 512 × 512, compared to an ICCD which was limited to 4.7 fps at 1024 × 1024 resolution. An intensifier with higher quantum efficiency at the entrance photocathode in the red wavelengths [30% quantum efficiency (QE) vs previous 19%] promises at least 8.6 fps at a resolution of 1024 × 1024 and lower monetary cost than the EM-ICCD. The ICCD with an intensifier better optimized for red wavelengths was found to provide the best potential for real-time display (at least 8.6 fps) of radiation dose on the skin during treatment at a resolution of 1024 × 1024.

  8. Using the OOI Cabled Array HD Camera to Explore Geophysical and Oceanographic Problems at Axial Seamount

    NASA Astrophysics Data System (ADS)

    Crone, T. J.; Knuth, F.; Marburg, A.

    2016-12-01

    A broad array of Earth science problems can be investigated using high-definition video imagery from the seafloor, ranging from those that are geological and geophysical in nature, to those that are biological and water-column related. A high-definition video camera was installed as part of the Ocean Observatory Initiative's core instrument suite on the Cabled Array, a real-time fiber optic data and power system that stretches from the Oregon Coast to Axial Seamount on the Juan de Fuca Ridge. This camera runs a 14-minute pan-tilt-zoom routine 8 times per day, focusing on locations of scientific interest on and near the Mushroom vent in the ASHES hydrothermal field inside the Axial caldera. The system produces 13 GB of lossless HD video every 3 hours, and at the time of this writing it has generated 2100 recordings totaling 28.5 TB since it began streaming data into the OOI archive in August of 2015. Because of the large size of this dataset, downloading the entirety of the video for long timescale investigations is not practical. We are developing a set of user-side tools for downloading single frames and frame ranges from the OOI HD camera raw data archive to aid users interested in using these data for their research. We use these tools to download about one year's worth of partial frame sets to investigate several questions regarding the hydrothermal system at ASHES, including the variability of bacterial "floc" in the water-column, and changes in high temperature fluid fluxes using optical flow techniques. We show that while these user-side tools can facilitate rudimentary scientific investigations using the HD camera data, a server-side computing environment that allows users to explore this dataset without downloading any raw video will be required for more advanced investigations to flourish.

  9. Sensors for 3D Imaging: Metric Evaluation and Calibration of a CCD/CMOS Time-of-Flight Camera.

    PubMed

    Chiabrando, Filiberto; Chiabrando, Roberto; Piatti, Dario; Rinaudo, Fulvio

    2009-01-01

    3D imaging with Time-of-Flight (ToF) cameras is a promising recent technique which allows 3D point clouds to be acquired at video frame rates. However, the distance measurements of these devices are often affected by some systematic errors which decrease the quality of the acquired data. In order to evaluate these errors, some experimental tests on a CCD/CMOS ToF camera sensor, the SwissRanger (SR)-4000 camera, were performed and reported in this paper. In particular, two main aspects are treated: the calibration of the distance measurements of the SR-4000 camera, which deals with evaluation of the camera warm up time period, the distance measurement error evaluation and a study of the influence on distance measurements of the camera orientation with respect to the observed object; the second aspect concerns the photogrammetric calibration of the amplitude images delivered by the camera using a purpose-built multi-resolution field made of high contrast targets.

  10. Water Ice Abundance on Ceres

    NASA Image and Video Library

    2016-12-15

    This frame from an animation shows dwarf planet Ceres overlaid with the concentration of hydrogen determined from data acquired by the gamma ray and neutron detector GRaND instrument aboard NASA Dawn spacecraft. The hydrogen is in the upper yard (or meter) of regolith, the loose surface material on Ceres. The color scale gives hydrogen content in water-equivalent units, which assumes all of the hydrogen is in the form of H2O. Blue indicates where hydrogen content is higher, near the poles, while red indicates lower content at lower latitudes. In reality, some of the hydrogen is in the form of water ice, while a portion of the hydrogen is in the form of hydrated minerals (such as OH, in serpentine group minerals). The color information is superimposed on shaded relief map for context. A second animation (Figure 2) compares the hydrogen content of Ceres' regolith with that of the giant asteroid Vesta, which Dawn orbited from 2011 to 2012. These data show Vesta is a much drier world, with a much lower percent of hydrogen in its regolith. Both maps were produced from data acquired by GRaND. Videos are available at http://photojournal.jpl.nasa.gov/catalog/PIA21081

  11. Solar System Portrait - 60 Frame Mosaic

    NASA Image and Video Library

    1996-09-13

    The cameras of Voyager 1 on Feb. 14, 1990, pointed back toward the sun and took a series of pictures of the sun and the planets, making the first ever portrait of our solar system as seen from the outside. In the course of taking this mosaic consisting of a total of 60 frames, Voyager 1 made several images of the inner solar system from a distance of approximately 4 billion miles and about 32 degrees above the ecliptic plane. Thirty-nine wide angle frames link together six of the planets of our solar system in this mosaic. Outermost Neptune is 30 times further from the sun than Earth. Our sun is seen as the bright object in the center of the circle of frames. The wide-angle image of the sun was taken with the camera's darkest filter (a methane absorption band) and the shortest possible exposure (5 thousandths of a second) to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large as seen from Voyager, only about one-fortieth of the diameter as seen from Earth, but is still almost 8 million times brighter than the brightest star in Earth's sky, Sirius. The result of this great brightness is an image with multiple reflections from the optics in the camera. Wide-angle images surrounding the sun also show many artifacts attributable to scattered light in the optics. These were taken through the clear filter with one second exposures. The insets show the planets magnified many times. Narrow-angle images of Earth, Venus, Jupiter, Saturn, Uranus and Neptune were acquired as the spacecraft built the wide-angle mosaic. Jupiter is larger than a narrow-angle pixel and is clearly resolved, as is Saturn with its rings. Uranus and Neptune appear larger than they really are because of image smear due to spacecraft motion during the long (15 second) exposures. From Voyager's great distance Earth and Venus are mere points of light, less than the size of a picture element even in the narrow-angle camera. Earth was a crescent only 0.12 pixel in size. Coincidentally, Earth lies right in the center of one of the scattered light rays resulting from taking the image so close to the sun. http://photojournal.jpl.nasa.gov/catalog/PIA00451

  12. Solar System Portrait - 60 Frame Mosaic

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The cameras of Voyager 1 on Feb. 14, 1990, pointed back toward the sun and took a series of pictures of the sun and the planets, making the first ever 'portrait' of our solar system as seen from the outside. In the course of taking this mosaic consisting of a total of 60 frames, Voyager 1 made several images of the inner solar system from a distance of approximately 4 billion miles and about 32 degrees above the ecliptic plane. Thirty-nine wide angle frames link together six of the planets of our solar system in this mosaic. Outermost Neptune is 30 times further from the sun than Earth. Our sun is seen as the bright object in the center of the circle of frames. The wide-angle image of the sun was taken with the camera's darkest filter (a methane absorption band) and the shortest possible exposure (5 thousandths of a second) to avoid saturating the camera's vidicon tube with scattered sunlight. The sun is not large as seen from Voyager, only about one-fortieth of the diameter as seen from Earth, but is still almost 8 million times brighter than the brightest star in Earth's sky, Sirius. The result of this great brightness is an image with multiple reflections from the optics in the camera. Wide-angle images surrounding the sun also show many artifacts attributable to scattered light in the optics. These were taken through the clear filter with one second exposures. The insets show the planets magnified many times. Narrow-angle images of Earth, Venus, Jupiter, Saturn, Uranus and Neptune were acquired as the spacecraft built the wide-angle mosaic. Jupiter is larger than a narrow-angle pixel and is clearly resolved, as is Saturn with its rings. Uranus and Neptune appear larger than they really are because of image smear due to spacecraft motion during the long (15 second) exposures. From Voyager's great distance Earth and Venus are mere points of light, less than the size of a picture element even in the narrow-angle camera. Earth was a crescent only 0.12 pixel in size. Coincidentally, Earth lies right in the center of one of the scattered light rays resulting from taking the image so close to the sun.

  13. Integration of USB and firewire cameras in machine vision applications

    NASA Astrophysics Data System (ADS)

    Smith, Timothy E.; Britton, Douglas F.; Daley, Wayne D.; Carey, Richard

    1999-08-01

    Digital cameras have been around for many years, but a new breed of consumer market cameras is hitting the main stream. By using these devices, system designers and integrators will be well posited to take advantage of technological advances developed to support multimedia and imaging applications on the PC platform. Having these new cameras on the consumer market means lower cost, but it does not necessarily guarantee ease of integration. There are many issues that need to be accounted for like image quality, maintainable frame rates, image size and resolution, supported operating system, and ease of software integration. This paper will describe briefly a couple of the consumer digital standards, and then discuss some of the advantages and pitfalls of integrating both USB and Firewire cameras into computer/machine vision applications.

  14. HDR video synthesis for vision systems in dynamic scenes

    NASA Astrophysics Data System (ADS)

    Shopovska, Ivana; Jovanov, Ljubomir; Goossens, Bart; Philips, Wilfried

    2016-09-01

    High dynamic range (HDR) image generation from a number of differently exposed low dynamic range (LDR) images has been extensively explored in the past few decades, and as a result of these efforts a large number of HDR synthesis methods have been proposed. Since HDR images are synthesized by combining well-exposed regions of the input images, one of the main challenges is dealing with camera or object motion. In this paper we propose a method for the synthesis of HDR video from a single camera using multiple, differently exposed video frames, with circularly alternating exposure times. One of the potential applications of the system is in driver assistance systems and autonomous vehicles, involving significant camera and object movement, non- uniform and temporally varying illumination, and the requirement of real-time performance. To achieve these goals simultaneously, we propose a HDR synthesis approach based on weighted averaging of aligned radiance maps. The computational complexity of high-quality optical flow methods for motion compensation is still pro- hibitively high for real-time applications. Instead, we rely on more efficient global projective transformations to solve camera movement, while moving objects are detected by thresholding the differences between the trans- formed and brightness adapted images in the set. To attain temporal consistency of the camera motion in the consecutive HDR frames, the parameters of the perspective transformation are stabilized over time by means of computationally efficient temporal filtering. We evaluated our results on several reference HDR videos, on synthetic scenes, and using 14-bit raw images taken with a standard camera.

  15. X-ray imaging using digital cameras

    NASA Astrophysics Data System (ADS)

    Winch, Nicola M.; Edgar, Andrew

    2012-03-01

    The possibility of using the combination of a computed radiography (storage phosphor) cassette and a semiprofessional grade digital camera for medical or dental radiography is investigated. We compare the performance of (i) a Canon 5D Mk II single lens reflex camera with f1.4 lens and full-frame CMOS array sensor and (ii) a cooled CCD-based camera with a 1/3 frame sensor and the same lens system. Both systems are tested with 240 x 180 mm cassettes which are based on either powdered europium-doped barium fluoride bromide or needle structure europium-doped cesium bromide. The modulation transfer function for both systems has been determined and falls to a value of 0.2 at around 2 lp/mm, and is limited by light scattering of the emitted light from the storage phosphor rather than the optics or sensor pixelation. The modulation transfer function for the CsBr:Eu2+ plate is bimodal, with a high frequency wing which is attributed to the light-guiding behaviour of the needle structure. The detective quantum efficiency has been determined using a radioisotope source and is comparatively low at 0.017 for the CMOS camera and 0.006 for the CCD camera, attributed to the poor light harvesting by the lens. The primary advantages of the method are portability, robustness, digital imaging and low cost; the limitations are the low detective quantum efficiency and hence signal-to-noise ratio for medical doses, and restricted range of plate sizes. Representative images taken with medical doses are shown and illustrate the potential use for portable basic radiography.

  16. What Drives Pakistan’s Interest in Afghanistan?

    DTIC Science & Technology

    2011-05-19

    2010), 13. 172 Kamran Shafi, “Putting on a Brave Face and Standing Tall,” http://www.dawn.com/wps/wcm/connect/dawn-content-library/dawn/the-newspaper...columnists/ kamran -shafi- putting-on-a-brave-face-and-standing-tall-480 (accessed September 10, 2010). 59 include sanctuary, to militant...Shafi, Kamran . “Putting on a Brave Face and Standing Tall.” http://www.dawn.com/wps/wcm/connect/dawn-content-library/dawn/the- newspaper

  17. SarcOptiM for ImageJ: high-frequency online sarcomere length computing on stimulated cardiomyocytes.

    PubMed

    Pasqualin, Côme; Gannier, François; Yu, Angèle; Malécot, Claire O; Bredeloux, Pierre; Maupoil, Véronique

    2016-08-01

    Accurate measurement of cardiomyocyte contraction is a critical issue for scientists working on cardiac physiology and physiopathology of diseases implying contraction impairment. Cardiomyocytes contraction can be quantified by measuring sarcomere length, but few tools are available for this, and none is freely distributed. We developed a plug-in (SarcOptiM) for the ImageJ/Fiji image analysis platform developed by the National Institutes of Health. SarcOptiM computes sarcomere length via fast Fourier transform analysis of video frames captured or displayed in ImageJ and thus is not tied to a dedicated video camera. It can work in real time or offline, the latter overcoming rotating motion or displacement-related artifacts. SarcOptiM includes a simulator and video generator of cardiomyocyte contraction. Acquisition parameters, such as pixel size and camera frame rate, were tested with both experimental recordings of rat ventricular cardiomyocytes and synthetic videos. It is freely distributed, and its source code is available. It works under Windows, Mac, or Linux operating systems. The camera speed is the limiting factor, since the algorithm can compute online sarcomere shortening at frame rates >10 kHz. In conclusion, SarcOptiM is a free and validated user-friendly tool for studying cardiomyocyte contraction in all species, including human. Copyright © 2016 the American Physiological Society.

  18. Homography-based multiple-camera person-tracking

    NASA Astrophysics Data System (ADS)

    Turk, Matthew R.

    2009-01-01

    Multiple video cameras are cheaply installed overlooking an area of interest. While computerized single-camera tracking is well-developed, multiple-camera tracking is a relatively new problem. The main multi-camera problem is to give the same tracking label to all projections of a real-world target. This is called the consistent labelling problem. Khan and Shah (2003) introduced a method to use field of view lines to perform multiple-camera tracking. The method creates inter-camera meta-target associations when objects enter at the scene edges. They also said that a plane-induced homography could be used for tracking, but this method was not well described. Their homography-based system would not work if targets use only one side of a camera to enter the scene. This paper overcomes this limitation and fully describes a practical homography-based tracker. A new method to find the feet feature is introduced. The method works especially well if the camera is tilted, when using the bottom centre of the target's bounding-box would produce inaccurate results. The new method is more accurate than the bounding-box method even when the camera is not tilted. Next, a method is presented that uses a series of corresponding point pairs "dropped" by oblivious, live human targets to find a plane-induced homography. The point pairs are created by tracking the feet locations of moving targets that were associated using the field of view line method. Finally, a homography-based multiple-camera tracking algorithm is introduced. Rules governing when to create the homography are specified. The algorithm ensures that homography-based tracking only starts after a non-degenerate homography is found. The method works when not all four field of view lines are discoverable; only one line needs to be found to use the algorithm. To initialize the system, the operator must specify pairs of overlapping cameras. Aside from that, the algorithm is fully automatic and uses the natural movement of live targets for training. No calibration is required. Testing shows that the algorithm performs very well in real-world sequences. The consistent labelling problem is solved, even for targets that appear via in-scene entrances. Full occlusions are handled. Although implemented in Matlab, the multiple-camera tracking system runs at eight frames per second. A faster implementation would be suitable for real-world use at typical video frame rates.

  19. An adaptive enhancement algorithm for infrared video based on modified k-means clustering

    NASA Astrophysics Data System (ADS)

    Zhang, Linze; Wang, Jingqi; Wu, Wen

    2016-09-01

    In this paper, we have proposed a video enhancement algorithm to improve the output video of the infrared camera. Sometimes the video obtained by infrared camera is very dark since there is no clear target. In this case, infrared video should be divided into frame images by frame extraction, in order to carry out the image enhancement. For the first frame image, which can be divided into k sub images by using K-means clustering according to the gray interval it occupies before k sub images' histogram equalization according to the amount of information per sub image, we used a method to solve a problem that final cluster centers close to each other in some cases; and for the other frame images, their initial cluster centers can be determined by the final clustering centers of the previous ones, and the histogram equalization of each sub image will be carried out after image segmentation based on K-means clustering. The histogram equalization can make the gray value of the image to the whole gray level, and the gray level of each sub image is determined by the ratio of pixels to a frame image. Experimental results show that this algorithm can improve the contrast of infrared video where night target is not obvious which lead to a dim scene, and reduce the negative effect given by the overexposed pixels adaptively in a certain range.

  20. Meteor Search by Spirit, Sol 668

    NASA Technical Reports Server (NTRS)

    2005-01-01

    [figure removed for brevity, see original site] Annotated Meteor Search by Spirit, Sol 668

    The panoramic cameras on NASA's Mars Exploration Rovers are about as sensitive as the human eye at night. The cameras can see the same bright stars that we can see from Earth, and the same patterns of constellations dot the night sky. Scientists on the rover team have been taking images of some of these bright stars as part of several different projects. One project is designed to try to capture 'shooting stars,' or meteors, in the martian night sky. 'Meteoroids' are small pieces of comets and asteroids that travel through space and eventually run into a planet. On Earth, we can sometimes see meteoroids become brilliant, long 'meteors' streaking across the night sky as they burn up from the friction in our atmosphere. Some of these meteors survive their fiery flight and land on the surface (or in the ocean) where, if found, they are called 'meteorites.' The same thing happens in the martian atmosphere, and Spirit even accidentally discovered a meteor while attempting to obtain images of Earth in the pre-dawn sky back in March, 2004 (see http://marsrovers.jpl.nasa.gov/gallery/press/spirit/20040311a.html, and Selsis et al. (2005) Nature, vol 435, p. 581). On Earth, some meteors come in 'storms' or 'showers' at predictable times of the year, like the famous Perseid meteor shower in August or the Leonid meteor shower in November. These 'storms' happen when Earth passes through the same parts of space where comets sometimes pass. The meteors we see at these times are from leftover debris that was shed off of these comets.

    The same kind of thing is predicted for Mars, as well. Inspired by calculations about Martian meteor storms by meteor scientists from the University of Western Ontario in Canada and the Centre de Recherche en Astrophysique de Lyon in France, and also aided by other meteor research colleagues from NASA's Marshall Space Flight Center, scientists on the rover team planned some observations to try to detect predicted meteor storms in October and November, 2005. The views shown here are a composite of nine 60-second exposures taken with the panoramic camera on Spirit during night hours of sol 668 (Nov. 18, 2005), during a week when Mars was predicted to pass through a meteor stream associated with Halley's comet. The south celestial pole is at the center of the frame. Many stars can be seen in the images, appearing as short, curved streaks forming arcs around the center point. The star trails are curved because Mars is rotating while the camera takes the images. The brightest stars in this view would be easily visible to the naked eye, but the faintest ones are slightly dimmer than the human eye can detect.

    In addition to the star trails, there are several smaller linear streaks, dots and splotches that are the trails left by cosmic rays hitting the camera detectors. Cosmic rays are high-energy particles that are created in the Sun and in other stars throughout our galaxy and travel through space in all directions. Some of them strike Earth or other planets, and ones that strike a digital camera detector can leave little tracks or splotches like those seen in these images. Because they come from all directions, some strike the detector face-on, and others strike at glancing angles. Some even skip across the detector like flat rocks skipped across a pond. These are very common phenomena to astronomers used to working with sensitive digital cameras like those in the Mars rovers, the Hubble Space Telescope, or other space probes, and while they can be a nuisance when taking pictures, they generally do not cause any lasting damage to the cameras. Three of the streaks in the image, including one spanning most of the distance from the left edge of the frame to the center, might be meteor trails or could be the marks of other cosmic rays.

    While hunting for meteors on Mars is fun, ultimately the team wants to use the images and results for scientific purposes. These include helping to validate the models and predictions for interplanetary meteor storms, providing information on the rate of impacts of small meteoroids with Mars for comparison with rates for the Earth and Moon, assessing the rate and intensity of cosmic ray impact events in the Martian environment, and looking at whether some bright stars are being dimmed occasionally by water ice or dust clouds occurring at night during different Martian seasons.

  1. Experimental comparison of high-density scintillators for EMCCD-based gamma ray imaging

    NASA Astrophysics Data System (ADS)

    Heemskerk, Jan W. T.; Kreuger, Rob; Goorden, Marlies C.; Korevaar, Marc A. N.; Salvador, Samuel; Seeley, Zachary M.; Cherepy, Nerine J.; van der Kolk, Erik; Payne, Stephen A.; Dorenbos, Pieter; Beekman, Freek J.

    2012-07-01

    Detection of x-rays and gamma rays with high spatial resolution can be achieved with scintillators that are optically coupled to electron-multiplying charge-coupled devices (EMCCDs). These can be operated at typical frame rates of 50 Hz with low noise. In such a set-up, scintillation light within each frame is integrated after which the frame is analyzed for the presence of scintillation events. This method allows for the use of scintillator materials with relatively long decay times of a few milliseconds, not previously considered for use in photon-counting gamma cameras, opening up an unexplored range of dense scintillators. In this paper, we test CdWO4 and transparent polycrystalline ceramics of Lu2O3:Eu and (Gd,Lu)2O3:Eu as alternatives to currently used CsI:Tl in order to improve the performance of EMCCD-based gamma cameras. The tested scintillators were selected for their significantly larger cross-sections at 140 keV (99mTc) compared to CsI:Tl combined with moderate to good light yield. A performance comparison based on gamma camera spatial and energy resolution was done with all tested scintillators having equal (66%) interaction probability at 140 keV. CdWO4, Lu2O3:Eu and (Gd,Lu)2O3:Eu all result in a significantly improved spatial resolution over CsI:Tl, albeit at the cost of reduced energy resolution. Lu2O3:Eu transparent ceramic gives the best spatial resolution: 65 µm full-width-at-half-maximum (FWHM) compared to 147 µm FWHM for CsI:Tl. In conclusion, these ‘slow’ dense scintillators open up new possibilities for improving the spatial resolution of EMCCD-based scintillation cameras.

  2. Improving vehicle tracking rate and speed estimation in dusty and snowy weather conditions with a vibrating camera

    PubMed Central

    Yaghoobi Ershadi, Nastaran

    2017-01-01

    Traffic surveillance systems are interesting to many researchers to improve the traffic control and reduce the risk caused by accidents. In this area, many published works are only concerned about vehicle detection in normal conditions. The camera may vibrate due to wind or bridge movement. Detection and tracking of vehicles is a very difficult task when we have bad weather conditions in winter (snowy, rainy, windy, etc.), dusty weather in arid and semi-arid regions, at night, etc. Also, it is very important to consider speed of vehicles in the complicated weather condition. In this paper, we improved our method to track and count vehicles in dusty weather with vibrating camera. For this purpose, we used a background subtraction based strategy mixed with an extra processing to segment vehicles. In this paper, the extra processing included the analysis of the headlight size, location, and area. In our work, tracking was done between consecutive frames via a generalized particle filter to detect the vehicle and pair the headlights using the connected component analysis. So, vehicle counting was performed based on the pairing result, with Centroid of each blob we calculated distance between two frames by simple formula and hence dividing it by the time between two frames obtained from the video. Our proposed method was tested on several video surveillance records in different conditions such as dusty or foggy weather, vibrating camera, and in roads with medium-level traffic volumes. The results showed that the new proposed method performed better than our previously published method and other methods, including the Kalman filter or Gaussian model, in different traffic conditions. PMID:29261719

  3. Improving vehicle tracking rate and speed estimation in dusty and snowy weather conditions with a vibrating camera.

    PubMed

    Yaghoobi Ershadi, Nastaran

    2017-01-01

    Traffic surveillance systems are interesting to many researchers to improve the traffic control and reduce the risk caused by accidents. In this area, many published works are only concerned about vehicle detection in normal conditions. The camera may vibrate due to wind or bridge movement. Detection and tracking of vehicles is a very difficult task when we have bad weather conditions in winter (snowy, rainy, windy, etc.), dusty weather in arid and semi-arid regions, at night, etc. Also, it is very important to consider speed of vehicles in the complicated weather condition. In this paper, we improved our method to track and count vehicles in dusty weather with vibrating camera. For this purpose, we used a background subtraction based strategy mixed with an extra processing to segment vehicles. In this paper, the extra processing included the analysis of the headlight size, location, and area. In our work, tracking was done between consecutive frames via a generalized particle filter to detect the vehicle and pair the headlights using the connected component analysis. So, vehicle counting was performed based on the pairing result, with Centroid of each blob we calculated distance between two frames by simple formula and hence dividing it by the time between two frames obtained from the video. Our proposed method was tested on several video surveillance records in different conditions such as dusty or foggy weather, vibrating camera, and in roads with medium-level traffic volumes. The results showed that the new proposed method performed better than our previously published method and other methods, including the Kalman filter or Gaussian model, in different traffic conditions.

  4. Automatic vision system for analysis of microscopic behavior of flow and transport in porous media

    NASA Astrophysics Data System (ADS)

    Rashidi, Mehdi; Dehmeshki, Jamshid; Dickenson, Eric; Daemi, M. Farhang

    1997-10-01

    This paper describes the development of a novel automated and efficient vision system to obtain velocity and concentration measurement within a porous medium. An aqueous fluid lace with a fluorescent dye to microspheres flows through a transparent, refractive-index-matched column packed with transparent crystals. For illumination purposes, a planar sheet of laser passes through the column as a CCD camera records all the laser illuminated planes. Detailed microscopic velocity and concentration fields have been computed within a 3D volume of the column. For measuring velocities, while the aqueous fluid, laced with fluorescent microspheres, flows through the transparent medium, a CCD camera records the motions of the fluorescing particles by a video cassette recorder. The recorded images are acquired automatically frame by frame and transferred to the computer for processing, by using a frame grabber an written relevant algorithms through an RS-232 interface. Since the grabbed image is poor in this stage, some preprocessings are used to enhance particles within images. Finally, these enhanced particles are monitored to calculate velocity vectors in the plane of the beam. For concentration measurements, while the aqueous fluid, laced with a fluorescent organic dye, flows through the transparent medium, a CCD camera sweeps back and forth across the column and records concentration slices on the planes illuminated by the laser beam traveling simultaneously with the camera. Subsequently, these recorded images are transferred to the computer for processing in similar fashion to the velocity measurement. In order to have a fully automatic vision system, several detailed image processing techniques are developed to match exact images that have different intensities values but the same topological characteristics. This results in normalized interstitial chemical concentrations as a function of time within the porous column.

  5. Time and flow-direction responses of shear-styress-sensitive liquid crystal coatings

    NASA Technical Reports Server (NTRS)

    Reda, Daniel C.; Muraqtore, J. J.; Heinick, James T.

    1994-01-01

    Time and flow-direction responses of shear-stress liquid crystal coatings were exploresd experimentally. For the time-response experiments, coatings were exposed to transient, compressible flows created during the startup and off-design operation of an injector-driven supersonic wind tunnel. Flow transients were visualized with a focusing schlieren system and recorded with a 100 frame/s color video camera.

  6. Earth observation taken by the Expedition 28 crew

    NASA Image and Video Library

    2011-09-07

    ISS028-E-043559 (7 Sept. 2011) --- This view, from the camera of an Expedition 28 crew member onboard the International Space Station, looks from the northwest toward southeast and covers many counties in southeast Texas that have been heavily affected by dozens of wild fires. Houston can be seen near frame center and the Gulf of Mexico takes up the upper right quadrant of the frame.

  7. Earth Observation as seen by Expedition Two crew

    NASA Image and Video Library

    2001-04-16

    ISS002-E-5656 (16 April 2001) --- Extreme southern topography of California, including inland portions of the San Diego area were captured in this digital still camera's image from the International Space Station's Expedition Two crew members. The previous frame (5655) and this one were both recorded with an 800mm lens, whereas the succeeding frame (5657) was shot with a 105mm lens.

  8. U.S. Geological Survey National Computer Technology Meeting (7th): Program and Abstracts, Held in New Orleans, Louisiana, April 10-15, 1994

    DTIC Science & Technology

    1994-01-01

    Magnolia Room 8:00 pm - 10:00 pm FrameMaker Techniques - Moderator, Terry A. Reinitz, USGS, WRD, Reston, Va. Wednesday, April 13,1994 7:30 am...Maker Interchange Format (MIF) strings, to an MIF file. The MIF file is imported into a blank FrameMaker template, creating a word-processor-formatted...draft to camera-ready stages using Data General workstations and software packages that include FrameMaker , CorelDRAW, USGS-G2, Statit, and

  9. Clouds over Tharsis

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Color composite of condensate clouds over Tharsis made from red and blue images with a synthesized green channel. Mars Orbiter Camera wide angle frames from Orbit 48.

    Figure caption from Science Magazine

  10. SKYLAB (SL)-3 - TELEVISION

    NASA Image and Video Library

    1973-09-29

    S73-34619 (28 July 1973) --- A composite of four frames taken from 16mm movie camera footage showing an overhead view of the Skylab space station cluster in Earth orbit. The Maurer motion picture camera scenes were being filmed during the Skylab 3 Command/Service Module's (CSM) first "fly around" inspection of the space station. Close comparison of the four frames reveals movement of the improvised parasol solar shield over the Orbital Workshop (OWS). The "flapping" of the sun shade was caused from the exhaust of the reaction control subsystem (RCS) thrusters of the Skylab 3 CSM. The one remaining solar array system wing on the OWS is in the lower left background. The solar panel in the lower left foreground is on the Apollo Telescope Mount (ATM). Photo credit: NASA

  11. Various views of STS-95 Senator John Glenn during training

    NASA Image and Video Library

    1998-06-18

    S98-08732 (9 April 1998) --- Holding a 35mm camera, U.S. Sen. John H. Glenn Jr. (D.-Ohio) gets a refresher course in photography from a JSC crew trainer (out of frame, right). The STS-95 payload specialist carried a 35mm camera on his historic MA-6 flight over 36 years ago. The photo was taken by Joe McNally, National Geographic, for NASA.

  12. Digital Semaphore: Technical Feasibility of QR Code Optical Signaling for Fleet Communications

    DTIC Science & Technology

    2013-06-01

    Standards (http://www.iso.org) JIS Japanese Industrial Standard JPEG Joint Photographic Experts Group (digital image format; http://www.jpeg.org) LED...Denso Wave corporation in the 1990s for the Japanese automotive manufacturing industry. See Appendix A for full details. Reed-Solomon Error...eliminates camera blur induced by the shutter, providing clear images at extremely high frame rates. Thusly, digital cinema cameras are more suitable

  13. Synchronizing A Stroboscope With A Video Camera

    NASA Technical Reports Server (NTRS)

    Rhodes, David B.; Franke, John M.; Jones, Stephen B.; Dismond, Harriet R.

    1993-01-01

    Circuit synchronizes flash of light from stroboscope with frame and field periods of video camera. Sync stripper sends vertical-synchronization signal to delay generator, which generates trigger signal. Flashlamp power supply accepts delayed trigger signal and sends pulse of power to flash lamp. Designed for use in making short-exposure images that "freeze" flow in wind tunnel. Also used for making longer-exposure images obtained by use of continuous intense illumination.

  14. STREAM PROCESSING ALGORITHMS FOR DYNAMIC 3D SCENE ANALYSIS

    DTIC Science & Technology

    2018-02-15

    23 9 Ground truth creation based on marked building feature points in two different views 50 frames apart in...between just two views , each row in the current figure represents a similar assessment however between one camera and all other cameras within the dataset...BA4S. While Fig. 44 depicted the epipolar lines for the point correspondences between just two views , the current figure represents a similar

  15. Video monitoring in the Gadria debris flow catchment: preliminary results of large scale particle image velocimetry (LSPIV)

    NASA Astrophysics Data System (ADS)

    Theule, Joshua; Crema, Stefano; Comiti, Francesco; Cavalli, Marco; Marchi, Lorenzo

    2015-04-01

    Large scale particle image velocimetry (LSPIV) is a technique mostly used in rivers to measure two dimensional velocities from high resolution images at high frame rates. This technique still needs to be thoroughly explored in the field of debris flow studies. The Gadria debris flow monitoring catchment in Val Venosta (Italian Alps) has been equipped with four MOBOTIX M12 video cameras. Two cameras are located in a sediment trap located close to the alluvial fan apex, one looking upstream and the other looking down and more perpendicular to the flow. The third camera is in the next reach upstream from the sediment trap at a closer proximity to the flow. These three cameras are connected to a field shelter equipped with power supply and a server collecting all the monitoring data. The fourth camera is located in an active gully, the camera is activated by a rain gauge when there is one minute of rainfall. Before LSPIV can be used, the highly distorted images need to be corrected and accurate reference points need to be made. We decided to use IMGRAFT (an opensource image georectification toolbox) which can correct distorted images using reference points and camera location, and then finally rectifies the batch of images onto a DEM grid (or the DEM grid onto the image coordinates). With the orthorectified images, we used the freeware Fudaa-LSPIV (developed by EDF, IRSTEA, and DeltaCAD Company) to generate the LSPIV calculations of the flow events. Calculated velocities can easily be checked manually because of the already orthorectified images. During the monitoring program (since 2011) we recorded three debris flow events at the sediment trap area (each with very different surge dynamics). The camera in the gully was in operation in 2014 which managed to record granular flows and rockfalls, which particle tracking may be more appropriate for velocity measurements. The four cameras allows us to explore the limitations of camera distance, angle, frame rate, and image quality.

  16. Measurement of the timing behaviour of off-the-shelf cameras

    NASA Astrophysics Data System (ADS)

    Schatz, Volker

    2017-04-01

    This paper presents a measurement method suitable for investigating the timing properties of cameras. A single light source illuminates the camera detector starting with a varying defined delay after the camera trigger. Pixels from the recorded camera frames are summed up and normalised, and the resulting function is indicative of the overlap between illumination and exposure. This allows one to infer the trigger delay and the exposure time with sub-microsecond accuracy. The method is therefore of interest when off-the-shelf cameras are used in reactive systems or synchronised with other cameras. It can supplement radiometric and geometric calibration methods for cameras in scientific use. A closer look at the measurement results reveals deviations from the ideal camera behaviour of constant sensitivity limited to the exposure interval. One of the industrial cameras investigated retains a small sensitivity long after the end of the nominal exposure interval. All three investigated cameras show non-linear variations of sensitivity at O≤ft({{10}-3}\\right) to O≤ft({{10}-2}\\right) during exposure. Due to its sign, the latter effect cannot be described by a sensitivity function depending on the time after triggering, but represents non-linear pixel characteristics.

  17. Enhanced RGB-D Mapping Method for Detailed 3D Indoor and Outdoor Modeling

    PubMed Central

    Tang, Shengjun; Zhu, Qing; Chen, Wu; Darwish, Walid; Wu, Bo; Hu, Han; Chen, Min

    2016-01-01

    RGB-D sensors (sensors with RGB camera and Depth camera) are novel sensing systems that capture RGB images along with pixel-wise depth information. Although they are widely used in various applications, RGB-D sensors have significant drawbacks including limited measurement ranges (e.g., within 3 m) and errors in depth measurement increase with distance from the sensor with respect to 3D dense mapping. In this paper, we present a novel approach to geometrically integrate the depth scene and RGB scene to enlarge the measurement distance of RGB-D sensors and enrich the details of model generated from depth images. First, precise calibration for RGB-D Sensors is introduced. In addition to the calibration of internal and external parameters for both, IR camera and RGB camera, the relative pose between RGB camera and IR camera is also calibrated. Second, to ensure poses accuracy of RGB images, a refined false features matches rejection method is introduced by combining the depth information and initial camera poses between frames of the RGB-D sensor. Then, a global optimization model is used to improve the accuracy of the camera pose, decreasing the inconsistencies between the depth frames in advance. In order to eliminate the geometric inconsistencies between RGB scene and depth scene, the scale ambiguity problem encountered during the pose estimation with RGB image sequences can be resolved by integrating the depth and visual information and a robust rigid-transformation recovery method is developed to register RGB scene to depth scene. The benefit of the proposed joint optimization method is firstly evaluated with the publicly available benchmark datasets collected with Kinect. Then, the proposed method is examined by tests with two sets of datasets collected in both outside and inside environments. The experimental results demonstrate the feasibility and robustness of the proposed method. PMID:27690028

  18. A novel method to reduce time investment when processing videos from camera trap studies.

    PubMed

    Swinnen, Kristijn R R; Reijniers, Jonas; Breno, Matteo; Leirs, Herwig

    2014-01-01

    Camera traps have proven very useful in ecological, conservation and behavioral research. Camera traps non-invasively record presence and behavior of animals in their natural environment. Since the introduction of digital cameras, large amounts of data can be stored. Unfortunately, processing protocols did not evolve as fast as the technical capabilities of the cameras. We used camera traps to record videos of Eurasian beavers (Castor fiber). However, a large number of recordings did not contain the target species, but instead empty recordings or other species (together non-target recordings), making the removal of these recordings unacceptably time consuming. In this paper we propose a method to partially eliminate non-target recordings without having to watch the recordings, in order to reduce workload. Discrimination between recordings of target species and non-target recordings was based on detecting variation (changes in pixel values from frame to frame) in the recordings. Because of the size of the target species, we supposed that recordings with the target species contain on average much more movements than non-target recordings. Two different filter methods were tested and compared. We show that a partial discrimination can be made between target and non-target recordings based on variation in pixel values and that environmental conditions and filter methods influence the amount of non-target recordings that can be identified and discarded. By allowing a loss of 5% to 20% of recordings containing the target species, in ideal circumstances, 53% to 76% of non-target recordings can be identified and discarded. We conclude that adding an extra processing step in the camera trap protocol can result in large time savings. Since we are convinced that the use of camera traps will become increasingly important in the future, this filter method can benefit many researchers, using it in different contexts across the globe, on both videos and photographs.

  19. Enhanced RGB-D Mapping Method for Detailed 3D Indoor and Outdoor Modeling.

    PubMed

    Tang, Shengjun; Zhu, Qing; Chen, Wu; Darwish, Walid; Wu, Bo; Hu, Han; Chen, Min

    2016-09-27

    RGB-D sensors (sensors with RGB camera and Depth camera) are novel sensing systems that capture RGB images along with pixel-wise depth information. Although they are widely used in various applications, RGB-D sensors have significant drawbacks including limited measurement ranges (e.g., within 3 m) and errors in depth measurement increase with distance from the sensor with respect to 3D dense mapping. In this paper, we present a novel approach to geometrically integrate the depth scene and RGB scene to enlarge the measurement distance of RGB-D sensors and enrich the details of model generated from depth images. First, precise calibration for RGB-D Sensors is introduced. In addition to the calibration of internal and external parameters for both, IR camera and RGB camera, the relative pose between RGB camera and IR camera is also calibrated. Second, to ensure poses accuracy of RGB images, a refined false features matches rejection method is introduced by combining the depth information and initial camera poses between frames of the RGB-D sensor. Then, a global optimization model is used to improve the accuracy of the camera pose, decreasing the inconsistencies between the depth frames in advance. In order to eliminate the geometric inconsistencies between RGB scene and depth scene, the scale ambiguity problem encountered during the pose estimation with RGB image sequences can be resolved by integrating the depth and visual information and a robust rigid-transformation recovery method is developed to register RGB scene to depth scene. The benefit of the proposed joint optimization method is firstly evaluated with the publicly available benchmark datasets collected with Kinect. Then, the proposed method is examined by tests with two sets of datasets collected in both outside and inside environments. The experimental results demonstrate the feasibility and robustness of the proposed method.

  20. OOM - OBJECT ORIENTATION MANIPULATOR, VERSION 6.1

    NASA Technical Reports Server (NTRS)

    Goza, S. P.

    1994-01-01

    The Object Orientation Manipulator (OOM) is an application program for creating, rendering, and recording three-dimensional computer-generated still and animated images. This is done using geometrically defined 3D models, cameras, and light sources, referred to collectively as animation elements. OOM does not provide the tools necessary to construct 3D models; instead, it imports binary format model files generated by the Solid Surface Modeler (SSM). Model files stored in other formats must be converted to the SSM binary format before they can be used in OOM. SSM is available as MSC-21914 or as part of the SSM/OOM bundle, COS-10047. Among OOM's features are collision detection (with visual and audio feedback), the capability to define and manipulate hierarchical relationships between animation elements, stereographic display, and ray-traced rendering. OOM uses Euler angle transformations for calculating the results of translation and rotation operations. OOM provides an interactive environment for the manipulation and animation of models, cameras, and light sources. Models are the basic entity upon which OOM operates and are therefore considered the primary animation elements. Cameras and light sources are considered secondary animation elements. A camera, in OOM, is simply a location within the three-space environment from which the contents of the environment are observed. OOM supports the creation and full animation of cameras. Light sources can be defined, positioned and linked to models, but they cannot be animated independently. OOM can simultaneously accommodate as many animation elements as the host computer's memory permits. Once the required animation elements are present, the user may position them, orient them, and define any initial relationships between them. Once the initial relationships are defined, the user can display individual still views for rendering and output, or define motion for the animation elements by using the Interp Animation Editor. The program provides the capability to save still images, animated sequences of frames, and the information that describes the initialization process for an OOM session. OOM provides the same rendering and output options for both still and animated images. OOM is equipped with a robust model manipulation environment featuring a full screen viewing window, a menu-oriented user interface, and an interpolative Animation Editor. It provides three display modes: solid, wire frame, and simple, that allow the user to trade off visual authenticity for update speed. In the solid mode, each model is drawn based on the shading characteristics assigned to it when it was built. All of the shading characteristics supported by SSM are recognized and properly rendered in this mode. If increasing model complexity impedes the operation of OOM in this mode, then wireframe and simple modes are available. These provide substantially faster screen updates than solid mode. The creation and placement of cameras and light sources is under complete control of the user. One light source is provided in the default element set. It is modeled as a direct light source providing a type of lighting analogous to that provided by the Sun. OOM can accommodate as many light sources as the memory of the host computer permits. Animation is created in OOM using a technique called key frame interpolation. First, various program functions are used to load models, load or create light sources and cameras, and specify initial positions for each element. When these steps are completed, the Interp function is used to create an animation sequence for each element to be animated. An animation sequence consists of a user-defined number of frames (screen images) with some subset of those being defined as key frames. The motion of the element between key frames is interpolated automatically by the software. Key frames thus act as transition points in the motion of an element. This saves the user from having to individually define element data at each frame of a sequence. Animation frames and still images can be output to videotape recorders, film recorders, color printers, and disk files. OOM is written in C-language for implementation on SGI IRIS 4D series workstations running the IRIX operating system. A minimum of 8Mb of RAM is recommended for this program. The standard distribution medium for OOM is a .25 inch streaming magnetic IRIX tape cartridge in UNIX tar format. OOM is also offered as a bundle with a related program, SSM (Solid Surface Modeler). Please see the abstract for SSM/OOM (COS-10047) for information about the bundled package. OOM was released in 1993.

  1. 4K Video of Colorful Liquid in Space

    NASA Image and Video Library

    2015-10-09

    Once again, astronauts on the International Space Station dissolved an effervescent tablet in a floating ball of water, and captured images using a camera capable of recording four times the resolution of normal high-definition cameras. The higher resolution images and higher frame rate videos can reveal more information when used on science investigations, giving researchers a valuable new tool aboard the space station. This footage is one of the first of its kind. The cameras are being evaluated for capturing science data and vehicle operations by engineers at NASA's Marshall Space Flight Center in Huntsville, Alabama.

  2. System of technical vision for autonomous unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Bondarchuk, A. S.

    2018-05-01

    This paper is devoted to the implementation of image recognition algorithm using the LabVIEW software. The created virtual instrument is designed to detect the objects on the frames from the camera mounted on the UAV. The trained classifier is invariant to changes in rotation, as well as to small changes in the camera's viewing angle. Finding objects in the image using particle analysis, allows you to classify regions of different sizes. This method allows the system of technical vision to more accurately determine the location of the objects of interest and their movement relative to the camera.

  3. Modeling of digital information optical encryption system with spatially incoherent illumination

    NASA Astrophysics Data System (ADS)

    Bondareva, Alyona P.; Cheremkhin, Pavel A.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Starikov, Rostislav S.; Starikov, Sergey N.

    2015-10-01

    State of the art micromirror DMD spatial light modulators (SLM) offer unprecedented framerate up to 30000 frames per second. This, in conjunction with high speed digital camera, should allow to build high speed optical encryption system. Results of modeling of digital information optical encryption system with spatially incoherent illumination are presented. Input information is displayed with first SLM, encryption element - with second SLM. Factors taken into account are: resolution of SLMs and camera, holograms reconstruction noise, camera noise and signal sampling. Results of numerical simulation demonstrate high speed (several gigabytes per second), low bit error rate and high crypto-strength.

  4. Circuit design of an EMCCD camera

    NASA Astrophysics Data System (ADS)

    Li, Binhua; Song, Qian; Jin, Jianhui; He, Chun

    2012-07-01

    EMCCDs have been used in the astronomical observations in many ways. Recently we develop a camera using an EMCCD TX285. The CCD chip is cooled to -100°C in an LN2 dewar. The camera controller consists of a driving board, a control board and a temperature control board. Power supplies and driving clocks of the CCD are provided by the driving board, the timing generator is located in the control board. The timing generator and an embedded Nios II CPU are implemented in an FPGA. Moreover the ADC and the data transfer circuit are also in the control board, and controlled by the FPGA. The data transfer between the image workstation and the camera is done through a Camera Link frame grabber. The software of image acquisition is built using VC++ and Sapera LT. This paper describes the camera structure, the main components and circuit design for video signal processing channel, clock driver, FPGA and Camera Link interfaces, temperature metering and control system. Some testing results are presented.

  5. Single-photon sensitive fast ebCMOS camera system for multiple-target tracking of single fluorophores: application to nano-biophotonics

    NASA Astrophysics Data System (ADS)

    Cajgfinger, Thomas; Chabanat, Eric; Dominjon, Agnes; Doan, Quang T.; Guerin, Cyrille; Houles, Julien; Barbier, Remi

    2011-03-01

    Nano-biophotonics applications will benefit from new fluorescent microscopy methods based essentially on super-resolution techniques (beyond the diffraction limit) on large biological structures (membranes) with fast frame rate (1000 Hz). This trend tends to push the photon detectors to the single-photon counting regime and the camera acquisition system to real time dynamic multiple-target tracing. The LUSIPHER prototype presented in this paper aims to give a different approach than those of Electron Multiplied CCD (EMCCD) technology and try to answer to the stringent demands of the new nano-biophotonics imaging techniques. The electron bombarded CMOS (ebCMOS) device has the potential to respond to this challenge, thanks to the linear gain of the accelerating high voltage of the photo-cathode, to the possible ultra fast frame rate of CMOS sensors and to the single-photon sensitivity. We produced a camera system based on a 640 kPixels ebCMOS with its acquisition system. The proof of concept for single-photon based tracking for multiple single-emitters is the main result of this paper.

  6. Eight-channel Kirkpatrick-Baez microscope for multiframe x-ray imaging diagnostics in laser plasma experiments.

    PubMed

    Yi, Shengzhen; Zhang, Zhe; Huang, Qiushi; Zhang, Zhong; Mu, Baozhong; Wang, Zhanshan; Fang, Zhiheng; Wang, Wei; Fu, Sizu

    2016-10-01

    Because grazing-incidence Kirkpatrick-Baez (KB) microscopes have better resolution and collection efficiency than pinhole cameras, they have been widely used for x-ray imaging diagnostics of laser inertial confinement fusion. The assembly and adjustment of a multichannel KB microscope must meet stringent requirements for image resolution and reproducible alignment. In the present study, an eight-channel KB microscope was developed for diagnostics by imaging self-emission x-rays with a framing camera at the Shenguang-II Update (SGII-Update) laser facility. A consistent object field of view is ensured in the eight channels using an assembly method based on conical reference cones, which also allow the intervals between the eight images to be tuned to couple with the microstrips of the x-ray framing camera. The eight-channel KB microscope was adjusted via real-time x-ray imaging experiments in the laboratory. This paper describes the details of the eight-channel KB microscope, its optical and multilayer design, the assembly and alignment methods, and results of imaging in the laboratory and at the SGII-Update.

  7. Preliminary Results from Initial Investigations of Ceres' Cratering Record from Dawn Imaging Data

    NASA Astrophysics Data System (ADS)

    Schmedemann, Nico; Michael, Gregory; Ivanov, Boris A.; Kneissl, Thomas; Neesemann, Adrian; Hiesinger, Harald; Jaumann, Ralf; Raymond, Carol A.; Russell, Christopher T.

    2015-04-01

    The highly successful Dawn mission [1] finished data collection at Vesta in 2012 and is now on its way to the dwarf planet Ceres. According to the current Ceres approach timeline of the Dawn mission, the ground resolution of the Dawn FC camera [2] will be about 10 times better than Hubble data [3] at the time of the presentation of this work. This may allow for identification of craters about 15 km in diameter. Initial mapping of sample areas may provide enough information of the cratering record in order to compare it with the theoretical Ceres crater production function we present at the 46th LPSC conference (March 16-20, 2015, The Woodlands, Texas) [4]. Our preliminary crater production function for Ceres is derived from the assumption of an icy crust just below a thin surface layer of dust [5], and a projectile population that is very similar to the one that impacted the Moon [6]. In order to scale the lunar cratering record to Ceres we use the Ivanov scaling laws [7], which allow for crater scaling based on parameters that can be derived from observations. The lunar-like approach gave reasonable good results for the crater production function on the asteroids Vesta, Ida, Lutetia and Gaspra [8]. Since the lunar surface is of basaltic composition, the correct scaling between the different materials is challenging. One crucial parameter is the transition diameter from simple to complex craters. Based on the simple to complex transition diameter on Iapetus, an icy satellite of Saturn, we expect this transition at about 12 km crater size at Ceres. This value may be slightly different due to the different temperatures at Ceres and Iapetus. If the simple to complex transition is observed at much larger diameters, the reason could be a substantial fraction of rock in the shallow subsurface of Ceres. In an ice-rich surface material high relaxation rates may also be expected that could change the shape of the crater production function. A thorough geological mapping takes much more time than is available and, thus, will not be available at the time of the presentation. First hi-res imaging data will also provide details about crater morphologies and the major geologic units that will be analyzed during later stages of the Dawn mission. Acknowledgment: This work has been supported by the German Space Agency (DLR) on behalf of the Federal Ministry of Economic Affairs and Energy, grants 50OW1101 (NS, TK, AN) and 50QM1301 (GM). BAI is supported by Program 22 RAS. References: [1] Russell C.T. et al. (2012) Science, 336, 684-686; [2] Sierks H. et al. (2011) Space Science Reviews, 163, 263-327; [3] Li J.Y. et al. (2006) Icarus, 182, 143-160; [4] Schmedemann N. et al. (2015): 46.LPSC, The Woodlands, #1418; [5] McCord T.B. et al. (2012) Ceres: Its Origin, Evolution and Structure and Dawn's Potential Contribution. In: Russell, C.T, Raymond, C.A. (eds.) The Dawn Mission to Minor Planets 4 Vesta and 1 Ceres. Springer, New York, 63-76; [6] Neukum G. and Ivanov B. A. (1994) Crater size distribu-tions and impact probabilities on Earth from Lunar, terrestrial planet, and asteroid cratering data. In: Gehrels T. (ed) Hazards due to comets and asteroids. University of Arizona Press, Tucson, 359-416. [7] Ivanov B.A. (2001) Space Science Reviews, 96, 87-104; [8] Schmedemann N. et al. (2014), 103, 104-130.

  8. Ground Testing of Prototype Hardware and Processing Algorithms for a Wide Area Space Surveillance System (WASSS)

    NASA Astrophysics Data System (ADS)

    Goldstein, N.; Dressler, R. A.; Richtsmeier, S. S.; McLean, J.; Dao, P. D.; Murray-Krezan, J.; Fulcoly, D. O.

    2013-09-01

    Recent ground testing of a wide area camera system and automated star removal algorithms has demonstrated the potential to detect, quantify, and track deep space objects using small aperture cameras and on-board processors. The camera system, which was originally developed for a space-based Wide Area Space Surveillance System (WASSS), operates in a fixed-stare mode, continuously monitoring a wide swath of space and differentiating celestial objects from satellites based on differential motion across the field of view. It would have greatest utility in a LEO orbit to provide automated and continuous monitoring of deep space with high refresh rates, and with particular emphasis on the GEO belt and GEO transfer space. Continuous monitoring allows a concept of change detection and custody maintenance not possible with existing sensors. The detection approach is equally applicable to Earth-based sensor systems. A distributed system of such sensors, either Earth-based, or space-based, could provide automated, persistent night-time monitoring of all of deep space. The continuous monitoring provides a daily record of the light curves of all GEO objects above a certain brightness within the field of view. The daily updates of satellite light curves offers a means to identify specific satellites, to note changes in orientation and operational mode, and to queue other SSA assets for higher resolution queries. The data processing approach may also be applied to larger-aperture, higher resolution camera systems to extend the sensitivity towards dimmer objects. In order to demonstrate the utility of the WASSS system and data processing, a ground based field test was conducted in October 2012. We report here the results of the observations made at Magdalena Ridge Observatory using the prototype WASSS camera, which has a 4×60° field-of-view , <0.05° resolution, a 2.8 cm2 aperture, and the ability to view within 4° of the sun. A single camera pointed at the GEO belt provided a continuous night-long record of the intensity and location of more than 50 GEO objects detected within the camera's 60-degree field-of-view, with a detection sensitivity similar to the camera's shot noise limit of Mv=13.7. Performance is anticipated to scale with aperture area, allowing the detection of dimmer objects with larger-aperture cameras. The sensitivity of the system depends on multi-frame averaging and an image processing algorithm that exploits the different angular velocities of celestial objects and SOs. Principal Components Analysis (PCA) is used to filter out all objects moving with the velocity of the celestial frame of reference. The resulting filtered images are projected back into an Earth-centered frame of reference, or into any other relevant frame of reference, and co-added to form a series of images of the GEO objects as a function of time. The PCA approach not only removes the celestial background, but it also removes systematic variations in system calibration, sensor pointing, and atmospheric conditions. The resulting images are shot-noise limited, and can be exploited to automatically identify deep space objects, produce approximate state vectors, and track their locations and intensities as a function of time.

  9. Clouds over Tharsis

    NASA Image and Video Library

    1998-03-13

    Color composite of condensate clouds over Tharsis made from red and blue images with a synthesized green channel. Mars Orbiter Camera wide angle frames from Orbit 48. http://photojournal.jpl.nasa.gov/catalog/PIA00812

  10. Camera-on-a-Chip

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Jet Propulsion Laboratory's research on a second generation, solid-state image sensor technology has resulted in the Complementary Metal- Oxide Semiconductor Active Pixel Sensor (CMOS), establishing an alternative to the Charged Coupled Device (CCD). Photobit Corporation, the leading supplier of CMOS image sensors, has commercialized two products of their own based on this technology: the PB-100 and PB-300. These devices are cameras on a chip, combining all camera functions. CMOS "active-pixel" digital image sensors offer several advantages over CCDs, a technology used in video and still-camera applications for 30 years. The CMOS sensors draw less energy, they use the same manufacturing platform as most microprocessors and memory chips, and they allow on-chip programming of frame size, exposure, and other parameters.

  11. Pulsed-neutron imaging by a high-speed camera and center-of-gravity processing

    NASA Astrophysics Data System (ADS)

    Mochiki, K.; Uragaki, T.; Koide, J.; Kushima, Y.; Kawarabayashi, J.; Taketani, A.; Otake, Y.; Matsumoto, Y.; Su, Y.; Hiroi, K.; Shinohara, T.; Kai, T.

    2018-01-01

    Pulsed-neutron imaging is attractive technique in the research fields of energy-resolved neutron radiography and RANS (RIKEN) and RADEN (J-PARC/JAEA) are small and large accelerator-driven pulsed-neutron facilities for its imaging, respectively. To overcome the insuficient spatial resolution of the conunting type imaging detectors like μ NID, nGEM and pixelated detectors, camera detectors combined with a neutron color image intensifier were investigated. At RANS center-of-gravity technique was applied to spots image obtained by a CCD camera and the technique was confirmed to be effective for improving spatial resolution. At RADEN a high-frame-rate CMOS camera was used and super resolution technique was applied and it was recognized that the spatial resolution was futhermore improved.

  12. Comparative Analysis of THOR-NT ATD vs. Hybrid III ATD in Laboratory Vertical Shock Testing

    DTIC Science & Technology

    2013-09-01

    were taken both pretest and post - test for each test event (figure 5). Figure 5. Rigid fixture placed on the drop table with ATD seated: Hybrid III...6 3. Experimental Procedure 6 3.1 Test Setup...frames per second and with a Vision Research Phantom V9.1 (Wayne, NJ) high-speed video camera, sampling 1000 frames per second. 3. Experimental

  13. A Probability-Based Algorithm Using Image Sensors to Track the LED in a Vehicle Visible Light Communication System.

    PubMed

    Huynh, Phat; Do, Trong-Hop; Yoo, Myungsik

    2017-02-10

    This paper proposes a probability-based algorithm to track the LED in vehicle visible light communication systems using a camera. In this system, the transmitters are the vehicles' front and rear LED lights. The receivers are high speed cameras that take a series of images of the LEDs. ThedataembeddedinthelightisextractedbyfirstdetectingthepositionoftheLEDsintheseimages. Traditionally, LEDs are detected according to pixel intensity. However, when the vehicle is moving, motion blur occurs in the LED images, making it difficult to detect the LEDs. Particularly at high speeds, some frames are blurred at a high degree, which makes it impossible to detect the LED as well as extract the information embedded in these frames. The proposed algorithm relies not only on the pixel intensity, but also on the optical flow of the LEDs and on statistical information obtained from previous frames. Based on this information, the conditional probability that a pixel belongs to a LED is calculated. Then, the position of LED is determined based on this probability. To verify the suitability of the proposed algorithm, simulations are conducted by considering the incidents that can happen in a real-world situation, including a change in the position of the LEDs at each frame, as well as motion blur due to the vehicle speed.

  14. Fast Fiber-Coupled Imaging Devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brockington, Samuel; Case, Andrew; Witherspoon, Franklin Douglas

    HyperV Technologies Corp. has successfully designed, built and experimentally demonstrated a full scale 1024 pixel 100 MegaFrames/s fiber coupled camera with 12 or 14 bits, and record lengths of 32K frames, exceeding our original performance objectives. This high-pixel-count, fiber optically-coupled, imaging diagnostic can be used for investigating fast, bright plasma events. In Phase 1 of this effort, a 100 pixel fiber-coupled fast streak camera for imaging plasma jet profiles was constructed and successfully demonstrated. The resulting response from outside plasma physics researchers emphasized development of increased pixel performance as a higher priority over increasing pixel count. In this Phase 2more » effort, HyperV therefore focused on increasing the sample rate and bit-depth of the photodiode pixel designed in Phase 1, while still maintaining a long record length and holding the cost per channel to levels which allowed up to 1024 pixels to be constructed. Cost per channel was 53.31 dollars, very close to our original target of $50 per channel. The system consists of an imaging "camera head" coupled to a photodiode bank with an array of optical fibers. The output of these fast photodiodes is then digitized at 100 Megaframes per second and stored in record lengths of 32,768 samples with bit depths of 12 to 14 bits per pixel. Longer record lengths are possible with additional memory. A prototype imaging system with up to 1024 pixels was designed and constructed and used to successfully take movies of very fast moving plasma jets as a demonstration of the camera performance capabilities. Some faulty electrical components on the 64 circuit boards resulted in only 1008 functional channels out of 1024 on this first generation prototype system. We experimentally observed backlit high speed fan blades in initial camera testing and then followed that with full movies and streak images of free flowing high speed plasma jets (at 30-50 km/s). Jet structure and jet collisions onto metal pillars in the path of the plasma jets were recorded in a single shot. This new fast imaging system is an attractive alternative to conventional fast framing cameras for applications and experiments where imaging events using existing techniques are inefficient or impossible. The development of HyperV's new diagnostic was split into two tracks: a next generation camera track, in which HyperV built, tested, and demonstrated a prototype 1024 channel camera at its own facility, and a second plasma community beta test track, where selected plasma physics programs received small systems of a few test pixels to evaluate the expected performance of a full scale camera on their experiments. These evaluations were performed as part of an unfunded collaboration with researchers at Los Alamos National Laboratory and the University of California at Davis. Results from the prototype 1024-pixel camera are discussed, as well as results from the collaborations with test pixel system deployment sites.« less

  15. TEM Video Compressive Sensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, Andrew; Kovarik, Libor; Abellan, Patricia

    One of the main limitations of imaging at high spatial and temporal resolution during in-situ TEM experiments is the frame rate of the camera being used to image the dynamic process. While the recent development of direct detectors has provided the hardware to achieve frame rates approaching 0.1ms, the cameras are expensive and must replace existing detectors. In this paper, we examine the use of coded aperture compressive sensing methods [1, 2, 3, 4] to increase the framerate of any camera with simple, low-cost hardware modifications. The coded aperture approach allows multiple sub-frames to be coded and integrated into amore » single camera frame during the acquisition process, and then extracted upon readout using statistical compressive sensing inversion. Our simulations show that it should be possible to increase the speed of any camera by at least an order of magnitude. Compressive Sensing (CS) combines sensing and compression in one operation, and thus provides an approach that could further improve the temporal resolution while correspondingly reducing the electron dose rate. Because the signal is measured in a compressive manner, fewer total measurements are required. When applied to TEM video capture, compressive imaging couled improve acquisition speed and reduce the electron dose rate. CS is a recent concept, and has come to the forefront due the seminal work of Candès [5]. Since the publication of Candès, there has been enormous growth in the application of CS and development of CS variants. For electron microscopy applications, the concept of CS has also been recently applied to electron tomography [6], and reduction of electron dose in scanning transmission electron microscopy (STEM) imaging [7]. To demonstrate the applicability of coded aperture CS video reconstruction for atomic level imaging, we simulate compressive sensing on observations of Pd nanoparticles and Ag nanoparticles during exposure to high temperatures and other environmental conditions. Figure 1 highlights the results from the Pd nanoparticle experiment. On the left, 10 frames are reconstructed from a single coded frame—the original frames are shown for comparison. On the right a selection of three frames are shown from reconstructions at compression levels 10,20,30. The reconstructions, which are not post-processed, are true to the original and degrade in a straightforward manner. The final choice of compression level will obviously depend on both the temporal and spatial resolution required for a specific imaging task, but the results indicate that an increase in speed of better than an order of magnitude should be possible for all experiments. References: [1] P Llull, X Liao, X Yuan et al. Optics express 21(9), (2013), p. 10526. [2] J Yang, X Yuan, X Liao et al. Image Processing, IEEE Trans 23(11), (2014), p. 4863. [3] X Yuan, J Yang, P Llull et al. In ICIP 2013 (IEEE), p. 14. [4] X Yuan, P Llull, X Liao et al. In CVPR 2014. p. 3318. [5] EJ Candès, J Romberg and T Tao. Information Theory, IEEE Trans 52(2), (2006), p. 489. [6] P Binev, W Dahmen, R DeVore et al. In Modeling Nanoscale Imaging in Electron Microscopy, eds. T Vogt, W Dahmen and P Binev (Springer US), Nanostructure Science and Technology (2012). p. 73. [7] A Stevens, H Yang, L Carin et al. Microscopy 63(1), (2014), pp. 41.« less

  16. VizieR Online Data Catalog: BV(RI)c light curves of FF Vul (Samec+, 2016)

    NASA Astrophysics Data System (ADS)

    Samec, R. G.; Nyaude, R.; Caton, D.; van Hamme, W.

    2017-02-01

    The present BVRcIc light curves were taken by DC, the Dark Sky Observatory 0.81m reflector at Phillips Gap, North Carolina. These were taken on 2015 September 12, 13, 14 and 15, and October 15, with a thermoelectrically cooled (-40°C) 2*2K Apogee Alta camera. Additional observations were obtained remotely with the SARA north 0.91m reflector at KPNO on 2015 September 20 and October 11, with the ARC 2*2K camera cooled to -110°C. Individual observations were taken at both sites with standard Johnson-Cousins filters, and included 444 field images in B, 451 in V, 443 in Rc, and 445 in Ic. The standard error was ~7mmag in each of B, V, Rc and Ic. Nightly images were calibrated with 25 bias frames, five flat frames in each filter, and ten 300s dark frames. The exposure times were 40-50s in B, 25-30s in V, 15-25s in Rc and Ic. Our observations are listed in Table1. (1 data file).

  17. MPCM: a hardware coder for super slow motion video sequences

    NASA Astrophysics Data System (ADS)

    Alcocer, Estefanía; López-Granado, Otoniel; Gutierrez, Roberto; Malumbres, Manuel P.

    2013-12-01

    In the last decade, the improvements in VLSI levels and image sensor technologies have led to a frenetic rush to provide image sensors with higher resolutions and faster frame rates. As a result, video devices were designed to capture real-time video at high-resolution formats with frame rates reaching 1,000 fps and beyond. These ultrahigh-speed video cameras are widely used in scientific and industrial applications, such as car crash tests, combustion research, materials research and testing, fluid dynamics, and flow visualization that demand real-time video capturing at extremely high frame rates with high-definition formats. Therefore, data storage capability, communication bandwidth, processing time, and power consumption are critical parameters that should be carefully considered in their design. In this paper, we propose a fast FPGA implementation of a simple codec called modulo-pulse code modulation (MPCM) which is able to reduce the bandwidth requirements up to 1.7 times at the same image quality when compared with PCM coding. This allows current high-speed cameras to capture in a continuous manner through a 40-Gbit Ethernet point-to-point access.

  18. Visual tracking using neuromorphic asynchronous event-based cameras.

    PubMed

    Ni, Zhenjiang; Ieng, Sio-Hoi; Posch, Christoph; Régnier, Stéphane; Benosman, Ryad

    2015-04-01

    This letter presents a novel computationally efficient and robust pattern tracking method based on a time-encoded, frame-free visual data. Recent interdisciplinary developments, combining inputs from engineering and biology, have yielded a novel type of camera that encodes visual information into a continuous stream of asynchronous, temporal events. These events encode temporal contrast and intensity locally in space and time. We show that the sparse yet accurately timed information is well suited as a computational input for object tracking. In this letter, visual data processing is performed for each incoming event at the time it arrives. The method provides a continuous and iterative estimation of the geometric transformation between the model and the events representing the tracked object. It can handle isometry, similarities, and affine distortions and allows for unprecedented real-time performance at equivalent frame rates in the kilohertz range on a standard PC. Furthermore, by using the dimension of time that is currently underexploited by most artificial vision systems, the method we present is able to solve ambiguous cases of object occlusions that classical frame-based techniques handle poorly.

  19. Practical low-cost visual communication using binary images for deaf sign language.

    PubMed

    Manoranjan, M D; Robinson, J A

    2000-03-01

    Deaf sign language transmitted by video requires a temporal resolution of 8 to 10 frames/s for effective communication. Conventional videoconferencing applications, when operated over low bandwidth telephone lines, provide very low temporal resolution of pictures, of the order of less than a frame per second, resulting in jerky movement of objects. This paper presents a practical solution for sign language communication, offering adequate temporal resolution of images using moving binary sketches or cartoons, implemented on standard personal computer hardware with low-cost cameras and communicating over telephone lines. To extract cartoon points an efficient feature extraction algorithm adaptive to the global statistics of the image is proposed. To improve the subjective quality of the binary images, irreversible preprocessing techniques, such as isolated point removal and predictive filtering, are used. A simple, efficient and fast recursive temporal prefiltering scheme, using histograms of successive frames, reduces the additive and multiplicative noise from low-cost cameras. An efficient three-dimensional (3-D) compression scheme codes the binary sketches. Subjective tests performed on the system confirm that it can be used for sign language communication over telephone lines.

  20. Vehicle counting system using real-time video processing

    NASA Astrophysics Data System (ADS)

    Crisóstomo-Romero, Pedro M.

    2006-02-01

    Transit studies are important for planning a road network with optimal vehicular flow. A vehicular count is essential. This article presents a vehicle counting system based on video processing. An advantage of such system is the greater detail than is possible to obtain, like shape, size and speed of vehicles. The system uses a video camera placed above the street to image transit in real-time. The video camera must be placed at least 6 meters above the street level to achieve proper acquisition quality. Fast image processing algorithms and small image dimensions are used to allow real-time processing. Digital filters, mathematical morphology, segmentation and other techniques allow identifying and counting all vehicles in the image sequences. The system was implemented under Linux in a 1.8 GHz Pentium 4 computer. A successful count was obtained with frame rates of 15 frames per second for images of size 240x180 pixels and 24 frames per second for images of size 180x120 pixels, thus being able to count vehicles whose speeds do not exceed 150 km/h.

  1. MS Musgrave conducts CFES experiment on middeck

    NASA Image and Video Library

    1983-04-09

    STS006-03-381 (4-9 April 1983) --- Astronaut F. Story Musgrave, STS-6 mission specialist, monitors the activity of a sample in the continuous flow electrophoresis system (CFES) aboard the Earth-orbiting space shuttle Challenger. Dr. Musgrave is in the middeck area of the spacecraft. He has mounted a 35mm camera to record the activity through the window of the experiment. This frame was also photographed with a 35mm camera. Photo credit: NASA

  2. Various views of STS-95 Senator John Glenn during training

    NASA Image and Video Library

    1998-06-18

    S98-08733 (9 April 1998) --- Looking through the view finder on a camera, U.S. Sen. John H. Glenn Jr. (D.-Ohio) gets a refresher course in photography from a JSC crew trainer (out of frame, right). The STS-95 payload specialist carried a 35mm camera on his historic MA-6 flight over 36 years ago. The photo was taken by Joe McNally, National Geographic, for NASA.

  3. Advantages of computer cameras over video cameras/frame grabbers for high-speed vision applications

    NASA Astrophysics Data System (ADS)

    Olson, Gaylord G.; Walker, Jo N.

    1997-09-01

    Cameras designed to work specifically with computers can have certain advantages in comparison to the use of cameras loosely defined as 'video' cameras. In recent years the camera type distinctions have become somewhat blurred, with a great presence of 'digital cameras' aimed more at the home markets. This latter category is not considered here. The term 'computer camera' herein is intended to mean one which has low level computer (and software) control of the CCD clocking. These can often be used to satisfy some of the more demanding machine vision tasks, and in some cases with a higher rate of measurements than video cameras. Several of these specific applications are described here, including some which use recently designed CCDs which offer good combinations of parameters such as noise, speed, and resolution. Among the considerations for the choice of camera type in any given application would be such effects as 'pixel jitter,' and 'anti-aliasing.' Some of these effects may only be relevant if there is a mismatch between the number of pixels per line in the camera CCD and the number of analog to digital (A/D) sampling points along a video scan line. For the computer camera case these numbers are guaranteed to match, which alleviates some measurement inaccuracies and leads to higher effective resolution.

  4. Photometric Calibration of Consumer Video Cameras

    NASA Technical Reports Server (NTRS)

    Suggs, Robert; Swift, Wesley, Jr.

    2007-01-01

    Equipment and techniques have been developed to implement a method of photometric calibration of consumer video cameras for imaging of objects that are sufficiently narrow or sufficiently distant to be optically equivalent to point or line sources. Heretofore, it has been difficult to calibrate consumer video cameras, especially in cases of image saturation, because they exhibit nonlinear responses with dynamic ranges much smaller than those of scientific-grade video cameras. The present method not only takes this difficulty in stride but also makes it possible to extend effective dynamic ranges to several powers of ten beyond saturation levels. The method will likely be primarily useful in astronomical photometry. There are also potential commercial applications in medical and industrial imaging of point or line sources in the presence of saturation.This development was prompted by the need to measure brightnesses of debris in amateur video images of the breakup of the Space Shuttle Columbia. The purpose of these measurements is to use the brightness values to estimate relative masses of debris objects. In most of the images, the brightness of the main body of Columbia was found to exceed the dynamic ranges of the cameras. A similar problem arose a few years ago in the analysis of video images of Leonid meteors. The present method is a refined version of the calibration method developed to solve the Leonid calibration problem. In this method, one performs an endto- end calibration of the entire imaging system, including not only the imaging optics and imaging photodetector array but also analog tape recording and playback equipment (if used) and any frame grabber or other analog-to-digital converter (if used). To automatically incorporate the effects of nonlinearity and any other distortions into the calibration, the calibration images are processed in precisely the same manner as are the images of meteors, space-shuttle debris, or other objects that one seeks to analyze. The light source used to generate the calibration images is an artificial variable star comprising a Newtonian collimator illuminated by a light source modulated by a rotating variable neutral- density filter. This source acts as a point source, the brightness of which varies at a known rate. A video camera to be calibrated is aimed at this source. Fixed neutral-density filters are inserted in or removed from the light path as needed to make the video image of the source appear to fluctuate between dark and saturated bright. The resulting video-image data are analyzed by use of custom software that determines the integrated signal in each video frame and determines the system response curve (measured output signal versus input brightness). These determinations constitute the calibration, which is thereafter used in automatic, frame-by-frame processing of the data from the video images to be analyzed.

  5. Dawn approaches Ceres: Analysis of first FC color data

    NASA Astrophysics Data System (ADS)

    Hoffmann, Martin; Nathues, Andreas; Schäfer, Michael; Russell, Christopher T.; Schäfer, Tanja; Memgel, Kurt; Reddy, Vishnu; Thangjam, Guneshwar S.; Sierks, Holger; Christensen, Ulrich; Hiesinger, Harald; Le Corre, Lucille; Gutiérrez-Marqués, Pablo; Büttner, Irene; Hall, Ian; Ripken, Joachim; Sykes, Mark V.; Li, Jian-Yang

    2015-04-01

    Since December 1, 2014 the Dawn spacecraft obtains images of Ceres by its onboard Framing Camera in seven color bands and one clear (panchromatic) filter. The size of Ceres (in pixels) has increased during this time from a diameter of 9 pixels to about a quarter of the full frame. The higher resolution of more recent data reveals first details of the topography and distribution of reflectances. Also, we are going to show evidence on extent and elevation of impact structures and other geologic features. These are first indications on their context and unique properties of the surface and evolution of Ceres. The relationship of these features to previous resolved HST observations (Li et al., 2006) and the recent discussion on water emission activity (Küppers et al., 2014), as well as their distribution in longitude and latitude, will be discussed. The (anticipated) most recent data will be able to resolve water related features comparable with those on icy satellites. Potential consequences for the upcoming high resolution data and their planning are to be shown. The first data, obtained on Dec 1st, have been used to start studying the phase curve and to derive an integrated spectrum of Ceres (17 co-registered pixels around the center of the disk). The data were integrated to a single spectrum between 0.44 µm and 0.96 µm. The spectrum is essentially flat over all bands within the accuracy of the data (± 0.01 in reflectance). It is consistent with previous Earth based spectra (Vilas and McFadden, 1992, Burbine et al., 2002, Li et al., 2006). Potential sites showing spectral absorption features in the visual wavelength range will be discussed. The distribution of reflectances at positions relative to the sub-solar longitude also confirms the expected extrapolation of the phase curve from ground based observations. A comparison of the observed phase effect and detected surface features will be presented. Thus differences of the surface roughness on different size scales can be discussed. They are related to presumed effects like relaxation associated with a potential subsurface water regime of Ceres. References Burbine, T. H., Rivkin, A. S., Noble, S. K., Mothe-Diniz, T., Bottke, W. F., McCoy, T. J., Dyar, M. D., Thomas, C. A. (2008). Oxygen and asteroids. Reviews in Mineralogy & Geochemistry 68, 273-343 Küppers, M., O'Rourke, L., Bocklee-Morvan, D., Zakharov, V., Lee, S., von Allmen, P., Carry, B., Teyssier, D., Marston, A., Müller, T., Crovisier, J., Barucci, M. A., Moreno, R. (2014). Localized sources of water vapour on the dwarf planet (1) Ceres. Nature 505, 525-527 Li, J.-Y., McFadden, L. A., Parker, J. Wm., Young, E. F., Stern, S. A., Thomas, P. C., Russell, C. T., Sykes, M. V. (2006). Photometric analysis of 1 Ceres and surface mapping from HST observations. Icarus 182, 143-160 Vilas, F., McFadden L. A. (1992) CCD reflectance spectra of selected asteroids. I. Presentation and data analysis considerations. Icarus 100, 85-94

  6. Ceres' darkest secret and its putative exosphere

    NASA Astrophysics Data System (ADS)

    Schorghofer, N.; Mazarico, E.; Platz, T.; Schroeder, S.; Byrne, S.; Carsenty, U.; Combe, J. P.; Ermakov, A.; McFadden, L. A.; Prettyman, T. H.; Preusker, F.; Raymond, C. A.; Russell, C. T.

    2016-12-01

    Craters near Ceres' rotational poles can be shadowed year-round and trap volatiles. The persistently shadowed regions (PSRs) have been mapped in the northern hemisphere in two ways: by illumination modeling based on the topography and by stacking of images acquired near summer solstice. Scattered light reveals bright crater floor deposits (BCFDs) in a few PSRs. The lack of BCFDs in most PSRs can in part be explained by changes in Ceres' obliquity (axis tilt). At least one BCFD is illuminated and spectroscopically identified as H2O ice; this deposit is exceptionally bright and unusual morphologically. The BCFDs are likely water ice, either delivered through the exosphere or exposed ground ice. The remarkably shallow depths at which water ice is encountered on Ceres, on a global scale, imply that only a small amount of H2O was supplied to its water exosphere from this endogenic source. Ice that accumulated in the PSRs is hence easily dominated by other sources. The lack of optically thick ice deposits in most PSRs provides an upper bound on the exogenic delivery of water to Ceres, estimated as <109 kg since the most recent obliquity maximum 14 kyr ago. Water molecules are only barely gravitationally bound to Ceres at thermal speeds, but heavier species can be long-lived in the exosphere due to the low photo-destruction rates. Nevertheless, there is no observational evidence of other exospheric species yet. These results are based on observations by the FC (Framing Camera), VIR (Visible and Infrared Spectrometer), GRaND (Gamma-Ray and Neutron Spectrometer), and Gravity Science investigation of the Dawn spacecraft, which continues to advance our understanding not only of Ceres but of processes relevant to other Solar System bodies as well.

  7. Geological Structures in the WaIls of Vestan Craters

    NASA Technical Reports Server (NTRS)

    Mittlefehldt, David; Nathues, A.; Beck, A. W.; Hoffmann, M.; Schaefer, M.; Williams, D. A.

    2014-01-01

    A compelling case can be made that Vesta is the parent asteroid for the howardite, eucrite and diogenite (HED) meteorites [1], although this interpretation has been questioned [2]. Generalized models for the structure of the crust of Vesta have been developed based on petrologic studies of basaltic eucrites, cumulate eucrites and diogenites. These models use inferred cooling rates for different types of HEDs and compositional variations within the clan to posit that the lower crust is dominantly diogenitic in character, cumulate eucrites occur deep in the upper crust, and basaltic eucrites dominate the higher levels of the upper crust [3-5]. These models lack fine-scale resolution and thus do not allow for detailed predictions of crustal structure. Geophysical models predict dike and sill intrusions ought to be present, but their widths may be quite small [6]. The northern hemisphere of Vesta is heavily cratered, and the southern hemisphere is dominated by two 400-500 km diameter basins that excavated deep into the crust [7-8]. Physical modeling of regolith formation on 300 km diameter asteroids predicts that debris layers would reach a few km in thickness, while on asteroids of Vesta's diameter regolith thicknesses would be less [9]. This agrees well with the estimated =1 km thickness of local debris excavated by a 45 km diameter vestan crater [10]. Large craters and basins may have punched through the regolith/megaregolith and exposed primary vestan crustal structures. We will use Dawn Framing Camera (FC) [11] images and color ratio maps from the High Altitude and Low Altitude Mapping Orbits (HAMO, 65 m/pixel; LAMO, 20 m/pixel) to evaluate structures exposed on the walls of craters: two examples are discussed here.

  8. Dawn Spacecraft Processing

    NASA Image and Video Library

    2007-04-10

    In Astrotech's Payload Processing Facility, technicians help secure the Dawn spacecraft onto a moveable stand. Dawn will be moved into clean room C for unbagging and further processing. Dawn's mission is to explore two of the asteroid belt's most intriguing and dissimilar occupants: asteroid Vesta and the dwarf planet Ceres. The Dawn mission is managed by JPL, a division of the California Institute of Technology in Pasadena, for NASA's Science Mission Directorate in Washington, D.C.

  9. Dawn Spacecraft Processing

    NASA Image and Video Library

    2007-04-10

    In Astrotech's Payload Processing Facility, an overhead crane lifts the Dawn spacecraft from its transporter. Dawn will be moved into clean room C for unbagging and further processing. Dawn's mission is to explore two of the asteroid belt's most intriguing and dissimilar occupants: asteroid Vesta and the dwarf planet Ceres. The Dawn mission is managed by JPL, a division of the California Institute of Technology in Pasadena, for NASA's Science Mission Directorate in Washington, D.C

  10. MESSENGER Departs Mercury

    NASA Image and Video Library

    2008-01-30

    After NASA MESSENGER spacecraft completed its successful flyby of Mercury, the Narrow Angle Camera NAC, part of the Mercury Dual Imaging System MDIS, took these images of the receding planet. This is a frame from an animation.

  11. Near-infrared high-resolution real-time omnidirectional imaging platform for drone detection

    NASA Astrophysics Data System (ADS)

    Popovic, Vladan; Ott, Beat; Wellig, Peter; Leblebici, Yusuf

    2016-10-01

    Recent technological advancements in hardware systems have made higher quality cameras. State of the art panoramic systems use them to produce videos with a resolution of 9000 x 2400 pixels at a rate of 30 frames per second (fps).1 Many modern applications use object tracking to determine the speed and the path taken by each object moving through a scene. The detection requires detailed pixel analysis between two frames. In fields like surveillance systems or crowd analysis, this must be achieved in real time.2 In this paper, we focus on the system-level design of multi-camera sensor acquiring near-infrared (NIR) spectrum and its ability to detect mini-UAVs in a representative rural Swiss environment. The presented results show the UAV detection from the trial that we conducted during a field trial in August 2015.

  12. Fast frame rate rodent cardiac x-ray imaging using scintillator lens coupled to CMOS camera

    NASA Astrophysics Data System (ADS)

    Swathi Lakshmi, B.; Sai Varsha, M. K. N.; Kumar, N. Ashwin; Dixit, Madhulika; Krishnamurthi, Ganapathy

    2017-03-01

    Micro-Computed Tomography (MCT) systems for small animal imaging plays a critical role for monitoring disease progression and therapy evaluation. In this work, an in-house built micro-CT system equipped with a X-ray scintillator lens coupled to a commercial CMOS camera was used to test the feasibility of its application to Digital Subtraction Angiography (DSA). Literature has reported such studies being done with clinical X-ray tubes that can be pulsed rapidly or with rotating gantry systems, thus increasing the cost and infrastructural requirements.The feasibility of DSA was evaluated by injected Iodinated contrast agent (ICA) through the tail vein of a mouse. Projection images of the heart were acquired pre and post contrast using the high frame rate X-ray detector and processing done to visualize transit of ICA through the heart.

  13. Tracking Sunspots from Mars, April 2015 Animation

    NASA Image and Video Library

    2015-07-10

    This single frame from a sequence of six images of an animation shows sunspots as viewed by NASA Curiosity Mars rover from April 4 to April 15, 2015. From Mars, the rover was in position to see the opposite side of the sun. The images were taken by the right-eye camera of Curiosity's Mast Camera (Mastcam), which has a 100-millimeter telephoto lens. The view on the left of each pair in this sequence has little processing other than calibration and putting north toward the top of each frame. The view on the right of each pair has been enhanced to make sunspots more visible. The apparent granularity throughout these enhanced images is an artifact of this processing. These sunspots seen in this sequence eventually produced two solar eruptions, one of which affected Earth. http://photojournal.jpl.nasa.gov/catalog/PIA19802

  14. Research on target tracking algorithm based on spatio-temporal context

    NASA Astrophysics Data System (ADS)

    Li, Baiping; Xu, Sanmei; Kang, Hongjuan

    2017-07-01

    In this paper, a novel target tracking algorithm based on spatio-temporal context is proposed. During the tracking process, the camera shaking or occlusion may lead to the failure of tracking. The proposed algorithm can solve this problem effectively. The method use the spatio-temporal context algorithm as the main research object. We get the first frame's target region via mouse. Then the spatio-temporal context algorithm is used to get the tracking targets of the sequence of frames. During this process a similarity measure function based on perceptual hash algorithm is used to judge the tracking results. If tracking failed, reset the initial value of Mean Shift algorithm for the subsequent target tracking. Experiment results show that the proposed algorithm can achieve real-time and stable tracking when camera shaking or target occlusion.

  15. Precise Trajectory Reconstruction of CE-3 Hovering Stage By Landing Camera Images

    NASA Astrophysics Data System (ADS)

    Yan, W.; Liu, J.; Li, C.; Ren, X.; Mu, L.; Gao, X.; Zeng, X.

    2014-12-01

    Chang'E-3 (CE-3) is part of the second phase of the Chinese Lunar Exploration Program, incorporating a lander and China's first lunar rover. It was landed on 14 December, 2013 successfully. Hovering and obstacle avoidance stages are essential for CE-3 safety soft landing so that precise spacecraft trajectory in these stages are of great significance to verify orbital control strategy, to optimize orbital design, to accurately determine the landing site of CE-3, and to analyze the geological background of the landing site. Because the time consumption of these stages is just 25s, it is difficult to present spacecraft's subtle movement by Measurement and Control System or by radio observations. Under this background, the trajectory reconstruction based on landing camera images can be used to obtain the trajectory of CE-3 because of its technical advantages such as unaffecting by lunar gravity field spacecraft kinetic model, high resolution, high frame rate, and so on. In this paper, the trajectory of CE-3 before and after entering hovering stage was reconstructed by landing camera images from frame 3092 to frame 3180, which lasted about 9s, under Single Image Space Resection (SISR). The results show that CE-3's subtle changes during hovering stage can be emerged by the reconstructed trajectory. The horizontal accuracy of spacecraft position was up to 1.4m while vertical accuracy was up to 0.76m. The results can be used for orbital control strategy analysis and some other application fields.

  16. High speed fluorescence imaging with compressed ultrafast photography

    NASA Astrophysics Data System (ADS)

    Thompson, J. V.; Mason, J. D.; Beier, H. T.; Bixler, J. N.

    2017-02-01

    Fluorescent lifetime imaging is an optical technique that facilitates imaging molecular interactions and cellular functions. Because the excited lifetime of a fluorophore is sensitive to its local microenvironment,1, 2 measurement of fluorescent lifetimes can be used to accurately detect regional changes in temperature, pH, and ion concentration. However, typical state of the art fluorescent lifetime methods are severely limited when it comes to acquisition time (on the order of seconds to minutes) and video rate imaging. Here we show that compressed ultrafast photography (CUP) can be used in conjunction with fluorescent lifetime imaging to overcome these acquisition rate limitations. Frame rates up to one hundred billion frames per second have been demonstrated with compressed ultrafast photography using a streak camera.3 These rates are achieved by encoding time in the spatial direction with a pseudo-random binary pattern. The time domain information is then reconstructed using a compressed sensing algorithm, resulting in a cube of data (x,y,t) for each readout image. Thus, application of compressed ultrafast photography will allow us to acquire an entire fluorescent lifetime image with a single laser pulse. Using a streak camera with a high-speed CMOS camera, acquisition rates of 100 frames per second can be achieved, which will significantly enhance our ability to quantitatively measure complex biological events with high spatial and temporal resolution. In particular, we will demonstrate the ability of this technique to do single-shot fluorescent lifetime imaging of cells and microspheres.

  17. Geologic Map of the Northern Hemisphere of Vesta

    NASA Astrophysics Data System (ADS)

    Hiesinger, Harald; Ruesch, Ottaviano; Blewett, Dave T.; Buczkowski, Debra L.; Scully, Jennifer; Williams, Dave A.; Aileen Yingst, R.; Russell, Chris T.; Raymond, Carol A.

    2013-04-01

    For more than a year, the NASA Dawn mission acquired Framing Camera (FC) images from orbit around Vesta. The surface of the asteroid was completely imaged [1] before Dawn left for its next target, the asteroid Ceres. In an early phase of the mission, the southern and equatorial regions were imaged, allowing the production of several geologic quadrangle maps [2]. During the second High Altitude Mapping Orbit (HAMO-2), the northern hemisphere became illuminated and visible. Here we present the first geologic map of the northern vestan hemisphere, from 21°N to 85°N, derived mainly from HAMO-2 observations. Detailed studies of specific geologic features within this hemisphere are presented elsewhere [e.g., 3,4]. For our geologic map we used high-resolution FC images [5] with ~20 m/pixel from the Low Altitude Mapping Orbit (LAMO), which unfortunately only cover the southern part of the study area (21°N to 45°N). For areas farther north, LAMO images are supplemented with HAMO-2 images, which have a pixel scale of about 70 m/pixel. During the departure phase, images of the north pole area with even lower spatial resolutions were acquired. Due to observational constraints, considerable shadowing is present north of 75°. From these data, an albedo mosaic and a stereo-photogrammetric digital terrain model [6] was produced, which serve as basis for our geologic map. For the geologic mapping at a scale of 1:500,000, all data were incorporated into a Geographic Information System (ArcGIS). We have identified several geologic units within the study area, including cratered highland material (ch) and the Saturnalia Formation (Sf), which is characterized by large-scale ridges and troughs, presumably associated with the south polar Veneneia impact [7]. In addition, we mapped undifferentiated crater material (uc), discontinuous ejecta material (dem), and dark/bright crater material and dark/bright crater ray material (dc/bc and dcr/bcr). We will present a detailed description of the geologic units and their relative stratigraphy [8]. References: [1] Russell C. T. et al. (2012) GSA Ann. Meet., 152-1. [2] Yingst R. A. et al. (2012) EGU, Gen. Ass., 6225. [3] Blewett D. T. et al. (2012) GSA Ann. Meet., 152-9. [4] Scully J. (2012) DPS Meet. 44, #207.08. [5] Sierks H. et al. (2011) Space Sci Rev. [6] Preusker et al. (2012) LPSC 43, #2012. [7] Jaumann et al. (2012) Science Vol. 336, pp. 687-690. [8] Hiesinger H. et al. (2013) LPSC 44, #2582.

  18. Ceres' obliquity history: implications for permanently shadowed regions

    NASA Astrophysics Data System (ADS)

    Ermakov, A.; Mazarico, E.; Schroeder, S.; Carsenty, U.; Schorghofer, N.; Raymond, C. A.; Zuber, M. T.; Smith, D. E.; Russell, C. T.

    2016-12-01

    The Dawn spacecraft's Framing Camera (FC) images and radio-tracking data have allowed precise determination of Ceres' rotational pole and obliquity. Presently, the obliquity (ɛ) of Ceres is ≈4°. Because of the low obliquity, permanently shadowed regions (PSRs) can exist on Ceres, and have been identified using both images and shape models (Schorghofer et al., 2016). These observations make Ceres only the third body in the solar system with recognized PSRs after the Moon (Zuber et al., 1997) and Mercury (Chabot et al., 2012). Some craters in Ceres' polar regions possess bright crater floor deposits (BCFD). These crater floors are typically in shadow. However, they receive light scattered from the surrounding sunlit crater walls and therefore can be seen by FC. These bright deposits are hypothesized to be water ice accumulated in PSR cold traps, analogous to the Moon (Watson et al., 1961). The existence of the PSRs critically depends on the body's obliquity. The goal of this work is to study the history of Ceres' obliquity. Knowing past obliquity variations can shed light on the history of PSRs, and can help constrain the water-ice deposition time scales. We integrate the obliquity of Ceres over the last 3 My for the range of C/MR2vol constrained by the Dawn gravity measurements (Park et al., 2016, Ermakov et al., 2016) using methods described in Wisdom & Holman (1991) and Touma & Wisdom (1994). The obliquity history for C/MR2vol=0.392 is shown in Fig. 1. The integrations show that the obliquity of Ceres undergoes large oscillations with the main period of T=25 ky and a maximum of 19.7°. The obliquity oscillations are driven by the periodic change of Ceres' orbit inclination (T=22 ky) and the pole precession (T=210 ky). Ceres passed a local obliquity minimum 1327 years ago when (ɛmin=2.4°). The most recent maximum was 13895 years ago (ɛmax=18.5°). At such high obliquity, most of the present-day PSRs receive direct sunlight. We find a correlation between BCFDs and the most persistent PSRs. In the northern hemisphere, we find that only two PSRs remain at ɛmax. Interestingly, these PSRs contain BCFDs. In the southern hemisphere, we find that only one crater with a BCFD remains in shadow at ɛmax. Ongoing work includes computation of the irradiance of individual BCFDs given the orbital and obliquity history.

  19. DRUG ABUSE WARNING NETWORK (DAWN) DATABASE

    EPA Science Inventory

    The Drug Abuse Warning Network (DAWN) is an ongoing drug abuse data collection system sponsored by SAMHSA's Office of Applied Studies. DAWN collects data from: (1) hospital emergency departments (EDs) and (2) medical examiners (MEs). The DAWN ED component relies on a nationally r...

  20. Barnacle Bill in Super Resolution from Super Panorama

    NASA Image and Video Library

    1998-07-03

    "Barnacle Bill" is a small rock immediately west-northwest of the Mars Pathfinder lander and was the first rock visited by the Sojourner Rover's alpha proton X-ray spectrometer (APXS) instrument. This image shows super resolution techniques applied to the first APXS target rock, which was never imaged with the rover's forward cameras. Super resolution was applied to help to address questions about the texture of this rock and what it might tell us about its mode of origin. This view of Barnacle Bill was produced by combining the "Super Panorama" frames from the IMP camera. Super resolution was applied to help to address questions about the texture of these rocks and what it might tell us about their mode of origin. The composite color frames that make up this anaglyph were produced for both the right and left eye of the IMP. The composites consist of 7 frames in the right eye and 8 frames in the left eye, taken with different color filters that were enlarged by 500% and then co-added using Adobe Photoshop to produce, in effect, a super-resolution panchromatic frame that is sharper than an individual frame would be. These panchromatic frames were then colorized with the red, green, and blue filtered images from the same sequence. The color balance was adjusted to approximate the true color of Mars. The anaglyph view was produced by combining the left with the right eye color composite frames by assigning the left eye composite view to the red color plane and the right eye composite view to the green and blue color planes (cyan), to produce a stereo anaglyph mosaic. This mosaic can be viewed in 3-D on your computer monitor or in color print form by wearing red-blue 3-D glasses. http://photojournal.jpl.nasa.gov/catalog/PIA01409

  1. Mars Science Laboratory Engineering Cameras

    NASA Technical Reports Server (NTRS)

    Maki, Justin N.; Thiessen, David L.; Pourangi, Ali M.; Kobzeff, Peter A.; Lee, Steven W.; Dingizian, Arsham; Schwochert, Mark A.

    2012-01-01

    NASA's Mars Science Laboratory (MSL) Rover, which launched to Mars in 2011, is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover (MER) cameras, which were sent to Mars in 2003. The engineering cameras weigh less than 300 grams each and use less than 3 W of power. Images returned from the engineering cameras are used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The hazard avoidance cameras (Haz - cams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a frame-transfer CCD (charge-coupled device) with a 1024x1024 imaging region and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer A and the other set is connected to rover computer B. The MSL rover carries 8 Hazcams and 4 Navcams.

  2. C-RED one: ultra-high speed wavefront sensing in the infrared made possible

    NASA Astrophysics Data System (ADS)

    Gach, J.-L.; Feautrier, Philippe; Stadler, Eric; Greffe, Timothee; Clop, Fabien; Lemarchand, Stéphane; Carmignani, Thomas; Boutolleau, David; Baker, Ian

    2016-07-01

    First Light Imaging's CRED-ONE infrared camera is capable of capturing up to 3500 full frames per second with a subelectron readout noise. This breakthrough has been made possible thanks to the use of an e-APD infrared focal plane array which is a real disruptive technology in imagery. We will show the performances of the camera, its main features and compare them to other high performance wavefront sensing cameras like OCAM2 in the visible and in the infrared. The project leading to this application has received funding from the European Union's Horizon 2020 research and innovation program under grant agreement N° 673944.

  3. An interactive web-based system using cloud for large-scale visual analytics

    NASA Astrophysics Data System (ADS)

    Kaseb, Ahmed S.; Berry, Everett; Rozolis, Erik; McNulty, Kyle; Bontrager, Seth; Koh, Youngsol; Lu, Yung-Hsiang; Delp, Edward J.

    2015-03-01

    Network cameras have been growing rapidly in recent years. Thousands of public network cameras provide tremendous amount of visual information about the environment. There is a need to analyze this valuable information for a better understanding of the world around us. This paper presents an interactive web-based system that enables users to execute image analysis and computer vision techniques on a large scale to analyze the data from more than 65,000 worldwide cameras. This paper focuses on how to use both the system's website and Application Programming Interface (API). Given a computer program that analyzes a single frame, the user needs to make only slight changes to the existing program and choose the cameras to analyze. The system handles the heterogeneity of the geographically distributed cameras, e.g. different brands, resolutions. The system allocates and manages Amazon EC2 and Windows Azure cloud resources to meet the analysis requirements.

  4. Linear array of photodiodes to track a human speaker for video recording

    NASA Astrophysics Data System (ADS)

    DeTone, D.; Neal, H.; Lougheed, R.

    2012-12-01

    Communication and collaboration using stored digital media has garnered more interest by many areas of business, government and education in recent years. This is due primarily to improvements in the quality of cameras and speed of computers. An advantage of digital media is that it can serve as an effective alternative when physical interaction is not possible. Video recordings that allow for viewers to discern a presenter's facial features, lips and hand motions are more effective than videos that do not. To attain this, one must maintain a video capture in which the speaker occupies a significant portion of the captured pixels. However, camera operators are costly, and often do an imperfect job of tracking presenters in unrehearsed situations. This creates motivation for a robust, automated system that directs a video camera to follow a presenter as he or she walks anywhere in the front of a lecture hall or large conference room. Such a system is presented. The system consists of a commercial, off-the-shelf pan/tilt/zoom (PTZ) color video camera, a necklace of infrared LEDs and a linear photodiode array detector. Electronic output from the photodiode array is processed to generate the location of the LED necklace, which is worn by a human speaker. The computer controls the video camera movements to record video of the speaker. The speaker's vertical position and depth are assumed to remain relatively constant- the video camera is sent only panning (horizontal) movement commands. The LED necklace is flashed at 70Hz at a 50% duty cycle to provide noise-filtering capability. The benefit to using a photodiode array versus a standard video camera is its higher frame rate (4kHz vs. 60Hz). The higher frame rate allows for the filtering of infrared noise such as sunlight and indoor lighting-a capability absent from other tracking technologies. The system has been tested in a large lecture hall and is shown to be effective.

  5. SU-C-18A-02: Image-Based Camera Tracking: Towards Registration of Endoscopic Video to CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ingram, S; Rao, A; Wendt, R

    Purpose: Endoscopic examinations are routinely performed on head and neck and esophageal cancer patients. However, these images are underutilized for radiation therapy because there is currently no way to register them to a CT of the patient. The purpose of this work is to develop a method to track the motion of an endoscope within a structure using images from standard clinical equipment. This method will be incorporated into a broader endoscopy/CT registration framework. Methods: We developed a software algorithm to track the motion of an endoscope within an arbitrary structure. We computed frame-to-frame rotation and translation of the cameramore » by tracking surface points across the video sequence and utilizing two-camera epipolar geometry. The resulting 3D camera path was used to recover the surrounding structure via triangulation methods. We tested this algorithm on a rigid cylindrical phantom with a pattern spray-painted on the inside. We did not constrain the motion of the endoscope while recording, and we did not constrain our measurements using the known structure of the phantom. Results: Our software algorithm can successfully track the general motion of the endoscope as it moves through the phantom. However, our preliminary data do not show a high degree of accuracy in the triangulation of 3D point locations. More rigorous data will be presented at the annual meeting. Conclusion: Image-based camera tracking is a promising method for endoscopy/CT image registration, and it requires only standard clinical equipment. It is one of two major components needed to achieve endoscopy/CT registration, the second of which is tying the camera path to absolute patient geometry. In addition to this second component, future work will focus on validating our camera tracking algorithm in the presence of clinical imaging features such as patient motion, erratic camera motion, and dynamic scene illumination.« less

  6. Projective Structure from Two Uncalibrated Images: Structure from Motion and Recognition

    DTIC Science & Technology

    1992-09-01

    correspondence between points in Maybank 1990). The question, therefore, is why look for both views more of a problem, and hence, may make the...plane is fixed with respect to the 1987, Faugeras, Luong and Maybank 1992). The prob- camera coordinate frame. A rigid camera motion, there- lem of...the second reference Rieger-Lawton 1985, Faugeras and Maybank 1990, Hil- plane (assuming the four object points Pi, j = 1, ...,4, dreth 1991, Faugeras

  7. Ground Testing of Prototype Hardware and Processing Algorithms for a Wide Area Space Surveillance System (WASSS)

    DTIC Science & Technology

    2013-09-01

    Ground testing of prototype hardware and processing algorithms for a Wide Area Space Surveillance System (WASSS) Neil Goldstein, Rainer A...at Magdalena Ridge Observatory using the prototype Wide Area Space Surveillance System (WASSS) camera, which has a 4 x 60 field-of-view , < 0.05...objects with larger-aperture cameras. The sensitivity of the system depends on multi-frame averaging and a Principal Component Analysis based image

  8. The ET as it falls away from the orbiter after separation on STS-121

    NASA Image and Video Library

    2006-07-04

    S121-E-05006 (4 July 2006) --- This picture of the STS-121 external tank was taken with a digital still camera by an astronaut only seconds after separation from the Space Shuttle Discovery on launch day. Engineers, managers and flight controllers have carefully studied this image and other frames from this series as well as a number of pictures showing the falling ET as photographed from umbilical well cameras.

  9. The ET as it falls away from the orbiter after separation on STS-121

    NASA Image and Video Library

    2006-07-04

    STS121-E-05011 (4 July 2006)-- This picture of the STS-121 external tank was taken with a digital still camera by an astronaut only seconds after separation from the Space Shuttle Discovery on launch day. Engineers, managers and flight controllers have carefully studied this image and other frames from this series as well as a number of pictures showing the falling ET as photographed from umbilical well cameras.

  10. The ET as it falls away from the orbiter after separation on STS-121

    NASA Image and Video Library

    2006-07-04

    STS121-E-05008 (4 July 2006)-- This picture of the STS-121 external tank was taken with a digital still camera by an astronaut only seconds after separation from the Space Shuttle Discovery on launch day. Engineers, managers and flight controllers have carefully studied this image and other frames from this series as well as a number of pictures showing the falling ET as photographed from umbilical well cameras.

  11. Use of Space Shuttle Photography in the Study of Meteorological Phenomena.

    DTIC Science & Technology

    1985-06-01

    Hurricane Kamisy in the Indian Ocean 7-12 April 1984, and the Nauna Loa volcano smoke plume, Hawaii 7-12 April 1984) is perfored using handheld-camera...April 1984 (50-mm lens) . ....... 98 42. The island of Hawaii , and Location o2 its Major Volcanoes . . . . . . . . . . . . . . . . . . . . . 99 43. Frame... Hawaii 7-12 April 1984) is performed using handheld-camera photographs from the Space Transportation ~ System (STS) 41-C mission (6-13 April 1984). High

  12. Robust camera calibration for sport videos using court models

    NASA Astrophysics Data System (ADS)

    Farin, Dirk; Krabbe, Susanne; de With, Peter H. N.; Effelsberg, Wolfgang

    2003-12-01

    We propose an automatic camera calibration algorithm for court sports. The obtained camera calibration parameters are required for applications that need to convert positions in the video frame to real-world coordinates or vice versa. Our algorithm uses a model of the arrangement of court lines for calibration. Since the court model can be specified by the user, the algorithm can be applied to a variety of different sports. The algorithm starts with a model initialization step which locates the court in the image without any user assistance or a-priori knowledge about the most probable position. Image pixels are classified as court line pixels if they pass several tests including color and local texture constraints. A Hough transform is applied to extract line elements, forming a set of court line candidates. The subsequent combinatorial search establishes correspondences between lines in the input image and lines from the court model. For the succeeding input frames, an abbreviated calibration algorithm is used, which predicts the camera parameters for the new image and optimizes the parameters using a gradient-descent algorithm. We have conducted experiments on a variety of sport videos (tennis, volleyball, and goal area sequences of soccer games). Video scenes with considerable difficulties were selected to test the robustness of the algorithm. Results show that the algorithm is very robust to occlusions, partial court views, bad lighting conditions, or shadows.

  13. A 3D camera for improved facial recognition

    NASA Astrophysics Data System (ADS)

    Lewin, Andrew; Orchard, David A.; Scott, Andrew M.; Walton, Nicholas A.; Austin, Jim

    2004-12-01

    We describe a camera capable of recording 3D images of objects. It does this by projecting thousands of spots onto an object and then measuring the range to each spot by determining the parallax from a single frame. A second frame can be captured to record a conventional image, which can then be projected onto the surface mesh to form a rendered skin. The camera is able of locating the images of the spots to a precision of better than one tenth of a pixel, and from this it can determine range to an accuracy of less than 1 mm at 1 meter. The data can be recorded as a set of two images, and is reconstructed by forming a 'wire mesh' of range points and morphing the 2 D image over this structure. The camera can be used to record the images of faces and reconstruct the shape of the face, which allows viewing of the face from various angles. This allows images to be more critically inspected for the purpose of identifying individuals. Multiple images can be stitched together to create full panoramic images of head sized objects that can be viewed from any direction. The system is being tested with a graph matching system capable of fast and accurate shape comparisons for facial recognition. It can also be used with "models" of heads and faces to provide a means of obtaining biometric data.

  14. Low power multi-camera system and algorithms for automated threat detection

    NASA Astrophysics Data System (ADS)

    Huber, David J.; Khosla, Deepak; Chen, Yang; Van Buer, Darrel J.; Martin, Kevin

    2013-05-01

    A key to any robust automated surveillance system is continuous, wide field-of-view sensor coverage and high accuracy target detection algorithms. Newer systems typically employ an array of multiple fixed cameras that provide individual data streams, each of which is managed by its own processor. This array can continuously capture the entire field of view, but collecting all the data and back-end detection algorithm consumes additional power and increases the size, weight, and power (SWaP) of the package. This is often unacceptable, as many potential surveillance applications have strict system SWaP requirements. This paper describes a wide field-of-view video system that employs multiple fixed cameras and exhibits low SWaP without compromising the target detection rate. We cycle through the sensors, fetch a fixed number of frames, and process them through a modified target detection algorithm. During this time, the other sensors remain powered-down, which reduces the required hardware and power consumption of the system. We show that the resulting gaps in coverage and irregular frame rate do not affect the detection accuracy of the underlying algorithms. This reduces the power of an N-camera system by up to approximately N-fold compared to the baseline normal operation. This work was applied to Phase 2 of DARPA Cognitive Technology Threat Warning System (CT2WS) program and used during field testing.

  15. Multi-mode Observations of Cloud-to-Ground Lightning Strokes

    NASA Astrophysics Data System (ADS)

    Smith, M. W.; Smith, B. J.; Clemenson, M. D.; Zollweg, J. D.

    2015-12-01

    We present hyper-temporal and hyper-spectral data collected using a suite of three Phantom high-speed cameras configured to observe cloud-to-ground lightning strokes. The first camera functioned as a contextual imager to show the location and structure of the strokes. The other two cameras were operated as slit-less spectrometers, with resolutions of 0.2 to 1.0 nm. The imaging camera was operated at a readout rate of 48,000 frames per second and provided an image-based trigger mechanism for the spectrometers. Each spectrometer operated at a readout rate of 400,000 frames per second. The sensors were deployed on the southern edge of Albuquerque, New Mexico and collected data over a 4 week period during the thunderstorm season in the summer of 2015. Strikes observed by the sensor suite were correlated to specific strikes recorded by the National Lightning Data Network (NLDN) and thereby geo-located. Sensor calibration factors, distance to each strike, and calculated values of atmospheric transmission were used to estimate absolute radiometric intensities for the spectral-temporal data. The data that we present show the intensity and time evolution of broadband and line emission features for both leader and return strokes. We highlight several key features and overall statistics of the observations. A companion poster describes a lightning model that is being developed at Sandia National Laboratories.

  16. Dawn Fields of View of Asteroid Vesta

    NASA Image and Video Library

    2007-01-01

    This graphic from NASA's Dawn shows fields of view of Dawn instruments from Survey orbit (red), High Altitude Mapping Orbit (green), and Low Altitude Mapping Orbit (blue) and is part of the Mission Art series from NASA's Dawn mission. http://photojournal.jpl.nasa.gov/catalog/PIA19371

  17. Compact full-motion video hyperspectral cameras: development, image processing, and applications

    NASA Astrophysics Data System (ADS)

    Kanaev, A. V.

    2015-10-01

    Emergence of spectral pixel-level color filters has enabled development of hyper-spectral Full Motion Video (FMV) sensors operating in visible (EO) and infrared (IR) wavelengths. The new class of hyper-spectral cameras opens broad possibilities of its utilization for military and industry purposes. Indeed, such cameras are able to classify materials as well as detect and track spectral signatures continuously in real time while simultaneously providing an operator the benefit of enhanced-discrimination-color video. Supporting these extensive capabilities requires significant computational processing of the collected spectral data. In general, two processing streams are envisioned for mosaic array cameras. The first is spectral computation that provides essential spectral content analysis e.g. detection or classification. The second is presentation of the video to an operator that can offer the best display of the content depending on the performed task e.g. providing spatial resolution enhancement or color coding of the spectral analysis. These processing streams can be executed in parallel or they can utilize each other's results. The spectral analysis algorithms have been developed extensively, however demosaicking of more than three equally-sampled spectral bands has been explored scarcely. We present unique approach to demosaicking based on multi-band super-resolution and show the trade-off between spatial resolution and spectral content. Using imagery collected with developed 9-band SWIR camera we demonstrate several of its concepts of operation including detection and tracking. We also compare the demosaicking results to the results of multi-frame super-resolution as well as to the combined multi-frame and multiband processing.

  18. Development of low-cost high-performance multispectral camera system at Banpil

    NASA Astrophysics Data System (ADS)

    Oduor, Patrick; Mizuno, Genki; Olah, Robert; Dutta, Achyut K.

    2014-05-01

    Banpil Photonics (Banpil) has developed a low-cost high-performance multispectral camera system for Visible to Short- Wave Infrared (VIS-SWIR) imaging for the most demanding high-sensitivity and high-speed military, commercial and industrial applications. The 640x512 pixel InGaAs uncooled camera system is designed to provide a compact, smallform factor to within a cubic inch, high sensitivity needing less than 100 electrons, high dynamic range exceeding 190 dB, high-frame rates greater than 1000 frames per second (FPS) at full resolution, and low power consumption below 1W. This is practically all the feature benefits highly desirable in military imaging applications to expand deployment to every warfighter, while also maintaining a low-cost structure demanded for scaling into commercial markets. This paper describes Banpil's development of the camera system including the features of the image sensor with an innovation integrating advanced digital electronics functionality, which has made the confluence of high-performance capabilities on the same imaging platform practical at low cost. It discusses the strategies employed including innovations of the key components (e.g. focal plane array (FPA) and Read-Out Integrated Circuitry (ROIC)) within our control while maintaining a fabless model, and strategic collaboration with partners to attain additional cost reductions on optics, electronics, and packaging. We highlight the challenges and potential opportunities for further cost reductions to achieve a goal of a sub-$1000 uncooled high-performance camera system. Finally, a brief overview of emerging military, commercial and industrial applications that will benefit from this high performance imaging system and their forecast cost structure is presented.

  19. Radiometric calibration of wide-field camera system with an application in astronomy

    NASA Astrophysics Data System (ADS)

    Vítek, Stanislav; Nasyrova, Maria; Stehlíková, Veronika

    2017-09-01

    Camera response function (CRF) is widely used for the description of the relationship between scene radiance and image brightness. Most common application of CRF is High Dynamic Range (HDR) reconstruction of the radiance maps of imaged scenes from a set of frames with different exposures. The main goal of this work is to provide an overview of CRF estimation algorithms and compare their outputs with results obtained under laboratory conditions. These algorithms, typically designed for multimedia content, are unfortunately quite useless with astronomical image data, mostly due to their nature (blur, noise, and long exposures). Therefore, we propose an optimization of selected methods to use in an astronomical imaging application. Results are experimentally verified on the wide-field camera system using Digital Single Lens Reflex (DSLR) camera.

  20. Solid state television camera (CCD-buried channel)

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The development of an all solid state television camera, which uses a buried channel charge coupled device (CCD) as the image sensor, was undertaken. A 380 x 488 element CCD array is utilized to ensure compatibility with 525 line transmission and display monitor equipment. Specific camera design approaches selected for study and analysis included (a) optional clocking modes for either fast (1/60 second) or normal (1/30 second) frame readout, (b) techniques for the elimination or suppression of CCD blemish effects, and (c) automatic light control and video gain control (i.e., ALC and AGC) techniques to eliminate or minimize sensor overload due to bright objects in the scene. Preferred approaches were determined and integrated into a deliverable solid state TV camera which addressed the program requirements for a prototype qualifiable to space environment conditions.

Top