Science.gov

Sample records for airborne digital camera

  1. A simple method for vignette correction of airborne digital camera data

    SciTech Connect

    Nguyen, A.T.; Stow, D.A.; Hope, A.S.

    1996-11-01

    Airborne digital camera systems have gained popularity in recent years due to their flexibility, high geometric fidelity and spatial resolution, and fast data turn-around time. However, a common problem that plagues these types of framing systems is vignetting which causes falloff in image brightness away from principle nadir point. This paper presents a simple method for vignetting correction by utilizing laboratory images of a uniform illumination source. Multiple lab images are averaged and inverted to create digital correction templates which then are applied to actual airborne data. The vignette correction was effective in removing the systematic falloff in spectral values. We have shown that the vignette correction is a necessary part of the preprocessing of raw digital airborne remote sensing data. The consequences of not correcting for these effects are demonstrated in the context of monitoring of salt marsh habitat. 4 refs.

  2. Application of multimode airborne digital camera system in Wenchuan earthquake disaster monitoring

    NASA Astrophysics Data System (ADS)

    Liu, Xue; Li, Qingting; Fang, Junyong; Tong, Qingxi; Zheng, Lanfen

    2009-06-01

    Remote sensing, especially airborne remote sensing, can be an invaluable technique for quick response to natural disasters. Timely acquired images by airborne remote sensing can provide very important information for the headquarters and decision makers to be aware of the disaster situation, and make effective relief arrangements. The image acquisition and processing of Multi-mode Airborne Digital Camera System (MADC) and its application in Wenchuan earthquake disaster monitoring are presented in this paper. MADC system is a novel airborne digital camera developed by Institute of Remote Sensing Applications, Chinese Academy of Sciences. This camera system can acquire high quality images in three modes, namely wide field, multi-spectral (hyper-spectral) and stereo conformation. The basic components and technical parameters of MADC are also presented in this paper. MADC system played a very important role in the disaster monitoring of Wenchuan earthquake. In particular, the map of dammed lakes in Jianjiang river area was produced and provided to the front line headquarters. Analytical methods and information extraction techniques of MADC are introduced. Some typical analytical and imaging results are given too. Suggestions for the design and configuration of the airborne sensors are discussed at the end of this paper.

  3. Georeferencing airborne images from a multiple digital camera system by GPS/INS

    NASA Astrophysics Data System (ADS)

    Mostafa, Mohamed Mohamed Rashad

    2000-10-01

    In this thesis, the development and testing of an airborne fully digital multi-sensor system for kinematic mapping is presented. The system acquires two streams of data, namely navigation data and imaging data. The navigation data are obtained by integrating an accurate strapdown Inertial Navigation System with two GPS receivers. The imaging data are acquired by two digital cameras, configured in such a way so as to reduce their geometric limitations. The two digital cameras capture strips of overlapping nadir and oblique images. The INS/GPS-derived trajectory contains the full translational and rotational motion of the carrier aircraft. Thus, image exterior orientation information is extracted from the trajectory, during postprocessing. This approach eliminates the need for ground control when computing 3D positions of objects that appear in the field of view of the system imaging component. Test flights were conducted over the campus of The University of Calgary. Two approaches for calibrating the system are presented, namely pre-mission calibration and in-flight calibration. Testing the system in flight showed that best ground point positioning accuracy at 1:12000 average image scale is 0.2 m (RMS) in easting and northing and 0.3 m (RMS) in height. Preliminary results indicate that major applications of such a system in the future are in the field of digital mapping, at scales of 1:10000 and smaller, and the generation of digital elevation models for engineering applications.

  4. Airborne ballistic camera tracking systems

    NASA Technical Reports Server (NTRS)

    Redish, W. L.

    1976-01-01

    An operational airborne ballistic camera tracking system was tested for operational and data reduction feasibility. The acquisition and data processing requirements of the system are discussed. Suggestions for future improvements are also noted. A description of the data reduction mathematics is outlined. Results from a successful reentry test mission are tabulated. The test mission indicated that airborne ballistic camera tracking systems are feasible.

  5. Replacing 16 mm film cameras with high definition digital cameras

    SciTech Connect

    Balch, K.S.

    1995-12-31

    For many years 16 mm film cameras have been used in severe environments. These film cameras are used on Hy-G automotive sleds, airborne gun cameras, range tracking and other hazardous environments. The companies and government agencies using these cameras are in need of replacing them with a more cost effective solution. Film-based cameras still produce the best resolving capability, however, film development time, chemical disposal, recurring media cost, and faster digital analysis are factors influencing the desire for a 16 mm film camera replacement. This paper will describe a new camera from Kodak that has been designed to replace 16 mm high speed film cameras.

  6. Digital Pinhole Camera

    ERIC Educational Resources Information Center

    Lancor, Rachael; Lancor, Brian

    2014-01-01

    In this article we describe how the classic pinhole camera demonstration can be adapted for use with digital cameras. Students can easily explore the effects of the size of the pinhole and its distance from the sensor on exposure time, magnification, and image quality. Instructions for constructing a digital pinhole camera and our method for…

  7. An airborne four-camera imaging system for agricultural applications

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper describes the design and testing of an airborne multispectral digital imaging system for remote sensing applications. The system consists of four high resolution charge coupled device (CCD) digital cameras and a ruggedized PC equipped with a frame grabber and image acquisition software. T...

  8. Replacing 16-mm film cameras with high-definition digital cameras

    NASA Astrophysics Data System (ADS)

    Balch, Kris S.

    1995-09-01

    For many years 16 mm film cameras have been used in severe environments. These film cameras are used on Hy-G automotive sleds, airborne gun cameras, range tracking and other hazardous environments. The companies and government agencies using these cameras are in need of replacing them with a more cost effective solution. Film-based cameras still produce the best resolving capability, however, film development time, chemical disposal, recurring media cost, and faster digital analysis are factors influencing the desire for a 16 mm film camera replacement. This paper will describe a new camera from Kodak that has been designed to replace 16 mm high speed film cameras.

  9. Digital Cameras for Student Use.

    ERIC Educational Resources Information Center

    Simpson, Carol

    1997-01-01

    Describes the features, equipment and operations of digital cameras and compares three different digital cameras for use in education. Price, technology requirements, features, transfer software, and accessories for the Kodak DC25, Olympus D-200L and Casio QV-100 are presented in a comparison table. (AEF)

  10. Cameras for digital microscopy.

    PubMed

    Spring, Kenneth R

    2013-01-01

    This chapter reviews the fundamental characteristics of charge-coupled devices (CCDs) and related detectors, outlines the relevant parameters for their use in microscopy, and considers promising recent developments in the technology of detectors. Electronic imaging with a CCD involves three stages--interaction of a photon with the photosensitive surface, storage of the liberated charge, and readout or measurement of the stored charge. The most demanding applications in fluorescence microscopy may require as much as four orders of greater magnitude sensitivity. The image in the present-day light microscope is usually acquired with a CCD camera. The CCD is composed of a large matrix of photosensitive elements (often referred to as "pixels" shorthand for picture elements, which simultaneously capture an image over the entire detector surface. The light-intensity information for each pixel is stored as electronic charge and is converted to an analog voltage by a readout amplifier. This analog voltage is subsequently converted to a numerical value by a digitizer situated on the CCD chip, or very close to it. Several (three to six) amplifiers are required for each pixel, and to date, uniform images with a homogeneous background have been a problem because of the inherent difficulties of balancing the gain in all of the amplifiers. Complementary metal oxide semiconductor sensors also exhibit relatively high noise associated with the requisite high-speed switching. Both of these deficiencies are being addressed, and sensor performance is nearing that required for scientific imaging. PMID:23931507

  11. Measuring Distances Using Digital Cameras

    ERIC Educational Resources Information Center

    Kendal, Dave

    2007-01-01

    This paper presents a generic method of calculating accurate horizontal and vertical object distances from digital images taken with any digital camera and lens combination, where the object plane is parallel to the image plane or tilted in the vertical plane. This method was developed for a project investigating the size, density and spatial…

  12. MAPPING GRAIN SORGHUM YEILD VARIABILITY USING AIRBORNE DIGITAL VIDEOGRAPHY

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Mapping crop yield variability is one important aspect of precision agriculture. This study was designed to assess airborne digital videography as a tool for mapping grain sorghum yields for precision farming. Color-infrared (CIR) imagery was acquired with a three- camera digital video imaging sys...

  13. A high-resolution airborne four-camera imaging system for agricultural remote sensing

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper describes the design and testing of an airborne multispectral digital imaging system for remote sensing applications. The system consists of four high resolution charge coupled device (CCD) digital cameras and a ruggedized PC equipped with a frame grabber and image acquisition software. T...

  14. Recent advances in digital camera optics

    NASA Astrophysics Data System (ADS)

    Ishiguro, Keizo

    2012-10-01

    The digital camera market has extremely expanded in the last ten years. The zoom lens for digital camera is especially the key determining factor of the camera body size and image quality. Its technologies have been based on several analog technological progresses including the method of aspherical lens manufacturing and the mechanism of image stabilization. Panasonic is one of the pioneers of both technologies. I will introduce the previous trend in optics of zoom lens as well as original optical technologies of Panasonic digital camera "LUMIX", and in addition optics in 3D camera system. Besides, I would like to suppose the future trend in digital cameras.

  15. High Speed Digital Camera Technology Review

    NASA Technical Reports Server (NTRS)

    Clements, Sandra D.

    2009-01-01

    A High Speed Digital Camera Technology Review (HSD Review) is being conducted to evaluate the state-of-the-shelf in this rapidly progressing industry. Five HSD cameras supplied by four camera manufacturers participated in a Field Test during the Space Shuttle Discovery STS-128 launch. Each camera was also subjected to Bench Tests in the ASRC Imaging Development Laboratory. Evaluation of the data from the Field and Bench Tests is underway. Representatives from the imaging communities at NASA / KSC and the Optical Systems Group are participating as reviewers. A High Speed Digital Video Camera Draft Specification was updated to address Shuttle engineering imagery requirements based on findings from this HSD Review. This draft specification will serve as the template for a High Speed Digital Video Camera Specification to be developed for the wider OSG imaging community under OSG Task OS-33.

  16. High Resolution Airborne Digital Imagery for Precision Agriculture

    NASA Technical Reports Server (NTRS)

    Herwitz, Stanley R.

    1998-01-01

    The Environmental Research Aircraft and Sensor Technology (ERAST) program is a NASA initiative that seeks to demonstrate the application of cost-effective aircraft and sensor technology to private commercial ventures. In 1997-98, a series of flight-demonstrations and image acquisition efforts were conducted over the Hawaiian Islands using a remotely-piloted solar- powered platform (Pathfinder) and a fixed-wing piloted aircraft (Navajo) equipped with a Kodak DCS450 CIR (color infrared) digital camera. As an ERAST Science Team Member, I defined a set of flight lines over the largest coffee plantation in Hawaii: the Kauai Coffee Company's 4,000 acre Koloa Estate. Past studies have demonstrated the applications of airborne digital imaging to agricultural management. Few studies have examined the usefulness of high resolution airborne multispectral imagery with 10 cm pixel sizes. The Kodak digital camera integrated with ERAST's Airborne Real Time Imaging System (ARTIS) which generated multiband CCD images consisting of 6 x 106 pixel elements. At the designated flight altitude of 1,000 feet over the coffee plantation, pixel size was 10 cm. The study involved the analysis of imagery acquired on 5 March 1998 for the detection of anomalous reflectance values and for the definition of spectral signatures as indicators of tree vigor and treatment effectiveness (e.g., drip irrigation; fertilizer application).

  17. Television camera on RMS surveys insulation on Airborne Support Equipment

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The television camera on the end effector of the Canadian-built Remote Manipulator System (RMS) is seen surveying some of the insulation on the Airborne Support Equipment (ASE). Flight controllers called for the survey following the departure of the Advanced Communications Technology Satellite (ACTS) and its Transfer Orbit Stage (TOS).

  18. An Inexpensive Digital Infrared Camera

    ERIC Educational Resources Information Center

    Mills, Allan

    2012-01-01

    Details are given for the conversion of an inexpensive webcam to a camera specifically sensitive to the near infrared (700-1000 nm). Some experiments and practical applications are suggested and illustrated. (Contains 9 figures.)

  19. Tips and Tricks for Digital Camera Users.

    ERIC Educational Resources Information Center

    Ekhaml, Leticia

    2002-01-01

    Discusses the use of digital cameras in school library media centers and offers suggestions for teachers and students in elementary schools. Describes appropriate image-editing software; explains how to create panoramas, screen savers, and coloring books; and includes useful tips for digital photographers. (LRW)

  20. Camera! Action! Collaborate with Digital Moviemaking

    ERIC Educational Resources Information Center

    Swan, Kathleen Owings; Hofer, Mark; Levstik, Linda S.

    2007-01-01

    Broadly defined, digital moviemaking integrates a variety of media (images, sound, text, video, narration) to communicate with an audience. There is near-ubiquitous access to the necessary software (MovieMaker and iMovie are bundled free with their respective operating systems) and hardware (computers with Internet access, digital cameras, etc.).…

  1. A stereoscopic lens for digital cinema cameras

    NASA Astrophysics Data System (ADS)

    Lipton, Lenny; Rupkalvis, John

    2015-03-01

    Live-action stereoscopic feature films are, for the most part, produced using a costly post-production process to convert planar cinematography into stereo-pair images and are only occasionally shot stereoscopically using bulky dual-cameras that are adaptations of the Ramsdell rig. The stereoscopic lens design described here might very well encourage more live-action image capture because it uses standard digital cinema cameras and workflow to save time and money.

  2. Digital Camera Project Fosters Communication Skills

    ERIC Educational Resources Information Center

    Fisher, Ashley; Lazaros, Edward J.

    2009-01-01

    This article details the many benefits of educators' use of digital camera technology and provides an activity in which students practice taking portrait shots of classmates, manipulate the resulting images, and add language arts practice by interviewing their subjects to produce a photo-illustrated Word document. This activity gives…

  3. National Guidelines for Digital Camera Systems Certification

    NASA Astrophysics Data System (ADS)

    Yaron, Yaron; Keinan, Eran; Benhamu, Moshe; Regev, Ronen; Zalmanzon, Garry

    2016-06-01

    Digital camera systems are a key component in the production of reliable, geometrically accurate, high-resolution geospatial products. These systems have replaced film imaging in photogrammetric data capturing. Today, we see a proliferation of imaging sensors collecting photographs in different ground resolutions, spectral bands, swath sizes, radiometric characteristics, accuracies and carried on different mobile platforms. In addition, these imaging sensors are combined with navigational tools (such as GPS and IMU), active sensors such as laser scanning and powerful processing tools to obtain high quality geospatial products. The quality (accuracy, completeness, consistency, etc.) of these geospatial products is based on the use of calibrated, high-quality digital camera systems. The new survey regulations of the state of Israel specify the quality requirements for each geospatial product including: maps at different scales and for different purposes, elevation models, orthophotographs, three-dimensional models at different levels of details (LOD) and more. In addition, the regulations require that digital camera systems used for mapping purposes should be certified using a rigorous mapping systems certification and validation process which is specified in the Director General Instructions. The Director General Instructions for digital camera systems certification specify a two-step process as follows: 1. Theoretical analysis of system components that includes: study of the accuracy of each component and an integrative error propagation evaluation, examination of the radiometric and spectral response curves for the imaging sensors, the calibration requirements, and the working procedures. 2. Empirical study of the digital mapping system that examines a typical project (product scale, flight height, number and configuration of ground control points and process). The study examine all the aspects of the final product including; its accuracy, the product pixels size

  4. Enviro-pix: Using Digital Cameras in the Classroom.

    ERIC Educational Resources Information Center

    Clements, Dan

    1997-01-01

    Provides technical information about digital photography including cost, quality, and applications in classrooms. Recommends that digital cameras be used for student classroom presentations and record keeping. (DDR)

  5. Classroom multispectral imaging using inexpensive digital cameras.

    NASA Astrophysics Data System (ADS)

    Fortes, A. D.

    2007-12-01

    The proliferation of increasingly cheap digital cameras in recent years means that it has become easier to exploit the broad wavelength sensitivity of their CCDs (360 - 1100 nm) for classroom-based teaching. With the right tools, it is possible to open children's eyes to the invisible world of UVA and near-IR radiation either side of our narrow visual band. The camera-filter combinations I describe can be used to explore the world of animal vision, looking for invisible markings on flowers, or in bird plumage, for example. In combination with a basic spectroscope (such as the Project-STAR handheld plastic spectrometer, 25), it is possible to investigate the range of human vision and camera sensitivity, and to explore the atomic and molecular absorption lines from the solar and terrestrial atmospheres. My principal use of the cameras has been to teach multispectral imaging of the kind used to determine remotely the composition of planetary surfaces. A range of camera options, from 50 circuit-board mounted CCDs up to $900 semi-pro infrared camera kits (including mobile phones along the way), and various UV-vis-IR filter options will be presented. Examples of multispectral images taken with these systems are used to illustrate the range of classroom topics that can be covered. Particular attention is given to learning about spectral reflectance curves and comparing images from Earth and Mars taken using the same filter combination that it used on the Mars Rovers.

  6. Digital Earth Watch: Investigating the World with Digital Cameras

    NASA Astrophysics Data System (ADS)

    Gould, A. D.; Schloss, A. L.; Beaudry, J.; Pickle, J.

    2015-12-01

    Every digital camera including the smart phone camera can be a scientific tool. Pictures contain millions of color intensity measurements organized spatially allowing us to measure properties of objects in the images. This presentation will demonstrate how digital pictures can be used for a variety of studies with a special emphasis on using repeat digital photographs to study change-over-time in outdoor settings with a Picture Post. Demonstrations will include using inexpensive color filters to take pictures that enhance features in images such as unhealthy leaves on plants, or clouds in the sky. Software available at no cost from the Digital Earth Watch (DEW) website that lets students explore light, color and pixels, manipulate color in images and make measurements, will be demonstrated. DEW and Picture Post were developed with support from NASA. Please visit our websites: DEW: http://dew.globalsystemsscience.orgPicture Post: http://picturepost.unh.edu

  7. A digital ISO expansion technique for digital cameras

    NASA Astrophysics Data System (ADS)

    Yoo, Youngjin; Lee, Kangeui; Choe, Wonhee; Park, SungChan; Lee, Seong-Deok; Kim, Chang-Yong

    2010-01-01

    Market's demands of digital cameras for higher sensitivity capability under low-light conditions are remarkably increasing nowadays. The digital camera market is now a tough race for providing higher ISO capability. In this paper, we explore an approach for increasing maximum ISO capability of digital cameras without changing any structure of an image sensor or CFA. Our method is directly applied to the raw Bayer pattern CFA image to avoid non-linearity characteristics and noise amplification which are usually deteriorated after ISP (Image Signal Processor) of digital cameras. The proposed method fuses multiple short exposed images which are noisy, but less blurred. Our approach is designed to avoid the ghost artifact caused by hand-shaking and object motion. In order to achieve a desired ISO image quality, both low frequency chromatic noise and fine-grain noise that usually appear in high ISO images are removed and then we modify the different layers which are created by a two-scale non-linear decomposition of an image. Once our approach is performed on an input Bayer pattern CFA image, the resultant Bayer image is further processed by ISP to obtain a fully processed RGB image. The performance of our proposed approach is evaluated by comparing SNR (Signal to Noise Ratio), MTF50 (Modulation Transfer Function), color error ~E*ab and visual quality with reference images whose exposure times are properly extended into a variety of target sensitivity.

  8. X-ray imaging using digital cameras

    NASA Astrophysics Data System (ADS)

    Winch, Nicola M.; Edgar, Andrew

    2012-03-01

    The possibility of using the combination of a computed radiography (storage phosphor) cassette and a semiprofessional grade digital camera for medical or dental radiography is investigated. We compare the performance of (i) a Canon 5D Mk II single lens reflex camera with f1.4 lens and full-frame CMOS array sensor and (ii) a cooled CCD-based camera with a 1/3 frame sensor and the same lens system. Both systems are tested with 240 x 180 mm cassettes which are based on either powdered europium-doped barium fluoride bromide or needle structure europium-doped cesium bromide. The modulation transfer function for both systems has been determined and falls to a value of 0.2 at around 2 lp/mm, and is limited by light scattering of the emitted light from the storage phosphor rather than the optics or sensor pixelation. The modulation transfer function for the CsBr:Eu2+ plate is bimodal, with a high frequency wing which is attributed to the light-guiding behaviour of the needle structure. The detective quantum efficiency has been determined using a radioisotope source and is comparatively low at 0.017 for the CMOS camera and 0.006 for the CCD camera, attributed to the poor light harvesting by the lens. The primary advantages of the method are portability, robustness, digital imaging and low cost; the limitations are the low detective quantum efficiency and hence signal-to-noise ratio for medical doses, and restricted range of plate sizes. Representative images taken with medical doses are shown and illustrate the potential use for portable basic radiography.

  9. Digital Camera Control for Faster Inspection

    NASA Technical Reports Server (NTRS)

    Brown, Katharine; Siekierski, James D.; Mangieri, Mark L.; Dekome, Kent; Cobarruvias, John; Piplani, Perry J.; Busa, Joel

    2009-01-01

    Digital Camera Control Software (DCCS) is a computer program for controlling a boom and a boom-mounted camera used to inspect the external surface of a space shuttle in orbit around the Earth. Running in a laptop computer in the space-shuttle crew cabin, DCCS commands integrated displays and controls. By means of a simple one-button command, a crewmember can view low- resolution images to quickly spot problem areas and can then cause a rapid transition to high- resolution images. The crewmember can command that camera settings apply to a specific small area of interest within the field of view of the camera so as to maximize image quality within that area. DCCS also provides critical high-resolution images to a ground screening team, which analyzes the images to assess damage (if any); in so doing, DCCS enables the team to clear initially suspect areas more quickly than would otherwise be possible and further saves time by minimizing the probability of re-imaging of areas already inspected. On the basis of experience with a previous version (2.0) of the software, the present version (3.0) incorporates a number of advanced imaging features that optimize crewmember capability and efficiency.

  10. Process simulation in digital camera system

    NASA Astrophysics Data System (ADS)

    Toadere, Florin

    2012-06-01

    The goal of this paper is to simulate the functionality of a digital camera system. The simulations cover the conversion from light to numerical signal and the color processing and rendering. We consider the image acquisition system to be linear shift invariant and axial. The light propagation is orthogonal to the system. We use a spectral image processing algorithm in order to simulate the radiometric properties of a digital camera. In the algorithm we take into consideration the transmittances of the: light source, lenses, filters and the quantum efficiency of a CMOS (complementary metal oxide semiconductor) sensor. The optical part is characterized by a multiple convolution between the different points spread functions of the optical components. We use a Cooke triplet, the aperture, the light fall off and the optical part of the CMOS sensor. The electrical part consists of the: Bayer sampling, interpolation, signal to noise ratio, dynamic range, analog to digital conversion and JPG compression. We reconstruct the noisy blurred image by blending different light exposed images in order to reduce the photon shot noise, also we filter the fixed pattern noise and we sharpen the image. Then we have the color processing blocks: white balancing, color correction, gamma correction, and conversion from XYZ color space to RGB color space. For the reproduction of color we use an OLED (organic light emitting diode) monitor. The analysis can be useful to assist students and engineers in image quality evaluation and imaging system design. Many other configurations of blocks can be used in our analysis.

  11. Data fusion techniques for object space classification using airborne laser data and airborne digital photographs

    NASA Astrophysics Data System (ADS)

    Park, Joong Yong

    The objective of this research is to investigate possible strategies for the fusion of airborne laser data with passive optical data for object space classification. A significant contribution of our work is the development and implementation of a data-level fusion technique, direct digital image georeferencing (DDIG). In DDIG, we use navigation data from an integrated system (composed of global positioning system (GPS) and inertial measurement unit (IMU)) to project three-dimensional data points measured with the University of Florida's airborne laser swath mapping (ALSM) system onto digital aerial photographs. As an underlying math model, we use the familiar collinearity condition equations. After matching the ALSM object space points to their corresponding image space pixels, we resample the digital photographs using cubic convolution techniques. We call the resulting images pseudo-ortho-rectified images (PORI) because they are orthographic at the ground surface but still exhibit some relief displacement for elevated objects; and because they have been resampled using a interpolation technique. Our accuracy tests on these PORI images show that they are planimetrically correct to about 0.4 meters. This accuracy is sufficient to remove most of the effects of the central perspective projection and enable a meaningful fusion of the RGB data with the height and intensity data produced by the laser. PORI images may also be sufficiently accurate for many other mapping applications, and may in some applications be an attractive alternative to traditional photogrammetric techniques. A second contribution of our research is the development of several strategies for the fusion of data from airborne laser and camera systems. We have conducted our work within the sensor fusion paradigm formalized in the optical engineering community. Our work explores the fusion of these two types of data for precision mapping applications. Specifically, we combine three different types of

  12. It's a Snap! Selecting the Right Digital Camera

    ERIC Educational Resources Information Center

    Browne, Ron

    2005-01-01

    Digital cameras can be wonderful teaching/learning tools in the preschool classroom. They can record and document student development, make text-free cues for pre-reading children, and develop learning prompts for discussion. In this article, the author discusses tips on selecting the right digital camera. Above all, it is important to consider…

  13. Comparison of Digital Surface Models for Snow Depth Mapping with Uav and Aerial Cameras

    NASA Astrophysics Data System (ADS)

    Boesch, R.; Bühler, Y.; Marty, M.; Ginzler, C.

    2016-06-01

    Photogrammetric workflows for aerial images have improved over the last years in a typically black-box fashion. Most parameters for building dense point cloud are either excessive or not explained and often the progress between software releases is poorly documented. On the other hand, development of better camera sensors and positional accuracy of image acquisition is significant by comparing product specifications. This study shows, that hardware evolutions over the last years have a much stronger impact on height measurements than photogrammetric software releases. Snow height measurements with airborne sensors like the ADS100 and UAV-based DSLR cameras can achieve accuracies close to GSD * 2 in comparison with ground-based GNSS reference measurements. Using a custom notch filter on the UAV camera sensor during image acquisition does not yield better height accuracies. UAV based digital surface models are very robust. Different workflow parameter variations for ADS100 and UAV camera workflows seem to have only random effects.

  14. An airborne multispectral imaging system based on two consumer-grade cameras for agricultural remote sensing

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper describes the design and evaluation of an airborne multispectral imaging system based on two identical consumer-grade cameras for agricultural remote sensing. The cameras are equipped with a full-frame complementary metal oxide semiconductor (CMOS) sensor with 5616 × 3744 pixels. One came...

  15. Methods for identification of images acquired with digital cameras

    NASA Astrophysics Data System (ADS)

    Geradts, Zeno J.; Bijhold, Jurrien; Kieft, Martijn; Kurosawa, Kenji; Kuroki, Kenro; Saitoh, Naoki

    2001-02-01

    From the court we were asked whether it is possible to determine if an image has been made with a specific digital camera. This question has to be answered in child pornography cases, where evidence is needed that a certain picture has been made with a specific camera. We have looked into different methods of examining the cameras to determine if a specific image has been made with a camera: defects in CCDs, file formats that are used, noise introduced by the pixel arrays and watermarking in images used by the camera manufacturer.

  16. Digital dental photography. Part 4: choosing a camera.

    PubMed

    Ahmad, I

    2009-06-13

    With so many cameras and systems on the market, making a choice of the right one for your practice needs is a daunting task. As described in Part 1 of this series, a digital single reflex (DSLR) camera is an ideal choice for dental use in enabling the taking of portraits, close-up or macro images of the dentition and study casts. However, for the sake of completion, some other cameras systems that are used in dentistry are also discussed. PMID:19521372

  17. A Simple Spectrophotometer Using Common Materials and a Digital Camera

    ERIC Educational Resources Information Center

    Widiatmoko, Eko; Widayani; Budiman, Maman; Abdullah, Mikrajuddin; Khairurrijal

    2011-01-01

    A simple spectrophotometer was designed using cardboard, a DVD, a pocket digital camera, a tripod and a computer. The DVD was used as a diffraction grating and the camera as a light sensor. The spectrophotometer was calibrated using a reference light prior to use. The spectrophotometer was capable of measuring optical wavelengths with a…

  18. Characterizing Digital Camera Systems: A Prelude to Data Standards

    NASA Technical Reports Server (NTRS)

    Ryan, Robert

    2002-01-01

    This viewgraph presentation profiles: 1) Digital imaging systems; 2) Specifying a digital imagery product; and 3) Characterization of data acquisition systems. Advanced large array digital imaging systems are routinely being used. Digital imagery guidelines are being developed by ASPRS and ISPRS. Guidelines and standards are of little use without standardized characterization methods. Characterization of digital camera systems is important for supporting digital imagery guidelines. Specifications are characterized in the lab and/or the field. Laboratory characterization is critical for optimizing and defining performance. In-flight characterization is necessary for an end-to-end system test.

  19. Choosing the Best Digital Camera for Your Program

    ERIC Educational Resources Information Center

    Mikat, Richard P.; Anderson, Mandi

    2005-01-01

    Many educators in physical education, recreation, dance, and related fields have begun using digital images to enhance their teaching (e.g., Ryan, Marzilla, & Martindale, 2001). Many other educators would like to begin using this technology, but find the task of choosing an appropriate digital camera to be overwhelming. This article is designed to…

  20. PENTAX 645D Medium-Format Digital SLR Camera

    NASA Astrophysics Data System (ADS)

    Maekawa, Yasuyuki

    PENTAX 645D has been developped with the aim to offer a relatively affordable, yet highly operable and durable Digital SLR camera equipped with larger sensor, to nonprofessional, yet highly-enthusiastic photographers, whereas precedent cameras with similar format had been a tool of a professional photographer, both in terms of its extraordinary pricing and the extent of knowledge required for the handling of it. The focus of this article is to detail the unique feature of the Medium-Format Digital SLR camera, a product composed of the latest functionality and thought-out user interface developed in the eingeneering of standard APS-C Digital SLR, as well as the extensive knowledge of larger format cameras accumulated through the development of the film Medium-Format SLR.

  1. Improvement on the polynomial modeling of digital camera colorimetric characterization

    NASA Astrophysics Data System (ADS)

    Huang, Xiaoqiao; Yu, Hongfei; Shi, Junsheng; Tai, Yonghang

    2014-11-01

    The digital camera has become a requisite for people's life, also essential in imaging applications, and it is important to get more accurate colors with digital camera. The colorimetric characterization of digital camera is the basis of image copy and color management process. One of the traditional methods for deriving a colorimetric mapping between camera RGB signals and the tristimulus values CIEXYZ is to use polynomial modeling with 3×11 polynomial transfer matrices. In this paper, an improved polynomial modeling is presented, in which the normalized luminance replaces the camera inherent RGB values in the traditional polynomial modeling. The improved modeling can be described by a two stage model. The first stage, relationship between the camera RGB values and normalized luminance with six gray patches in the X-rite ColorChecker 24-color card was described as "Gamma", camera RGB values were converted into normalized luminance using Gamma. The second stage, the traditional polynomial modeling was improved to the colorimetric mapping between normalized luminance and the CIEXYZ. Meanwhile, this method was used under daylight lighting environment, the users can not measure the CIEXYZ of the color target char using professional instruments, but they can accomplish the task of the colorimetric characterization of digital camera. The experimental results show that: (1) the proposed method for the colorimetric characterization of digital camera performs better than traditional polynomial modeling; (2) it's a feasible approach to handle the color characteristics using this method under daylight environment without professional instruments, the result can satisfy for request of simple application.

  2. A simple spectrophotometer using common materials and a digital camera

    NASA Astrophysics Data System (ADS)

    Widiatmoko, Eko; Widayani; Budiman, Maman; Abdullah, Mikrajuddin; Khairurrijal

    2011-05-01

    A simple spectrophotometer was designed using cardboard, a DVD, a pocket digital camera, a tripod and a computer. The DVD was used as a diffraction grating and the camera as a light sensor. The spectrophotometer was calibrated using a reference light prior to use. The spectrophotometer was capable of measuring optical wavelengths with a theoretical accuracy as high as 0.2 nm. Using this spectrophotometer, wavelengths are determined via image processing.

  3. Automated 3D measurement with the DCS200 digital camera

    NASA Astrophysics Data System (ADS)

    Van den Heuvel, Frank A.

    1994-03-01

    A digital photogrammetric system for automated 3D coordinate measurement in a production environment has been developed. For the image acquisition the Kodak DCS200 digital camera is used. This camera is based on a standard 35-mm camera. The results of the radiometric and geometric calibration of the DCS200 camera show the potential of this camera for photogrammetric applications. The software part of the system performs the detection, identification, and measurement of artificial targets present in digital images. These artificial targets are designed for automatic detection in images of a complex scene. For the identification of the targets a circular bar code is read by the image processing software. The least squares template matching method is implemented for the target image measurement. A precision better than 2% of a pixel was obtained for the target location. The 3D coordinate computation is performed by Geodelta's bundle adjustment package BINAER. It includes extensive statistical testing to assess the accuracy of the results. Tests with the DCS200 camera show a repeatability of 18 micrometer standard deviation on a test field 60 X 50 X 30 centimeter. The achieved precision is in the order of 2 (DOT) 10-5.

  4. Review of up-to date digital cameras interfaces

    NASA Astrophysics Data System (ADS)

    Linkemann, Joachim

    2013-04-01

    Over the past 15 years, various interfaces on digital industrial cameras have been available on the market. This tutorial will give an overview of interfaces such as LVDS (RS644), Channel Link and Camera Link. In addition, other interfaces such as FireWire, Gigabit Ethernet, and now USB 3.0 have become more popular. Owing to their ease of use, these interfaces cover most of the market. Nevertheless, for certain applications and especially for higher bandwidths, Camera Link and CoaXPress are very useful. This tutorial will give a description of the advantages and disadvantages, comment on bandwidths, and provide recommendations on when to use which interface.

  5. Compression of CCD raw images for digital still cameras

    NASA Astrophysics Data System (ADS)

    Sriram, Parthasarathy; Sudharsanan, Subramania

    2005-03-01

    Lossless compression of raw CCD images captured using color filter arrays has several benefits. The benefits include improved storage capacity, reduced memory bandwidth, and lower power consumption for digital still camera processors. The paper discusses the benefits in detail and proposes the use of a computationally efficient block adaptive scheme for lossless compression. Experimental results are provided that indicate that the scheme performs well for CCD raw images attaining compression factors of more than two. The block adaptive method also compares favorably with JPEG-LS. A discussion is provided indicating how the proposed lossless coding scheme can be incorporated into digital still camera processors enabling lower memory bandwidth and storage requirements.

  6. Digital image georeferencing from a multiple camera system by GPS/INS

    NASA Astrophysics Data System (ADS)

    Mostafa, Mohamed M. R.; Schwarz, Klaus-Peter

    In this paper, the development and testing of an airborne fully digital multi-sensor system for digital mapping data acquisition is presented. The system acquires two streams of data, namely, navigation (georeferencing) data and imaging data. The navigation data are obtained by integrating an accurate strapdown inertial navigation system with a differential GPS system (DGPS). The imaging data are acquired by two low-cost digital cameras, configured in such a way so as to reduce their geometric limitations. The two cameras capture strips of overlapping nadir and oblique images. The GPS/INS-derived trajectory contains the full translational and rotational motion of the carrier aircraft. Thus, image exterior orientation information is extracted from the trajectory, during post-processing. This approach eliminates the need for ground control (GCP) when computing 3D positions of objects that appear in the field of view of the system imaging component. Two approaches for calibrating the system are presented, namely, terrestrial calibration and in-flight calibration. Test flights were conducted over the campus of The University of Calgary. Testing the system showed that best ground point positioning accuracy at 1:12,000 average image scale is 0.2 m (RMS) in easting and northing and 0.3 m (RMS) in height. Preliminary results indicate that major applications of such a system in the future are in the field of digital mapping, at scales of 1:5000 and smaller, and in the generation of digital elevation models for engineering applications.

  7. Using a Digital Video Camera to Study Motion

    ERIC Educational Resources Information Center

    Abisdris, Gil; Phaneuf, Alain

    2007-01-01

    To illustrate how a digital video camera can be used to analyze various types of motion, this simple activity analyzes the motion and measures the acceleration due to gravity of a basketball in free fall. Although many excellent commercially available data loggers and software can accomplish this task, this activity requires almost no financial…

  8. Bringing the Digital Camera to the Physics Lab

    ERIC Educational Resources Information Center

    Rossi, M.; Gratton, L. M.; Oss, S.

    2013-01-01

    We discuss how compressed images created by modern digital cameras can lead to even severe problems in the quantitative analysis of experiments based on such images. Difficulties result from the nonlinear treatment of lighting intensity values stored in compressed files. To overcome such troubles, one has to adopt noncompressed, native formats, as…

  9. Utilization of consumer level digital cameras in astronomy

    NASA Astrophysics Data System (ADS)

    Páta, Petr; Fliegel, Karel; Klíma, Miloš; Blažek, Martin; Řeřábek, Martin

    2010-08-01

    This paper presents a study of possible utilization of digital single-lens reflex (DSLR) cameras in astronomy. The DSLRs have a great advantage over the professional equipments in better cost efficiency with comparable usability for selected purposes. The quality of electro-optical system in the DSLR camera determines the area where it can be used with acceptable precision. At first a set of important camera parameters for astronomical utilization is introduced in the paper. Color filter array (CFA) structure, demosaicing algorithm, image sensor spectral properties, noise and transfer characteristics are the parameters that belong among the very important ones and these are further analyzed in the paper. Compression of astronomical images using the KLT approach is also described below. The potential impact of these parameters on position and photometric measurement is presented based on the analysis and measurements with the wide-angle lens. The prospective utilization of consumer DSLR camera as a substitute for expensive devices is discussed.

  10. Airborne Digital Sensor System and GPS-aided inertial technology for direct geopositioning in rough terrain

    USGS Publications Warehouse

    Sanchez, Richard D.

    2004-01-01

    High-resolution airborne digital cameras with onboard data collection based on the Global Positioning System (GPS) and inertial navigation systems (INS) technology may offer a real-time means to gather accurate topographic map information by reducing ground control and eliminating aerial triangulation. Past evaluations of this integrated system over relatively flat terrain have proven successful. The author uses Emerge Digital Sensor System (DSS) combined with Applanix Corporation?s Position and Orientation Solutions for Direct Georeferencing to examine the positional mapping accuracy in rough terrain. The positional accuracy documented in this study did not meet large-scale mapping requirements owing to an apparent system mechanical failure. Nonetheless, the findings yield important information on a new approach for mapping in Antarctica and other remote or inaccessible areas of the world.

  11. Use of the Digital Camera To Increase Student Interest and Learning in High School Biology.

    ERIC Educational Resources Information Center

    Tatar, Denise; Robinson, Mike

    2003-01-01

    Attempts to answer two research questions: (1) Does the use of a digital camera in laboratory activities increase student learning?; and (2) Does the use of digital cameras motivate students to take a greater interest in laboratory work? Results indicate that the digital camera did increase student learning of process skills in two biology…

  12. Airborne particle monitoring with urban closed-circuit television camera networks and a chromatic technique

    NASA Astrophysics Data System (ADS)

    Kolupula, Y. R.; Aceves-Fernandez, M. A.; Jones, G. R.; Deakin, A. G.; Spencer, J. W.

    2010-11-01

    An economic approach for the preliminary assessment of 2-10 µm sized (PM10) airborne particle levels in urban areas is described. It uses existing urban closed-circuit television (CCTV) surveillance camera networks in combination with particle accumulating units and chromatic quantification of polychromatic light scattered by the captured particles. Methods for accommodating extraneous light effects are discussed and test results obtained from real urban sites are presented to illustrate the potential of the approach.

  13. Observation of Planetary Motion Using a Digital Camera

    ERIC Educational Resources Information Center

    Meyn, Jan-Peter

    2008-01-01

    A digital SLR camera with a standard lens (50 mm focal length, f/1.4) on a fixed tripod is used to obtain photographs of the sky which contain stars up to 8[superscript m] apparent magnitude. The angle of view is large enough to ensure visual identification of the photograph with a large sky region in a stellar map. The resolution is sufficient to…

  14. Digital camera system built on JPEG2000 compression and decompression

    NASA Astrophysics Data System (ADS)

    Atsumi, Eiji

    2003-05-01

    Processing architecture for digital camera has been built on JPEG2000 compression system. Concerns are to minimize processing power and data traffic inside (data-bandwidth at interface) and out-side (compression efficiency) of camera system. Key idea is to decompose Bayer matrix data given from image sensor into four half-resolution planes instead of interpolating to three full-resolution planes. With a new compression standard, JPEG2000, capable of handling multi-component image, the four-plane representation can be encoded into a single bit-stream. The representation saves data traffic between image reconstruction stage and compression stage by 1/3 to 1/2 compared to the Bayer-interpolated data. Not only reduced processing power prior to and during compression but also competitive or superior compression efficiency is achieved. On reconstruction to full resolution is Bayer-interpolation and/or edge-enhancement required as a post-processing to a standard decoder, while half or smaller resolution image is reconstructed without a post-processing. For mobile terminals with an integrated camera (image reconstruction in camera h/w and compression in terminal processor), this scheme helps to accommodate increased resolution with all the limited data-bandwidth from camera to terminal processor and limited processing capability.

  15. Improvement of digital photoelasticity based on camera response function.

    PubMed

    Chang, Shih-Hsin; Wu, Hsien-Huang P

    2011-09-20

    Studies on photoelasticity have been conducted by many researchers in recent years, and many equations for photoelastic analysis based on digital images were proposed. While these equations were all presented by the light intensity emitted from the analyzer, pixel values of the digital image were actually used in the real calculations. In this paper, a proposal of using relative light intensity obtained by the camera response function to replace the pixel value for photoelastic analysis was investigated. Generation of isochromatic images based on relative light intensity and pixel value were compared to evaluate the effectiveness of the new approach. The results showed that when relative light intensity was used, the quality of an isochromatic image can be greatly improved both visually and quantitatively. We believe that the technique proposed in this paper can also be used to improve the performance for the other types of photoelastic analysis using digital images. PMID:21947044

  16. Measurement of solar extinction in tower plants with digital cameras

    NASA Astrophysics Data System (ADS)

    Ballestrín, J.; Monterreal, R.; Carra, M. E.; Fernandez-Reche, J.; Barbero, J.; Marzo, A.

    2016-05-01

    Atmospheric extinction of solar radiation between the heliostat field and the receiver is accepted as a non-negligible source of energy loss in the increasingly large central receiver plants. However, the reality is that there is currently no reliable measurement method for this quantity and at present these plants are designed, built and operated without knowing this local parameter. Nowadays digital cameras are used in many scientific applications for their ability to convert available light into digital images. Its broad spectral range, high resolution and high signal to noise ratio, make them an interesting device in solar technology. In this work a method for atmospheric extinction measurement based on digital images is presented. The possibility of defining a measurement setup in circumstances similar to those of a tower plant increases the credibility of the method. This procedure is currently being implemented at Plataforma Solar de Almería.

  17. Night sky photometry with amateur-grade digital cameras

    NASA Astrophysics Data System (ADS)

    Mrozek, Tomasz; Gronkiewicz, Dominik; Kolomanski, Sylwester; Steslicki, Marek

    2015-08-01

    Measurements of night sky brightness can give us valuable information on light pollution. The more the measurements we have the better is our knowledge on the spatial distribution of the pollution on local and global scale.High accuracy professional photometry of night sky can be performed with dedicated instruments. The main drawbacks of this method are high price and low mobility. This limits an amount of observers and therefore amount of photometric data that can be collected. In order to overcome the problem of limited amount of data we can involve amateur astronomers in photometry of night sky. However, to achieve this goal we need a method that utilizes equipment which is usually used by amateur astronomers, e.g digital cameras.We propose a method that enables good accuracy photometry of night sky with a use of digital compact or DSLR cameras. In the method reduction of observations and standarization to Johnson UBV system are performed. We tested several cameras and compared results to Sky Quality Meter (SQM) measurements. The overall consistency for results is within 0.2 mag.

  18. Formal methods and digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    This report has been prepared to supplement a forthcoming chapter on formal methods in the FAA Digital Systems Validation Handbook. Its purpose is as follows: to outline the technical basis for formal methods in computer science; to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used on board aircraft; and to suggest factors for consideration when formal methods are offered in support of certification. These latter factors assume the context for software development and assurance described in RTCA document DO-178B, 'Software Considerations in Airborne Systems and Equipment Certification,' Dec. 1992.

  19. Establishing imaging sensor specifications for digital still cameras

    NASA Astrophysics Data System (ADS)

    Kriss, Michael A.

    2007-02-01

    Digital Still Cameras, DSCs, have now displaced conventional still cameras in most markets. The heart of a DSC is thought to be the imaging sensor, be it Full Frame CCD, and Interline CCD, a CMOS sensor or the newer Foveon buried photodiode sensors. There is a strong tendency by consumers to consider only the number of mega-pixels in a camera and not to consider the overall performance of the imaging system, including sharpness, artifact control, noise, color reproduction, exposure latitude and dynamic range. This paper will provide a systematic method to characterize the physical requirements of an imaging sensor and supporting system components based on the desired usage. The analysis is based on two software programs that determine the "sharpness", potential for artifacts, sensor "photographic speed", dynamic range and exposure latitude based on the physical nature of the imaging optics, sensor characteristics (including size of pixels, sensor architecture, noise characteristics, surface states that cause dark current, quantum efficiency, effective MTF, and the intrinsic full well capacity in terms of electrons per square centimeter). Examples will be given for consumer, pro-consumer, and professional camera systems. Where possible, these results will be compared to imaging system currently on the market.

  20. A large distributed digital camera system for accelerator beam diagnostics

    NASA Astrophysics Data System (ADS)

    Catani, L.; Cianchi, A.; Di Pirro, G.; Honkavaara, K.

    2005-07-01

    Optical diagnostics, providing images of accelerated particle beams using radiation emitted by particles impinging a radiator, typically a fluorescent screen, has been extensively used, especially on electron linacs, since the 1970's. Higher intensity beams available in the last decade allow extending the use of beam imaging techniques to perform precise measurements of important beam parameters such as emittance, energy, and energy spread using optical transition radiation (OTR). OTR-based diagnostics systems are extensively used on the superconducting TESLA Test Facility (TTF) linac driving the vacuum ultraviolet free electron laser (VUV-FEL) at the Deutsches Elektronen-Synchrotron facility. Up to 30 optical diagnostic stations have been installed at various positions along the 250-m-long linac, each equipped with a high-performance digital camera. This paper describes the new approach to the design of the hardware and software setups required by the complex topology of such a distributed camera system.

  1. Preparation of a Low-Cost Digital Camera System for Remote Sensing

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Off-the-shelf consumer digital cameras are convenient and user-friendly. However, the use of these cameras in remote sensing is limited because convenient methods for concurrently determining visible and near-infrared (NIR) radiation have not been developed. Two Nikon COOLPIX 4300 digital cameras ...

  2. Investigating thin film interference with a digital camera

    NASA Astrophysics Data System (ADS)

    Atkins, Leslie J.; Elliott, Richard C.

    2010-12-01

    Thin film interference is discussed in most introductory physics courses as an intriguing example of wave interference. Although students may understand the interference mechanism that determines the colors of a film, they are likely to have difficulty understanding why soap bubbles and oil slicks have a distinctive set of colors—colors that are strikingly different from those present in the rainbow. This article describes a way to model these colors and a simple method for investigating them using a digital camera and a computer.

  3. Social Justice through Literacy: Integrating Digital Video Cameras in Reading Summaries and Responses

    ERIC Educational Resources Information Center

    Liu, Rong; Unger, John A.; Scullion, Vicki A.

    2014-01-01

    Drawing data from an action-oriented research project for integrating digital video cameras into the reading process in pre-college courses, this study proposes using digital video cameras in reading summaries and responses to promote critical thinking and to teach social justice concepts. The digital video research project is founded on…

  4. Imaging and radiometric performance simulation for a new high-performance dual-band airborne reconnaissance camera

    NASA Astrophysics Data System (ADS)

    Seong, Sehyun; Yu, Jinhee; Ryu, Dongok; Hong, Jinsuk; Yoon, Jee-Yeon; Kim, Sug-Whan; Lee, Jun-Ho; Shin, Myung-Jin

    2009-05-01

    In recent years, high performance visible and IR cameras have been used widely for tactical airborne reconnaissance. The process improvement for efficient discrimination and analysis of complex target information from active battlefields requires for simultaneous multi-band measurement from airborne platforms at various altitudes. We report a new dual band airborne camera designed for simultaneous registration of both visible and IR imagery from mid-altitude ranges. The camera design uses a common front end optical telescope of around 0.3m in entrance aperture and several relay optical sub-systems capable of delivering both high spatial resolution visible and IR images to the detectors. The camera design is benefited from the use of several optical channels packaged in a compact space and the associated freedom to choose between wide (~3 degrees) and narrow (~1 degree) field of view. In order to investigate both imaging and radiometric performances of the camera, we generated an array of target scenes with optical properties such as reflection, refraction, scattering, transmission and emission. We then combined the target scenes and the camera optical system into the integrated ray tracing simulation environment utilizing Monte Carlo computation technique. Taking realistic atmospheric radiative transfer characteristics into account, both imaging and radiometric performances were then investigated. The simulation results demonstrate successfully that the camera design satisfies NIIRS 7 detection criterion. The camera concept, details of performance simulation computation, the resulting performances are discussed together with future development plan.

  5. Digital cameras with designs inspired by the arthropod eye.

    PubMed

    Song, Young Min; Xie, Yizhu; Malyarchuk, Viktor; Xiao, Jianliang; Jung, Inhwa; Choi, Ki-Joong; Liu, Zhuangjian; Park, Hyunsung; Lu, Chaofeng; Kim, Rak-Hwan; Li, Rui; Crozier, Kenneth B; Huang, Yonggang; Rogers, John A

    2013-05-01

    In arthropods, evolution has created a remarkably sophisticated class of imaging systems, with a wide-angle field of view, low aberrations, high acuity to motion and an infinite depth of field. A challenge in building digital cameras with the hemispherical, compound apposition layouts of arthropod eyes is that essential design requirements cannot be met with existing planar sensor technologies or conventional optics. Here we present materials, mechanics and integration schemes that afford scalable pathways to working, arthropod-inspired cameras with nearly full hemispherical shapes (about 160 degrees). Their surfaces are densely populated by imaging elements (artificial ommatidia), which are comparable in number (180) to those of the eyes of fire ants (Solenopsis fugax) and bark beetles (Hylastes nigrinus). The devices combine elastomeric compound optical elements with deformable arrays of thin silicon photodetectors into integrated sheets that can be elastically transformed from the planar geometries in which they are fabricated to hemispherical shapes for integration into apposition cameras. Our imaging results and quantitative ray-tracing-based simulations illustrate key features of operation. These general strategies seem to be applicable to other compound eye devices, such as those inspired by moths and lacewings (refracting superposition eyes), lobster and shrimp (reflecting superposition eyes), and houseflies (neural superposition eyes). PMID:23636401

  6. Off-axis digital holographic camera for quantitative phase microscopy

    PubMed Central

    Monemhaghdoust, Zahra; Montfort, Frédéric; Emery, Yves; Depeursinge, Christian; Moser, Christophe

    2014-01-01

    We propose and experimentally demonstrate a digital holographic camera which can be attached to the camera port of a conventional microscope for obtaining digital holograms in a self-reference configuration, under short coherence illumination and in a single shot. A thick holographic grating filters the beam containing the sample information in two dimensions through diffraction. The filtered beam creates the reference arm of the interferometer. The spatial filtering method, based on the high angular selectivity of the thick grating, reduces the alignment sensitivity to angular displacements compared with pinhole based Fourier filtering. The addition of a thin holographic grating alters the coherence plane tilt introduced by the thick grating so as to create high-visibility interference over the entire field of view. The acquired full-field off-axis holograms are processed to retrieve the amplitude and phase information of the sample. The system produces phase images of cheek cells qualitatively similar to phase images extracted with a standard commercial DHM. PMID:24940535

  7. Photogrammetry and Remote Sensing: New German Standards (din) Setting Quality Requirements of Products Generated by Digital Cameras, Pan-Sharpening and Classification

    NASA Astrophysics Data System (ADS)

    Reulke, R.; Baltrusch, S.; Brunn, A.; Komp, K.; Kresse, W.; von Schönermark, M.; Spreckels, V.

    2012-08-01

    10 years after the first introduction of a digital airborne mapping camera in the ISPRS conference 2000 in Amsterdam, several digital cameras are now available. They are well established in the market and have replaced the analogue camera. A general improvement in image quality accompanied the digital camera development. The signal-to-noise ratio and the dynamic range are significantly better than with the analogue cameras. In addition, digital cameras can be spectrally and radiometrically calibrated. The use of these cameras required a rethinking in many places though. New data products were introduced. In the recent years, some activities took place that should lead to a better understanding of the cameras and the data produced by these cameras. Several projects, like the projects of the German Society for Photogrammetry, Remote Sensing and Geoinformation (DGPF) or EuroSDR (European Spatial Data Research), were conducted to test and compare the performance of the different cameras. In this paper the current DIN (Deutsches Institut fuer Normung - German Institute for Standardization) standards will be presented. These include the standard for digital cameras, the standard for ortho rectification, the standard for classification, and the standard for pan-sharpening. In addition, standards for the derivation of elevation models, the use of Radar / SAR, and image quality are in preparation. The OGC has indicated its interest in participating that development. The OGC has already published specifications in the field of photogrammetry and remote sensing. One goal of joint future work could be to merge these formerly independent developments and the joint development of a suite of implementation specifications for photogrammetry and remote sensing.

  8. Payette National Forest aerial survey project using the Kodak digital color infrared camera

    NASA Astrophysics Data System (ADS)

    Greer, Jerry D.

    1997-11-01

    Staff of the Payette National Forest located in central Idaho used the Kodak Digital Infrared Camera to collect digital photographic images over a wide variety of selected areas. The objective of this aerial survey project is to collect airborne digital camera imagery and to evaluate it for potential use in forest assessment and management. The data collected from this remote sensing system is being compared with existing resource information and with personal knowledge of the areas surveyed. Resource specialists are evaluating the imagery to determine if it may be useful for; identifying cultural sites (pre-European settlement tribal villages and camps); recognizing ecosystem landscape pattern; mapping recreation areas; evaluating the South Fork Salmon River road reconstruction project; designing the Elk Summit Road; assessing the impact of sediment on anadramous fish in the South Fork Salmon River; assessing any contribution of sediment to the South Fork from the reconstructed road; determining post-wildfire stress development in conifer timber; in assessing the development of insect populations in areas initially determined to be within low intensity wildfire burn polygons; and to search for Idaho Ground Squirrel habitat. Project sites include approximately 60 linear miles of the South Fork of the Salmon River; a parallel road over about half that distance; 3 archaeological sites; two transects of about 6 miles each for landscape patterns; 3 recreation areas; 5 miles of the Payette River; 4 miles of the Elk Summit Road; a pair of transects 4.5 miles long for stress assessment in timber; a triplet of transects about 3 miles long for the assessment of the identification of species; and an area of about 640 acres to evaluate habitat for the endangered Idaho Ground Squirrel. Preliminary results indicate that the imagery is an economically viable way to collect site specific resource information that is of value in the management of a national forest.

  9. Digital camera calibration for color measurements on prints

    NASA Astrophysics Data System (ADS)

    Andersson, Mattias

    2007-01-01

    Flatbed scanners and digital cameras have become established and widely used color imaging devices. If colorimetrically calibrated, these trichromatic devices can provide fast color measurement tools in applications such as printer calibration, process control, objective print quality measurements and color management. However, in calibrations intended to be used for color measurements on printed matter, the media dependency must be considered. Very good results can be achieved when the calibration is carried out on a single media and then applied for measurements on the same media, or at least a media of a very similar type. Significantly poorer results can be observed when the calibration is carried out for one printer-substrate combination and then applied for measurements on targets produced with another printer-substrate combination. Even if the problem is restricted to the color calibration of a scanner or camera for different paper media printed on a single printer, it is still tedious work to make a separate calibration for each new paper grade to be used in the printer. Therefore, it would be of interest to find a method where it is sufficient to characterize for only one or a few papers within a grade segment and then be able to apply a correction based on measurable optical paper properties. However, before being able to make any corrections, the influence of measurable paper properties on color characterizations must be studied and modeled. Fluorescence has been mentioned1-3 as a potential source of error in color calibrations for measurements on printed matter. In order to improve paper whiteness, producers of printing paper add bluish dye and fluorescent whitening agents (FWA) to the paper4. In this study, the influence of FWA in printing paper on the color calibration of a digital camera for color measurements on printed targets is discussed. To study the effect of FWA in the paper, a set of papers with varying additions of FWA but otherwise

  10. A Comparative Study of Microscopic Images Captured by a Box Type Digital Camera Versus a Standard Microscopic Photography Camera Unit

    PubMed Central

    Desai, Nandini J.; Gupta, B. D.; Patel, Pratik Narendrabhai

    2014-01-01

    Introduction: Obtaining images of slides viewed by a microscope can be invaluable for both diagnosis and teaching.They can be transferred among technologically-advanced hospitals for further consultation and evaluation. But a standard microscopic photography camera unit (MPCU)(MIPS-Microscopic Image projection System) is costly and not available in resource poor settings. The aim of our endeavour was to find a comparable and cheaper alternative method for photomicrography. Materials and Methods: We used a NIKON Coolpix S6150 camera (box type digital camera) with Olympus CH20i microscope and a fluorescent microscope for the purpose of this study. Results: We got comparable results for capturing images of light microscopy, but the results were not as satisfactory for fluorescent microscopy. Conclusion: A box type digital camera is a comparable, less expensive and convenient alternative to microscopic photography camera unit. PMID:25478350

  11. Encrypting Digital Camera with Automatic Encryption Key Deletion

    NASA Technical Reports Server (NTRS)

    Oakley, Ernest C. (Inventor)

    2007-01-01

    A digital video camera includes an image sensor capable of producing a frame of video data representing an image viewed by the sensor, an image memory for storing video data such as previously recorded frame data in a video frame location of the image memory, a read circuit for fetching the previously recorded frame data, an encryption circuit having an encryption key input connected to receive the previously recorded frame data from the read circuit as an encryption key, an un-encrypted data input connected to receive the frame of video data from the image sensor and an encrypted data output port, and a write circuit for writing a frame of encrypted video data received from the encrypted data output port of the encryption circuit to the memory and overwriting the video frame location storing the previously recorded frame data.

  12. Verification of Potency of Aerial Digital Oblique Cameras for Aerial Photogrammetry in Japan

    NASA Astrophysics Data System (ADS)

    Nakada, Ryuji; Takigawa, Masanori; Ohga, Tomowo; Fujii, Noritsuna

    2016-06-01

    Digital oblique aerial camera (hereinafter called "oblique cameras") is an assembly of medium format digital cameras capable of shooting digital aerial photographs in five directions i.e. nadir view and oblique views (forward and backward, left and right views) simultaneously and it is used for shooting digital aerial photographs efficiently for generating 3D models in a wide area. For aerial photogrammetry of public survey in Japan, it is required to use large format cameras, like DMC and UltraCam series, to ensure aerial photogrammetric accuracy. Although oblique cameras are intended to generate 3D models, digital aerial photographs in 5 directions taken with them should not be limited to 3D model production but they may also be allowed for digital mapping and photomaps of required public survey accuracy in Japan. In order to verify the potency of using oblique cameras for aerial photogrammetry (simultaneous adjustment, digital mapping and photomaps), (1) a viewer was developed to interpret digital aerial photographs taken with oblique cameras, (2) digital aerial photographs were shot with an oblique camera owned by us, a Penta DigiCAM of IGI mbH, and (3) accuracy of 3D measurements was verified.

  13. Camera system resolution and its influence on digital image correlation

    SciTech Connect

    Reu, Phillip L.; Sweatt, William; Miller, Timothy; Fleming, Darryn

    2014-09-21

    Digital image correlation (DIC) uses images from a camera and lens system to make quantitative measurements of the shape, displacement, and strain of test objects. This increasingly popular method has had little research on the influence of the imaging system resolution on the DIC results. This paper investigates the entire imaging system and studies how both the camera and lens resolution influence the DIC results as a function of the system Modulation Transfer Function (MTF). It will show that when making spatial resolution decisions (including speckle size) the resolution limiting component should be considered. A consequence of the loss of spatial resolution is that the DIC uncertainties will be increased. This is demonstrated using both synthetic and experimental images with varying resolution. The loss of image resolution and DIC accuracy can be compensated for by increasing the subset size, or better, by increasing the speckle size. The speckle-size and spatial resolution are now a function of the lens resolution rather than the more typical assumption of the pixel size. The study will demonstrate the tradeoffs associated with limited lens resolution.

  14. Camera system resolution and its influence on digital image correlation

    DOE PAGESBeta

    Reu, Phillip L.; Sweatt, William; Miller, Timothy; Fleming, Darryn

    2014-09-21

    Digital image correlation (DIC) uses images from a camera and lens system to make quantitative measurements of the shape, displacement, and strain of test objects. This increasingly popular method has had little research on the influence of the imaging system resolution on the DIC results. This paper investigates the entire imaging system and studies how both the camera and lens resolution influence the DIC results as a function of the system Modulation Transfer Function (MTF). It will show that when making spatial resolution decisions (including speckle size) the resolution limiting component should be considered. A consequence of the loss ofmore » spatial resolution is that the DIC uncertainties will be increased. This is demonstrated using both synthetic and experimental images with varying resolution. The loss of image resolution and DIC accuracy can be compensated for by increasing the subset size, or better, by increasing the speckle size. The speckle-size and spatial resolution are now a function of the lens resolution rather than the more typical assumption of the pixel size. The study will demonstrate the tradeoffs associated with limited lens resolution.« less

  15. <5cm Ground Resolution DEMs for the Atacama Fault System (Chile), Acquried With the Modular Airborne Camera System (MACS)

    NASA Astrophysics Data System (ADS)

    Zielke, O.; Victor, P.; Oncken, O.; Bucher, T. U.; Lehmann, F.

    2011-12-01

    A primary step towards assessing time and size of future earthquakes is the identification of earthquake recurrence patterns in the existing seismic record. Geologic and geomorphic data are commonly analyzed for this purpose, reasoned by the lack of sufficiently long historical or instrumental seismic data sets. Until recently, those geomorphic data sets encompassed field observation, local total station surveys, and aerial photography. Over the last decade, LiDAR-based high-resolution topographic data sets became an additional powerful mean, contributing distinctly to a better understanding of earthquake rupture characteristics (e.g., single-event along-fault slip distribution, along-fault slip accumulation pattern) and their relation to fault geometric complexities. Typical shot densities of such data sets (e.g., airborne-LiDAR data along the San Andreas Fault) permit generation of digital elevation models (DEM) with <50 cm ground resolution, sufficient for depiction of meter-scale tectonic landforms. Identification of submeter-scale features is however prohibited by DEM resolution limitation. Here, we present a high-resolution topographic and visual data set from the Atacama fault system near Antofagasta, Chile. Data were acquired with Modular Airborne Camera System (MACS) - developed by the DLR (German Aerospace Center) in Berlin, Germany. The photogrammetrically derived DEM and True Ortho Images with <5cm ground resolution permit identification of very small-scale geomorphic features, thus enabling fault zone and earthquake rupture characterization at unprecedented detail. Compared to typical LiDAR-DEM, ground resolution is increased by an order of magnitude while the spatial extend of these data set is essentially the same. Here, we present examples of the <5cm resolution data set (DEM and visual results) and further explore resolution capabilities and potential with regards to the aforementioned tectono-geomorphic questions.

  16. Quantifying biodiversity using digital cameras and automated image analysis.

    NASA Astrophysics Data System (ADS)

    Roadknight, C. M.; Rose, R. J.; Barber, M. L.; Price, M. C.; Marshall, I. W.

    2009-04-01

    Monitoring the effects on biodiversity of extensive grazing in complex semi-natural habitats is labour intensive. There are also concerns about the standardization of semi-quantitative data collection. We have chosen to focus initially on automating the most time consuming aspect - the image analysis. The advent of cheaper and more sophisticated digital camera technology has lead to a sudden increase in the number of habitat monitoring images and information that is being collected. We report on the use of automated trail cameras (designed for the game hunting market) to continuously capture images of grazer activity in a variety of habitats at Moor House National Nature Reserve, which is situated in the North of England at an average altitude of over 600m. Rainfall is high, and in most areas the soil consists of deep peat (1m to 3m), populated by a mix of heather, mosses and sedges. The cameras have been continuously in operation over a 6 month period, daylight images are in full colour and night images (IR flash) are black and white. We have developed artificial intelligence based methods to assist in the analysis of the large number of images collected, generating alert states for new or unusual image conditions. This paper describes the data collection techniques, outlines the quantitative and qualitative data collected and proposes online and offline systems that can reduce the manpower overheads and increase focus on important subsets in the collected data. By converting digital image data into statistical composite data it can be handled in a similar way to other biodiversity statistics thus improving the scalability of monitoring experiments. Unsupervised feature detection methods and supervised neural methods were tested and offered solutions to simplifying the process. Accurate (85 to 95%) categorization of faunal content can be obtained, requiring human intervention for only those images containing rare animals or unusual (undecidable) conditions, and

  17. Tape measuring system using linear encoder and digital camera

    NASA Astrophysics Data System (ADS)

    Eom, Tae Bong; Jeong, Don Young; Kim, Myung Soon; Kim, Jae Wan; Kim, Jong Ahn

    2013-04-01

    We have designed and constructed the calibration system of line standards such as tape and rule for the secondary calibration laboratories. The system consists of the main body with linear stage and linear encoder, the optical microscope with digital camera, and the computer. The base of the system is a aluminum profile with 2.9 m length, 0.09 m height and 0.18 m width. The linear stage and the linear encoder are fixed on the aluminum profile. The micro-stage driven by micrometer is fixed on the carriage of the long linear stage, and the optical microscope with digital camera and the tablet PC are on the this stage. The linear encoder counts the moving distance of the linear stage with resolution of 1 μm and its counting value is transferred to the tablet PC. The image of the scale mark of the tape is captured by the CCD camera of optical microscope and transferred to the PC through USB interface. The computer automatically determines the center of the scale mark by image processing technique and at the same time reads the moving distance of the linear stage. As a result, the computer can calculate the interval between the scale marks of the tape. In order to achieve the high accuracy, the linear encoder should be calibrated using the laser interferometer or the rigid steel rule. This calibration data of the linear encoder is stored at the computer and the computer corrects the reading value of the linear encoder. In order to determine the center of the scale mark, we use three different algorithms. First, the image profile over specified threshold level is fitted in even order polynomial and the axis of the polynomial is used as the center of the line. Second, the left side and right side areas at the center of the image profile are calculated so that two areas are same. Third, the left and right edges of the image profile are determined at every intensity level of the image and the center of the graduation is calculated as an average of the centers of the left

  18. Property of the Large Format Digital Aerial Camera Dmc II

    NASA Astrophysics Data System (ADS)

    Jacobsen, K.; Neumann, K.

    2012-07-01

    Z/I Imaging introduced with the DMC II 140, 230 and 250 digital aerial cameras with a very large format CCD for the panchromatic channel. The CCDs have with 140 / 230 / 250 mega pixel a size not available in photogrammetry before. CCDs in general have a very high relative accuracy, but the overall geometry has to be checked as well as the influence of not flat CCDs. A CCD with a size of 96mm × 82mm must have a flatness or knowledge of flatness in the range of 1μm if the camera accuracy in the range of 1.3μm shall not be influenced. The DMC II cameras have been evaluated with three different flying heights leading to 5cm, 9cm and 15cm or 20cm GSD, crossing flight lines and 60% side lap. The optimal test conditions guaranteed the precise determination of the object coordinates as well as the systematic image errors. All three camera types show only very small systematic image errors, ranging in the root mean square between 0.12μm up to 0.3μm with extreme values not exceeding 1.6μm. The remaining systematic image errors, determined by analysis of the image residuals and not covered by the additional parameters, are negligible. A standard deviation of the object point heights below the GSD, determined at independent check points, even in blocks with just 20% side lap and 60% end lap is standard. Corresponding to the excellent image geometry the object point coordinates are only slightly influenced by the self calibration. For all DMCII types the handling of image models for data acquisition must not be supported by an improvement of the image coordinates by the determined systematic image errors. Such an improvement up to now is not standard for photogrammetric software packages. The advantage of a single monolithic CCD is obvious. An edge analysis of pan-sharpened DMC II 250 images resulted in factors for the effective resolution below 1.0. The result below 1.0 is only possible by contrast enhancement, but this requires with low image noise, demonstrating the

  19. Use of a Digital Camera To Document Student Observations in a Microbiology Laboratory Class.

    ERIC Educational Resources Information Center

    Mills, David A.; Kelley, Kevin; Jones, Michael

    2001-01-01

    Points out the lack of microscopic images of wine-related microbes. Uses a digital camera during a wine microbiology laboratory to capture student-generated microscope images. Discusses the advantages of using a digital camera in a teaching lab. (YDS)

  20. Practical target location and accuracy indicator in digital close range photogrammetry using consumer grade cameras

    NASA Astrophysics Data System (ADS)

    Moriya, Gentaro; Chikatsu, Hirofumi

    2011-07-01

    Recently, pixel numbers and functions of consumer grade digital camera are amazingly increasing by modern semiconductor and digital technology, and there are many low-priced consumer grade digital cameras which have more than 10 mega pixels on the market in Japan. In these circumstances, digital photogrammetry using consumer grade cameras is enormously expected in various application fields. There is a large body of literature on calibration of consumer grade digital cameras and circular target location. Target location with subpixel accuracy had been investigated as a star tracker issue, and many target location algorithms have been carried out. It is widely accepted that the least squares models with ellipse fitting is the most accurate algorithm. However, there are still problems for efficient digital close range photogrammetry. These problems are reconfirmation of the target location algorithms with subpixel accuracy for consumer grade digital cameras, relationship between number of edge points along target boundary and accuracy, and an indicator for estimating the accuracy of normal digital close range photogrammetry using consumer grade cameras. With this motive, an empirical testing of several algorithms for target location with subpixel accuracy and an indicator for estimating the accuracy are investigated in this paper using real data which were acquired indoors using 7 consumer grade digital cameras which have 7.2 mega pixels to 14.7 mega pixels.

  1. Quantification of gully volume using very high resolution DSM generated through 3D reconstruction from airborne and field digital imagery

    NASA Astrophysics Data System (ADS)

    Castillo, Carlos; Zarco-Tejada, Pablo; Laredo, Mario; Gómez, Jose Alfonso

    2013-04-01

    Major advances have been made recently in automatic 3D photo-reconstruction techniques using uncalibrated and non-metric cameras (James and Robson, 2012). However, its application on soil conservation studies and landscape feature identification is currently at the outset. The aim of this work is to compare the performance of a remote sensing technique using a digital camera mounted on an airborne platform, with 3D photo-reconstruction, a method already validated for gully erosion assessment purposes (Castillo et al., 2012). A field survey was conducted in November 2012 in a 250 m-long gully located in field crops on a Vertisol in Cordoba (Spain). The airborne campaign was conducted with a 4000x3000 digital camera installed onboard an aircraft flying at 300 m above ground level to acquire 6 cm resolution imagery. A total of 990 images were acquired over the area ensuring a large overlap in the across- and along-track direction of the aircraft. An ortho-mosaic and the digital surface model (DSM) were obtained through automatic aerial triangulation and camera calibration methods. For the field-level photo-reconstruction technique, the gully was divided in several reaches to allow appropriate reconstruction (about 150 pictures taken per reach) and, finally, the resulting point clouds were merged into a unique mesh. A centimetric-accuracy GPS provided a benchmark dataset for gully perimeter and distinguishable reference points in order to allow the assessment of measurement errors of the airborne technique and the georeferenciation of the photo-reconstruction 3D model. The uncertainty on the gully limits definition was explicitly addressed by comparison of several criteria obtained by 3D models (slope and second derivative) with the outer perimeter obtained by the GPS operator identifying visually the change in slope at the top of the gully walls. In this study we discussed the magnitude of planimetric and altimetric errors and the differences observed between the

  2. Digital Camera with Apparatus for Authentication of Images Produced from an Image File

    NASA Technical Reports Server (NTRS)

    Friedman, Gary L. (Inventor)

    1996-01-01

    A digital camera equipped with a processor for authentication of images produced from an image file taken by the digital camera is provided. The digital camera processor has embedded therein a private key unique to it, and the camera housing has a public key that is so uniquely related to the private key that digital data encrypted with the private key may be decrypted using the public key. The digital camera processor comprises means for calculating a hash of the image file using a predetermined algorithm, and second means for encrypting the image hash with the private key, thereby producing a digital signature. The image file and the digital signature are stored in suitable recording means so they will be available together. Apparatus for authenticating the image file as being free of any alteration uses the public key for decrypting the digital signature, thereby deriving a secure image hash identical to the image hash produced by the digital camera and used to produce the digital signature. The authenticating apparatus calculates from the image file an image hash using the same algorithm as before. By comparing this last image hash with the secure image hash, authenticity of the image file is determined if they match. Other techniques to address time-honored methods of deception, such as attaching false captions or inducing forced perspectives, are included.

  3. Digital camera with apparatus for authentication of images produced from an image file

    NASA Technical Reports Server (NTRS)

    Friedman, Gary L. (Inventor)

    1993-01-01

    A digital camera equipped with a processor for authentication of images produced from an image file taken by the digital camera is provided. The digital camera processor has embedded therein a private key unique to it, and the camera housing has a public key that is so uniquely based upon the private key that digital data encrypted with the private key by the processor may be decrypted using the public key. The digital camera processor comprises means for calculating a hash of the image file using a predetermined algorithm, and second means for encrypting the image hash with the private key, thereby producing a digital signature. The image file and the digital signature are stored in suitable recording means so they will be available together. Apparatus for authenticating at any time the image file as being free of any alteration uses the public key for decrypting the digital signature, thereby deriving a secure image hash identical to the image hash produced by the digital camera and used to produce the digital signature. The apparatus calculates from the image file an image hash using the same algorithm as before. By comparing this last image hash with the secure image hash, authenticity of the image file is determined if they match, since even one bit change in the image hash will cause the image hash to be totally different from the secure hash.

  4. Passive auto-focus for digital still cameras and camera phones: Filter-switching and low-light techniques

    NASA Astrophysics Data System (ADS)

    Gamadia, Mark Noel

    In order to gain valuable market share in the growing consumer digital still camera and camera phone market, camera manufacturers have to continually add and improve existing features to their latest product offerings. Auto-focus (AF) is one such feature, whose aim is to enable consumers to quickly take sharply focused pictures with little or no manual intervention in adjusting the camera's focus lens. While AF has been a standard feature in digital still and cell-phone cameras, consumers often complain about their cameras' slow AF performance, which may lead to missed photographic opportunities, rendering valuable moments and events with undesired out-of-focus pictures. This dissertation addresses this critical issue to advance the state-of-the-art in the digital band-pass filter, passive AF method. This method is widely used to realize AF in the camera industry, where a focus actuator is adjusted via a search algorithm to locate the in-focus position by maximizing a sharpness measure extracted from a particular frequency band of the incoming image of the scene. There are no known systematic methods for automatically deriving the parameters such as the digital pass-bands or the search step-size increments used in existing passive AF schemes. Conventional methods require time consuming experimentation and tuning in order to arrive at a set of parameters which balance AF performance in terms of speed and accuracy ultimately causing a delay in product time-to-market. This dissertation presents a new framework for determining an optimal set of passive AF parameters, named Filter- Switching AF, providing an automatic approach to achieve superior AF performance, both in good and low lighting conditions based on the following performance measures (metrics): speed (total number of iterations), accuracy (offset from truth), power consumption (total distance moved), and user experience (in-focus position overrun). Performance results using three different prototype cameras

  5. Control design for image tracking with an inertially stabilized airborne camera platform

    NASA Astrophysics Data System (ADS)

    Hurák, Zdenek; Rezáč, Martin

    2010-04-01

    The paper reports on a few control engineering issues related to design and implementation of an image-based pointing and tracking system for an inertially stabilized airborne camera platform. A medium-sized platform has been developed by the authors and a few more team members within a joint governmental project coordinated by Czech Air Force Research Institute. The resulting experimental platform is based on a common double gimbal configuration with two direct drive motors and off-the-shelf MEMS gyros. Automatic vision-based tracking system is built on top of the inertial stabilization. Choice of a suitable control configuration is discussed first, because the decoupled structure for the inner inertial rate controllers does not extend easily to the outer imagebased pointing and tracking loop. It appears that the pointing and tracking controller can benefit much from availability of measurements of an inertial rate of the camera around its optical axis. The proposed pointing and tracking controller relies on feedback linearization well known in image-based visual servoing. Simple compensation of a one sample delay introduced into the (slow) visual pointing and tracking loop by the computer vision system is proposed. It relies on a simple modification of the well-known Smith predictor scheme where the prediction takes advantage of availability of the (fast and undelayed) inertial rate measurements.

  6. Airborne Camera System for Real-Time Applications - Support of a National Civil Protection Exercise

    NASA Astrophysics Data System (ADS)

    Gstaiger, V.; Romer, H.; Rosenbaum, D.; Henkel, F.

    2015-04-01

    In the VABENE++ project of the German Aerospace Center (DLR), powerful tools are being developed to aid public authorities and organizations with security responsibilities as well as traffic authorities when dealing with disasters and large public events. One focus lies on the acquisition of high resolution aerial imagery, its fully automatic processing, analysis and near real-time provision to decision makers in emergency situations. For this purpose a camera system was developed to be operated from a helicopter with light-weight processing units and microwave link for fast data transfer. In order to meet end-users' requirements DLR works close together with the German Federal Office of Civil Protection and Disaster Assistance (BBK) within this project. One task of BBK is to establish, maintain and train the German Medical Task Force (MTF), which gets deployed nationwide in case of large-scale disasters. In October 2014, several units of the MTF were deployed for the first time in the framework of a national civil protection exercise in Brandenburg. The VABENE++ team joined the exercise and provided near real-time aerial imagery, videos and derived traffic information to support the direction of the MTF and to identify needs for further improvements and developments. In this contribution the authors introduce the new airborne camera system together with its near real-time processing components and share experiences gained during the national civil protection exercise.

  7. Accuracy assessment of airborne photogrammetrically derived high-resolution digital elevation models in a high mountain environment

    NASA Astrophysics Data System (ADS)

    Müller, Johann; Gärtner-Roer, Isabelle; Thee, Patrick; Ginzler, Christian

    2014-12-01

    High-resolution digital elevation models (DEMs) generated by airborne remote sensing are frequently used to analyze landform structures (monotemporal) and geomorphological processes (multitemporal) in remote areas or areas of extreme terrain. In order to assess and quantify such structures and processes it is necessary to know the absolute accuracy of the available DEMs. This study assesses the absolute vertical accuracy of DEMs generated by the High Resolution Stereo Camera-Airborne (HRSC-A), the Leica Airborne Digital Sensors 40/80 (ADS40 and ADS80) and the analogue camera system RC30. The study area is located in the Turtmann valley, Valais, Switzerland, a glacially and periglacially formed hanging valley stretching from 2400 m to 3300 m a.s.l. The photogrammetrically derived DEMs are evaluated against geodetic field measurements and an airborne laser scan (ALS). Traditional and robust global and local accuracy measurements are used to describe the vertical quality of the DEMs, which show a non Gaussian distribution of errors. The results show that all four sensor systems produce DEMs with similar accuracy despite their different setups and generations. The ADS40 and ADS80 (both with a ground sampling distance of 0.50 m) generate the most accurate DEMs in complex high mountain areas with a RMSE of 0.8 m and NMAD of 0.6 m They also show the highest accuracy relating to flying height (0.14‰). The pushbroom scanning system HRSC-A produces a RMSE of 1.03 m and a NMAD of 0.83 m (0.21‰ accuracy of the flying height and 10 times the ground sampling distance). The analogue camera system RC30 produces DEMs with a vertical accuracy of 1.30 m RMSE and 0.83 m NMAD (0.17‰ accuracy of the flying height and two times the ground sampling distance). It is also shown that the performance of the DEMs strongly depends on the inclination of the terrain. The RMSE of areas up to an inclination <40° is better than 1 m. In more inclined areas the error and outlier occurrence

  8. Aerosol retrieval from twilight photographs taken by a digital camera

    NASA Astrophysics Data System (ADS)

    Saito, M.; Iwabuchi, H.

    2014-12-01

    Twilight sky, one of the most beautiful sights seen in our daily life, varies day by day, because atmospheric components such as ozone and aerosols also varies day by day. Recent studies have revealed the effects of tropospheric aerosols on twilight sky. In this study, we develop a new algorithm for aerosol retrievals from twilight photographs taken by a digital single reflex-lens camera in solar zenith angle of 90-96˚ with interval of 1˚. A radiative transfer model taking spherical-shell atmosphere, multiple scattering and refraction into account is used as a forward model, and the optimal estimation is used as an inversion calculation to infer the aerosol optical and radiative properties. The sensitivity tests show that tropospheric (stratospheric) aerosol optical thickness is responsible to the distribution of twilight sky color and brightness near the horizon (in viewing angles of 10˚ to 20˚) and aerosol size distribution is responsible to the angular distribution of brightness near the solar direction. The AOTs are inferred with small uncertainties and agree very well with that from the Skyradiometer. In this conference, several case studies using the algorithm will be shown.

  9. The role of camera-bundled image management software in the consumer digital imaging value chain

    NASA Astrophysics Data System (ADS)

    Mueller, Milton; Mundkur, Anuradha; Balasubramanian, Ashok; Chirania, Virat

    2005-02-01

    This research was undertaken by the Convergence Center at the Syracuse University School of Information Studies (www.digital-convergence.info). Project ICONICA, the name for the research, focuses on the strategic implications of digital Images and the CONvergence of Image management and image CApture. Consumer imaging - the activity that we once called "photography" - is now recognized as in the throes of a digital transformation. At the end of 2003, market researchers estimated that about 30% of the households in the U.S. and 40% of the households in Japan owned digital cameras. In 2004, of the 86 million new cameras sold (excluding one-time use cameras), a majority (56%) were estimated to be digital cameras. Sales of photographic film, while still profitable, are declining precipitously.

  10. Using Commercial Digital Cameras and Structure-for-Motion Software to Map Snow Cover Depth from Small Aircraft

    NASA Astrophysics Data System (ADS)

    Sturm, M.; Nolan, M.; Larsen, C. F.

    2014-12-01

    A long-standing goal in snow hydrology has been to map snow cover in detail, either mapping snow depth or snow water equivalent (SWE) with sub-meter resolution. Airborne LiDAR and air photogrammetry have been used successfully for this purpose, but both require significant investments in equipment and substantial processing effort. Here we detail a relatively inexpensive and simple airborne photogrammetric technique that can be used to measure snow depth. The main airborne hardware consists of a consumer-grade digital camera attached to a survey-quality, dual-frequency GPS. Photogrammetric processing is done using commercially available Structure from Motion (SfM) software that does not require ground control points. Digital elevation models (DEMs) are made from snow-free acquisitions in the summer and snow-covered acquisitions in winter, and the maps are then differenced to arrive at snow thickness. We tested the accuracy and precision of snow depths measured using this system through 1) a comparison with airborne scanning LiDAR, 2) a comparison of results from two independent and slightly different photogrameteric systems, and 3) comparison to extensive on-the-ground measured snow depths. Vertical accuracy and precision are on the order of +/-30 cm and +/- 8 cm, respectively. The accuracy can be made to approach that of the precision if suitable snow-free ground control points exists and are used to co-register summer to winter DEM maps. Final snow depth accuracy from our series of tests was on the order of ±15 cm. This photogrammetric method substantially lowers the economic and expertise barriers to entry for mapping snow.

  11. DR with a DSLR: Digital Radiography with a Digital Single-Lens Reflex camera

    PubMed Central

    Fan, Helen; Durko, Heather L.; Moore, Stephen K.; Moore, Jared; Miller, Brian W.; Furenlid, Lars R.; Pradhan, Sunil; Barrett, Harrison H.

    2010-01-01

    An inexpensive, portable digital radiography (DR) detector system for use in remote regions has been built and evaluated. The system utilizes a large-format digital single-lens reflex (DSLR) camera to capture the image from a standard fluorescent screen. The large sensor area allows relatively small demagnification factors and hence minimizes the light loss. The system has been used for initial phantom tests in urban hospitals and Himalayan clinics in Nepal, and it has been evaluated in the laboratory at the University of Arizona by additional phantom studies. Typical phantom images are presented in this paper, and a simplified discussion of the detective quantum efficiency of the detector is given. PMID:21516238

  12. Real-time object tracking for moving target auto-focus in digital camera

    NASA Astrophysics Data System (ADS)

    Guan, Haike; Niinami, Norikatsu; Liu, Tong

    2015-02-01

    Focusing at a moving object accurately is difficult and important to take photo of the target successfully in a digital camera. Because the object often moves randomly and changes its shape frequently, position and distance of the target should be estimated at real-time so as to focus at the objet precisely. We propose a new method of real-time object tracking to do auto-focus for moving target in digital camera. Video stream in the camera is used for the moving target tracking. Particle filter is used to deal with problem of the target object's random movement and shape change. Color and edge features are used as measurement of the object's states. Parallel processing algorithm is developed to realize real-time particle filter object tracking easily in hardware environment of the digital camera. Movement prediction algorithm is also proposed to remove focus error caused by difference between tracking result and target object's real position when the photo is taken. Simulation and experiment results in digital camera demonstrate effectiveness of the proposed method. We embedded real-time object tracking algorithm in the digital camera. Position and distance of the moving target is obtained accurately by object tracking from the video stream. SIMD processor is applied to enforce parallel real-time processing. Processing time less than 60ms for each frame is obtained in the digital camera with its CPU of only 162MHz.

  13. Radiometric calibration of digital cameras using Gaussian processes

    NASA Astrophysics Data System (ADS)

    Schall, Martin; Grunwald, Michael; Umlauf, Georg; Franz, Matthias O.

    2015-05-01

    Digital cameras are subject to physical, electronic and optic effects that result in errors and noise in the image. These effects include for example a temperature dependent dark current, read noise, optical vignetting or different sensitivities of individual pixels. The task of a radiometric calibration is to reduce these errors in the image and thus improve the quality of the overall application. In this work we present an algorithm for radiometric calibration based on Gaussian processes. Gaussian processes are a regression method widely used in machine learning that is particularly useful in our context. Then Gaussian process regression is used to learn a temperature and exposure time dependent mapping from observed gray-scale values to true light intensities for each pixel. Regression models based on the characteristics of single pixels suffer from excessively high runtime and thus are unsuitable for many practical applications. In contrast, a single regression model for an entire image with high spatial resolution leads to a low quality radiometric calibration, which also limits its practical use. The proposed algorithm is predicated on a partitioning of the pixels such that each pixel partition can be represented by one single regression model without quality loss. Partitioning is done by extracting features from the characteristic of each pixel and using them for lexicographic sorting. Splitting the sorted data into partitions with equal size yields the final partitions, each of which is represented by the partition centers. An individual Gaussian process regression and model selection is done for each partition. Calibration is performed by interpolating the gray-scale value of each pixel with the regression model of the respective partition. The experimental comparison of the proposed approach to classical flat field calibration shows a consistently higher reconstruction quality for the same overall number of calibration frames.

  14. Design and application of a digital array high-speed camera system

    NASA Astrophysics Data System (ADS)

    Liu, Wei; Yao, Xuefeng; Ma, Yinji; Yuan, Yanan

    2016-03-01

    In this paper, a digital array high-speed camera system is designed and applied in dynamic fracture experiment. First, the design scheme for 3*3 array digital high-speed camera system is presented, including 3*3 array light emitting diode (LED) light source unit, 3*3 array charge coupled device (CCD) camera unit, timing delay control unit, optical imaging unit and impact loading unit. Second, the influence of geometric optical parameters on optical parallax is analyzed based on the geometric optical imaging mechanism. Finally, combining the method of dynamic caustics with the digital high-speed camera system, the dynamic fracture behavior of crack initiation and propagation in PMMA specimen under low-speed impact is investigated to verify the feasibility of the high-speed camera system.

  15. Issues in implementing services for a wireless web-enabled digital camera

    NASA Astrophysics Data System (ADS)

    Venkataraman, Shyam; Sampat, Nitin; Fisher, Yoram; Canosa, John; Noel, Nicholas

    2001-05-01

    The competition in the exploding digital photography market has caused vendors to explore new ways to increase their return on investment. A common view among industry analysts is that increasingly it will be services provided by these cameras, and not the cameras themselves, that will provide the revenue stream. These services will be coupled to e- Appliance based Communities. In addition, the rapidly increasing need to upload images to the Internet for photo- finishing services as well as the need to download software upgrades to the camera is driving many camera OEMs to evaluate the benefits of using the wireless web to extend their enterprise systems. Currently, creating a viable e- appliance such as a digital camera coupled with a wireless web service requires more than just a competency in product development. This paper will evaluate the system implications in the deployment of recurring revenue services and enterprise connectivity of a wireless, web-enabled digital camera. These include, among other things, an architectural design approach for services such as device management, synchronization, billing, connectivity, security, etc. Such an evaluation will assist, we hope, anyone designing or connecting a digital camera to the enterprise systems.

  16. Works starts on building world's largest digital camera

    NASA Astrophysics Data System (ADS)

    Kruesi, Liz

    2015-10-01

    The $473m Large Synoptic Survey Telescope (LSST) has moved one step closer to completion after the US Department of Energy (DOE) approved the start of construction for the telescope's $168m 3.2-gigapixel camera.

  17. Investigation of a consumer-grade digital stereo camera

    NASA Astrophysics Data System (ADS)

    Menna, Fabio; Nocerino, Erica; Remondino, Fabio; Shortis, Mark

    2013-04-01

    The paper presents a metric investigation of the Fuji FinePix Real 3D W1 stereo photo-camera. The stereo-camera uses a synchronized Twin Lens-CCD System to acquire simultaneously two images using two Fujinon 3x optical zoom lenses arranged in an aluminum die-cast frame integrated in a very compact body. The nominal baseline is 77 mm and the resolution of the each CCD is 10 megapixels. Given the short baseline and the presence of two optical paths, the investigation aims to evaluate the accuracy of the 3D data that can be produced and the stability of the camera. From a photogrammetric point of view, the interest in this camera is its capability to acquire synchronized image pairs that contain important 3D metric information for many close-range applications (human body parts measurement, rapid prototyping, surveying of archeological artifacts, etc.). Calibration values - for the left and right cameras - at different focal lengths, derived with an in-house software application, are reported together with accuracy analyses. The object coordinates obtained from the bundle adjustment computation for each focal length were compared to reference coordinates of a test range by means of a similarity transformation. Additionally, the article reports on the investigation of the asymmetrical relative orientation between the left and right camera.

  18. Fast measurement of temporal noise of digital camera's photosensors

    NASA Astrophysics Data System (ADS)

    Cheremkhin, Pavel A.; Evtikhiev, Nikolay N.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Starikov, Rostislav S.; Starikov, Sergey N.

    2015-10-01

    Currently photo- and videocameras are widespread parts of both scientific experimental setups and consumer applications. They are used in optics, radiophysics, astrophotography, chemistry, and other various fields of science and technology such as control systems and video-surveillance monitoring. One of the main information limitations of photoand videocameras are noises of photosensor pixels. Camera's photosensor noise can be divided into random and pattern components. Temporal noise includes random noise component while spatial noise includes pattern noise component. Spatial part usually several times lower in magnitude than temporal. At first approximation spatial noises might be neglected. Earlier we proposed modification of the automatic segmentation of non-uniform targets (ASNT) method for measurement of temporal noise of photo- and videocameras. Only two frames are sufficient for noise measurement with the modified method. In result, proposed ASNT modification should allow fast and accurate measurement of temporal noise. In this paper, we estimated light and dark temporal noises of four cameras of different types using the modified ASNT method with only several frames. These cameras are: consumer photocamera Canon EOS 400D (CMOS, 10.1 MP, 12 bit ADC), scientific camera MegaPlus II ES11000 (CCD, 10.7 MP, 12 bit ADC), industrial camera PixeLink PLB781F (CMOS, 6.6 MP, 10 bit ADC) and video-surveillance camera Watec LCL-902C (CCD, 0.47 MP, external 8 bit ADC). Experimental dependencies of temporal noise on signal value are in good agreement with fitted curves based on a Poisson distribution excluding areas near saturation. We measured elapsed time for processing of shots used for temporal noise estimation. The results demonstrate the possibility of fast obtaining of dependency of camera full temporal noise on signal value with the proposed ASNT modification.

  19. Employing airborne multispectral digital imagery to map Brazilian pepper infestation in south Texas.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A study was conducted in south Texas to determine the feasibility of using airborne multispectral digital imagery for differentiating the invasive plant Brazilian pepper (Schinus terebinthifolius) from other cover types. Imagery obtained in the visible, near infrared, and mid infrared regions of th...

  20. Evaluating airborne multispectral digital video to differentiate giant Salvinia from other features in northeast Texas

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Giant salvinia is one of the world’s most noxious aquatic weeds. Researchers employed airborne digital video imagery and an unsupervised computer analysis to derive a map showing giant salvinia and other aquatic and terrestrial features within a study site located in northeast Texas. The map had a...

  1. CAMAC interface for digitally recording infrared camera images

    NASA Astrophysics Data System (ADS)

    Dyer, G. R.

    1986-06-01

    An instrument has been built to store the digital signals from a modified imaging infrared scanner directly in a digital memory. This procedure avoids the signal-to-noise degradation and dynamic range limitations associated with successive analog-to-digital and digital-to-analog conversions and the analog recording method normally used to store data from the scanner. This technique also allows digital data processing methods to be applied directly to recorded data and permits processing and image reconstruction to be done using either a mainframe or a microcomputer. If a suitable computer and CAMAC-based data collection system are already available, digital storage of up to 12 scanner images can be implemented for less than 1750 in materials cost. Each image is stored as a frame of 60×80 eight-bit pixels, with an acquisition rate of one frame every 16.7 ms. The number of frames stored is limited only by the available memory. Initially, data processing for this equipment was done on a VAX 11-780, but images may also be displayed on the screen of a microcomputer. Software for setting the displayed gray scale, generating contour plots and false-color displays, and subtracting one image from another (e.g., background suppression) has been developed for IBM-compatible personal computers.

  2. DigiCam: fully digital compact camera for SST-1M telescope

    NASA Astrophysics Data System (ADS)

    Aguilar, J. A.; Bilnik, W.; Bogacz, L.; Bulik, T.; Christov, A.; della Volpe, D.; Dyrda, M.; Frankowski, A.; Grudzinska, M.; Grygorczuk, J.; Heller, M.; Idźkowski, B.; Janiak, M.; Jamrozy, M.; Karczewski, M.; Kasperek, J.; Lyard, E.; Marszałek, A.; Michałowski, J.; Moderski, R.; Montaruli, T.; Neronov, A.; Nicolau-Kukliński, J.; Niemiec, J.; Ostrowski, M.; Paśko, P.; Płatos, Ł.; Prandini, E.; Pruchniewicz, R.; Rafalski, J.; Rajda, P. J.; Rameez, M.; Rataj, M.; Rupiński, M.; Rutkowski, K.; Seweryn, K.; Sidz, M.; Stawarz, Ł.; Stodulska, M.; Stodulski, M.; Tokarz, M.; Toscano, S.; Troyano Pujadas, I.; Walter, R.; Wawer, P.; Wawrzaszek, R.; Wiśniewski, L.; Zietara, K.; Ziółkowski, P.; Żychowski, P.

    2014-08-01

    The single mirror Small Size Telescopes (SST-1M), being built by a sub-consortium of Polish and Swiss Institutions of the CTA Consortium, will be equipped with a fully digital camera with a compact photodetector plane based on silicon photomultipliers. The internal trigger signal transmission overhead will be kept at low level by introducing a high level of integration. It will be achieved by massively deploying state-of-the-art multi-gigabit transceivers, beginning from the ADC flash converters, through the internal data and trigger signals transmission over backplanes and cables, to the camera's server 10Gb/s Ethernet links. Such approach will allow fitting the size and weight of the camera exactly to the SST-1M needs, still retaining the flexibility of a fully digital design. Such solution has low power consumption, high reliability and long lifetime. The concept of the camera will be described, along with some construction details and performance results.

  3. Photogrammetry of a 5m Inflatable Space Antenna With Consumer Digital Cameras

    NASA Technical Reports Server (NTRS)

    Pappa, Richard S.; Giersch, Louis R.; Quagliaroli, Jessica M.

    2000-01-01

    This paper discusses photogrammetric measurements of a 5m-diameter inflatable space antenna using four Kodak DC290 (2.1 megapixel) digital cameras. The study had two objectives: 1) Determine the photogrammetric measurement precision obtained using multiple consumer-grade digital cameras and 2) Gain experience with new commercial photogrammetry software packages, specifically PhotoModeler Pro from Eos Systems, Inc. The paper covers the eight steps required using this hardware/software combination. The baseline data set contained four images of the structure taken from various viewing directions. Each image came from a separate camera. This approach simulated the situation of using multiple time-synchronized cameras, which will be required in future tests of vibrating or deploying ultra-lightweight space structures. With four images, the average measurement precision for more than 500 points on the antenna surface was less than 0.020 inches in-plane and approximately 0.050 inches out-of-plane.

  4. Perspective Intensity Images for Co-Registration of Terrestrial Laser Scanner and Digital Camera

    NASA Astrophysics Data System (ADS)

    Liang, Yubin; Qiu, Yan; Cui, Tiejun

    2016-06-01

    Co-registration of terrestrial laser scanner and digital camera has been an important topic of research, since reconstruction of visually appealing and measurable models of the scanned objects can be achieved by using both point clouds and digital images. This paper presents an approach for co-registration of terrestrial laser scanner and digital camera. A perspective intensity image of the point cloud is firstly generated by using the collinearity equation. Then corner points are extracted from the generated perspective intensity image and the camera image. The fundamental matrix F is then estimated using several interactively selected tie points and used to obtain more matches with RANSAC. The 3D coordinates of all the matched tie points are directly obtained or estimated using the least squares method. The robustness and effectiveness of the presented methodology is demonstrated by the experimental results. Methods presented in this work may also be used for automatic registration of terrestrial laser scanning point clouds.

  5. Measuring the image quality of digital-camera sensors by a ping-pong ball

    NASA Astrophysics Data System (ADS)

    Pozo, Antonio M.; Rubiño, Manuel; Castro, José J.; Salas, Carlos; Pérez-Ocón, Francisco

    2014-07-01

    In this work, we present a low-cost experimental setup to evaluate the image quality of digital-camera sensors, which can be implemented in undergraduate and postgraduate teaching. The method consists of evaluating the modulation transfer function (MTF) of digital-camera sensors by speckle patterns using a ping-pong ball as a diffuser, with two handmade circular apertures acting as input and output ports, respectively. To specify the spatial-frequency content of the speckle pattern, it is necessary to use an aperture; for this, we made a slit in a piece of black cardboard. First, the MTF of a digital-camera sensor was calculated using the ping-pong ball and the handmade slit, and then the MTF was calculated using an integrating sphere and a high-quality steel slit. Finally, the results achieved with both experimental setups were compared, showing a similar MTF in both cases.

  6. 2010 A Digital Odyssey: Exploring Document Camera Technology and Computer Self-Efficacy in a Digital Era

    ERIC Educational Resources Information Center

    Hoge, Robert Joaquin

    2010-01-01

    Within the sphere of education, navigating throughout a digital world has become a matter of necessity for the developing professional, as with the advent of Document Camera Technology (DCT). This study explores the pedagogical implications of implementing DCT; to see if there is a relationship between teachers' comfort with DCT and to the…

  7. A Multispectral Image Creating Method for a New Airborne Four-Camera System with Different Bandpass Filters

    PubMed Central

    Li, Hanlun; Zhang, Aiwu; Hu, Shaoxing

    2015-01-01

    This paper describes an airborne high resolution four-camera multispectral system which mainly consists of four identical monochrome cameras equipped with four interchangeable bandpass filters. For this multispectral system, an automatic multispectral data composing method was proposed. The homography registration model was chosen, and the scale-invariant feature transform (SIFT) and random sample consensus (RANSAC) were used to generate matching points. For the difficult registration problem between visible band images and near-infrared band images in cases lacking manmade objects, we presented an effective method based on the structural characteristics of the system. Experiments show that our method can acquire high quality multispectral images and the band-to-band alignment error of the composed multiple spectral images is less than 2.5 pixels. PMID:26205264

  8. Digital synchroballistic schlieren camera for high-speed photography of bullets and rocket sleds

    NASA Astrophysics Data System (ADS)

    Buckner, Benjamin D.; L'Esperance, Drew

    2013-08-01

    A high-speed digital streak camera designed for simultaneous high-resolution color photography and focusing schlieren imaging is described. The camera uses a computer-controlled galvanometer scanner to achieve synchroballistic imaging through a narrow slit. Full color 20 megapixel images of a rocket sled moving at 480 m/s and of projectiles fired at around 400 m/s were captured, with high-resolution schlieren imaging in the latter cases, using conventional photographic flash illumination. The streak camera can achieve a line rate for streak imaging of up to 2.4 million lines/s.

  9. Estimation of spectral distribution of sky radiance using a commercial digital camera.

    PubMed

    Saito, Masanori; Iwabuchi, Hironobu; Murata, Isao

    2016-01-10

    Methods for estimating spectral distribution of sky radiance from images captured by a digital camera and for accurately estimating spectral responses of the camera are proposed. Spectral distribution of sky radiance is represented as a polynomial of the wavelength, with coefficients obtained from digital RGB counts by linear transformation. The spectral distribution of radiance as measured is consistent with that obtained by spectrometer and radiative transfer simulation for wavelengths of 430-680 nm, with standard deviation below 1%. Preliminary applications suggest this method is useful for detecting clouds and studying the relation between irradiance at the ground and cloud distribution. PMID:26835780

  10. In-plane displacement and strain measurements using a camera phone and digital image correlation

    NASA Astrophysics Data System (ADS)

    Yu, Liping; Pan, Bing

    2014-05-01

    In-plane displacement and strain measurements of planar objects by processing the digital images captured by a camera phone using digital image correlation (DIC) are performed in this paper. As a convenient communication tool for everyday use, the principal advantages of a camera phone are its low cost, easy accessibility, and compactness. However, when used as a two-dimensional DIC system for mechanical metrology, the assumed imaging model of a camera phone may be slightly altered during the measurement process due to camera misalignment, imperfect loading, sample deformation, and temperature variations of the camera phone, which can produce appreciable errors in the measured displacements. In order to obtain accurate DIC measurements using a camera phone, the virtual displacements caused by these issues are first identified using an unstrained compensating specimen and then corrected by means of a parametric model. The proposed technique is first verified using in-plane translation and out-of-plane translation tests. Then, it is validated through a determination of the tensile strains and elastic properties of an aluminum specimen. Results of the present study show that accurate DIC measurements can be conducted using a common camera phone provided that an adequate correction is employed.

  11. Measurement of noises and modulation transfer function of cameras used in optical-digital correlators

    NASA Astrophysics Data System (ADS)

    Evtikhiev, Nikolay N.; Starikov, Sergey N.; Cheryomkhin, Pavel A.; Krasnov, Vitaly V.

    2012-01-01

    Hybrid optical-digital systems based on diffractive correlator are being actively developed. To correctly estimate application capabilities of cameras of different types in optical-digital correlation systems knowledge of modulation transfer function (MTF) and light depended temporal and spatial noises is required. The method for measurement of 2D MTF is presented. The method based on random target method but instead of a random target the specially created target with flat power spectrum is used. It allows to measure MTF without averaging 1D Fourier spectra over rows or columns as is in the random target method and to achieve all values of 2D MTF instead of just two orthogonal cross-sections. The simple method for measuring the dependence of camera temporal noise on light signal value by shooting a single scene is described. Measurements results of light and dark spatial and temporal noises of cameras are presented. Procedure for obtaining camera's light spatial noise portrait (array of PRNU values for all photo sensor pixels) is presented. Results on measurements of MTF and temporal and spatial noises for consumer photo camera, machine vision camera and videosurveillance camera are presented.

  12. Lights, Camera, Reflection! Digital Movies: A Tool for Reflective Learning

    ERIC Educational Resources Information Center

    Genereux, Annie Prud'homme; Thompson, William A.

    2008-01-01

    At the end of a biology course entitled Ecology, Evolution, and Genetics, students were asked to consider how their learning experience had changed their perception of either ecology or genetics. Students were asked to express their thoughts in the form of a "digital story" using readily available software to create movies for the purpose of…

  13. The trustworthy digital camera: Restoring credibility to the photographic image

    NASA Technical Reports Server (NTRS)

    Friedman, Gary L.

    1994-01-01

    The increasing sophistication of computers has made digital manipulation of photographic images, as well as other digitally-recorded artifacts such as audio and video, incredibly easy to perform and increasingly difficult to detect. Today, every picture appearing in newspapers and magazines has been digitally altered to some degree, with the severity varying from the trivial (cleaning up 'noise' and removing distracting backgrounds) to the point of deception (articles of clothing removed, heads attached to other people's bodies, and the complete rearrangement of city skylines). As the power, flexibility, and ubiquity of image-altering computers continues to increase, the well-known adage that 'the photography doesn't lie' will continue to become an anachronism. A solution to this problem comes from a concept called digital signatures, which incorporates modern cryptographic techniques to authenticate electronic mail messages. 'Authenticate' in this case means one can be sure that the message has not been altered, and that the sender's identity has not been forged. The technique can serve not only to authenticate images, but also to help the photographer retain and enforce copyright protection when the concept of 'electronic original' is no longer meaningful.

  14. Development of Digital SLR Camera: PENTAX K-7

    NASA Astrophysics Data System (ADS)

    Kawauchi, Hiraku

    The DSLR "PENTAX K-7" comes with an easy-to-carry, minimal yet functional small form factor, a long inherited identities of the PENTAX brand. Nevertheless for its compact body, this camera has up-to-date enhanced fundamental features such as high-quality viewfinder, enhanced shutter mechanism, extended continuous shooting capabilities, reliable exposure control, and fine-tuned AF systems, as well as strings of newest technologies such as movie recording capability and automatic leveling function. The main focus of this article is to reveal the ideas behind the concept making of this product and its distinguished features.

  15. Vegetation indices derived from a modified digital camera in combination with different blocking filters

    NASA Astrophysics Data System (ADS)

    Krainer, Karl; Hammerle, Albin; Wohlfahrt, Georg

    2015-04-01

    Remote and proximal sensing have become valuable and broadly used tools in ecosystem research. Radiation reflected and scattered at and from the vegetation is used to infer information about vegetation biomass, structure, vitality and functioning, just to name a few. To this end numerous vegetation indices have been established, which relate reflectance in different wavelengths to each other. While such indices are usually calculated from reflectance data measured by spectro-radiometers we did a study using a commercially available digital camera, from which the infrared (IR) band elimination filter was removed. By removing this filter, the camera sensor became sensitive for IR radiation besides the visible spectrum. Comparing measurements with this modified camera and a hyperspectral spectro-radiometer over different vegetation and surfaces we determined the potential of such a modified camera to measure different vegetation indices. To this end we compared 71 vegetation indices derived from spectro-radiometer data with 63 indices derived from the modified digital camera. We found that many of these different indices featured relatively high correlations. Especially the rgR (green/red ratio) and NDI (normalized difference vegetation index) calculated from data of the modified camera do correlate very well with vegetation indices that are known for representing the amount and vitality of green biomass, as these are the NIDI (normalized infrared vegetation index) and the LIC (curvature index). We thus conclude from this experiment, that given a proper inter-calibration, a commercially available digital camera can be modified and used as a reasonable alternative tool to determine vegetation biomass and/or vitality. In addition to these measurements currently different band elimination filters are used to improve the information content of the digital images.

  16. A powerful ethernet interface module for digital camera control

    NASA Astrophysics Data System (ADS)

    Amato, Stephen M.; Geary, John C.

    2012-09-01

    We have found a commercially-available ethernet interface module with sufficient on-board resources to largely handle all timing generation tasks required by digital imaging systems found in astronomy. In addition to providing a high-bandwidth ethernet interface to the controller, it can largely replace the need for special-purpose timing circuitry. Examples for use with both CCD and CMOS imagers are provided.

  17. Digital data from the Great Sand Dunes airborne gravity gradient survey, south-central Colorado

    USGS Publications Warehouse

    Drenth, B.J.; Abraham, J.D.; Grauch, V.J.S.; Labson, V.F.; Hodges, G.

    2013-01-01

    This report contains digital data and supporting explanatory files describing data types, data formats, and survey procedures for a high-resolution airborne gravity gradient (AGG) survey at Great Sand Dunes National Park, Alamosa and Saguache Counties, south-central Colorado. In the San Luis Valley, the Great Sand Dunes survey covers a large part of Great Sand Dunes National Park and Preserve. The data described were collected from a high-resolution AGG survey flown in February 2012, by Fugro Airborne Surveys Corp., on contract to the U.S. Geological Survey. Scientific objectives of the AGG survey are to investigate the subsurface structural framework that may influence groundwater hydrology and seismic hazards, and to investigate AGG methods and resolution using different flight specifications. Funding was provided by an airborne geophysics training program of the U.S. Department of Defense's Task Force for Business & Stability Operations.

  18. Film cameras or digital sensors? The challenge ahead for aerial imaging

    USGS Publications Warehouse

    Light, D.L.

    1996-01-01

    Cartographic aerial cameras continue to play the key role in producing quality products for the aerial photography business, and specifically for the National Aerial Photography Program (NAPP). One NAPP photograph taken with cameras capable of 39 lp/mm system resolution can contain the equivalent of 432 million pixels at 11 ??m spot size, and the cost is less than $75 per photograph to scan and output the pixels on a magnetic storage medium. On the digital side, solid state charge coupled device linear and area arrays can yield quality resolution (7 to 12 ??m detector size) and a broader dynamic range. If linear arrays are to compete with film cameras, they will require precise attitude and positioning of the aircraft so that the lines of pixels can be unscrambled and put into a suitable homogeneous scene that is acceptable to an interpreter. Area arrays need to be much larger than currently available to image scenes competitive in size with film cameras. Analysis of the relative advantages and disadvantages of the two systems show that the analog approach is more economical at present. However, as arrays become larger, attitude sensors become more refined, global positioning system coordinate readouts become commonplace, and storage capacity becomes more affordable, the digital camera may emerge as the imaging system for the future. Several technical challenges must be overcome if digital sensors are to advance to where they can support mapping, charting, and geographic information system applications.

  19. Color calibration of a CMOS digital camera for mobile imaging

    NASA Astrophysics Data System (ADS)

    Eliasson, Henrik

    2010-01-01

    As white balance algorithms employed in mobile phone cameras become increasingly sophisticated by using, e.g., elaborate white-point estimation methods, a proper color calibration is necessary. Without such a calibration, the estimation of the light source for a given situation may go wrong, giving rise to large color errors. At the same time, the demands for efficiency in the production environment require the calibration to be as simple as possible. Thus it is important to find the correct balance between image quality and production efficiency requirements. The purpose of this work is to investigate camera color variations using a simple model where the sensor and IR filter are specified in detail. As input to the model, spectral data of the 24-color Macbeth Colorchecker was used. This data was combined with the spectral irradiance of mainly three different light sources: CIE A, D65 and F11. The sensor variations were determined from a very large population from which 6 corner samples were picked out for further analysis. Furthermore, a set of 100 IR filters were picked out and measured. The resulting images generated by the model were then analyzed in the CIELAB space and color errors were calculated using the ΔE94 metric. The results of the analysis show that the maximum deviations from the typical values are small enough to suggest that a white balance calibration is sufficient. Furthermore, it is also demonstrated that the color temperature dependence is small enough to justify the use of only one light source in a production environment.

  20. Can Commercial Digital Cameras Be Used as Multispectral Sensors? A Crop Monitoring Test

    PubMed Central

    Lebourgeois, Valentine; Bégué, Agnès; Labbé, Sylvain; Mallavan, Benjamin; Prévot, Laurent; Roux, Bruno

    2008-01-01

    The use of consumer digital cameras or webcams to characterize and monitor different features has become prevalent in various domains, especially in environmental applications. Despite some promising results, such digital camera systems generally suffer from signal aberrations due to the on-board image processing systems and thus offer limited quantitative data acquisition capability. The objective of this study was to test a series of radiometric corrections having the potential to reduce radiometric distortions linked to camera optics and environmental conditions, and to quantify the effects of these corrections on our ability to monitor crop variables. In 2007, we conducted a five-month experiment on sugarcane trial plots using original RGB and modified RGB (Red-Edge and NIR) cameras fitted onto a light aircraft. The camera settings were kept unchanged throughout the acquisition period and the images were recorded in JPEG and RAW formats. These images were corrected to eliminate the vignetting effect, and normalized between acquisition dates. Our results suggest that 1) the use of unprocessed image data did not improve the results of image analyses; 2) vignetting had a significant effect, especially for the modified camera, and 3) normalized vegetation indices calculated with vignetting-corrected images were sufficient to correct for scene illumination conditions. These results are discussed in the light of the experimental protocol and recommendations are made for the use of these versatile systems for quantitative remote sensing of terrestrial surfaces.

  1. 75 FR 7519 - In the Matter of Certain Digital Cameras; Notice of Commission Determination Not To Review an...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-19

    ... ] on February 27, 2009 and March 11, 2009. 74 FR 12377-78 (Mar. 24, 2009). The complaint, as... COMMISSION In the Matter of Certain Digital Cameras; Notice of Commission Determination Not To Review an... importation, or the sale within the United States after importation of certain digital cameras by reason...

  2. Developing Mental Imagery Using a Digital Camera: A Study of Adult Vocational Training

    ERIC Educational Resources Information Center

    Ryba, Ken; Selby, Linda; Brown, Roy

    2004-01-01

    This study was undertaken to explore the use of a digital camera for mental imagery training of a vocational task with two young adult men with Down syndrome. The results indicate that these particular men benefited from the use of a collaborative training process that involved mental imagery for learning a series of photocopying operations. An…

  3. Estimating the Infrared Radiation Wavelength Emitted by a Remote Control Device Using a Digital Camera

    ERIC Educational Resources Information Center

    Catelli, Francisco; Giovannini, Odilon; Bolzan, Vicente Dall Agnol

    2011-01-01

    The interference fringes produced by a diffraction grating illuminated with radiation from a TV remote control and a red laser beam are, simultaneously, captured by a digital camera. Based on an image with two interference patterns, an estimate of the infrared radiation wavelength emitted by a TV remote control is made. (Contains 4 figures.)

  4. Digital Video Cameras for Brainstorming and Outlining: The Process and Potential

    ERIC Educational Resources Information Center

    Unger, John A.; Scullion, Vicki A.

    2013-01-01

    This "Voices from the Field" paper presents methods and participant-exemplar data for integrating digital video cameras into the writing process across postsecondary literacy contexts. The methods and participant data are part of an ongoing action-based research project systematically designed to bring research and theory into practice…

  5. A Mid-infrared Digital Electronic Camera System for Assessing Natural Resources

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Water strongly absorbs mid-infrared (1300-2500 nm) radiation, resulting in this region of the spectrum being sensitive to the water content within features. Little information is available on using an electronic digital camera filtered to this region of the spectrum to assess natural resources. Th...

  6. On the Complexity of Digital Video Cameras in/as Research: Perspectives and Agencements

    ERIC Educational Resources Information Center

    Bangou, Francis

    2014-01-01

    The goal of this article is to consider the potential for digital video cameras to produce as part of a research agencement. Our reflection will be guided by the current literature on the use of video recordings in research, as well as by the rhizoanalysis of two vignettes. The first of these vignettes is associated with a short video clip shot by…

  7. Development of an XYZ Digital Camera with Embedded Color Calibration System for Accurate Color Acquisition

    NASA Astrophysics Data System (ADS)

    Kretkowski, Maciej; Jablonski, Ryszard; Shimodaira, Yoshifumi

    Acquisition of accurate colors is important in the modern era of widespread exchange of electronic multimedia. The variety of device-dependent color spaces causes troubles with accurate color reproduction. In this paper we present the outlines of accomplished digital camera system with device-independent output formed from tristimulus XYZ values. The outstanding accuracy and fidelity of acquired color is achieved in our system by employing an embedded color calibration system based on emissive device generating reference calibration colors with user-defined spectral distribution and chromaticity coordinates. The system was tested by calibrating the camera using 24 reference colors spectrally reproduced from 24 color patches of the Macbeth Chart. The average color difference (CIEDE2000) has been found to be ΔE =0.83, which is an outstanding result compared to commercially available digital cameras.

  8. MTF measurement and imaging quality evaluation of digital camera with slanted-edge method

    NASA Astrophysics Data System (ADS)

    Xiang, Chunchang; Chen, Xinhua; Chen, Yuheng; Zhou, Jiankang; Shen, Weimin

    2010-11-01

    Modulation Transfer Function (MTF) is the spatial frequency response of imaging systems and now develops as an objective merit performance for evaluating both quality of lens and camera. Slanted-edge method and its principle for measuring MTF of digital camera are introduced in this paper. The setup and software for testing digital camera is respectively established and developed. Measurement results with different tilt angle of the knife edge are compared to discuss the influence of the tilt angle. Also carefully denoise of the knife edge image is performed to decrease the noise sensitivity of knife edge measurement. Comparisons have been made between the testing results gained by slanted-edge method and grating target technique, and their deviation is analyzed.

  9. A versatile digital camera trigger for telescopes in the Cherenkov Telescope Array

    NASA Astrophysics Data System (ADS)

    Schwanke, U.; Shayduk, M.; Sulanke, K.-H.; Vorobiov, S.; Wischnewski, R.

    2015-05-01

    This paper describes the concept of an FPGA-based digital camera trigger for imaging atmospheric Cherenkov telescopes, developed for the future Cherenkov Telescope Array (CTA). The proposed camera trigger is designed to select images initiated by the Cherenkov emission of extended air showers from very-high energy (VHE, E > 20 GeV) photons and charged particles while suppressing signatures from background light. The trigger comprises three stages. A first stage employs programmable discriminators to digitize the signals arriving from the camera channels (pixels). At the second stage, a grid of low-cost FPGAs is used to process the digitized signals for camera regions with 37 pixels. At the third stage, trigger conditions found independently in any of the overlapping 37-pixel regions are combined into a global camera trigger by few central FPGAs. Trigger prototype boards based on Xilinx FPGAs have been designed, built and tested and were shown to function properly. Using these components a full camera trigger with a power consumption and price per channel of about 0.5 W and 19 €, respectively, can be built. With the described design the camera trigger algorithm can take advantage of pixel information in both the space and the time domain allowing, for example, the creation of triggers sensitive to the time-gradient of a shower image; the time information could also be exploited to online adjust the time window of the acquisition system for pixel data. Combining the results of the parallel execution of different trigger algorithms (optimized, for example, for the lowest and highest energies, respectively) on each FPGA can result in a better response over all photons energies (as demonstrated by Monte Carlo simulation in this work).

  10. Metric Potential of a 3D Measurement System Based on Digital Compact Cameras

    PubMed Central

    Sanz-Ablanedo, Enoc; Rodríguez-Pérez, José Ramón; Arias-Sánchez, Pedro; Armesto, Julia

    2009-01-01

    This paper presents an optical measuring system based on low cost, high resolution digital cameras. Once the cameras are synchronised, the portable and adjustable system can be used to observe living beings, bodies in motion, or deformations of very different sizes. Each of the cameras has been modelled individually and studied with regard to the photogrammetric potential of the system. We have investigated the photogrammetric precision obtained from the crossing of rays, the repeatability of results, and the accuracy of the coordinates obtained. Systematic and random errors are identified in validity assessment of the definition of the precision of the system from crossing of rays or from marking residuals in images. The results have clearly demonstrated the capability of a low-cost multiple-camera system to measure with sub-millimetre precision. PMID:22408520

  11. Comparison of Kodak Professional Digital Camera System images to conventional film, still video, and freeze-frame images

    NASA Astrophysics Data System (ADS)

    Kent, Richard A.; McGlone, John T.; Zoltowski, Norbert W.

    1991-06-01

    Electronic cameras provide near real time image evaluation with the benefits of digital storage methods for rapid transmission or computer processing and enhancement of images. But how does the image quality of their images compare to that of conventional film? A standard Nikon F-3TM 35 mm SLR camera was transformed into an electro-optical camera by replacing the film back with Kodak's KAF-1400V (or KAF-1300L) megapixel CCD array detector back and a processing accessory. Images taken with these Kodak electronic cameras were compared to those using conventional films and to several still video cameras. Quantitative and qualitative methods were used to compare images from these camera systems. Images captured on conventional video analog systems provide a maximum of 450 - 500 TV lines of resolution depending upon the camera resolution, storage method, and viewing system resolution. The Kodak Professional Digital Camera SystemTM exceeded this resolution and more closely approached that of film.

  12. A novel ultra-high speed camera for digital image processing applications

    NASA Astrophysics Data System (ADS)

    Hijazi, A.; Madhavan, V.

    2008-08-01

    Multi-channel gated-intensified cameras are commonly used for capturing images at ultra-high frame rates. The use of image intensifiers reduces the image resolution and increases the error in applications requiring high-quality images, such as digital image correlation. We report the development of a new type of non-intensified multi-channel camera system that permits recording of image sequences at ultra-high frame rates at the native resolution afforded by the imaging optics and the cameras used. This camera system is based upon the concept of using a sequence of short-duration light pulses of different wavelengths for illumination and using wavelength selective elements in the imaging system to route each particular wavelength of light to a particular camera. As such, the duration of the light pulses controls the exposure time and the timing of the light pulses controls the interframe time. A prototype camera system built according to this concept comprises four dual-frame cameras synchronized with four dual-cavity pulsed lasers producing 5 ns pulses in four different wavelengths. The prototype is capable of recording four-frame full-resolution image sequences at frame rates up to 200 MHz and eight-frame image sequences at frame rates up to 8 MHz. This system is built around a stereo microscope to capture stereoscopic image sequences usable for 3D digital image correlation. The camera system is used for imaging the chip-workpiece interface area during high speed machining, and the images are used to map the strain rate in the primary shear zone.

  13. Arthropod eye-inspired digital camera with unique imaging characteristics

    NASA Astrophysics Data System (ADS)

    Xiao, Jianliang; Song, Young Min; Xie, Yizhu; Malyarchuk, Viktor; Jung, Inhwa; Choi, Ki-Joong; Liu, Zhuangjian; Park, Hyunsung; Lu, Chaofeng; Kim, Rak-Hwan; Li, Rui; Crozier, Kenneth B.; Huang, Yonggang; Rogers, John A.

    2014-06-01

    In nature, arthropods have a remarkably sophisticated class of imaging systems, with a hemispherical geometry, a wideangle field of view, low aberrations, high acuity to motion and an infinite depth of field. There are great interests in building systems with similar geometries and properties due to numerous potential applications. However, the established semiconductor sensor technologies and optics are essentially planar, which experience great challenges in building such systems with hemispherical, compound apposition layouts. With the recent advancement of stretchable optoelectronics, we have successfully developed strategies to build a fully functional artificial apposition compound eye camera by combining optics, materials and mechanics principles. The strategies start with fabricating stretchable arrays of thin silicon photodetectors and elastomeric optical elements in planar geometries, which are then precisely aligned and integrated, and elastically transformed to hemispherical shapes. This imaging device demonstrates nearly full hemispherical shape (about 160 degrees), with densely packed artificial ommatidia. The number of ommatidia (180) is comparable to those of the eyes of fire ants and bark beetles. We have illustrated key features of operation of compound eyes through experimental imaging results and quantitative ray-tracing-based simulations. The general strategies shown in this development could be applicable to other compound eye devices, such as those inspired by moths and lacewings (refracting superposition eyes), lobster and shrimp (reflecting superposition eyes), and houseflies (neural superposition eyes).

  14. Development and Utilization of High Precision Digital Elevation Data taken by Airborne Laser Scanner

    NASA Astrophysics Data System (ADS)

    Akutsu, Osamu; Ohta, Masataka; Isobe, Tamio; Ando, Hisamitsu, Noguchi, Takahiro; Shimizu, Masayuki

    2005-03-01

    Disasters caused by heavy rain in urban areas bring a damage such as chaos in the road and railway transport systems, power failure, breakdown of the telephone system and submersion of built up areas, subways and underground shopping arcades, etc. It is important to obtain high precision elevation data which shows the detailed landform because a slight height difference affects damages by flood very considerably. Therefore, The Geographical Survey Institute (GSI) is preparing 5m grid digital terrain model (DTM) based on precise ground elevation data taken by using airborne laser scanner. This paper describes the process and an example of the use of a 5m grid digital data set.

  15. Comparison of mosaicking techniques for airborne images from consumer-grade cameras

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Images captured from airborne imaging systems have the advantages of relatively low cost, high spatial resolution, and real/near-real-time availability. Multiple images taken from one or more flight lines could be used to generate a high-resolution mosaic image, which could be useful for diverse rem...

  16. Optical design of high resolution and large format CCD airborne remote sensing camera on unmanned aerial vehicle

    NASA Astrophysics Data System (ADS)

    Qian, Yixian; Cheng, Xiaowei; Shao, Jie

    2010-11-01

    Unmanned aerial vehicle remote sensing (UAVRS) is lower in cost, flexible on task arrangement and automatic and intelligent in application, it has been used widely for mapping, surveillance, reconnaissance and city planning. Airborne remote sensing missions require sensors with both high resolution and large fields of view, large format CCD digital airborne imaging systems are now a reality. A refractive system was designed to meet the requirements with the help of code V software, It has a focal length of 150mm, F number of 5.6, waveband of 0.45~0.7um, and field of view reaches 20°. It is shown that the value of modulation transfer function is higher than 0.5 at 55lp/mm, distortion is less than 0.1%, image quality reaches the diffraction limit. The system with large format CCD and wide field can satisfy the demand of the wide ground overlay area and high resolution. The optical system with simpler structure, smaller size and lighter weight, can be used in airborne remote sensing.

  17. Self-calibration of digital aerial camera using combined orthogonal models

    NASA Astrophysics Data System (ADS)

    Babapour, Hadi; Mokhtarzade, Mehdi; Valadan Zoej, Mohamad Javad

    2016-07-01

    The emergence of new digital aerial cameras and the diverse design and technology used in this type of cameras require in-situ calibration. Self-calibration methods, e.g. the Fourier model, are primarily used; however, additional parameters employed in such methods have not yet met the expectations to desirably model the complex multiple distortions existing in the digital aerial cameras. The present study proposes the Chebyshev-Fourier (CHF) and Jacobi-Fourier (JF) combined orthogonal models. The models are evaluated for the multiple distortions using both simulated and real data, the latter being derived from an UltraCam digital camera. The results indicate that the JF model is superior to the other methods where, e.g., in the UltraCam scenario, it improves the planimetric and vertical accuracy over the Fourier model by 18% and 22%, respectively. Furthermore, a 30% and 16% of reduction in external and internal correlation is obtained via this approach which is very promising.

  18. Illuminant spectrum estimation using a digital color camera and a color chart

    NASA Astrophysics Data System (ADS)

    Shi, Junsheng; Yu, Hongfei; Huang, Xiaoqiao; Chen, Zaiqing; Tai, Yonghang

    2014-10-01

    Illumination estimation is the main step in color constancy processing, also an important prerequisite for digital color image reproduction and many computer vision applications. In this paper, a method for estimating illuminant spectrum is investigated using a digital color camera and a color chart under the situation when the spectral reflectance of the chart is known. The method is based on measuring CIEXYZ of the chart using the camera. The first step of the method is to gain camera's color correction matrix and gamma values by taking a photo of the chart under a standard illuminant. The second step is to take a photo of the chart under an estimated illuminant, and the camera's inherent RGB values are converted to the standard sRGB values and further converted to CIEXYZ of the chart. Based on measured CIEXYZ and known spectral reflectance of the chart, the spectral power distribution (SPD) of the illuminant is estimated using the Wiener estimation and smoothing estimation. To evaluate the performance of the method quantitatively, the goodnessfitting coefficient (GFC) was used to measure the spectral match and the CIELAB color difference metric was used to evaluate the color match between color patches under the estimated and actual SPDs. The simulated experiment was carried to estimate CIE standard illuminant D50 and C using X-rite ColorChecker 24-color chart, the actual experiment was carried to estimate daylight and illuminant A using two consumergrade cameras and the chart, and the experiment results verified feasible of the investigated method.

  19. Temporal monitoring of groundcover change using digital cameras

    NASA Astrophysics Data System (ADS)

    Zerger, A.; Gobbett, D.; Crossman, C.; Valencia, P.; Wark, T.; Davies, M.; Handcock, R. N.; Stol, J.

    2012-10-01

    This paper describes the development and testing of an automated method for detecting change in groundcover vegetation in response to kangaroo grazing using visible wavelength digital photography. The research is seen as a precursor to the future deployment of autonomous vegetation monitoring systems (environmental sensor networks). The study was conducted over six months with imagery captured every 90 min and post-processed using supervised image processing techniques. Synchronous manual assessments of groundcover change were also conducted to evaluate the effectiveness of the automated procedures. Results show that for particular cover classes such as Live Vegetation and Bare Ground, there is excellent temporal concordance between automated and manual methods. However, litter classes were difficult to consistently differentiate. A limitation of the method is the inability to effectively deal with change in the vertical profile of groundcover. This indicates that the three dimensional structure related to species composition and plant traits play an important role in driving future experimental designs. The paper concludes by providing lessons for conducting future groundcover monitoring experiments.

  20. Comparison of fractional vegetation cover derived from digital camera and MODIS NDVI in Mongolia

    NASA Astrophysics Data System (ADS)

    Jaebeom, K.; Jang, K.; Kang, S.

    2014-12-01

    Satellite remote sensing can continuously observe the land surface vegetation with repetitive error over large area, though it requires complex processes to correct errors occurred from atmosphere and topography. On the other hand, the imageries captured by digital camera provide several benefits such as high spatial resolution, simple shooting method, and relatively low-priced instrument. Furthermore, digital camera has less of atmospheric effect such as path radiance than satellite imagery, and have advantage of the shooting with actual land cover. The objective of this study is the comparison of fractional vegetation cover derived from digital camera and MODIS Normalized Difference Vegetation Index (NDVI) in Mongolia. 670 imageries for the above ground including green leaves and soil surface captured by digital camera at the 134 sites in Mongolia from 2011 to 2014 were used to classify the vegetation cover fraction. Thirteen imageries captured by Mongolia and South Korea were selected to determine the best classification method. Various classification methods including the 4 supervised classifications, 2 unsupervised classifications, and histogram methods were used to separate the green vegetation in camera imageries that were converted to two color spaces such as Red-Green-Blue (RGB) and Hue-Intensity-Saturation (HIS). Those results were validated using the manually counted dataset from the local plant experts. The maximum likelihood classification (MLC) with HIS color space among classification methods showed a good agreement with manually counted dataset. The correlation coefficient and the root mean square error were 1.008 and 7.88%, respectively. Our preliminary result indicates that the MLC with HIS color space has a potential to classify the green vegetation in Mongolia.

  1. Combining laser scan and photogrammetry for 3D object modeling using a single digital camera

    NASA Astrophysics Data System (ADS)

    Xiong, Hanwei; Zhang, Hong; Zhang, Xiangwei

    2009-07-01

    In the fields of industrial design, artistic design and heritage conservation, physical objects are usually digitalized by reverse engineering through some 3D scanning methods. Laser scan and photogrammetry are two main methods to be used. For laser scan, a video camera and a laser source are necessary, and for photogrammetry, a digital still camera with high resolution pixels is indispensable. In some 3D modeling tasks, two methods are often integrated to get satisfactory results. Although many research works have been done on how to combine the results of the two methods, no work has been reported to design an integrated device at low cost. In this paper, a new 3D scan system combining laser scan and photogrammetry using a single consumer digital camera is proposed. Nowadays there are many consumer digital cameras, such as Canon EOS 5D Mark II, they usually have features of more than 10M pixels still photo recording and full 1080p HD movie recording, so a integrated scan system can be designed using such a camera. A square plate glued with coded marks is used to place the 3d objects, and two straight wood rulers also glued with coded marks can be laid on the plate freely. In the photogrammetry module, the coded marks on the plate make up a world coordinate and can be used as control network to calibrate the camera, and the planes of two rulers can also be determined. The feature points of the object and the rough volume representation from the silhouettes can be obtained in this module. In the laser scan module, a hand-held line laser is used to scan the object, and the two straight rulers are used as reference planes to determine the position of the laser. The laser scan results in dense points cloud which can be aligned together automatically through calibrated camera parameters. The final complete digital model is obtained through a new a patchwise energy functional method by fusion of the feature points, rough volume and the dense points cloud. The design

  2. CMOS image sensor noise reduction method for image signal processor in digital cameras and camera phones

    NASA Astrophysics Data System (ADS)

    Yoo, Youngjin; Lee, SeongDeok; Choe, Wonhee; Kim, Chang-Yong

    2007-02-01

    Digital images captured from CMOS image sensors suffer Gaussian noise and impulsive noise. To efficiently reduce the noise in Image Signal Processor (ISP), we analyze noise feature for imaging pipeline of ISP where noise reduction algorithm is performed. The Gaussian noise reduction and impulsive noise reduction method are proposed for proper ISP implementation in Bayer domain. The proposed method takes advantage of the analyzed noise feature to calculate noise reduction filter coefficients. Thus, noise is adaptively reduced according to the scene environment. Since noise is amplified and characteristic of noise varies while the image sensor signal undergoes several image processing steps, it is better to remove noise in earlier stage on imaging pipeline of ISP. Thus, noise reduction is carried out in Bayer domain on imaging pipeline of ISP. The method is tested on imaging pipeline of ISP and images captured from Samsung 2M CMOS image sensor test module. The experimental results show that the proposed method removes noise while effectively preserves edges.

  3. Quantitative Evaluation of Surface Color of Tomato Fruits Cultivated in Remote Farm Using Digital Camera Images

    NASA Astrophysics Data System (ADS)

    Hashimoto, Atsushi; Suehara, Ken-Ichiro; Kameoka, Takaharu

    To measure the quantitative surface color information of agricultural products with the ambient information during cultivation, a color calibration method for digital camera images and a remote monitoring system of color imaging using the Web were developed. Single-lens reflex and web digital cameras were used for the image acquisitions. The tomato images through the post-ripening process were taken by the digital camera in both the standard image acquisition system and in the field conditions from the morning to evening. Several kinds of images were acquired with the standard RGB color chart set up just behind the tomato fruit on a black matte, and a color calibration was carried out. The influence of the sunlight could be experimentally eliminated, and the calibrated color information consistently agreed with the standard ones acquired in the system through the post-ripening process. Furthermore, the surface color change of the tomato on the tree in a greenhouse was remotely monitored during maturation using the digital cameras equipped with the Field Server. The acquired digital color images were sent from the Farm Station to the BIFE Laboratory of Mie University via VPN. The time behavior of the tomato surface color change during the maturing process could be measured using the color parameter calculated based on the obtained and calibrated color images along with the ambient atmospheric record. This study is a very important step in developing the surface color analysis for both the simple and rapid evaluation of the crop vigor in the field and to construct an ambient and networked remote monitoring system for food security, precision agriculture, and agricultural research.

  4. Low-cost conversion of the Polaroid MD-4 land camera to a digital gel documentation system.

    PubMed

    Porch, Timothy G; Erpelding, John E

    2006-04-30

    A simple, inexpensive design is presented for the rapid conversion of the popular MD-4 Polaroid land camera to a high quality digital gel documentation system. Images of ethidium bromide stained DNA gels captured using the digital system were compared to images captured on Polaroid instant film. Resolution and sensitivity were enhanced using the digital system. In addition to the low cost and superior image quality of the digital system, there is also the added convenience of real-time image viewing through the swivel LCD of the digital camera, wide flexibility of gel sizes, accurate automatic focusing, variable image resolution, and consistent ease of use and quality. Images can be directly imported to a computer by using the USB port on the digital camera, further enhancing the potential of the digital system for documentation, analysis, and archiving. The system is appropriate for use as a start-up gel documentation system and for routine gel analysis. PMID:16472866

  5. Low-complexity camera digital signal imaging for video document projection system

    NASA Astrophysics Data System (ADS)

    Hsia, Shih-Chang; Tsai, Po-Shien

    2011-04-01

    We present high-performance and low-complexity algorithms for real-time camera imaging applications. The main functions of the proposed camera digital signal processing (DSP) involve color interpolation, white balance, adaptive binary processing, auto gain control, and edge and color enhancement for video projection systems. A series of simulations demonstrate that the proposed method can achieve good image quality while keeping computation cost and memory requirements low. On the basis of the proposed algorithms, the cost-effective hardware core is developed using Verilog HDL. The prototype chip has been verified with one low-cost programmable device. The real-time camera system can achieve 1270 × 792 resolution with the combination of extra components and can demonstrate each DSP function.

  6. Semantic evaluation of high-resolution low-cost digital camera data for urban classification

    NASA Astrophysics Data System (ADS)

    Frangesch, Alexander; Greiwe, Ansgar; Ehlers, Manfred

    2005-10-01

    Many applications of remote sensing - like, for example, urban monitoring - require high resolution image data for a correct determination of object geometry. The desired geometry of an object's surface is created in dieffernet studies by use of well known segmentation techniques. In this study, we evaluate the influence on image quality of analog and digital image data on the results of a image segmentation in eCognition. We compare the suitability of analog middle format camera data with image data produced by a commercial "of the shelf" digital camera taken during two campaigns in 2003 and 2004. Furthermore, the results of a multiresolution classification of an urban test site by use of both datasets will be presented. An outlook for future work on a multiresolution data fusion with hyperspectral data will be given at the end of this paper.

  7. A digital underwater video camera system for aquatic research in regulated rivers

    USGS Publications Warehouse

    Martin, Benjamin M.; Irwin, Elise R.

    2010-01-01

    We designed a digital underwater video camera system to monitor nesting centrarchid behavior in the Tallapoosa River, Alabama, 20 km below a peaking hydropower dam with a highly variable flow regime. Major components of the system included a digital video recorder, multiple underwater cameras, and specially fabricated substrate stakes. The innovative design of the substrate stakes allowed us to effectively observe nesting redbreast sunfish Lepomis auritus in a highly regulated river. Substrate stakes, which were constructed for the specific substratum complex (i.e., sand, gravel, and cobble) identified at our study site, were able to withstand a discharge level of approximately 300 m3/s and allowed us to simultaneously record 10 active nests before and during water releases from the dam. We believe our technique will be valuable for other researchers that work in regulated rivers to quantify behavior of aquatic fauna in response to a discharge disturbance.

  8. Measuring the Orbital Period of the Moon Using a Digital Camera

    ERIC Educational Resources Information Center

    Hughes, Stephen W.

    2006-01-01

    A method of measuring the orbital velocity of the Moon around the Earth using a digital camera is described. Separate images of the Moon and stars taken 24 hours apart were loaded into Microsoft PowerPoint and the centre of the Moon marked on each image. Four stars common to both images were connected together to form a "home-made" constellation.…

  9. Use of a new high-speed digital data acquisition system in airborne ice-sounding

    USGS Publications Warehouse

    Wright, David L.; Bradley, Jerry A.; Hodge, Steven M.

    1989-01-01

    A high-speed digital data acquisition and signal averaging system for borehole, surface, and airborne radio-frequency geophysical measurements was designed and built by the US Geological Survey. The system permits signal averaging at rates high enough to achieve significant signal-to-noise enhancement in profiling, even in airborne applications. The first field use of the system took place in Greenland in 1987 for recording data on a 150 by 150-km grid centered on the summit of the Greenland ice sheet. About 6000-line km were flown and recorded using the new system. The data can be used to aid in siting a proposed scientific corehole through the ice sheet.

  10. Monitoring of phenological control on ecosystem fluxes using digital cameras and eddy covariance data

    NASA Astrophysics Data System (ADS)

    Toomey, M. P.; Friedl, M. A.; Hufkens, K.; Sonnentag, O.; Milliman, T. E.; Frolking, S.; Richardson, A. D.

    2012-12-01

    Digital repeat photography is an emerging platform for monitoring land surface phenology. Despite the great potential of digital repeat photography to yield insights into phenological cycles, relatively few studies have compared digital repeat photography to in situ measures of ecosystem fluxes. We used 60 site years of concurrent camera and eddy covariance data at 13 sites, representing five distinct ecosystem types - temperate deciduous forest, temperate coniferous forest, boreal forest, grasslands and crops - to measure and model phenological controls on carbon and water exchange with the atmosphere. Camera-derived relative greenness was strongly correlated with estimated gross primary productivity among the five ecosystem types and was moderately correlated with water fluxes. Camera-derived canopy development was also compared with phenological phase as predicted by a generalized, bioclimatic phenology model and Moderate Resolution Imaging Spectrometer (MODIS) imagery to assess the potential for cross-biome phenological monitoring. This study demonstrates the potential of webcam networks such as Phenocam (phenocam.unh.edu) to conduct long-term, continental monitoring and modeling of ecosystem response to climate change.

  11. Using Digital Cameras to Teach about Infrared Radiation and Instrumentation Technology

    NASA Astrophysics Data System (ADS)

    Pompea, S. M.; Croft, S. K.

    1998-12-01

    Digital cameras and image processing are used to create color composite images that illustrate the importance of the near infrared portion of the spectrum in providing additional information about an astronomical object. Demonstrations with digital cameras also help make infrared radiation real to students and illustrate the different aspects of a sensing system including the spectral emission properties of the source, the reflectivity of the object of interest, the use of filters, detector sensitivity, and the use of image processing. Using appropriate, easily available filters, students can demonstrate that two objects that appear green (such as a car and a plant) have very different properties in the near infrared, since chlorophyll in plants is reflective in the near IR. The results can be applied to imaging of the planets to look for chlorophyll features indicative of life. Digital cameras are affordable, relatively common devices which can be used in a wide variety of classroom and experimental settings. As such they can have a profound influence, in conjunction with image processing, on participatory teaching of observational astronomy and in sharing observations across the web. Some other general applications in this area as well as extensions to several areas of spectroscopy will also be discussed. This work was supported by an NSF instructional materials grant as part of the Astronomy Village: Investigating the Solar System development program. S. Pompea is an adjunct faculty member of Steward Observatory, University of Arizona.

  12. A two-camera imaging system for pest detection and aerial application

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This presentation reports on the design and testing of an airborne two-camera imaging system for pest detection and aerial application assessment. The system consists of two digital cameras with 5616 x 3744 effective pixels. One camera captures normal color images with blue, green and red bands, whi...

  13. Comparison of mosaicking techniques for airborne images from consumer-grade cameras

    NASA Astrophysics Data System (ADS)

    Song, Huaibo; Yang, Chenghai; Zhang, Jian; Hoffmann, Wesley Clint; He, Dongjian; Thomasson, J. Alex

    2016-01-01

    Images captured from airborne imaging systems can be mosaicked for diverse remote sensing applications. The objective of this study was to identify appropriate mosaicking techniques and software to generate mosaicked images for use by aerial applicators and other users. Three software packages-Photoshop CC, Autostitch, and Pix4Dmapper-were selected for mosaicking airborne images acquired from a large cropping area. Ground control points were collected for georeferencing the mosaicked images and for evaluating the accuracy of eight mosaicking techniques. Analysis and accuracy assessment showed that Pix4Dmapper can be the first choice if georeferenced imagery with high accuracy is required. The spherical method in Photoshop CC can be an alternative for cost considerations, and Autostitch can be used to quickly mosaic images with reduced spatial resolution. The results also showed that the accuracy of image mosaicking techniques could be greatly affected by the size of the imaging area or the number of the images and that the accuracy would be higher for a small area than for a large area. The results from this study will provide useful information for the selection of image mosaicking software and techniques for aerial applicators and other users.

  14. Long-Term Tracking of a Specific Vehicle Using Airborne Optical Camera Systems

    NASA Astrophysics Data System (ADS)

    Kurz, F.; Rosenbaum, D.; Runge, H.; Cerra, D.; Mattyus, G.; Reinartz, P.

    2016-06-01

    In this paper we present two low cost, airborne sensor systems capable of long-term vehicle tracking. Based on the properties of the sensors, a method for automatic real-time, long-term tracking of individual vehicles is presented. This combines the detection and tracking of the vehicle in low frame rate image sequences and applies the lagged Cell Transmission Model (CTM) to handle longer tracking outages occurring in complex traffic situations, e.g. tunnels. The CTM model uses the traffic conditions in the proximities of the target vehicle and estimates its motion to predict the position where it reappears. The method is validated on an airborne image sequence acquired from a helicopter. Several reference vehicles are tracked within a range of 500m in a complex urban traffic situation. An artificial tracking outage of 240m is simulated, which is handled by the CTM. For this, all the vehicles in the close proximity are automatically detected and tracked to estimate the basic density-flow relations of the CTM model. Finally, the real and simulated trajectories of the reference vehicles in the outage are compared showing good correspondence also in congested traffic situations.

  15. Digital EO camera system VOS 40/270 for tactical reconnaissance pos RecceLite

    NASA Astrophysics Data System (ADS)

    Uhl, Bernd

    2002-11-01

    This paper describes the design concept and the technical features of a new developed EO camera. The camera is integrated on a stabilized sensor platform of a tactical reconnaissance pod for extended pointing capability and wide field ground coverage.One of the major design constraints was the extremely small space available in the front gimbals, shared with a high resolution IR FPA Sensor in the other half of the gimbals. The line-of-sight of both sensors is harmonized, leading to multi-spectral information of the same ground target. The sensor features a high resolution zoom lens, large FPA detector, automatic focus and exposure control. An unique feature of the camera is automatic mass compensation when the zoom lens elements change their position during FOV change to keep the balancing of the sensors on the gimbal. The VOS 40/270 camera represents a new generation of tactical reconnaissance sensors for stabilized platform integration. The design of the VOS 40/270 camera system employs the latest emerging technologies in an all digital reconnaissance system.

  16. Use of a Digital Camera to Monitor the Growth and Nitrogen Status of Cotton

    PubMed Central

    Jia, Biao; He, Haibing; Ma, Fuyu; Diao, Ming; Jiang, Guiying; Zheng, Zhong; Cui, Jin; Fan, Hua

    2014-01-01

    The main objective of this study was to develop a nondestructive method for monitoring cotton growth and N status using a digital camera. Digital images were taken of the cotton canopies between emergence and full bloom. The green and red values were extracted from the digital images and then used to calculate canopy cover. The values of canopy cover were closely correlated with the normalized difference vegetation index and the ratio vegetation index and were measured using a GreenSeeker handheld sensor. Models were calibrated to describe the relationship between canopy cover and three growth properties of the cotton crop (i.e., aboveground total N content, LAI, and aboveground biomass). There were close, exponential relationships between canopy cover and three growth properties. And the relationships for estimating cotton aboveground total N content were most precise, the coefficients of determination (R2) value was 0.978, and the root mean square error (RMSE) value was 1.479 g m−2. Moreover, the models were validated in three fields of high-yield cotton. The result indicated that the best relationship between canopy cover and aboveground total N content had an R2 value of 0.926 and an RMSE value of 1.631 g m−2. In conclusion, as a near-ground remote assessment tool, digital cameras have good potential for monitoring cotton growth and N status. PMID:24723817

  17. Airborne imaging for heritage documentation using the Fotokite tethered flying camera

    NASA Astrophysics Data System (ADS)

    Verhoeven, Geert; Lupashin, Sergei; Briese, Christian; Doneus, Michael

    2014-05-01

    Since the beginning of aerial photography, researchers used all kinds of devices (from pigeons, kites, poles, and balloons to rockets) to take still cameras aloft and remotely gather aerial imagery. To date, many of these unmanned devices are still used for what has been referred to as Low-Altitude Aerial Photography or LAAP. In addition to these more traditional camera platforms, radio-controlled (multi-)copter platforms have recently added a new aspect to LAAP. Although model airplanes have been around for several decades, the decreasing cost, increasing functionality and stability of ready-to-fly multi-copter systems has proliferated their use among non-hobbyists. As such, they became a very popular tool for aerial imaging. The overwhelming amount of currently available brands and types (heli-, dual-, tri-, quad-, hexa-, octo-, dodeca-, deca-hexa and deca-octocopters), together with the wide variety of navigation options (e.g. altitude and position hold, waypoint flight) and camera mounts indicate that these platforms are here to stay for some time. Given the multitude of still camera types and the image quality they are currently capable of, endless combinations of low- and high-cost LAAP solutions are available. In addition, LAAP allows for the exploitation of new imaging techniques, as it is often only a matter of lifting the appropriate device (e.g. video cameras, thermal frame imagers, hyperspectral line sensors). Archaeologists were among the first to adopt this technology, as it provided them with a means to easily acquire essential data from a unique point of view, whether for simple illustration purposes of standing historic structures or to compute three-dimensional (3D) models and orthophotographs from excavation areas. However, even very cheap multi-copters models require certain skills to pilot them safely. Additionally, malfunction or overconfidence might lift these devices to altitudes where they can interfere with manned aircrafts. As such, the

  18. Greenness indices from digital cameras predict the timing and seasonal dynamics of canopy-scale photosynthesis.

    PubMed

    Toomey, Michael; Friedl, Mark A; Frolking, Steve; Hufkens, Koen; Klosterman, Stephen; Sonnentag, Oliver; Baldocchi, Dennis D; Bernacchi, Carl J; Biraud, Sebastien C; Bohrer, Gil; Brzostek, Edward; Burns, Sean P; Coursolle, Carole; Hollinger, David Y; Margolis, Hank A; Mccaughey, Harry; Monson, Russell K; Munger, J William; Pallardy, Stephen; Phillips, Richard P; Torn, Margaret S; Wharton, Sonia; Zeri, Marcelo; And, Andrew D; Richardson, Andrew D

    2015-01-01

    The proliferation of digital cameras co-located with eddy covariance instrumentation provides new opportunities to better understand the relationship between canopy phenology and the seasonality of canopy photosynthesis. In this paper we analyze the abilities and limitations of canopy color metrics measured by digital repeat photography to track seasonal canopy development and photosynthesis, determine phenological transition dates, and estimate intra-annual and interannual variability in canopy photosynthesis. We used 59 site-years of camera imagery and net ecosystem exchange measurements from 17 towers spanning three plant functional types (deciduous broadleaf forest, evergreen needleleaf forest, and grassland/crops) to derive color indices and estimate gross primary productivity (GPP). GPP was strongly correlated with greenness derived from camera imagery in all three plant functional types. Specifically, the beginning of the photosynthetic period in deciduous broadleaf forest and grassland/crops and the end of the photosynthetic period in grassland/crops were both correlated with changes in greenness; changes in redness were correlated with the end of the photosynthetic period in deciduous broadleaf forest. However, it was not possible to accurately identify the beginning or ending of the photosynthetic period using camera greenness in evergreen needleleaf forest. At deciduous broadleaf sites, anomalies in integrated greenness and total GPP were significantly correlated up to 60 days after the mean onset date for the start of spring. More generally, results from this work demonstrate that digital repeat photography can be used to quantify both the duration of the photosynthetically active period as well as total GPP in deciduous broadleaf forest and grassland/crops, but that new and different approaches are required before comparable results can be achieved in evergreen needleleaf forest. PMID:26255360

  19. Digital Elevation Model from Non-Metric Camera in Uas Compared with LIDAR Technology

    NASA Astrophysics Data System (ADS)

    Dayamit, O. M.; Pedro, M. F.; Ernesto, R. R.; Fernando, B. L.

    2015-08-01

    Digital Elevation Model (DEM) data as a representation of surface topography is highly demanded for use in spatial analysis and modelling. Aimed to that issue many methods of acquisition data and process it are developed, from traditional surveying until modern technology like LIDAR. On the other hands, in a past four year the development of Unamend Aerial System (UAS) aimed to Geomatic bring us the possibility to acquire data about surface by non-metric digital camera on board in a short time with good quality for some analysis. Data collectors have attracted tremendous attention on UAS due to possibility of the determination of volume changes over time, monitoring of the breakwaters, hydrological modelling including flood simulation, drainage networks, among others whose support in DEM for proper analysis. The DEM quality is considered as a combination of DEM accuracy and DEM suitability so; this paper is aimed to analyse the quality of the DEM from non-metric digital camera on UAS compared with a DEM from LIDAR corresponding to same geographic space covering 4 km2 in Artemisa province, Cuba. This area is in a frame of urban planning whose need to know the topographic characteristics in order to analyse hydrology behaviour and decide the best place for make roads, building and so on. Base on LIDAR technology is still more accurate method, it offer us a pattern for test DEM from non-metric digital camera on UAS, whose are much more flexible and bring a solution for many applications whose needs DEM of detail.

  20. Digital Camera Calibration Using Images Taken from AN Unmanned Aerial Vehicle

    NASA Astrophysics Data System (ADS)

    Pérez, M.; Agüera, F.; Carvajal, F.

    2011-09-01

    For calibrating the camera, an accurate determination of the interior orientation parameters is needed. For more accurate results, the calibration images should be taken under conditions that are similar to the field samples. The aim of this work is the establishment of an efficient and accurate digital camera calibration method to be used in particular working conditions, as it can be found with our UAV (Unmanned Aerial Vehicle) photogrammetric projects. The UAV used in this work was md4-200 modelled by Microdrones. The microdrone is also equipped with a standard digital non- metric camera, the Pentax Optio A40 camera. To find out the interior orientation parameters of the digital camera, two calibration methods were done. A lab calibration based on a flat pattern and a field calibration were fulfilled. To carry out the calibration, Photomodeler Scanner software was used in both cases. The lab calibration process was completely automatic using a calibration grid. The focal length was fixed at widest angle and the network included a total of twelve images with± 90º roll angles. In order to develop the field calibration, a flight plan was programmed including a total of twelve images. In the same way as in the lab calibration, the focal length was fixed at widest angle. The field test used in the study was a flat surface located on the University of Almería campus and a set of 67 target points were placed. The calibration field area was 25 × 25 m approximately and the altitude flight over ground was 50 m. After the software processing, the camera calibration parameter values were obtained. The paper presents the process, the results and the accuracy of these calibration methods. The field calibration method reduced the final total error obtained in the previous lab calibration. Furthermore the overall RMSs obtained from both methods are similar. Therefore we will apply the field calibration results to all our photogrammetric projects in which the flight high

  1. Camera model and calibration process for high-accuracy digital image metrology of inspection planes

    NASA Astrophysics Data System (ADS)

    Correia, Bento A. B.; Dinis, Joao

    1998-10-01

    High accuracy digital image based metrology must rely on an integrated model of image generation that is able to consider simultaneously the geometry of the camera vs. object positioning, and the conversion of the optical image on the sensor into an electronic digital format. In applications of automated visual inspection involving the analysis of approximately plane objects these models are generally simplified in order to facilitate the process of camera calibration. In this context, the lack of rigor in the determination of the intrinsic parameters in such models is particularly relevant. Aiming at the high accuracy metrology of contours of objects lying on an analysis plane, and involving sub-pixel measurements, this paper presents a three-stage camera model that includes an extrinsic component of perspective distortion and the intrinsic components of radial lens distortion and sensor misalignment. The later two factors are crucial in applications of machine vision that rely on the use of low cost optical components. A polynomial model for the negative radial lens distortion of wide field of view CCTV lenses is also established.

  2. Retrieval of cloud optical properties using airborne hyperspectral cameras during the VOCALS campaign.

    NASA Astrophysics Data System (ADS)

    Labrador, L.; Vaughan, G.

    2009-09-01

    A set of two hyperspectral imaging sensors have been used to analyze the optical properties of stratocumulus cloud off the coast of Northern Chile within the framework of the VAMOS Ocean Clouds Atmosphere Land Study (VOCALS) during September-October 2008. The SPECIM Aisa Eagle & Hawk are tandem pushbroom-type hyperspectral imagers scanning in the 400-970 and 970-2500 nm range, respectively. The instruments were mounted onboard the National Environmental Research Council's (NERC) Dornier DO-228 aircraft, based in Arica, northern Chile during the campaign. An area approximately 600 x 200 km was surveyed off the northern coast of Chile and a total of 14 science flights were carried out where hyperspectral data were successfully collected over the stratocumulus deck at altitudes varying between 10000 and 15000 ft. Cloud optical properties, such as cloud optical thickness, cloud effective radius and liquid water path can be retrieved which can then be compared with space-borne hyperspectral imagers' retrievals. Atmospheric corrections have been applied to enable the comparison between the different type of sensors and the analysis requires, amongst other, solving the back-scattering problems associated with off-nadir views. The high resolution, both spatial and temporal, of these airborne sensors makes them ideal to validate satellite retrievals of cloud optical properties.

  3. Teaching with Technology: Step Back and Hand over the Cameras! Using Digital Cameras to Facilitate Mathematics Learning with Young Children in K-2 Classrooms

    ERIC Educational Resources Information Center

    Northcote, Maria

    2011-01-01

    Digital cameras are now commonplace in many classrooms and in the lives of many children in early childhood centres and primary schools. They are regularly used by adults and teachers for "saving special moments and documenting experiences." The use of previously expensive photographic and recording equipment has often remained in the domain of…

  4. Improvements in remote cardiopulmonary measurement using a five band digital camera.

    PubMed

    McDuff, Daniel; Gontarek, Sarah; Picard, Rosalind W

    2014-10-01

    Remote measurement of the blood volume pulse via photoplethysmography (PPG) using digital cameras and ambient light has great potential for healthcare and affective computing. However, traditional RGB cameras have limited frequency resolution. We present results of PPG measurements from a novel five band camera and show that alternate frequency bands, in particular an orange band, allowed physiological measurements much more highly correlated with an FDA approved contact PPG sensor. In a study with participants (n = 10) at rest and under stress, correlations of over 0.92 (p 0.01) were obtained for heart rate, breathing rate, and heart rate variability measurements. In addition, the remotely measured heart rate variability spectrograms closely matched those from the contact approach. The best results were obtained using a combination of cyan, green, and orange (CGO) bands; incorporating red and blue channel observations did not improve performance. In short, RGB is not optimal for this problem: CGO is better. Incorporating alternative color channel sensors should not increase the cost of such cameras dramatically. PMID:24835124

  5. The cryogenic multiplexer and shift register for submillimeter-wave digital camera

    NASA Astrophysics Data System (ADS)

    Hibi, Yasunori; Matsuo, Hiroshi; Arai, Hideaki; Nagata, Hirohisa; Ikeda, Hirokazu; Fujiwara, Mikio

    2009-11-01

    We have been developing cryogenic readout integrate circuits using SONY GaAs JFETs for large format arrays of high impedance type detectors especially for submillimeter/terahertz astronomy. The GaAs JFETs manufactured by SONY CO. Ltd have excellent static properties at less than 1 K. Besides, these JFETs have good performance for electrical switches; they have very low gate capacitance (<50 fF), low on resistance (˜10 kΩ), and high off resistance (>100 TΩ). To realize a cryogenic readout system for submillimeter-wave/terahertz camera, we designed multiplexers with sample-and-holds and shift registers. We report the first test results of each circuit and show prospect of a cryogenic multiplex system for a submillimeter-wave/terahertz digital camera.

  6. Development of High Speed Digital Camera: EXILIM EX-F1

    NASA Astrophysics Data System (ADS)

    Nojima, Osamu

    The EX-F1 is a high speed digital camera featuring a revolutionary improvement in burst shooting speed that is expected to create entirely new markets. This model incorporates a high speed CMOS sensor and a high speed LSI processor. With this model, CASIO has achieved an ultra-high speed 60 frames per second (fps) burst rate for still images, together with 1,200 fps high speed movie that captures movements which cannot even be seen by human eyes. Moreover, this model can record movies at full High-Definition. After launching it into the market, it was able to get a lot of high appraisals as an innovation camera. We will introduce the concept, features and technologies about the EX-F1.

  7. Technique for improving the quality of images from digital cameras using ink-jet printers and smoothed RGB transfer curves

    NASA Astrophysics Data System (ADS)

    Sampat, Nitin; Grim, John F.; O'Hara, James E.

    1998-04-01

    The digital camera market is growing at an explosive rate. At the same time, the quality of photographs printed on ink- jet printers continues to improve. Most of the consumer cameras are designed with the monitor as the target output device and ont the printer. When a user is printing his images from a camera, he/she needs to optimize the camera and printer combination in order to maximize image quality. We describe the details of one such method for improving image quality using a AGFA digital camera and an ink jet printer combination. Using Adobe PhotoShop, we generated optimum red, green and blue transfer curves that match the scene content to the printers output capabilities. Application of these curves to the original digital image resulted in a print with more shadow detail, no loss of highlight detail, a smoother tone scale, and more saturated colors. The image also exhibited an improved tonal scale and visually more pleasing images than those captured and printed without any 'correction'. While we report the results for one camera-printer combination we tested this technique on numbers digital cameras and printer combinations and in each case produced a better looking image. We also discuss the problems we encountered in implementing this technique.

  8. Instrument-quality digital camera that transitioned to low-cost high-volume production

    NASA Astrophysics Data System (ADS)

    Mandl, William J.

    2002-12-01

    MOSAD(copyright), Multiplexed OverSample Analog to Digital conversion, is a low power on focal plane analog to digital, A/D, process that places an oversample A/D at each pixel site. Two full custom designs for a visible light staring array were developed with this approach. One design approach uses a silicon photo diode in combination with photo gates at the pixel and the other approach uses an all photo gate sensor for detection. Both arrays were designed with a 320x240 format with the pixels placed on 16 micron centers. The system includes the camera assembly, driver interface assembly, a frame grabber board with integrated decimator and Windows 2000 compatible software for real time image display. The camera includes the sensor, either photo gate or photo diode, mounted on a PC card with support electronics. A custom lens mount attaches the camera to C or CS mount lens. Testing was done with a Tamron 13VM2812 CCTV CS mount lens. Both an RS644 and an RS422 parallel interface card assembly was developed to attach to the frame grabber board. The final iteration cameras were tested at the Amain facility and pictures were taken. At 400 samples per second, measured on chip power consumption is under 10 milliwatts. Noise measurements at sample rates from 400 samples per second to 1,600 samples per second were taken for both parts. The photo diode worked and produced images but it had a sense amplifier problem that prevented adequate noise measurement. At 28 times oversample, the photo gate achieved typical 9 to 11 bits signal to noise with best case measured at 13 bits. Nonuniformity variation was below the noise floor.

  9. A novel multi-digital camera system based on tilt-shift photography technology.

    PubMed

    Sun, Tao; Fang, Jun-Yong; Zhao, Dong; Liu, Xue; Tong, Qing-Xi

    2015-01-01

    Multi-digital camera systems (MDCS) are constantly being improved to meet the increasing requirement of high-resolution spatial data. This study identifies the insufficiencies of traditional MDCSs and proposes a new category MDCS based on tilt-shift photography to improve ability of the MDCS to acquire high-accuracy spatial data. A prototype system, including two or four tilt-shift cameras (TSC, camera model: Nikon D90), is developed to validate the feasibility and correctness of proposed MDCS. Similar to the cameras of traditional MDCSs, calibration is also essential for TSC of new MDCS. The study constructs indoor control fields and proposes appropriate calibration methods for TSC, including digital distortion model (DDM) approach and two-step calibrated strategy. The characteristics of TSC are analyzed in detail via a calibration experiment; for example, the edge distortion of TSC. Finally, the ability of the new MDCS to acquire high-accuracy spatial data is verified through flight experiments. The results of flight experiments illustrate that geo-position accuracy of prototype system achieves 0.3 m at a flight height of 800 m, and spatial resolution of 0.15 m. In addition, results of the comparison between the traditional (MADC II) and proposed MDCS demonstrate that the latter (0.3 m) provides spatial data with higher accuracy than the former (only 0.6 m) under the same conditions. We also take the attitude that using higher accuracy TSC in the new MDCS should further improve the accuracy of the photogrammetry senior product. PMID:25835187

  10. A Novel Multi-Digital Camera System Based on Tilt-Shift Photography Technology

    PubMed Central

    Sun, Tao; Fang, Jun-yong; Zhao, Dong; Liu, Xue; Tong, Qing-xi

    2015-01-01

    Multi-digital camera systems (MDCS) are constantly being improved to meet the increasing requirement of high-resolution spatial data. This study identifies the insufficiencies of traditional MDCSs and proposes a new category MDCS based on tilt-shift photography to improve ability of the MDCS to acquire high-accuracy spatial data. A prototype system, including two or four tilt-shift cameras (TSC, camera model: Nikon D90), is developed to validate the feasibility and correctness of proposed MDCS. Similar to the cameras of traditional MDCSs, calibration is also essential for TSC of new MDCS. The study constructs indoor control fields and proposes appropriate calibration methods for TSC, including digital distortion model (DDM) approach and two-step calibrated strategy. The characteristics of TSC are analyzed in detail via a calibration experiment; for example, the edge distortion of TSC. Finally, the ability of the new MDCS to acquire high-accuracy spatial data is verified through flight experiments. The results of flight experiments illustrate that geo-position accuracy of prototype system achieves 0.3 m at a flight height of 800 m, and spatial resolution of 0.15 m. In addition, results of the comparison between the traditional (MADC II) and proposed MDCS demonstrate that the latter (0.3 m) provides spatial data with higher accuracy than the former (only 0.6 m) under the same conditions. We also take the attitude that using higher accuracy TSC in the new MDCS should further improve the accuracy of the photogrammetry senior product. PMID:25835187

  11. Retrieval of water quality algorithms from airborne HySpex camera for oxbow lakes in north-eastern Poland

    NASA Astrophysics Data System (ADS)

    Slapinska, Malgorzata; Berezowski, Tomasz; Frąk, Magdalena; Chormański, Jarosław

    2016-04-01

    The aim of this study was to retrieve empirical formulas for water quality of oxbow lakes in Lower Biebrza Basin (river located in NE Poland) using HySpex airborne imaging spectrometer. Biebrza River is one of the biggest wetland in Europe. It is characterised by low contamination level and small human influence. Because of those characteristics Biebrza River can be treated as a reference area for other floodplains and fen ecosystem in Europe. Oxbow lakes are important part of Lower Biebrza Basin due to their retention and habitat function. Hyperspectral remote sensing data were acquired by the HySpex sensor (which covers the range of 400-2500 nm) on 01-02.08.2015 with the ground measurements campaign conducted 03-04.08.2015. The ground measurements consisted of two parts. First part included spectral reflectance sampling with spectroradiometer ASD FieldSpec 3, which covered the wavelength range of 350-2500 nm at 1 nm intervals. In situ data were collected both for water and for specific objects within the area. Second part of the campaign included water parameters such as Secchi disc depth (SDD), electric conductivity (EC), pH, temperature and phytoplankton. Measured reflectance enabled empirical line atmospheric correction which was conducted for the HySpex data. Our results indicated that proper atmospheric correction was very important for further data analysis. The empirical formulas for our water parameters were retrieved based on reflecatance data. This study confirmed applicability of HySpex camera to retrieve water quality.

  12. Development of a computer-aided alignment simulator for an EO/IR dual-band airborne camera

    NASA Astrophysics Data System (ADS)

    Lee, Jun Ho; Ryoo, Seungyeol; Park, Kwang-Woo; Lee, Haeng Bok

    2012-10-01

    An airborne sensor is developed for remote sensing on an unmanned aerial vehicle (UAV). The sensor is an optical payload for an eletro-optical/infrared (EO/IR) dual band camera that combines visible and IR imaging capabilities in a compact and lightweight manner. It adopts a Ritchey-Chrétien telescope for the common front end optics with several relay optics that divide and deliver EO and IR bands to a charge-coupled-device (CCD) and an IR detector, respectively. For the easy assemble of such a complicated optics, a computer-aided alignment program (herein called simulator) is developed. The simulator first estimates the details of the misalignments such as locations, types, and amounts from the test results such as modulation transfer function (MTF), Zernike polynomial coefficients, and RMS wavefront errors at different field positions. Then it recommends the compensator movement(s) with the estimated optical performance. The simulator is coded on Matlab with the hidden connection to optical analysis/design software Zemax. By interfacing ZEMAX and MATLAB, the GUI-based alignment simulator, will help even those not familiar with the two programs to obtain accurate results more easily and quickly.

  13. Assessing the application of an airborne intensified multispectral video camera to measure chlorophyll a in three Florida estuaries

    SciTech Connect

    Dierberg, F.E.; Zaitzeff, J.

    1997-08-01

    After absolute and spectral calibration, an airborne intensified, multispectral video camera was field tested for water quality assessments over three Florida estuaries (Tampa Bay, Indian River Lagoon, and the St. Lucie River Estuary). Univariate regression analysis of upwelling spectral energy vs. ground-truthed uncorrected chlorophyll a (Chl a) for each estuary yielded lower coefficients of determination (R{sup 2}) with increasing concentrations of Gelbstoff within an estuary. More predictive relationships were established by adding true color as a second independent variable in a bivariate linear regression model. These regressions successfully explained most of the variation in upwelling light energy (R{sup 2}=0.94, 0.82 and 0.74 for the Tampa Bay, Indian River Lagoon, and St. Lucie estuaries, respectively). Ratioed wavelength bands within the 625-710 nm range produced the highest correlations with ground-truthed uncorrected Chl a, and were similar to those reported as being the most predictive for Chl a in Tennessee reservoirs. However, the ratioed wavebands producing the best predictive algorithms for Chl a differed among the three estuaries due to the effects of varying concentrations of Gelbstoff on upwelling spectral signatures, which precluded combining the data into a common data set for analysis.

  14. Assessment of digital camera-derived vegetation indices in quantitative monitoring of seasonal rice growth

    NASA Astrophysics Data System (ADS)

    Sakamoto, Toshihiro; Shibayama, Michio; Kimura, Akihiko; Takada, Eiji

    2011-11-01

    A commercially available digital camera can be used in a low-cost automatic observation system for monitoring crop growth change in open-air fields. We developed a prototype Crop Phenology Recording System (CPRS) for monitoring rice growth, but the ready-made waterproof cases that we used produced shadows on the images. After modifying the waterproof cases, we repeated the fixed-point camera observations to clarify questions regarding digital camera-derived vegetation indices (VIs), namely, the visible atmospherically resistant index (VARI) based on daytime normal color images (RGB image) and the nighttime relative brightness index (NRBI NIR) based on nighttime near infrared (NIR) images. We also took frequent measurements of agronomic data such as plant length, leaf area index (LAI), and aboveground dry matter weight to gain a detailed understanding of the temporal relationship between the VIs and the biophysical parameters of rice. In addition, we conducted another nighttime outdoor experiment to establish the link between NRBI NIR and camera-to-object distance. The study produced the following findings. (1) The customized waterproof cases succeeded in preventing large shadows from being cast, especially on nighttime images, and it was confirmed that the brightness of the nighttime NIR images had spatial heterogeneity when a point light source (flashlight) was used, in contrast to the daytime RGB images. (2) The additional experiment using a forklift showed that both the ISO sensitivity and the calibrated digital number of the NIR (cDN NIR) had significant effects on the sensitivity of NRBI NIR to the camera-to-object distance. (3) Detailed measurements of a reproductive stem were collected to investigate the connection between the morphological feature change caused by the panicle sagging process and the downtrend in NRBI NIR during the reproductive stages. However, these agronomic data were not completely in accord with NRBI NIR in terms of the temporal pattern

  15. Geo-Referenced Mapping Using AN Airborne 3d Time-Of Camera

    NASA Astrophysics Data System (ADS)

    Kohoutek, T. K.; Nitsche, M.; Eisenbeiss, H.

    2011-09-01

    This paper presents the first experience of a close range bird's eye view photogrammetry with range imaging (RIM) sensors for the real time generation of high resolution geo-referenced 3D surface models. The aim of this study was to develop a mobile, versatile and less costly outdoor survey methodology to measure natural surfaces compared to the terrestrial laser scanning (TLS). Two commercial RIM cameras (SR4000 by MESA Imaging AG and a CamCube 2.0 by PMDTechnologies GmbH) were mounted on a lightweight crane and on an unmanned aerial vehicle (UAV). The field experiments revealed various challenges in real time deployment of the two state-of-the-art RIM systems, e.g. processing of the large data volume. Acquisition strategy and data processing and first measurements are presented. The precision of the measured distances is less than 1 cm for good conditions. However, the measurement precision degraded under the test conditions due to direct sunlight, strong illumination contrasts and helicopter vibrations.

  16. Determination of the diffusion coefficient between corn syrup and distilled water using a digital camera

    NASA Astrophysics Data System (ADS)

    Ray, E.; Bunton, P.; Pojman, J. A.

    2007-10-01

    A simple technique for determining the diffusion coefficient between two miscible liquids is presented based on observing concentration-dependent ultraviolet-excited fluorescence using a digital camera. The ultraviolet-excited visible fluorescence of corn syrup is proportional to the concentration of the syrup. The variation of fluorescence with distance from the transition zone between the fluids is fit by the Fick's law solution to the diffusion equation. By monitoring the concentration at successive times, the diffusion coefficient can be determined in otherwise transparent materials. The technique is quantitative and makes measurement of diffusion accessible in the advanced undergraduate physics laboratory.

  17. Cataract screening by minimally trained remote observer with non-mydriatic digital fundus camera

    NASA Astrophysics Data System (ADS)

    Choi, Ann; Hjelmstad, David; Taibl, Jessica N.; Sayegh, Samir I.

    2013-03-01

    We propose a method that allows an inexperienced observer, through the examination of the digital fundus image of a retina on a computer screen, to simply determine the presence of a cataract and the necessity to refer the patient for further evaluation. To do so, fundus photos obtained with a non-mydriatic camera were presented to an inexperienced observer that was briefly instructed on fundus imaging, nature of cataracts and their probable effect on the image of the retina and the use of a computer program presenting fundus image pairs. Preliminary results of pair testing indicate the method is very effective.

  18. Conversion from light to numerical signal in a digital camera pipeline

    NASA Astrophysics Data System (ADS)

    Toadere, Florin

    2010-11-01

    The goal of this paper is to simulate the conversion from light to numerical signal which occurs during the image propagations through the digital camera pipeline. We focus on the spectral and resolution analysis of the optical system, the Bayer sampling, the photon shot and fixed pattern noise, the high dynamic range image, the amplitude and bilateral filters and the analog to digital conversion. The image capture system consists of a flash illumination source, a Cooke triplet photographic objective and a passive pixel CMOS sensor. We use a spectral image in order to simulate the illumination and the propagation of the light through the optical system components. The Fourier optics is used to compute the point spread function specific to each optical component. We consider the image acquisition system to be linear shift invariant and axial. The light propagation is orthogonal to the system.

  19. Continuous-wave terahertz digital holography by use of a pyroelectric array camera.

    PubMed

    Ding, Sheng-Hui; Li, Qi; Li, Yun-Da; Wang, Qi

    2011-06-01

    Terahertz (THz) digital holography is realized based on a 2.52 THz far-IR gas laser and a commercial 124 × 124 pyroelectric array camera. Off-axis THz holograms are obtained by recording interference patterns between light passing through the sample and the reference wave. A numerical reconstruction process is performed to obtain the field distribution at the object surface. Different targets were imaged to test the system's imaging capability. Compared with THz focal plane images, the image quality of the reconstructed images are improved a lot. The results show that the system's imaging resolution can reach at least 0.4 mm. The system also has the potential for real-time imaging application. This study confirms that digital holography is a promising technique for real-time, high-resolution THz imaging, which has extensive application prospects. PMID:21633426

  20. Erosion research with a digital camera: the structure from motion method used in gully monitoring - field experiments from southern Morocco

    NASA Astrophysics Data System (ADS)

    Kaiser, Andreas; Rock, Gilles; Neugirg, Fabian; Müller, Christoph; Ries, Johannes

    2014-05-01

    From a geoscientific view arid or semiarid landscapes are often associated with soil degrading erosion processes and thus active geomorphology. In this regard gully incision represents one of the most important influences on surface dynamics. Established approaches to monitor and quantify soil loss require costly and labor-intensive measuring methods: terrestrial or airborne LiDAR scans to create digital elevation models and unmanned airborne vehicles for image acquisition provide adequate tools for geomorphological surveying. Despite their ever advancing abilities, they are finite with their applicability in detailed recordings of complex surfaces. Especially undercuttings and plunge pools in the headcut area of gully systems are invisible or cause shadowing effects. The presented work aims to apply and advance an adequate tool to avoid the above mentioned obstacles and weaknesses of the established methods. The emerging structure from motion-based high resolution 3D-visualisation not only proved to be useful in gully erosion. Moreover, it provides a solid ground for additional applications in geosciences such as surface roughness measurements, quantification of gravitational mass movements or capturing stream connectivity. During field campaigns in semiarid southern Morocco a commercial DSLR camera was used, to produce images that served as input data for software based point cloud and mesh generation. Thus, complex land surfaces could be reconstructed entirely in high resolution by photographing the object from different perspectives. In different scales the resulting 3D-mesh represents a holistic reconstruction of the actual shape complexity with its limits set only by computing capacity. Analysis and visualization of time series of different erosion-related events illustrate the additional benefit of the method. It opens new perspectives on process understanding that can be exploited by open source and commercial software. Results depicted a soil loss of 5

  1. The PANOPTES project: discovering exoplanets with low-cost digital cameras

    NASA Astrophysics Data System (ADS)

    Guyon, Olivier; Walawender, Josh; Jovanovic, Nemanja; Butterfield, Mike; Gee, Wilfred T.; Mery, Rawad

    2014-07-01

    The Panoptic Astronomical Networked OPtical observatory for Transiting Exoplanets Survey (PANOPTES, www.projectpanoptes.org) project is aimed at identifying transiting exoplanets using a wide network of low-cost imaging units. Each unit consists of two commercial digital single lens reflex (DSLR) cameras equipped with 85mm F1.4 lenses, mounted on a small equatorial mount. At a few $1000s per unit, the system offers a uniquely advantageous survey eficiency for the cost, and can easily be assembled by amateur astronomers or students. Three generations of prototype units have so far been tested, and the baseline unit design, which optimizes robustness, simplicity and cost, is now ready to be duplicated. We describe the hardware and software for the PANOPTES project, focusing on key challenging aspects of the project. We show that obtaining high precision photometric measurements with commercial DSLR color cameras is possible, using a PSF-matching algorithm we developed for this project. On-sky tests show that percent-level photometric precision is achieved in 1 min with a single camera. We also discuss hardware choices aimed at optimizing system robustness while maintaining adequate cost. PANOPTES is both an outreach project and a scientifically compelling survey for transiting exoplanets. In its current phase, experienced PANOPTES members are deploying a limited number of units, acquiring the experience necessary to run the network. A much wider community will then be able to participate to the project, with schools and citizen scientists integrating their units in the network.

  2. Remote detection of photoplethysmographic systolic and diastolic peaks using a digital camera.

    PubMed

    McDuff, Daniel; Gontarek, Sarah; Picard, Rosalind W

    2014-12-01

    We present a new method for measuring photoplethysmogram signals remotely using ambient light and a digital camera that allows for accurate recovery of the waveform morphology (from a distance of 3 m). In particular, we show that the peak-to-peak time between the systolic peak and diastolic peak/inflection can be automatically recovered using the second-order derivative of the remotely measured waveform. We compare measurements from the face with those captured using a contact fingertip sensor and show high agreement in peak and interval timings. Furthermore, we show that results can be significantly improved using orange, green, and cyan color channels compared to the tradition red, green, and blue channel combination. The absolute error in interbeat intervals was 26 ms and the absolute error in mean systolic-diastolic peak-to-peak times was 12 ms. The mean systolic-diastolic peak-to-peak times measured using the contact sensor and the camera were highly correlated, ρ = 0.94 (p 0.001). The results were obtained with a camera frame-rate of only 30 Hz. This technology has significant potential for advancing healthcare. PMID:25073159

  3. High-resolution image digitizing through 12x3-bit RGB-filtered CCD camera

    NASA Astrophysics Data System (ADS)

    Cheng, Andrew Y. S.; Pau, Michael C. Y.

    1996-09-01

    A high resolution computer-controlled CCD image capturing system is developed by using a 12 bits 1024 by 1024 pixels CCD camera and motorized RGB filters to grasp an image with color depth up to 36 bits. The filters distinguish the major components of color and collect them separately while the CCD camera maintains the spatial resolution and detector filling factor. The color separation can be done optically rather than electronically. The operation is simply by placing the capturing objects like color photos, slides and even x-ray transparencies under the camera system, the necessary parameters such as integration time, mixing level and light intensity are automatically adjusted by an on-line expert system. This greatly reduces the restrictions of the capturing species. This unique approach can save considerable time for adjusting the quality of image, give much more flexibility of manipulating captured object even if it is a 3D object with minimal setup fixers. In addition, cross sectional dimension of a 3D capturing object can be analyzed by adapting a fiber optic ring light source. It is particularly useful in non-contact metrology of a 3D structure. The digitized information can be stored in an easily transferable format. Users can also perform a special LUT mapping automatically or manually. Applications of the system include medical images archiving, printing quality control, 3D machine vision, and etc.

  4. Performance of low-cost X-ray area detectors with consumer digital cameras

    NASA Astrophysics Data System (ADS)

    Panna, A.; Gomella, A. A.; Harmon, K. J.; Chen, P.; Miao, H.; Bennett, E. E.; Wen, H.

    2015-05-01

    We constructed X-ray detectors using consumer-grade digital cameras coupled to commercial X-ray phosphors. Several detector configurations were tested against the Varian PaxScan 3024M (Varian 3024M) digital flat panel detector. These include consumer cameras (Nikon D800, Nikon D700, and Nikon D3X) coupled to a green emission phosphor in a back-lit, normal incidence geometry, and in a front-lit, oblique incidence geometry. We used the photon transfer method to evaluate detector sensitivity and dark noise, and the edge test method to evaluate their spatial resolution. The essential specifications provided by our evaluation include discrete charge events captured per mm2 per unit exposure surface dose, dark noise in equivalents of charge events per pixel, and spatial resolution in terms of the full width at half maximum (FWHM) of the detector`s line spread function (LSF). Measurements were performed using a tungsten anode X-ray tube at 50 kVp. The results show that the home-built detectors provide better sensitivity and lower noise than the commercial flat panel detector, and some have better spatial resolution. The trade-off is substantially smaller imaging areas. Given their much lower costs, these home-built detectors are attractive options for prototype development of low-dose imaging applications.

  5. Using the auxiliary camera for system calibration of 3D measurement by digital speckle

    NASA Astrophysics Data System (ADS)

    Xue, Junpeng; Su, Xianyu; Zhang, Qican

    2014-06-01

    The study of 3D shape measurement by digital speckle temporal sequence correlation have drawn a lot of attention by its own advantages, however, the measurement mainly for depth z-coordinate, horizontal physical coordinate (x, y) are usually marked as image pixel coordinate. In this paper, a new approach for the system calibration is proposed. With an auxiliary camera, we made up the temporary binocular vision system, which are used for the calibration of horizontal coordinates (mm) while the temporal sequence reference-speckle-sets are calibrated. First, the binocular vision system has been calibrated using the traditional method. Then, the digital speckles are projected on the reference plane, which is moved by equal distance in the direction of depth, temporal sequence speckle images are acquired with camera as reference sets. When the reference plane is in the first position and final position, crossed fringe pattern are projected to the plane respectively. The control points of pixel coordinates are extracted by Fourier analysis from the images, and the physical coordinates are calculated by the binocular vision. The physical coordinates corresponding to each pixel of the images are calculated by interpolation algorithm. Finally, the x and y corresponding to arbitrary depth value z are obtained by the geometric formula. Experiments prove that our method can fast and flexibly measure the 3D shape of an object as point cloud.

  6. Application Of A 1024X1024 Pixel Digital Image Store, With Pulsed Progressive Readout Camera, For Gastro-Intestinal Radiology

    NASA Astrophysics Data System (ADS)

    Edmonds, E. W.; Rowlands, J. A.; Hynes, D. M.; Toth, B. D.; Porter, A. J.

    1986-06-01

    We discuss the applicability of intensified x-ray television systems for general digital radiography and the requirements necessary for physician acceptance. Television systems for videofluorography when limited to conventional fluoroscopic exposure rates (25uR/s to x-ray intensifier), with particular application to the gastro-intestinal system, all suffer from three problems which tend to degrade the image: (a) lack of resolution, (b) noise, and (c) patient movement. The system to be described in this paper addresses each of these problems. Resolution is that provided by the use of a 1024 x 1024 pixel frame store combined with a 1024 line video camera and a 10"/6" x-ray image intensifier. Problems of noise and sensitivity to patient movement are overcome by using a short but intense burst of radiation to produce the latent image, which is then read off the video camera in a progressive fashion and placed in the digital store. Hard copy is produced from a high resolution multiformat camera, or a high resolution digital laser camera. It is intended that this PPR system will replace the 100mm spot film camera in present use, and will provide information in digital form for further processing and eventual digital archiving.

  7. Cloud Height Estimation with a Single Digital Camera and Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Carretas, Filipe; Janeiro, Fernando M.

    2014-05-01

    Clouds influence the local weather, the global climate and are an important parameter in the weather prediction models. Clouds are also an essential component of airplane safety when visual flight rules (VFR) are enforced, such as in most small aerodromes where it is not economically viable to install instruments for assisted flying. Therefore it is important to develop low cost and robust systems that can be easily deployed in the field, enabling large scale acquisition of cloud parameters. Recently, the authors developed a low-cost system for the measurement of cloud base height using stereo-vision and digital photography. However, due to the stereo nature of the system, some challenges were presented. In particular, the relative camera orientation requires calibration and the two cameras need to be synchronized so that the photos from both cameras are acquired simultaneously. In this work we present a new system that estimates the cloud height between 1000 and 5000 meters. This prototype is composed by one digital camera controlled by a Raspberry Pi and is installed at Centro de Geofísica de Évora (CGE) in Évora, Portugal. The camera is periodically triggered to acquire images of the overhead sky and the photos are downloaded to the Raspberry Pi which forwards them to a central computer that processes the images and estimates the cloud height in real time. To estimate the cloud height using just one image requires a computer model that is able to learn from previous experiences and execute pattern recognition. The model proposed in this work is an Artificial Neural Network (ANN) that was previously trained with cloud features at different heights. The chosen Artificial Neural Network is a three-layer network, with six parameters in the input layer, 12 neurons in the hidden intermediate layer, and an output layer with only one output. The six input parameters are the average intensity values and the intensity standard deviation of each RGB channel. The output

  8. First Results from an Airborne Ka-band SAR Using SweepSAR and Digital Beamforming

    NASA Technical Reports Server (NTRS)

    Sadowy, Gregory; Ghaemi, Hirad; Hensley, Scott

    2012-01-01

    NASA/JPL has developed SweepSAR technique that breaks typical Synthetic Aperture Radar (SAR) trade space using time-dependent multi-beam DBF on receive. Developing SweepSAR implementation using array-fed reflector for proposed DESDynI Earth Radar Mission concept. Performed first-of-a-kind airborne demonstration of the SweepSAR concept at Ka-band (35.6 GHz). Validated calibration and antenna pattern data sufficient for beam forming in elevation. (1) Provides validation evidence that the proposed Deformation Ecosystem Structure Dynamics of Ice (DESDynI) SAR architecture is sound. (2) Functions well even with large variations in receiver gain / phase. Future plans include using prototype DESDynI SAR digital flight hardware to do the beam forming in real-time onboard the aircraft.

  9. A digital elevation model of the Greenland ice sheet and validation with airborne laser altimeter data

    NASA Technical Reports Server (NTRS)

    Bamber, Jonathan L.; Ekholm, Simon; Krabill, William B.

    1997-01-01

    A 2.5 km resolution digital elevation model (DEM) of the Greenland ice sheet was produced from the 336 days of the geodetic phase of ERS-1. During this period the altimeter was operating in ice-mode over land surfaces providing improved tracking around the margins of the ice sheet. Combined with the high density of tracks during the geodetic phase, a unique data set was available for deriving a DEM of the whole ice sheet. The errors present in the altimeter data were investigated via a comparison with airborne laser altimeter data obtained for the southern half of Greenland. Comparison with coincident satellite data showed a correlation with surface slope. An explanation for the behavior of the bias as a function of surface slope is given in terms of the pattern of surface roughness on the ice sheet.

  10. Molecular Shocks Associated with Massive Young Stars: CO Line Images with a New Far-Infrared Spectroscopic Camera on the Kuiper Airborne Observatory

    NASA Technical Reports Server (NTRS)

    Watson, Dan M.

    1997-01-01

    Under the terms of our contract with NASA Ames Research Center, the University of Rochester (UR) offers the following final technical report on grant NAG 2-958, Molecular shocks associated with massive young stars: CO line images with a new far-infrared spectroscopic camera, given for implementation of the UR Far-Infrared Spectroscopic Camera (FISC) on the Kuiper Airborne Observatory (KAO), and use of this camera for observations of star-formation regions 1. Two KAO flights in FY 1995, the final year of KAO operations, were awarded to this program, conditional upon a technical readiness confirmation which was given in January 1995. The funding period covered in this report is 1 October 1994 - 30 September 1996. The project was supported with $30,000, and no funds remained at the conclusion of the project.

  11. Portable retinal imaging for eye disease screening using a consumer-grade digital camera

    NASA Astrophysics Data System (ADS)

    Barriga, Simon; Larichev, Andrey; Zamora, Gilberto; Soliz, Peter

    2012-03-01

    The development of affordable means to image the retina is an important step toward the implementation of eye disease screening programs. In this paper we present the i-RxCam, a low-cost, hand-held, retinal camera for widespread applications such as tele-retinal screening for eye diseases like diabetic retinopathy (DR), glaucoma, and age-related ocular diseases. Existing portable retinal imagers do not meet the requirements of a low-cost camera with sufficient technical capabilities (field of view, image quality, portability, battery power, and ease-of-use) to be distributed widely to low volume clinics, such as the offices of single primary care physicians serving rural communities. The i-RxCam uses a Nikon D3100 digital camera body. The camera has a CMOS sensor with 14.8 million pixels. We use a 50mm focal lens that gives a retinal field of view of 45 degrees. The internal autofocus can compensate for about 2D (diopters) of focusing error. The light source is an LED produced by Philips with a linear emitting area that is transformed using a light pipe to the optimal shape at the eye pupil, an annulus. To eliminate corneal reflex we use a polarization technique in which the light passes through a nano-wire polarizer plate. This is a novel type of polarizer featuring high polarization separation (contrast ratio of more than 1000) and very large acceptance angle (>45 degrees). The i-RxCam approach will yield a significantly more economical retinal imaging device that would allow mass screening of the at-risk population.

  12. Estimating the spatial position of marine mammals based on digital camera recordings

    PubMed Central

    Hoekendijk, Jeroen P A; de Vries, Jurre; van der Bolt, Krissy; Greinert, Jens; Brasseur, Sophie; Camphuysen, Kees C J; Aarts, Geert

    2015-01-01

    Estimating the spatial position of organisms is essential to quantify interactions between the organism and the characteristics of its surroundings, for example, predator–prey interactions, habitat selection, and social associations. Because marine mammals spend most of their time under water and may appear at the surface only briefly, determining their exact geographic location can be challenging. Here, we developed a photogrammetric method to accurately estimate the spatial position of marine mammals or birds at the sea surface. Digital recordings containing landscape features with known geographic coordinates can be used to estimate the distance and bearing of each sighting relative to the observation point. The method can correct for frame rotation, estimates pixel size based on the reference points, and can be applied to scenarios with and without a visible horizon. A set of R functions was written to process the images and obtain accurate geographic coordinates for each sighting. The method is applied to estimate the spatiotemporal fine-scale distribution of harbour porpoises in a tidal inlet. Video recordings of harbour porpoises were made from land, using a standard digital single-lens reflex (DSLR) camera, positioned at a height of 9.59 m above mean sea level. Porpoises were detected up to a distance of ∽3136 m (mean 596 m), with a mean location error of 12 m. The method presented here allows for multiple detections of different individuals within a single video frame and for tracking movements of individuals based on repeated sightings. In comparison with traditional methods, this method only requires a digital camera to provide accurate location estimates. It especially has great potential in regions with ample data on local (a)biotic conditions, to help resolve functional mechanisms underlying habitat selection and other behaviors in marine mammals in coastal areas. PMID:25691982

  13. Estimating the spatial position of marine mammals based on digital camera recordings.

    PubMed

    Hoekendijk, Jeroen P A; de Vries, Jurre; van der Bolt, Krissy; Greinert, Jens; Brasseur, Sophie; Camphuysen, Kees C J; Aarts, Geert

    2015-02-01

    Estimating the spatial position of organisms is essential to quantify interactions between the organism and the characteristics of its surroundings, for example, predator-prey interactions, habitat selection, and social associations. Because marine mammals spend most of their time under water and may appear at the surface only briefly, determining their exact geographic location can be challenging. Here, we developed a photogrammetric method to accurately estimate the spatial position of marine mammals or birds at the sea surface. Digital recordings containing landscape features with known geographic coordinates can be used to estimate the distance and bearing of each sighting relative to the observation point. The method can correct for frame rotation, estimates pixel size based on the reference points, and can be applied to scenarios with and without a visible horizon. A set of R functions was written to process the images and obtain accurate geographic coordinates for each sighting. The method is applied to estimate the spatiotemporal fine-scale distribution of harbour porpoises in a tidal inlet. Video recordings of harbour porpoises were made from land, using a standard digital single-lens reflex (DSLR) camera, positioned at a height of 9.59 m above mean sea level. Porpoises were detected up to a distance of ∽3136 m (mean 596 m), with a mean location error of 12 m. The method presented here allows for multiple detections of different individuals within a single video frame and for tracking movements of individuals based on repeated sightings. In comparison with traditional methods, this method only requires a digital camera to provide accurate location estimates. It especially has great potential in regions with ample data on local (a)biotic conditions, to help resolve functional mechanisms underlying habitat selection and other behaviors in marine mammals in coastal areas. PMID:25691982

  14. Characterization of digital cameras for reflected ultraviolet photography; implications for qualitative and quantitative image analysis during forensic examination.

    PubMed

    Garcia, Jair E; Wilksch, Philip A; Spring, Gale; Philp, Peta; Dyer, Adrian

    2014-01-01

    Reflected ultraviolet imaging techniques allow for the visualization of evidence normally outside the human visible spectrum. Specialized digital cameras possessing extended sensitivity can be used for recording reflected ultraviolet radiation. Currently, there is a lack of standardized methods for ultraviolet image recording and processing using digital cameras, potentially limiting the implementation and interpretation. A methodology is presented for processing ultraviolet images based on linear responses and the sensitivity of the respective color channels. The methodology is applied to a FujiS3 UVIR camera, and a modified Nikon D70s camera, to reconstruct their respective spectral sensitivity curves between 320 and 400 nm. This method results in images with low noise and high contrast, suitable for qualitative and/or quantitative analysis. The application of this methodology is demonstrated in the recording of latent fingerprints. PMID:24117678

  15. Realization of the FPGA based TDI algorithm in digital domain for CMOS cameras

    NASA Astrophysics Data System (ADS)

    Tao, Shuping; Jin, Guang; Zhang, Xuyan; Qu, Hongsong

    2012-10-01

    In order to make the CMOS image sensors suitable for space high resolution imaging applications, a new method realizing TDI in digital domain by FPGA is proposed in this paper, which improves the imaging mode for area array CMOS sensors. The TDI algorithm accumulates the corresponding pixels of adjoining frames in digital domain, so the gray values increase by M times, where M is for the integration number, and the image's quality in signal-to-noise ratio can be improved. In addition, the TDI optimization algorithm is discussed. Firstly, the signal storage is optimized by 2 slices of external RAM, where memory depth expanding and the table tennis operation mechanism are used. Secondly, the FIFO operation mechanism reduces the reading and writing operation on memory by M×(M-1) times, It saves so much signal transfer time as is proportional to the square of integration number M2, that the frame frequency is able to increase greatly. At last, the CMOS camera based on TDI in digital domain is developed, and the algorithm is validated by experiments on it.

  16. High-end aerial digital cameras and their impact on the automation and quality of the production workflow

    NASA Astrophysics Data System (ADS)

    Paparoditis, Nicolas; Souchon, Jean-Philippe; Martinoty, Gilles; Pierrot-Deseilligny, Marc

    The IGN digital camera project was established in the early 1990s. The first research surveys were carried out in 1996 and the digital camera was first used in production in 2000. In 2004 approximately 10 French departments (accounting for 10% of the territory) were covered with a four-head camera system and since summer 2005 all IGN imagery has been acquired digitally. Nevertheless the camera system is still evolving, with tests on new geometric configurations being continuously undertaken. The progressive integration of the system in IGN production workflows has allowed IGN to keep the system evolving in accordance with production needs. Remaining problems are due to specific camera characteristics such as CCD format, the optical quality of off-the-shelf lenses, and because some production tools are ill-adapted to digital images with a large dynamic range. However, when considering the pros and cons of integrating these images into production lines, the disadvantages are largely balanced by the numerous benefits this technology offers.

  17. Combining multi-spectral proximal sensors and digital cameras for monitoring grazed tropical pastures

    NASA Astrophysics Data System (ADS)

    Handcock, R. N.; Gobbett, D. L.; González, L. A.; Bishop-Hurley, G. J.; McGavin, S. L.

    2015-11-01

    Timely and accurate monitoring of pasture biomass and ground-cover is necessary in livestock production systems to ensure productive and sustainable management of forage for livestock. Interest in the use of proximal sensors for monitoring pasture status in grazing systems has increased, since such sensors can return data in near real-time, and have the potential to be deployed on large properties where remote sensing may not be suitable due to issues such as spatial scale or cloud cover. However, there are unresolved challenges in developing calibrations to convert raw sensor data to quantitative biophysical values, such as pasture biomass or vegetation ground-cover, to allow meaningful interpretation of sensor data by livestock producers. We assessed the use of multiple proximal sensors for monitoring tropical pastures with a pilot deployment of sensors at two sites on Lansdown Research Station near Townsville, Australia. Each site was monitored by a Skye SKR-four-band multi-spectral sensor (every 1 min), a digital camera (every 30 min), and a soil moisture sensor (every 1 min), each operated over 18 months. Raw data from each sensor were processed to calculate a number of multispectral vegetation indices. Visual observations of pasture characteristics, including above-ground standing biomass and ground cover, were made every 2 weeks. A methodology was developed to manage the sensor deployment and the quality control of the data collected. The data capture from the digital cameras was more reliable than the multi-spectral sensors, which had up to 63 % of data discarded after data cleaning and quality control. We found a strong relationship between sensor and pasture measurements during the wet season period of maximum pasture growth (January to April), especially when data from the multi-spectral sensors were combined with weather data. RatioNS34 (a simple band ratio between the near infrared (NIR) and lower shortwave infrared (SWIR) bands) and rainfall since 1

  18. A compact high-definition low-cost digital stereoscopic video camera for rapid robotic surgery development.

    PubMed

    Carlson, Jay; Kowalczuk, Jędrzej; Psota, Eric; Pérez, Lance C

    2012-01-01

    Robotic surgical platforms require vision feedback systems, which often consist of low-resolution, expensive, single-imager analog cameras. These systems are retooled for 3D display by simply doubling the cameras and outboard control units. Here, a fully-integrated digital stereoscopic video camera employing high-definition sensors and a class-compliant USB video interface is presented. This system can be used with low-cost PC hardware and consumer-level 3D displays for tele-medical surgical applications including military medical support, disaster relief, and space exploration. PMID:22356964

  19. Color segmentation as an aid to white balancing for digital still cameras

    NASA Astrophysics Data System (ADS)

    Cooper, Ted J.

    2000-12-01

    Digital Still Cameras employ automatic white balance techniques to adjust sensor amplifier gains so that white imaged objects appear white. A color cast detection algorithm is presented that uses histogram and segmentation techniques to select near-neutral objects in the image. Once identified and classified, these objects permit determination of the scene illuminant and implicitly the respective amplifier gains. Under certain circumstances, a scene may contain no near-neutral objects. By using the segmentation operations on non-neutral image objects, memory colors, from skin, sky, and foliage objects, may be identified. If identified, these memory colors provide enough chromatic information to predict the scene illuminant. By combining the approaches from near-neutral objects with those of memory color objects, a reasonable automatic white balance over a wide range of scenes is possible.

  20. Colorimetric characterization of digital cameras with unrestricted capture settings applicable for different illumination circumstances

    NASA Astrophysics Data System (ADS)

    Fang, Jingyu; Xu, Haisong; Wang, Zhehong; Wu, Xiaomin

    2016-05-01

    With colorimetric characterization, digital cameras can be used as image-based tristimulus colorimeters for color communication. In order to overcome the restriction of fixed capture settings adopted in the conventional colorimetric characterization procedures, a novel method was proposed considering capture settings. The method calculating colorimetric value of the measured image contains five main steps, including conversion from RGB values to equivalent ones of training settings through factors based on imaging system model so as to build the bridge between different settings, scaling factors involved in preparation steps for transformation mapping to avoid errors resulted from nonlinearity of polynomial mapping for different ranges of illumination levels. The experiment results indicate that the prediction error of the proposed method, which was measured by CIELAB color difference formula, reaches less than 2 CIELAB units under different illumination levels and different correlated color temperatures. This prediction accuracy for different capture settings remains the same level as the conventional method for particular lighting condition.

  1. Noctilucent clouds: modern ground-based photographic observations by a digital camera network.

    PubMed

    Dubietis, Audrius; Dalin, Peter; Balčiūnas, Ričardas; Černis, Kazimieras; Pertsev, Nikolay; Sukhodoev, Vladimir; Perminov, Vladimir; Zalcik, Mark; Zadorozhny, Alexander; Connors, Martin; Schofield, Ian; McEwan, Tom; McEachran, Iain; Frandsen, Soeren; Hansen, Ole; Andersen, Holger; Grønne, Jesper; Melnikov, Dmitry; Manevich, Alexander; Romejko, Vitaly

    2011-10-01

    Noctilucent, or "night-shining," clouds (NLCs) are a spectacular optical nighttime phenomenon that is very often neglected in the context of atmospheric optics. This paper gives a brief overview of current understanding of NLCs by providing a simple physical picture of their formation, relevant observational characteristics, and scientific challenges of NLC research. Modern ground-based photographic NLC observations, carried out in the framework of automated digital camera networks around the globe, are outlined. In particular, the obtained results refer to studies of single quasi-stationary waves in the NLC field. These waves exhibit specific propagation properties--high localization, robustness, and long lifetime--that are the essential requisites of solitary waves. PMID:22016249

  2. Lobate Scarp Modeling with Lunar Reconnaissance Orbiter Camera Digital Terrain Models

    NASA Astrophysics Data System (ADS)

    Williams, N. R.; Watters, T. R.; Pritchard, M. E.; Banks, M. E.; Bell, J. F.; Robinson, M. S.; Tran, T.

    2011-12-01

    Lobate scarps are a type of contractional tectonic landform expressed on the Moon's surface in both highlands and maria. Typically only tens of meters in relief, these linear or curvilinear topographic rises are interpreted to be low-angle thrust fault scarps resulting from global radial contraction. Radial contraction of the Moon can be inferred from shortening across the population of lobate scarps and is estimated at ~100 m. However, the geometry and depth of the underlying faults and mechanical properties of the near-surface lunar crustal materials are not well constrained. The Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Cameras (NACs) acquire 0.5 to 2.0 m/pixel panchromatic images and digital terrain models (DTMs) with spatial resolutions of 2 m are derived from NAC stereo pairs. Topographic data are being used to constrain models of the lobate scarp thrust faults. DTMs are analyzed for relief and morphology of the Slipher (48.3°N, 160.6°E), Racah X-1 (10°S, 178°E), and Simpelius-1 (73.5°S, 13°E) scarps. Profiles are extracted, detrended, and compared along strike. LROC Wide Angle Camera (WAC) 100 m/pixel image mosaics and topography provide regional contexts. Using elastic dislocation modeling, the fault dip angles, depths, slip, and taper are each varied until the predicted surface displacement best fits the DTM profiles for each lobate scarp. Preliminary best-fit dip angles vary from 30-40°, maximum fault depths extend to several hundred meters, and the amount of slip varies from 10 to 30 meters for the three scarps. The modeled maximum depths suggest that the thrust faults are not deeply rooted.

  3. Estimating information from image colors: an application to digital cameras and natural scenes.

    PubMed

    Marín-Franch, Iván; Foster, David H

    2013-01-01

    The colors present in an image of a scene provide information about its constituent elements. But the amount of information depends on the imaging conditions and on how information is calculated. This work had two aims. The first was to derive explicitly estimators of the information available and the information retrieved from the color values at each point in images of a scene under different illuminations. The second was to apply these estimators to simulations of images obtained with five sets of sensors used in digital cameras and with the cone photoreceptors of the human eye. Estimates were obtained for 50 hyperspectral images of natural scenes under daylight illuminants with correlated color temperatures 4,000, 6,500, and 25,000 K. Depending on the sensor set, the mean estimated information available across images with the largest illumination difference varied from 15.5 to 18.0 bits and the mean estimated information retrieved after optimal linear processing varied from 13.2 to 15.5 bits (each about 85 percent of the corresponding information available). With the best sensor set, 390 percent more points could be identified per scene than with the worst. Capturing scene information from image colors depends crucially on the choice of camera sensors. PMID:22450817

  4. Skin hydration imaging using a long-wavelength near-infrared digital camera

    NASA Astrophysics Data System (ADS)

    Attas, E. Michael; Posthumus, Trevor B.; Schattka, Bernhard J.; Sowa, Michael G.; Mantsch, Henry H.; Zhang, Shuliang L.

    2001-07-01

    Skin hydration is a key factor in skin health. Hydration measurements can provide diagnostic information on the condition of skin and can indicate the integrity of the skin barrier function. Near-infrared spectroscopy measures the water content of living tissue by its effect on tissue reflectance at a particular wavelength. Imaging has the important advantage of showing the degree of hydration as a function of location. Short-wavelength (650-1050 nm) near infrared spectroscopic reflectance imaging has previously been used in-vivo to determine the relative water content of skin under carefully controlled laboratory conditions. We have recently developed a novel spectroscopic imaging system to acquire image sets in the long-wavelength region of the near infrared (960 to 1700 nm), where the water absorption bands are more intense. The LW-NIR systems uses a liquid- crystal tunable filter in front of the objective lens and incorporates a 12-bit digital camera with a 320-by-240-pixel indium-gallium arsenide array sensor. Custom software controls the camera and tunable filter, allowing image sets to be acquired and displayed in near-real time. Forearm skin hydration was measured in a clinical context using the long- wavelength imaging system, a short-wavelength imaging system, and non-imaging instrumentation. Among these, the LW-NIR system appears to be the most sensitive at measuring dehydration of skin.

  5. Evaluation of the geometric stability and the accuracy potential of digital cameras — Comparing mechanical stabilisation versus parameterisation

    NASA Astrophysics Data System (ADS)

    Rieke-Zapp, D.; Tecklenburg, W.; Peipe, J.; Hastedt, H.; Haig, Claudia

    Recent tests on the geometric stability of several digital cameras that were not designed for photogrammetric applications have shown that the accomplished accuracies in object space are either limited or that the accuracy potential is not exploited to the fullest extent. A total of 72 calibrations were calculated with four different software products for eleven digital camera models with different hardware setups, some with mechanical fixation of one or more parts. The calibration procedure was chosen in accord to a German guideline for evaluation of optical 3D measuring systems [VDI/VDE, VDI/VDE 2634 Part 1, 2002. Optical 3D Measuring Systems-Imaging Systems with Point-by-point Probing. Beuth Verlag, Berlin]. All images were taken with ringflashes which was considered a standard method for close-range photogrammetry. In cases where the flash was mounted to the lens, the force exerted on the lens tube and the camera mount greatly reduced the accomplished accuracy. Mounting the ringflash to the camera instead resulted in a large improvement of accuracy in object space. For standard calibration best accuracies in object space were accomplished with a Canon EOS 5D and a 35 mm Canon lens where the focusing tube was fixed with epoxy (47 μm maximum absolute length measurement error in object space). The fixation of the Canon lens was fairly easy and inexpensive resulting in a sevenfold increase in accuracy compared with the same lens type without modification. A similar accuracy was accomplished with a Nikon D3 when mounting the ringflash to the camera instead of the lens (52 μm maximum absolute length measurement error in object space). Parameterisation of geometric instabilities by introduction of an image variant interior orientation in the calibration process improved results for most cameras. In this case, a modified Alpa 12 WA yielded the best results (29 μm maximum absolute length measurement error in object space). Extending the parameter model with Fi

  6. Calibration of Low Cost Digital Camera Using Data from Simultaneous LIDAR and Photogrammetric Surveys

    NASA Astrophysics Data System (ADS)

    Mitishita, E.; Debiasi, P.; Hainosz, F.; Centeno, J.

    2012-07-01

    Digital photogrammetric products from the integration of imagery and lidar datasets are a reality nowadays. When the imagery and lidar surveys are performed together and the camera is connected to the lidar system, a direct georeferencing can be applied to compute the exterior orientation parameters of the images. Direct georeferencing of the images requires accurate interior orientation parameters to perform photogrammetric application. Camera calibration is a procedure applied to compute the interior orientation parameters (IOPs). Calibration researches have established that to obtain accurate IOPs, the calibration must be performed with same or equal condition that the photogrammetric survey is done. This paper shows the methodology and experiments results from in situ self-calibration using a simultaneous images block and lidar dataset. The calibration results are analyzed and discussed. To perform this research a test field was fixed in an urban area. A set of signalized points was implanted on the test field to use as the check points or control points. The photogrammetric images and lidar dataset of the test field were taken simultaneously. Four strips of flight were used to obtain a cross layout. The strips were taken with opposite directions of flight (W-E, E-W, N-S and S-N). The Kodak DSC Pro SLR/c digital camera was connected to the lidar system. The coordinates of the exposition station were computed from the lidar trajectory. Different layouts of vertical control points were used in the calibration experiments. The experiments use vertical coordinates from precise differential GPS survey or computed by an interpolation procedure using the lidar dataset. The positions of the exposition stations are used as control points in the calibration procedure to eliminate the linear dependency of the group of interior and exterior orientation parameters. This linear dependency happens, in the calibration procedure, when the vertical images and flat test field are

  7. Quality Metrics of Semi Automatic DTM from Large Format Digital Camera

    NASA Astrophysics Data System (ADS)

    Narendran, J.; Srinivas, P.; Udayalakshmi, M.; Muralikrishnan, S.

    2014-11-01

    The high resolution digital images from Ultracam-D Large Format Digital Camera (LFDC) was used for near automatic DTM generation. In the past, manual method for DTM generation was used which are time consuming and labour intensive. In this study LFDC in synergy with accurate position and orientation system and processes like image matching algorithms, distributed processing and filtering techniques for near automatic DTM generation. Traditionally the DTM accuracy is reported using check points collected from the field which are limited in number, time consuming and costly. This paper discusses the reliability of near automatic DTM generated from Ultracam-D for an operational project covering an area of nearly 600 Sq. Km. using 21,000 check points captured stereoscopically by experienced operators. The reliability of the DTM for the three study areas with different morphology is presented using large number of stereo check points and parameters related to statistical distribution of residuals such as skewness, kurtosis, standard deviation and linear error at 90% confidence interval. The residuals obtained for the three areas follow normal distribution in agreement with the majority of standards on positional accuracy. The quality metrics in terms of reliability were computed for the DTMs generated and the tables and graphs show the potential of Ultracam-D for the generation of semiautomatic DTM process for different terrain types.

  8. A Cryogenic, Insulating Suspension System for the High Resolution Airborne Wideband Camera (HAWC)and Submillemeter And Far Infrared Experiment (SAFIRE) Adiabatic Demagnetization Refrigerators (ADRs)

    NASA Technical Reports Server (NTRS)

    Voellmer, George M.; Jackson, Michael L.; Shirron, Peter J.; Tuttle, James G.

    2002-01-01

    The High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter And Far Infrared Experiment (SAFIRE) will use identical Adiabatic Demagnetization Refrigerators (ADR) to cool their detectors to 200mK and 100mK, respectively. In order to minimize thermal loads on the salt pill, a Kevlar suspension system is used to hold it in place. An innovative, kinematic suspension system is presented. The suspension system is unique in that it consists of two parts that can be assembled and tensioned offline, and later bolted onto the salt pill.

  9. Field test of an autocorrelation technique for determining grain size using a digital camera

    NASA Astrophysics Data System (ADS)

    Barnard, P. L.; Rubin, D. M.; Harney, J.; Mustain, N.

    2007-12-01

    An extensive field test using Rubin's (2004) autocorrelation technique shows that median and mean grain size can be determined with suitable accuracy using a digital camera and associated autocorrelation when compared to traditional methods such as mechanical sieving and settling-tube analysis. The field test included 205 sediment samples and > 1200 digital images from a variety of beaches on the west coast of the United States, with grain sizes ranging from sand to granules. To test the accuracy of the digital-image grain-size algorithm, we compared results with manual point counts of a large image data set in the Santa Barbara littoral cell. Grain sizes calculated using the autocorrelation algorithm were highly correlated with the point counts of the same images (r2=0.93; n=79) and had an error of only 1%. Although grain sizes calculated from digital images give an accurate result for grains in the image, natural lateral and vertical variability in grain size can cause differences between grain size measured in digital images of the bed surface and grain size measured by sieving a grab sample that includes subsurface sediment. Lateral spatial variability was tested by analyzing the results of up to 100 images taken in a series of 1 m2 sample areas. Comparisons of calculated grain sizes and grain sizes measured from grab samples show small differences between surface sediment and grab samples on high- energy dissipative beaches with well-sorted sediment such as in the Pacific Northwest (r2 > 0.92; n=115). In contrast, on less dissipative, more poorly sorted beaches such as Ocean Beach in San Francisco, differences between surface and subsurface grain size are greater (r2 > 0.70; n=67; within 3% accuracy). In all field tests the autocorrelation method was able to predict the mean and median grain size with ~96% accuracy, which is more than adequate for the majority of sedimentological applications. When properly automated for large numbers of samples, the

  10. New Stereo Vision Digital Camera System for Simultaneous Measurement of Cloud Base Height and Atmospheric Visibility

    NASA Astrophysics Data System (ADS)

    Janeiro, F. M.; Carretas, F.; Palma, N.; Ramos, P. M.; Wagner, F.

    2013-12-01

    Clouds play an important role in many aspects of everyday life. They affect both the local weather as well as the global climate and are an important parameter on climate change studies. Cloud parameters are also important for weather prediction models which make use of actual measurements. It is thus important to have low-cost instrumentation that can be deployed in the field to measure those parameters. This kind of instruments should also be automated and robust since they may be deployed in remote places and be subject to adverse weather conditions. Although clouds are very important in environmental systems, they are also an essential component of airplane safety when visual flight rules (VFR) are enforced, such as in most small aerodromes where it is not economically viable to install instruments for assisted flying. Under VFR there are strict limits on the height of the cloud base, cloud cover and atmospheric visibility that ensure the safety of the pilots and planes. Although there are instruments, available in the market, to measure those parameters, their relatively high cost makes them unavailable in many local aerodromes. In this work we present a new prototype which has been recently developed and deployed in a local aerodrome as proof of concept. It is composed by two digital cameras that capture photographs of the sky and allow the measurement of the cloud height from the parallax effect. The new developments consist on having a new geometry which allows the simultaneous measurement of cloud base height, wind speed at cloud base height and atmospheric visibility, which was not previously possible with only two cameras. The new orientation of the cameras comes at the cost of a more complex geometry to measure the cloud base height. The atmospheric visibility is calculated from the Lambert-Beer law after the measurement of the contrast between a set of dark objects and the background sky. The prototype includes the latest hardware developments that

  11. The use of a consumer grade photo camera in optical-digital correlator for pattern recognition and input scene restoration

    NASA Astrophysics Data System (ADS)

    Konnik, Mikhail V.; Starikov, Sergey N.

    2009-11-01

    In this work an optical-digital correlator for pattern recognition and input scene restoration is described. Main features of the described correlator are portability and ability of multi-element input scenes processing. The correlator consists of a consumer grade digital photo camera with a diffractive optical element (DOE) inserted as a correlation filter. Correlation of an input scene with a reference image recorded on the DOE are provided optically and registered by the digital photo camera for further processing. Using obtained correlation signals and DOE's point spread function (PSF), one can restore the image of the input scene from the image of correlation signals by digital deconvolution algorithms. The construction of the correlator based on the consumer grade digital photo camera is presented. The software procedure that is necessary for images linearization of correlation signals is described. Experimental results on optical correlation are compared with numerical simulation. The results of images restoration from conventionally and specially processed correlation signals are reported. Quantitative estimations of accuracy of correlation signals as well as restored images of the input scene are presented.

  12. Analysis of airborne LiDAR as a basis for digital soil mapping in Alpine areas

    NASA Astrophysics Data System (ADS)

    Kringer, K.; Tusch, M.; Geitner, C.; Meißl, G.; Rutzinger, M.

    2009-04-01

    Especially in mountainous regions like the Alps the formation of soil is highly influenced by relief characteristics. Among all factors included in Jenny's (1941) model for soil development, relief is the one most commonly used in approaches to create digital soil maps and to derive soil properties from secondary data sources (McBratney et al. 2003). Elevation data, first order (slope, aspect) and second order derivates (plan, profile and cross-sectional curvature) as well as complex morphometric parameters (various landform classifications, e.g., Wood 1996) and compound indices (e.g., topographic wetness indices, vertical distance to drainage network, insolation) can be calculated from digital elevation models (DEM). However, while being an important source of information for digital soil mapping on small map scales, "conventional" DEMs are of limited use for the design of large scale conceptual soil maps for small areas due to rather coarse raster resolutions with cell sizes ranging from 20 to 100 meters. Slight variations in elevation and small landform features might not be discernible even though they might have a significant effect to soil formation, e.g., regarding the influence of groundwater in alluvial soils or the extent of alluvial fans. Nowadays, Airborne LiDAR (Light Detection And Ranging) provides highly accurate data for the elaboration of high-resolution digital terrain models (DTM) even in forested areas. In the project LASBO (Laserscanning in der Bodenkartierung) the applicability of digital terrain models derived from LiDAR for the identification of soil-relevant geomorphometric parameter is investigated. Various algorithms which were initially designed for coarser raster data are applied on high-resolution DTMs. Test areas for LASBO are located in the region of Bruneck (Italy) and near the municipality of Kramsach in the Inn Valley (Austria). The freely available DTM for Bruneck has a raster resolution of 2.5 meters while in Kramsach a DTM with

  13. Characterizing arid region alluvial fan surface roughness with airborne laser swath mapping digital topographic data

    NASA Astrophysics Data System (ADS)

    Frankel, Kurt L.; Dolan, James F.

    2007-06-01

    Range-front alluvial fan deposition in arid environments is episodic and results in multiple fan surfaces and ages. These distinct landforms are often defined by descriptions of their surface morphology, desert varnish accumulation, clast rubification, desert pavement formation, soil development, and stratigraphy. Although quantifying surface roughness differences between alluvial fan units has proven to be difficult in the past, high-resolution airborne laser swath mapping (ALSM) digital topographic data are now providing researchers with an opportunity to study topography in unprecedented detail. Here we use ALSM data to calculate surface roughness on two alluvial fans in northern Death Valley, California. We define surface roughness as the standard deviation of slope in a 5-m by 5-m moving window. Comparison of surface roughness values between mapped fan surfaces shows that each unit is statistically unique at the 99% confidence level. Furthermore, there is an obvious smoothing trend from the presently active channel to a deposit with cosmogenic 10Be and 36Cl surface exposure ages of ˜70 ka. Beyond 70 ka, alluvial landforms become progressively rougher with age. These data suggest that alluvial fans in arid regions smooth out with time until a threshold is crossed where roughness increases at greater wavelength with age as a result of surface runoff and headward tributary incision into the oldest surfaces.

  14. Respiratory-Gated MRgHIFU in Upper Abdomen Using an MR-Compatible In-Bore Digital Camera

    PubMed Central

    Petrusca, Lorena; Viallon, Magalie; Muller, Arnaud; Breguet, Romain; Becker, Christoph D.; Salomir, Rares

    2014-01-01

    Objective. To demonstrate the technical feasibility and the potential interest of using a digital optical camera inside the MR magnet bore for monitoring the breathing cycle and subsequently gating the PRFS MR thermometry, MR-ARFI measurement, and MRgHIFU sonication in the upper abdomen. Materials and Methods. A digital camera was reengineered to remove its magnetic parts and was further equipped with a 7 m long USB cable. The system was electromagnetically shielded and operated inside the bore of a closed 3T clinical scanner. Suitable triggers were generated based on real-time motion analysis of the images produced by the camera (resolution 640 × 480 pixels, 30 fps). Respiratory-gated MR-ARFI prepared MRgHIFU ablation was performed in the kidney and liver of two sheep in vivo, under general anaesthesia and ventilator-driven forced breathing. Results. The optical device demonstrated very good MR compatibility. The current setup permitted the acquisition of motion artefact-free and high resolution MR 2D ARFI and multiplanar interleaved PRFS thermometry (average SNR 30 in liver and 56 in kidney). Microscopic histology indicated precise focal lesions with sharply delineated margins following the respiratory-gated HIFU sonications. Conclusion. The proof-of-concept for respiratory motion management in MRgHIFU using an in-bore digital camera has been validated in vivo. PMID:24716196

  15. The integration of digital camera derived images with a computer based diabetes register for use in retinal screening.

    PubMed

    Taylor, D J; Jacob, J S; Tooke, J E

    2000-07-01

    Exeter district provides a retinal screening service based on a mobile non-mydriatic camera operated by a dedicated retinal screener visiting general practices on a 2-yearly cycle. Digital attachments to eye cameras can now provide a cost effective alternative to the use of film in population based eye screening programmes. Whilst the manufacturers of digital cameras provide a database for the storage of pictures, the images do not as yet interface readily with the rest of the patient's computer held data or allow for a sophisticated grading, reporting and administration system. The system described is a development of the Exeter diabetes register (EXSYST) which can import digitally derived pictures from either Ris-Lite TM and Imagenet TM camera systems or scanned Polaroids Pictures can be reported by the screener, checked by a consultant ophthalmologist via the hospital network, and a report, consisting of colour pictures, map of relevant pathology and referral recommendations produced. This concise report can be hard copied inexpensively on a high resolution ink-jet printer to be returned to the patient's general practitioner. Eye images remain available within the hospital diabetes centre computer network to facilitate shared care. This integrated system would form an ideal platform for the addition of computer based pathology recognition and total paperless transmission when suitable links to GP surgeries become available. PMID:10837903

  16. Evaluation of an airborne remote sensing platform consisting of two consumer-grade cameras for crop identification

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Remote sensing systems based on consumer-grade cameras have been increasingly used in scientific research and remote sensing applications because of their low cost and ease of use. However, the performance of consumer-grade cameras for practical applications have not been well documented in related ...

  17. Implementing PET-guided biopsy: integrating functional imaging data with digital x-ray mammography cameras

    NASA Astrophysics Data System (ADS)

    Weinberg, Irving N.; Zawarzin, Valera; Pani, Roberto; Williams, Rodney C.; Freimanis, Rita L.; Lesko, Nadia M.; Levine, E. A.; Perrier, N.; Berg, Wendie A.; Adler, Lee P.

    2001-05-01

    Purpose: Phantom trials using the PET data for localization of hot spots have demonstrated positional accuracies in the millimeter range. We wanted to perform biopsy based on information from both anatomic and functional imaging modalities, however we had a communication challenge. Despite the digital nature of DSM stereotactic X-ray mammography devices, and the large number of such devices in Radiology Departments (approximately 1600 in the US alone), we are not aware of any methods of connecting stereo units to other computers in the Radiology department. Methods: We implemented a local network between an external IBM PC (running Linux) and the Lorad Stereotactic Digital Spot Mammography PC (running DOS). The application used IP protocol on the parallel port, and could be run in the background on the LORAD PC without disrupting important clinical activities such as image acquisition or archiving. With this software application, users of the external PC could pull x-ray images on demand form the Load DSM computer. Results: X-ray images took about a minute to ship to the external PC for analysis or forwarding to other computers on the University's network. Using image fusion techniques we were able to designate locations of functional imaging features as the additional targets on the anatomic x-rays. These pseudo-features could then potentially be used to guide biopsy using the stereotactic gun stage on the Lorad camera. New Work to be Presented: A method of transferring and processing stereotactic x-ray mammography images to a functional PET workstation for implementing image-guided biopsy.

  18. A new lunar digital elevation model from the Lunar Orbiter Laser Altimeter and SELENE Terrain Camera

    NASA Astrophysics Data System (ADS)

    Barker, M. K.; Mazarico, E.; Neumann, G. A.; Zuber, M. T.; Haruyama, J.; Smith, D. E.

    2016-07-01

    We present an improved lunar digital elevation model (DEM) covering latitudes within ±60°, at a horizontal resolution of 512 pixels per degree (∼60 m at the equator) and a typical vertical accuracy ∼3 to 4 m. This DEM is constructed from ∼ 4.5 ×109 geodetically-accurate topographic heights from the Lunar Orbiter Laser Altimeter (LOLA) onboard the Lunar Reconnaissance Orbiter, to which we co-registered 43,200 stereo-derived DEMs (each 1° × 1°) from the SELENE Terrain Camera (TC) (∼1010 pixels total). After co-registration, approximately 90% of the TC DEMs show root-mean-square vertical residuals with the LOLA data of <5 m compared to ∼ 50% prior to co-registration. We use the co-registered TC data to estimate and correct orbital and pointing geolocation errors from the LOLA altimetric profiles (typically amounting to <10 m horizontally and <1 m vertically). By combining both co-registered datasets, we obtain a near-global DEM with high geodetic accuracy, and without the need for surface interpolation. We evaluate the resulting LOLA + TC merged DEM (designated as "SLDEM2015") with particular attention to quantifying seams and crossover errors.

  19. Digital chemiluminescence imaging of DNA sequencing blots using a charge-coupled device camera.

    PubMed Central

    Karger, A E; Weiss, R; Gesteland, R F

    1992-01-01

    Digital chemiluminescence imaging with a cryogenically cooled charge-coupled device (CCD) camera is used to visualize DNA sequencing fragments covalently bound to a blotting membrane. The detection is based on DNA hybridization with an alkaline phosphatase(AP) labeled oligodeoxyribonucleotide probe and AP triggered chemiluminescence of the substrate 3-(2'-spiro-adamantane)-4-methoxy-4-(3"-phosphoryloxy)phenyl- 1,2-dioxetane (AMPPD). The detection using a direct AP-oligonucleotide conjugate is compared to the secondary detection of biotinylated oligonucleotides with respect to their sensitivity and nonspecific binding to the nylon membrane by quantitative imaging. Using the direct oligonucleotide-AP conjugate as a hybridization probe, sub-attomol (0.5 pg of 2.7 kb pUC plasmid DNA) quantities of membrane bound DNA are detectable with 30 min CCD exposures. Detection using the biotinylated probe in combination with streptavidin-AP was found to be background limited by nonspecific binding of streptavidin-AP and the oligo(biotin-11-dUTP) label in equal proportions. In contrast, the nonspecific background of AP-labeled oligonucleotide is indistinguishable from that seen with 5'-32P-label, in that respect making AP an ideal enzymatic label. The effect of hybridization time, probe concentration, and presence of luminescence enhancers on the detection of plasmid DNA were investigated. Images PMID:1480487

  20. New long-zoom lens for 4K super 35mm digital cameras

    NASA Astrophysics Data System (ADS)

    Thorpe, Laurence J.; Usui, Fumiaki; Kamata, Ryuhei

    2015-05-01

    The world of television production is beginning to adopt 4K Super 35 mm (S35) image capture for a widening range of program genres that seek both the unique imaging properties of that large image format and the protection of their program assets in a world anticipating future 4K services. Documentary and natural history production in particular are transitioning to this form of production. The nature of their shooting demands long zoom lenses. In their traditional world of 2/3-inch digital HDTV cameras they have a broad choice in portable lenses - with zoom ranges as high as 40:1. In the world of Super 35mm the longest zoom lens is limited to 12:1 offering a telephoto of 400mm. Canon was requested to consider a significantly longer focal range lens while severely curtailing its size and weight. Extensive computer simulation explored countless combinations of optical and optomechanical systems in a quest to ensure that all operational requests and full 4K performance could be met. The final lens design is anticipated to have applications beyond entertainment production, including a variety of security systems.

  1. Measuring the orbital period of the Moon using a digital camera

    NASA Astrophysics Data System (ADS)

    Hughes, Stephen W.

    2006-03-01

    A method of measuring the orbital velocity of the Moon around the Earth using a digital camera is described. Separate images of the Moon and stars taken 24 hours apart were loaded into Microsoft PowerPoint and the centre of the Moon marked on each image. Four stars common to both images were connected together to form a 'home-made' constellation. On each image the Moon and constellation were grouped together. The group from one image was pasted onto the other image and translated and rotated so that the two constellations overlay each other. The distance between the Moon centres in pixels was converted into a physical distance on the CCD chip in order to calculate the angular separation on the sky. The angular movement was then used to calculate the orbital period of the Moon. A metre rule was photographed from a known distance in order to calculate the physical size of the CCD pixels. The orbital period of the Moon was measured as 27.1 days, which is within 0.7% of the actual period of 27.3 days.

  2. Tone-transfer (OECF) characteristics and spatial frequency response measurements for digital cameras and scanners

    NASA Astrophysics Data System (ADS)

    Burns, Peter D.

    2005-01-01

    Measurement of the spatial frequency response (SFR) of digital still cameras by slanted-edge analysis has been established for several years. The method, described in standard ISO 12233, has also been applied to other image acquisition subsystems such as document and print scanners. With the frequent application of the method and use of supporting software, questions often arise about the form of the input test image data. The tone-transfer characteristics of the system under test can influence the results, as can signal quantization and clipping. For this reason, the original standard called for a transformation of the input data prior to the slanted-edge analysis. The transformation is based on the measured opto-electronic conversion function (OECF) and can convert the image data to a reference-exposure signal space. This is often helpful when comparing different devices, if the intent is to do so in terms of the performance of optics, detector, and primary signal processing. We describe the use of the OECF and its inverse to derive the signal transformation in question. The influence of typical characteristics will be shown in several examples. It was found that, for test target data of modest contrast, the resulting SFR measurements were only moderately sensitive to the use of the inverse OECF transformation.

  3. Tone-transfer (OECF) characteristics and spatial frequency response measurements for digital cameras and scanners

    NASA Astrophysics Data System (ADS)

    Burns, Peter D.

    2004-10-01

    Measurement of the spatial frequency response (SFR) of digital still cameras by slanted-edge analysis has been established for several years. The method, described in standard ISO 12233, has also been applied to other image acquisition subsystems such as document and print scanners. With the frequent application of the method and use of supporting software, questions often arise about the form of the input test image data. The tone-transfer characteristics of the system under test can influence the results, as can signal quantization and clipping. For this reason, the original standard called for a transformation of the input data prior to the slanted-edge analysis. The transformation is based on the measured opto-electronic conversion function (OECF) and can convert the image data to a reference-exposure signal space. This is often helpful when comparing different devices, if the intent is to do so in terms of the performance of optics, detector, and primary signal processing. We describe the use of the OECF and its inverse to derive the signal transformation in question. The influence of typical characteristics will be shown in several examples. It was found that, for test target data of modest contrast, the resulting SFR measurements were only moderately sensitive to the use of the inverse OECF transformation.

  4. ESPI of a transient shock wave flow using an ultrafast digital camera

    NASA Astrophysics Data System (ADS)

    Andrag, Roland; Barbosa, Filipe J.; Skews, Beric W.

    2001-04-01

    The application of electronic speckle pattern interferometry (ESPI) to the visualization of a typical high-speed compressible flow is investigated. ESPI is an interferometric technique that has established itself as a reliable alternative to holographic interferometry in the measurement of small displacements and of vibrations, and is increasingly being used in flow visualization. It can instantly and in real time produce interferometric images in digital form on a video screen, with no photographic processing being required. In this paper two flows are examined, the one a low speed flow of a thermal plume arising from a hot soldering iron, for which real-time visualization is achievable; and the other single frame imaging of a shockwave emerging from a small round open- ended shock tube. ESPI is shown to be a valuable tool in the visualization of compressible flows, and a good alternative to holographic interferometry in obtaining quantitative density data about a flow field. A method for obtaining interferograms with finite fringe-width is presented. The main benefit of using ESPI for flow visualization is that the interferometric image is immediately accessible for viewing on a monitor, so avoiding the tedious photographic holographic reconstruction process. Advances in camera technology are fast overcoming its disadvantage, low image resolution.

  5. Time-to-digital converter based on analog time expansion for 3D time-of-flight cameras

    NASA Astrophysics Data System (ADS)

    Tanveer, Muhammad; Nissinen, Ilkka; Nissinen, Jan; Kostamovaara, Juha; Borg, Johan; Johansson, Jonny

    2014-03-01

    This paper presents an architecture and achievable performance for a time-to-digital converter, for 3D time-of-flight cameras. This design is partitioned in two levels. In the first level, an analog time expansion, where the time interval to be measured is stretched by a factor k, is achieved by charging a capacitor with current I, followed by discharging the capacitor with a current I/k. In the second level, the final time to digital conversion is performed by a global gated ring oscillator based time-to-digital converter. The performance can be increased by exploiting its properties of intrinsic scrambling of quantization noise and mismatch error, and first order noise shaping. The stretched time interval is measured by counting full clock cycles and storing the states of nine phases of the gated ring oscillator. The frequency of the gated ring oscillator is approximately 131 MHz, and an appropriate stretch factor k, can give a resolution of ≍ 57 ps. The combined low nonlinearity of the time stretcher and the gated ring oscillator-based time-to-digital converter can achieve a distance resolution of a few centimeters with low power consumption and small area occupation. The carefully optimized circuit configuration achieved by using an edge aligner, the time amplification property and the gated ring oscillator-based time-to-digital converter may lead to a compact, low power single photon configuration for 3D time-of-flight cameras, aimed for a measurement range of 10 meters.

  6. Digital holographic PTV for complicated flow in a water by two cameras and refractive index-matching method

    NASA Astrophysics Data System (ADS)

    Kuniyasu, Masataka; Aoyagi, Yusuke; Unno, Noriyuki; Satake, Shin-ichi; Yuki, Kazuhisa; Seki, Yohji

    2016-03-01

    A basic heat transfer promoter such as packed beds of spheres is one of the technologies of the promotion of heat transfer using the turbulent mixture. We carried out 3-D visualization of digital holographic PTV to understand the complicated flow in a sphere-packed pipe (SPP) using a refractive index-matching method with a water used as a working fluid, the spheres was made of MEXFLON, whose refractive index is the same as that of a water. To visualize the detail flow structure around the spheres in water, we performed three-dimensional simultaneous measurements of velocity field in a water flow in the SPP are performed by our proposed holography technique with two cameras. The velocity field by two cameras could obtain finer flow structures than that by one camera.

  7. Digital holographic PTV for complicated flow in a water by two cameras and refractive index-matching method

    NASA Astrophysics Data System (ADS)

    Kuniyasu, Masataka; Aoyagi, Yusuke; Unno, Noriyuki; Satake, Shin-ichi; Yuki, Kazuhisa; Seki, Yohji

    2016-06-01

    A basic heat transfer promoter such as packed beds of spheres is one of the technologies of the promotion of heat transfer using the turbulent mixture. We carried out 3-D visualization of digital holographic PTV to understand the complicated flow in a sphere-packed pipe (SPP) using a refractive index-matching method with a water used as a working fluid, the spheres was made of MEXFLON, whose refractive index is the same as that of a water. To visualize the detail flow structure around the spheres in water, we performed three-dimensional simultaneous measurements of velocity field in a water flow in the SPP are performed by our proposed holography technique with two cameras. The velocity field by two cameras could obtain finer flow structures than that by one camera.

  8. Estimates of the error caused by atmospheric turbulence in determining object's motion speed using a digital camera

    NASA Astrophysics Data System (ADS)

    Valley, M. T.; Dudorov, V. V.; Kolosov, V. V.; Filimonov, G. A.

    2006-11-01

    The paper considers the error caused by atmospheric turbulence, in determining the motion speed of an object by using its successive images recorded on a matrix of a digital camera. Numerical modeling of the image of a moving object in successive time moments is performed. Fluctuation variance of the image mass centre affecting the measurement error is calculated. Error dependences on the distance to the object and path slope angle are obtained for different turbulence models. Considered are the situations, when the angular displacement of the object between two immediate shots of the digital camera is greater than the isoplanatism angle as well as the situations when the angular displacement is smaller than this angle.

  9. Two Years of Digital Terrain Model Production Using the Lunar Reconnaissance Orbiter Narrow Angle Camera

    NASA Astrophysics Data System (ADS)

    Burns, K.; Robinson, M. S.; Speyerer, E.; LROC Science Team

    2011-12-01

    One of the primary objectives of the Lunar Reconnaissance Orbiter Camera (LROC) is to gather stereo observations with the Narrow Angle Camera (NAC). These stereo observations are used to generate digital terrain models (DTMs). The NAC has a pixel scale of 0.5 to 2.0 meters but was not designed for stereo observations and thus requires the spacecraft to roll off-nadir to acquire these images. Slews interfere with the data collection of the other instruments, so opportunities are currently limited to four per day. Arizona State University has produced DTMs from 95 stereo pairs for 11 Constellation Project (CxP) sites (Aristarchus, Copernicus crater, Gruithuisen domes, Hortensius domes, Ina D-caldera, Lichtenberg crater, Mare Ingenii, Marius hills, Reiner Gamma, South Pole-Aitkin Rim, Sulpicius Gallus) as well as 30 other regions of scientific interest (including: Bhabha crater, highest and lowest elevation points, Highland Ponds, Kugler Anuchin, Linne Crater, Planck Crater, Slipher crater, Sears Crater, Mandel'shtam Crater, Virtanen Graben, Compton/Belkovich, Rumker Domes, King Crater, Luna 16/20/23/24 landing sites, Ranger 6 landing site, Wiener F Crater, Apollo 11/14/15/17, fresh craters, impact melt flows, Larmor Q crater, Mare Tranquillitatis pit, Hansteen Alpha, Moore F Crater, and Lassell Massif). To generate DTMs, the USGS ISIS software and SOCET SET° from BAE Systems are used. To increase the absolute accuracy of the DTMs, data obtained from the Lunar Orbiter Laser Altimeter (LOLA) is used to coregister the NAC images and define the geodetic reference frame. NAC DTMs have been used in examination of several sites, e.g. Compton-Belkovich, Marius Hills and Ina D-caldera [1-3]. LROC will continue to acquire high-resolution stereo images throughout the science phase of the mission and any extended mission opportunities, thus providing a vital dataset for scientific research as well as future human and robotic exploration. [1] B.L. Jolliff (2011) Nature

  10. Digital X-ray camera for quality evaluation three-dimensional topographic reconstruction of single crystals of biological macromolecules

    NASA Technical Reports Server (NTRS)

    Borgstahl, Gloria (Inventor); Lovelace, Jeff (Inventor); Snell, Edward Holmes (Inventor); Bellamy, Henry (Inventor)

    2008-01-01

    The present invention provides a digital topography imaging system for determining the crystalline structure of a biological macromolecule, wherein the system employs a charge coupled device (CCD) camera with antiblooming circuitry to directly convert x-ray signals to electrical signals without the use of phosphor and measures reflection profiles from the x-ray emitting source after x-rays are passed through a sample. Methods for using said system are also provided.

  11. Digital monochrome CCD camera for robust pixel correspondant, data compression, and preprocessing in an integrated PC-based image-processing environment

    NASA Astrophysics Data System (ADS)

    Arshad, Norhashim M.; Harvey, David M.; Hobson, Clifford A.

    1996-12-01

    This paper describes the development of a compact digital CCD camera which contains image digitization and processing which interfaces to a personal computer (PC) via a standard enhanced parallel port. Digitizing of precise pixel samples coupled with the provision of putting a single chip FPGA for data processing, became the main digital components of the camera prior to sending the data to the PC. A form of compression scheme is applied so that the digital images may be transferred within the existing parallel port bandwidth. The data is decompressed in the PC environment for a real- time display of the video images using purely native processor resources. Frame capture is built into the camera so that a full uncompressed digital image could be sent for special processing.

  12. Fault dislocation modeled structure of lobate scarps from Lunar Reconnaissance Orbiter Camera digital terrain models

    NASA Astrophysics Data System (ADS)

    Williams, N. R.; Watters, T. R.; Pritchard, M. E.; Banks, M. E.; Bell, J. F.

    2013-02-01

    Before the launch of the Lunar Reconnaissance Orbiter, known characteristics of lobate scarps on the Moon were limited to studies of only a few dozen scarps revealed in Apollo-era photographs within ~20° of the equator. The Lunar Reconnaissance Orbiter Camera now provides meter-scale images of more than 100 lobate scarps, as well as stereo-derived topography of about a dozen scarps. High-resolution digital terrain models (DTMs) provide unprecedented insight into scarp morphology and dimensions. Here, we analyze images and DTMs of the Slipher, Racah X-1, Mandel'shtam-1, Feoktistov, Simpelius-1, and Oppenheimer F lobate scarps. Parameters in fault dislocation models are iteratively varied to provide best fits to DTM topographic profiles to test previous interpretations that the observed landforms are the result of shallow, low-angle thrust faults. Results suggest that these faults occur from the surface down to depths of hundreds of meters, have dip angles of 35-40°, and have typical maximum slips of tens of meters. These lunar scarp models are comparable to modeled geometries of lobate scarps on Mercury, Mars, and asteroid 433 Eros, but are shallower and ~10° steeper than geometries determined in studies with limited Apollo-era data. Frictional and rock mass strength criteria constrain the state of global differential stress between 3.5 and 18.6 MPa at the modeled maximum depths of faulting. Our results are consistent with thermal history models that predict relatively small compressional stresses that likely arise from cooling of a magma ocean.

  13. Deformation monitoring with off-the-shelf digital cameras for civil engineering fatigue testing

    NASA Astrophysics Data System (ADS)

    Detchev, I.; Habib, A.; He, F.; El-Badry, M.

    2014-06-01

    Deformation monitoring of civil infrastructure systems is important in terms of both their safety and serviceability. The former refers to estimating the maximum loading capacity during the design stages of a building project, and the latter means performing regularly scheduled maintenance of an already existing structure. Traditionally, large structures have been monitored using surveying techniques, while fine-scale monitoring of structural components such as beams and trusses has been done with strain gauge instrumentation. In the past decade, digital photogrammetric systems coupled with image processing techniques have also been used for deformation monitoring. The major advantage of this remote sensing method for performing deformation monitoring is that there is no need to access the object of interest while testing is in progress. The paper is a result of an experiment where concrete beams with polymer support sheets are subjected to dynamic loading conditions by a hydraulic actuator in a structures laboratory. This type of loading is also known as fatigue testing, and is used to simulate the typical use of concrete beams over a long period of time. From a photogrammetric point of view, the challenge for this type of experiment is to avoid motion artifacts by maximizing the sensor frame rate, and at the same time to have a good enough image quality in order to achieve satisfactory reconstruction precision. This research effort will investigate the optimal camera settings (e.g., aperture, shutter speed, sensor sensitivity, and file size resolution) in order to have a balance between high sensor frame rate and good image quality. The results will be first evaluated in terms of their repeatability, and then also in terms of their accuracy. The accuracy of the results will be checked against another set of results coming from high quality laser transducers.

  14. Snow process monitoring in mountain forest environments with a digital camera network

    NASA Astrophysics Data System (ADS)

    Dong, Chunyu; Menzel, Lucas

    2016-04-01

    Snow processes are important components of the hydrologic cycle in mountainous areas and at high latitudes. Sparse observations in remote regions, in combination with complex topography, local climate specifics and the impact of heterogeneous vegetation cover complicate a detailed investigation of snow related processes. In this study, a camera network is applied to monitor the complex snow processes with high temporal resolution in montane forest environments (800-1200 m a.s.l.) in southwestern Germany. A typical feature of this region is the high temporal variability of weather conditions, with frequent snow accumulation and ablation processes and recurrent snow interception on conifers. We developed a semi-automatic procedure to interpret snow depths from the digital images, which shows high consistency with manual readings and station-based measurements. To extract the snow canopy interception dynamics from the pictures, six binary classification methods are compared. MaxEntropy classifier shows obviously better performance than the others in various illumination conditions, and it is thus selected to execute the snow interception quantification. The snow accumulation and ablation processes on the ground as well as the snow loading and unloading in forest canopies are investigated based on the snow parameters derived from the time-lapse photography monitoring. Besides, the influences of meteorological conditions, forest cover and elevation on snow processes are considered. Further, our investigations serve to improve the snow and interception modules of a hydrological model. We found that time-lapse photography proves to be an effective and low-cost approach to collect useful snow-related information which supports our understanding of snow processes and the further development of hydrological models. We will present selected results from our investigations over two consecutive winters.

  15. Per-Pixel Coded Exposure for High-Speed and High-Resolution Imaging Using a Digital Micromirror Device Camera

    PubMed Central

    Feng, Wei; Zhang, Fumin; Qu, Xinghua; Zheng, Shiwei

    2016-01-01

    High-speed photography is an important tool for studying rapid physical phenomena. However, low-frame-rate CCD (charge coupled device) or CMOS (complementary metal oxide semiconductor) camera cannot effectively capture the rapid phenomena with high-speed and high-resolution. In this paper, we incorporate the hardware restrictions of existing image sensors, design the sampling functions, and implement a hardware prototype with a digital micromirror device (DMD) camera in which spatial and temporal information can be flexibly modulated. Combined with the optical model of DMD camera, we theoretically analyze the per-pixel coded exposure and propose a three-element median quicksort method to increase the temporal resolution of the imaging system. Theoretically, this approach can rapidly increase the temporal resolution several, or even hundreds, of times without increasing bandwidth requirements of the camera. We demonstrate the effectiveness of our method via extensive examples and achieve 100 fps (frames per second) gain in temporal resolution by using a 25 fps camera. PMID:26959023

  16. Comparison of a new laser beam wound camera and a digital photoplanimetry-based method for wound measurement in horses.

    PubMed

    Van Hecke, L L; De Mil, T A; Haspeslagh, M; Chiers, K; Martens, A M

    2015-03-01

    The aim of this study was to compare the accuracy, precision, inter- and intra-operator reliability of a new laser beam (LB) wound camera and a digital photoplanimetry-based (DPB) method for measuring the dimensions of equine wounds. Forty-one wounds were created on equine cadavers. The area, circumference, maximum depth and volume of each wound were measured four times with both techniques by two operators. A silicone cast was made of each wound and served as the reference standard to measure the wound dimensions. The DPB method had a higher accuracy and precision in determining the wound volume compared with the LB camera, which had a higher accuracy in determining the wound area and maximum depth and better precision in determining the area and circumference. The LB camera also had a significantly higher overall inter-operator reliability for measuring the wound area, circumference and volume. In contrast, the DPB method had poor intra-operator reliability for the wound circumference. The LB camera was more user-friendly than the DPB method. The LB wound camera is recommended as the better objective method to assess the dimensions of wounds in horses, despite its poorer performance for the measurement of wound volume. However, if the wound measurements are performed by one operator on cadavers or animals under general anaesthesia, the DPB method is a less expensive and valid alternative. PMID:25665920

  17. Per-Pixel Coded Exposure for High-Speed and High-Resolution Imaging Using a Digital Micromirror Device Camera.

    PubMed

    Feng, Wei; Zhang, Fumin; Qu, Xinghua; Zheng, Shiwei

    2016-01-01

    High-speed photography is an important tool for studying rapid physical phenomena. However, low-frame-rate CCD (charge coupled device) or CMOS (complementary metal oxide semiconductor) camera cannot effectively capture the rapid phenomena with high-speed and high-resolution. In this paper, we incorporate the hardware restrictions of existing image sensors, design the sampling functions, and implement a hardware prototype with a digital micromirror device (DMD) camera in which spatial and temporal information can be flexibly modulated. Combined with the optical model of DMD camera, we theoretically analyze the per-pixel coded exposure and propose a three-element median quicksort method to increase the temporal resolution of the imaging system. Theoretically, this approach can rapidly increase the temporal resolution several, or even hundreds, of times without increasing bandwidth requirements of the camera. We demonstrate the effectiveness of our method via extensive examples and achieve 100 fps (frames per second) gain in temporal resolution by using a 25 fps camera. PMID:26959023

  18. Comparison of - and Mutual Informaton Based Calibration of Terrestrial Laser Scanner and Digital Camera for Deformation Monitoring

    NASA Astrophysics Data System (ADS)

    Omidalizarandi, M.; Neumann, I.

    2015-12-01

    In the current state-of-the-art, geodetic deformation analysis of natural and artificial objects (e.g. dams, bridges,...) is an ongoing research in both static and kinematic mode and has received considerable interest by researchers and geodetic engineers. In this work, due to increasing the accuracy of geodetic deformation analysis, a terrestrial laser scanner (TLS; here the Zoller+Fröhlich IMAGER 5006) and a high resolution digital camera (Nikon D750) are integrated to complementarily benefit from each other. In order to optimally combine the acquired data of the hybrid sensor system, a highly accurate estimation of the extrinsic calibration parameters between TLS and digital camera is a vital preliminary step. Thus, the calibration of the aforementioned hybrid sensor system can be separated into three single calibrations: calibration of the camera, calibration of the TLS and extrinsic calibration between TLS and digital camera. In this research, we focus on highly accurate estimating extrinsic parameters between fused sensors and target- and targetless (mutual information) based methods are applied. In target-based calibration, different types of observations (image coordinates, TLS measurements and laser tracker measurements for validation) are utilized and variance component estimation is applied to optimally assign adequate weights to the observations. Space resection bundle adjustment based on the collinearity equations is solved using Gauss-Markov and Gauss-Helmert model. Statistical tests are performed to discard outliers and large residuals in the adjustment procedure. At the end, the two aforementioned approaches are compared and advantages and disadvantages of them are investigated and numerical results are presented and discussed.

  19. Design and Fabrication of Two-Dimensional Semiconducting Bolometer Arrays for the High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC-II)

    NASA Technical Reports Server (NTRS)

    Voellmer, George M.; Allen, Christine A.; Amato, Michael J.; Babu, Sachidananda R.; Bartels, Arlin E.; Benford, Dominic J.; Derro, Rebecca J.; Dowell, C. Darren; Harper, D. Al; Jhabvala, Murzy D.

    2002-01-01

    The High resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC II) will use almost identical versions of an ion-implanted silicon bolometer array developed at the National Aeronautics and Space Administration's Goddard Space Flight Center (GSFC). The GSFC 'Pop-up' Detectors (PUD's) use a unique folding technique to enable a 12 x 32-element close-packed array of bolometers with a filling factor greater than 95 percent. A kinematic Kevlar(trademark) suspension system isolates the 200 mK bolometers from the helium bath temperature, and GSFC - developed silicon bridge chips make electrical connection to the bolometers, while maintaining thermal isolation. The JFET preamps operate at 120 K. Providing good thermal heat sinking for these, and keeping their conduction and radiation from reaching the nearby bolometers, is one of the principal design challenges encountered. Another interesting challenge is the preparation of the silicon bolometers. They are manufactured in 32-element, planar rows using Micro Electro Mechanical Systems (MEMS) semiconductor etching techniques, and then cut and folded onto a ceramic bar. Optical alignment using specialized jigs ensures their uniformity and correct placement. The rows are then stacked to create the 12 x 32-element array. Engineering results from the first light run of SHARC II at the Caltech Submillimeter Observatory (CSO) are presented.

  20. Design and Fabrication of Two-Dimensional Semiconducting Bolometer Arrays for the High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC-II)

    NASA Technical Reports Server (NTRS)

    Voellmer, George M.; Allen, Christine A.; Amato, Michael J.; Babu, Sachidananda R.; Bartels, Arlin E.; Benford, Dominic J.; Derro, Rebecca J.; Dowell, C. Darren; Harper, D. Al; Jhabvala, Murzy D.; Simpson, A. D. (Technical Monitor)

    2002-01-01

    The High resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC 11) will use almost identical versions of an ion-implanted silicon bolometer array developed at the National Aeronautics and Space Administration's Goddard Space Flight Center (GSFC). The GSFC "Pop-Up" Detectors (PUD's) use a unique folding technique to enable a 12 x 32-element close-packed array of bolometers with a filling factor greater than 95 percent. A kinematic Kevlar(Registered Trademark) suspension system isolates the 200 mK bolometers from the helium bath temperature, and GSFC - developed silicon bridge chips make electrical connection to the bolometers, while maintaining thermal isolation. The JFET preamps operate at 120 K. Providing good thermal heat sinking for these, and keeping their conduction and radiation from reaching the nearby bolometers, is one of the principal design challenges encountered. Another interesting challenge is the preparation of the silicon bolometers. They are manufactured in 32-element, planar rows using Micro Electro Mechanical Systems (MEMS) semiconductor etching techniques, and then cut and folded onto a ceramic bar. Optical alignment using specialized jigs ensures their uniformity and correct placement. The rows are then stacked to create the 12 x 32-element array. Engineering results from the first light run of SHARC II at the CalTech Submillimeter Observatory (CSO) are presented.

  1. Co-Registration Airborne LIDAR Point Cloud Data and Synchronous Digital Image Registration Based on Combined Adjustment

    NASA Astrophysics Data System (ADS)

    Yang, Z. H.; Zhang, Y. S.; Zheng, T.; Lai, W. B.; Zou, Z. R.; Zou, B.

    2016-06-01

    Aim at the problem of co-registration airborne laser point cloud data with the synchronous digital image, this paper proposed a registration method based on combined adjustment. By integrating tie point, point cloud data with elevation constraint pseudo observations, using the principle of least-squares adjustment to solve the corrections of exterior orientation elements of each image, high-precision registration results can be obtained. In order to ensure the reliability of the tie point, and the effectiveness of pseudo observations, this paper proposed a point cloud data constrain SIFT matching and optimizing method, can ensure that the tie points are located on flat terrain area. Experiments with the airborne laser point cloud data and its synchronous digital image, there are about 43 pixels error in image space using the original POS data. If only considering the bore-sight of POS system, there are still 1.3 pixels error in image space. The proposed method regards the corrections of the exterior orientation elements of each image as unknowns and the errors are reduced to 0.15 pixels.

  2. Diabetic Retinopathy Screening Ratio Is Improved When Using a Digital, Nonmydriatic Fundus Camera Onsite in a Diabetes Outpatient Clinic

    PubMed Central

    Roser, Pia; Kalscheuer, Hannes; Groener, Jan B.; Lehnhoff, Daniel; Klein, Roman; Auffarth, Gerd U.; Nawroth, Peter P.; Schuett, Florian; Rudofsky, Gottfried

    2016-01-01

    Objective. To evaluate the effect of onsite screening with a nonmydriatic, digital fundus camera for diabetic retinopathy (DR) at a diabetes outpatient clinic. Research Design and Methods. This cross-sectional study included 502 patients, 112 with type 1 and 390 with type 2 diabetes. Patients attended screenings for microvascular complications, including diabetic nephropathy (DN), diabetic polyneuropathy (DP), and DR. Single-field retinal imaging with a digital, nonmydriatic fundus camera was used to assess DR. Prevalence and incidence of microvascular complications were analyzed and the ratio of newly diagnosed to preexisting complications for all entities was calculated in order to differentiate natural progress from missed DRs. Results. For both types of diabetes, prevalence of DR was 25.0% (n = 126) and incidence 6.4% (n = 32) (T1DM versus T2DM: prevalence: 35.7% versus 22.1%, incidence 5.4% versus 6.7%). 25.4% of all DRs were newly diagnosed. Furthermore, the ratio of newly diagnosed to preexisting DR was higher than those for DN (p = 0.12) and DP (p = 0.03) representing at least 13 patients with missed DR. Conclusions. The results indicate that implementing nonmydriatic, digital fundus imaging in a diabetes outpatient clinic can contribute to improved early diagnosis of diabetic retinopathy. PMID:26904690

  3. Use of commercial off-the-shelf digital cameras for scientific data acquisition and scene-specific color calibration.

    PubMed

    Akkaynak, Derya; Treibitz, Tali; Xiao, Bei; Gürkan, Umut A; Allen, Justine J; Demirci, Utkan; Hanlon, Roger T

    2014-02-01

    Commercial off-the-shelf digital cameras are inexpensive and easy-to-use instruments that can be used for quantitative scientific data acquisition if images are captured in raw format and processed so that they maintain a linear relationship with scene radiance. Here we describe the image-processing steps required for consistent data acquisition with color cameras. In addition, we present a method for scene-specific color calibration that increases the accuracy of color capture when a scene contains colors that are not well represented in the gamut of a standard color-calibration target. We demonstrate applications of the proposed methodology in the fields of biomedical engineering, artwork photography, perception science, marine biology, and underwater imaging. PMID:24562030

  4. Photometric-based recovery of illuminant-free color images using a red-green-blue digital camera

    NASA Astrophysics Data System (ADS)

    Luis Nieves, Juan; Plata, Clara; Valero, Eva M.; Romero, Javier

    2012-01-01

    Albedo estimation has traditionally been used to make computational simulations of real objects under different conditions, but as yet no device is capable of measuring albedo directly. The aim of this work is to introduce a photometric-based color imaging framework that can estimate albedo and can reproduce the appearance both indoors and outdoors of images under different lights and illumination geometry. Using a calibration sample set composed of chips made of the same material but different colors and textures, we compare two photometric-stereo techniques, one of them avoiding the effect of shadows and highlights in the image and the other ignoring this constraint. We combined a photometric-stereo technique and a color-estimation algorithm that directly relates the camera sensor outputs with the albedo values. The proposed method can produce illuminant-free images with good color accuracy when a three-channel red-green-blue (RGB) digital camera is used, even outdoors under solar illumination.

  5. Increasing signal-to-noise ratio of reconstructed digital holograms by using light spatial noise portrait of camera's photosensor

    NASA Astrophysics Data System (ADS)

    Cheremkhin, Pavel A.; Evtikhiev, Nikolay N.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Starikov, Sergey N.

    2015-01-01

    Digital holography is technique which includes recording of interference pattern with digital photosensor, processing of obtained holographic data and reconstruction of object wavefront. Increase of signal-to-noise ratio (SNR) of reconstructed digital holograms is especially important in such fields as image encryption, pattern recognition, static and dynamic display of 3D scenes, and etc. In this paper compensation of photosensor light spatial noise portrait (LSNP) for increase of SNR of reconstructed digital holograms is proposed. To verify the proposed method, numerical experiments with computer generated Fresnel holograms with resolution equal to 512×512 elements were performed. Simulation of shots registration with digital camera Canon EOS 400D was performed. It is shown that solo use of the averaging over frames method allows to increase SNR only up to 4 times, and further increase of SNR is limited by spatial noise. Application of the LSNP compensation method in conjunction with the averaging over frames method allows for 10 times SNR increase. This value was obtained for LSNP measured with 20 % error. In case of using more accurate LSNP, SNR can be increased up to 20 times.

  6. Digital image measurement of specimen deformation based on CCD cameras and Image J software: an application to human pelvic biomechanics

    NASA Astrophysics Data System (ADS)

    Jia, Yongwei; Cheng, Liming; Yu, Guangrong; Lou, Yongjian; Yu, Yan; Chen, Bo; Ding, Zuquan

    2008-03-01

    A method of digital image measurement of specimen deformation based on CCD cameras and Image J software was developed. This method was used to measure the biomechanics behavior of human pelvis. Six cadaveric specimens from the third lumbar vertebra to the proximal 1/3 part of femur were tested. The specimens without any structural abnormalities were dissected of all soft tissue, sparing the hip joint capsules and the ligaments of the pelvic ring and floor. Markers with black dot on white background were affixed to the key regions of the pelvis. Axial loading from the proximal lumbar was applied by MTS in the gradient of 0N to 500N, which simulated the double feet standing stance. The anterior and lateral images of the specimen were obtained through two CCD cameras. Based on Image J software, digital image processing software, which can be freely downloaded from the National Institutes of Health, digital 8-bit images were processed. The procedure includes the recognition of digital marker, image invert, sub-pixel reconstruction, image segmentation, center of mass algorithm based on weighted average of pixel gray values. Vertical displacements of S1 (the first sacral vertebrae) in front view and micro-angular rotation of sacroiliac joint in lateral view were calculated according to the marker movement. The results of digital image measurement showed as following: marker image correlation before and after deformation was excellent. The average correlation coefficient was about 0.983. According to the 768 × 576 pixels image (pixel size 0.68mm × 0.68mm), the precision of the displacement detected in our experiment was about 0.018 pixels and the comparatively error could achieve 1.11\\perthou. The average vertical displacement of S1 of the pelvis was 0.8356+/-0.2830mm under vertical load of 500 Newtons and the average micro-angular rotation of sacroiliac joint in lateral view was 0.584+/-0.221°. The load-displacement curves obtained from our optical measure system

  7. Evaluation of a novel laparoscopic camera for characterization of renal ischemia in a porcine model using digital light processing (DLP) hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Olweny, Ephrem O.; Tan, Yung K.; Faddegon, Stephen; Jackson, Neil; Wehner, Eleanor F.; Best, Sara L.; Park, Samuel K.; Thapa, Abhas; Cadeddu, Jeffrey A.; Zuzak, Karel J.

    2012-03-01

    Digital light processing hyperspectral imaging (DLP® HSI) was adapted for use during laparoscopic surgery by coupling a conventional laparoscopic light guide with a DLP-based Agile Light source (OL 490, Optronic Laboratories, Orlando, FL), incorporating a 0° laparoscope, and a customized digital CCD camera (DVC, Austin, TX). The system was used to characterize renal ischemia in a porcine model.

  8. First Results from an Airborne Ka-Band SAR Using SweepSAR and Digital Beamforming

    NASA Technical Reports Server (NTRS)

    Sadowy, Gregory A.; Ghaemi, Hirad; Hensley, Scott C.

    2012-01-01

    SweepSAR is a wide-swath synthetic aperture radar technique that is being studied for application on the future Earth science radar missions. This paper describes the design of an airborne radar demonstration that simulates an 11-m L-band (1.2-1.3 GHz) reflector geometry at Ka-band (35.6 GHz) using a 40-cm reflector. The Ka-band SweepSAR Demonstration system was flown on the NASA DC-8 airborne laboratory and used to study engineering performance trades and array calibration for SweepSAR configurations. We present an instrument and experiment overview, instrument calibration and first results.

  9. Stochastic modeling and triangulation for an airborne digital three line scanner

    NASA Astrophysics Data System (ADS)

    Jung, Won Jo

    The 3-OC is one of the newest digital three line scanners on the market. Unlike other three line scanners using a single optical system, the 3-OC uses three different optical systems moving together. Therefore, this thesis aimed to develop a photogrammetric model for the 3-OC. To precisely relate ground space and the corresponding image space, all the exterior orientation (E.O.) parameters of image lines need to be estimated using a bundle block adjustment. The biggest hurdle in this problem is the large number of exterior orientation parameters because one image strip of the 3-OC usually contains tens of thousands of lines. To reduce the number of unknown E.O. parameters, the E.O. parameters of all the three cameras at an instant imaging time were represented by transformed parameters with respect to the gimbal rotation center. As a result, the unknown E.O. parameters were reduced to one third of original number of parameters. However, the number of E.O. parameters is still too big and estimating these E.O. parameters requires enough observations which are practically very difficult to obtain. To resolve this problem, there have been two kinds of approaches. One is reducing the number of unknown parameters and the other is providing fictitious observations using a stochastic model. As the title of this thesis implies, a stochastic trajectory model was implemented in this thesis. The stochastic relationships between two adjacent lines, as described in previous work, were expanded to the stochastic relationships between two adjacent image observations, so that the E.O. parameters of the lines between two adjacent observations can be recovered by interpolation. By providing enough pass points, it was possible to recover all the E.O. parameters accurately. In addition, the number of unknown E.O. parameters was drastically reduced as well. In this thesis, aerial triangulations of the suggested photogrammetric model were performed with self-calibrating some of the

  10. Reading Out Single-Molecule Digital RNA and DNA Isothermal Amplification in Nanoliter Volumes with Unmodified Camera Phones.

    PubMed

    Rodriguez-Manzano, Jesus; Karymov, Mikhail A; Begolo, Stefano; Selck, David A; Zhukov, Dmitriy V; Jue, Erik; Ismagilov, Rustem F

    2016-03-22

    Digital single-molecule technologies are expanding diagnostic capabilities, enabling the ultrasensitive quantification of targets, such as viral load in HIV and hepatitis C infections, by directly counting single molecules. Replacing fluorescent readout with a robust visual readout that can be captured by any unmodified cell phone camera will facilitate the global distribution of diagnostic tests, including in limited-resource settings where the need is greatest. This paper describes a methodology for developing a visual readout system for digital single-molecule amplification of RNA and DNA by (i) selecting colorimetric amplification-indicator dyes that are compatible with the spectral sensitivity of standard mobile phones, and (ii) identifying an optimal ratiometric image-process for a selected dye to achieve a readout that is robust to lighting conditions and camera hardware and provides unambiguous quantitative results, even for colorblind users. We also include an analysis of the limitations of this methodology, and provide a microfluidic approach that can be applied to expand dynamic range and improve reaction performance, allowing ultrasensitive, quantitative measurements at volumes as low as 5 nL. We validate this methodology using SlipChip-based digital single-molecule isothermal amplification with λDNA as a model and hepatitis C viral RNA as a clinically relevant target. The innovative combination of isothermal amplification chemistry in the presence of a judiciously chosen indicator dye and ratiometric image processing with SlipChip technology allowed the sequence-specific visual readout of single nucleic acid molecules in nanoliter volumes with an unmodified cell phone camera. When paired with devices that integrate sample preparation and nucleic acid amplification, this hardware-agnostic approach will increase the affordability and the distribution of quantitative diagnostic and environmental tests. PMID:26900709

  11. New measuring concepts using integrated online analysis of color and monochrome digital high-speed camera sequences

    NASA Astrophysics Data System (ADS)

    Renz, Harald

    1997-05-01

    High speed sequences allow a subjective assessment of very fast processes and serve as an important basis for the quantitative analysis of movements. Computer systems help to acquire, handle, display and store digital image sequences as well as to perform measurement tasks automatically. High speed cameras have been used since several years for safety tests, material testing or production optimization. To get the very high speed of 1000 or more images per second, three have been used mainly 16 mm film cameras, which could provide an excellent image resolution and the required time resolution. But up to now, most results have been only judged by viewing. For some special applications like safety tests using crash or high-g sled tests in the automobile industry there have been used image analyzing techniques to measure also the characteristic of moving objects inside images. High speed films, shot during the short impact, allow judgement of the dynamic scene. Additionally they serve as an important basis for the quantitative analysis of the very fast movements. Thus exact values of the velocity and acceleration, the dummies or vehicles are exposed to, can be derived. For analysis of the sequences the positions of signalized points--mostly markers, which are fixed by the test engineers before a test--have to be measured frame by frame. The trajectories show the temporal sequence of the test objects and are the base for calibrated diagrams of distance, velocity and acceleration. Today there are replaced more and more 16 mm film cameras by electronic high speed cameras. The development of high-speed recording systems is very far advanced and the prices of these systems are more and more comparable to those of traditional film cameras. Also the resolution has been increased very greatly. The new cameras are `crashproof' and can be used for similar tasks as the 16 mm film cameras at similar sizes. High speed video cameras now offer an easy setup and direct access to

  12. An image compression algorithm for a high-resolution digital still camera

    NASA Technical Reports Server (NTRS)

    Nerheim, Rosalee

    1989-01-01

    The Electronic Still Camera (ESC) project will provide for the capture and transmission of high-quality images without the use of film. The image quality will be superior to video and will approach the quality of 35mm film. The camera, which will have the same general shape and handling as a 35mm camera, will be able to send images to earth in near real-time. Images will be stored in computer memory (RAM) in removable cartridges readable by a computer. To save storage space, the image will be compressed and reconstructed at the time of viewing. Both lossless and loss-y image compression algorithms are studied, described, and compared.

  13. Dgnss/ins Van Project For Road Survey In The Cei Countries: The Problem of Digital Cameras Calibration

    NASA Astrophysics Data System (ADS)

    Deruda, G.; Falchi, E.; Sanna, G.; Vacca, G.

    In order to assess the influence of distortion of objective lens on digital photocameras and videocameras a series of experiments, using a digital photocamera by Nikon, a videocamera by Samsung and a webcam by Creative have been performed with the aim to test the possibility to enhance camera images by means of resampling tech- niques. For this purpose a network of fiducial points has been materialized on two walls of a building in the Faculty of Engineering of Cagliari. Points coordinate have been obtained by means of a topographic survey. Images and video sequences of the fronts have been taken at several distances and different focal lens, obtaining an esti- mate of the lens behaviour, on the basis of witch a regular grid of the displacement of points on the photo has been generated for each camera. The grid has been used in a resampling procedure to remove distortion influence by the images. The improvement of accuracy has been estimated between about 30 and 50%.

  14. Trend of digital camera and interchangeable zoom lenses with high ratio based on patent application over the past 10 years

    NASA Astrophysics Data System (ADS)

    Sensui, Takayuki

    2012-10-01

    Although digitalization has tripled consumer-class camera market scale, extreme reductions in prices of fixed-lens cameras has reduced profitability. As a result, a number of manufacturers have entered the market of the System DSC i.e. digital still camera with interchangeable lens, where large profit margins are possible, and many high ratio zoom lenses with image stabilization functions have been released. Quiet actuators are another indispensable component. Design with which there is little degradation in performance due to all types of errors is preferred for good balance in terms of size, lens performance, and the rate of quality to sub-standard products. Decentering, such as that caused by tilting, sensitivity of moving groups is especially important. In addition, image stabilization mechanisms actively shift lens groups. Development of high ratio zoom lenses with vibration reduction mechanism is confronted by the challenge of reduced performance due to decentering, making control over decentering sensitivity between lens groups everything. While there are a number of ways to align lenses (axial alignment), shock resistance and ability to stand up to environmental conditions must also be considered. Naturally, it is very difficult, if not impossible, to make lenses smaller and achieve a low decentering sensitivity at the same time. 4-group zoom construction is beneficial in making lenses smaller, but decentering sensitivity is greater. 5-group zoom configuration makes smaller lenses more difficult, but it enables lower decentering sensitivities. At Nikon, the most advantageous construction is selected for each lens based on specifications. The AF-S DX NIKKOR 18-200mm f/3.5-5.6G ED VR II and AF-S NIKKOR 28-300mm f/3.5-5.6G ED VR are excellent examples of this.

  15. Digital photogrammetric analysis of the IMP camera images: Mapping the Mars Pathfinder landing site in three dimensions

    NASA Astrophysics Data System (ADS)

    Kirk, R. L.; Howington-Kraus, E.; Hare, T.; Dorrer, E.; Cook, D.; Becker, K.; Thompson, K.; Redding, B.; Blue, J.; Galuszka, D.; Lee, E. M.; Gaddis, L. R.; Johnson, J. R.; Soderblom, L. A.; Ward, A. W.; Smith, P. H.; Britt, D. T.

    1999-04-01

    This paper describes our photogrammetric analysis of the Imager for Mars Pathfinder data, part of a broader program of mapping the Mars Pathfinder landing site in support of geoscience investigations. This analysis, carried out primarily with a commercial digital photogrammetric system, supported by our in-house Integrated Software for Imagers and Spectrometers (ISIS), consists of three steps: (1) geometric control: simultaneous solution for refined estimates of camera positions and pointing plus three-dimensional (3-D) coordinates of ~103 features sitewide, based on the measured image coordinates of those features; (2) topographic modeling: identification of ~3×105 closely spaced points in the images and calculation (based on camera parameters from step 1) of their 3-D coordinates, yielding digital terrain models (DTMs); and (3) geometric manipulation of the data: combination of the DTMs from different stereo pairs into a sitewide model, and reprojection of image data to remove parallax between the different spectral filters in the two cameras and to provide an undistorted planimetric view of the site. These processes are described in detail and example products are shown. Plans for combining the photogrammetrically derived topographic data with spectrophotometry are also described. These include photometric modeling using surface orientations from the DTM to study surface microtextures and improve the accuracy of spectral measurements, and photoclinometry to refine the DTM to single-pixel resolution where photometric properties are sufficiently uniform. Finally, the inclusion of rover images in a joint photogrammetric analysis with IMP images is described. This challenging task will provide coverage of areas hidden to the IMP, but accurate ranging of distant features can be achieved only if the lander is also visible in the rover image used.

  16. Digital photogrammetric analysis of the IMP camera images: Mapping the Mars Pathfinder landing site in three dimensions

    USGS Publications Warehouse

    Kirk, R.L.; Howington-Kraus, E.; Hare, T.; Dorrer, E.; Cook, D.; Becker, K.; Thompson, K.; Redding, B.; Blue, J.; Galuszka, D.; Lee, E.M.; Gaddis, L.R.; Johnson, J. R.; Soderblom, L.A.; Ward, A.W.; Smith, P.H.; Britt, D.T.

    1999-01-01

    This paper describes our photogrammetric analysis of the Imager for Mars Pathfinder data, part of a broader program of mapping the Mars Pathfinder landing site in support of geoscience investigations. This analysis, carried out primarily with a commercial digital photogrammetric system, supported by our in-house Integrated Software for Imagers and Spectrometers (ISIS), consists of three steps: (1) geometric control: simultaneous solution for refined estimates of camera positions and pointing plus three-dimensional (3-D) coordinates of ???103 features sitewide, based on the measured image coordinates of those features; (2) topographic modeling: identification of ???3 ?? 105 closely spaced points in the images and calculation (based on camera parameters from step 1) of their 3-D coordinates, yielding digital terrain models (DTMs); and (3) geometric manipulation of the data: combination of the DTMs from different stereo pairs into a sitewide model, and reprojection of image data to remove parallax between the different spectral filters in the two cameras and to provide an undistorted planimetric view of the site. These processes are described in detail and example products are shown. Plans for combining the photogrammetrically derived topographic data with spectrophotometry are also described. These include photometric modeling using surface orientations from the DTM to study surface microtextures and improve the accuracy of spectral measurements, and photoclinometry to refine the DTM to single-pixel resolution where photometric properties are sufficiently uniform. Finally, the inclusion of rover images in a joint photogrammetric analysis with IMP images is described. This challenging task will provide coverage of areas hidden to the IMP, but accurate ranging of distant features can be achieved only if the lander is also visible in the rover image used. Copyright 1999 by the American Geophysical Union.

  17. Low-cost camera modifications and methodologies for very-high-resolution digital images

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Aerial color and color-infrared photography are usually acquired at high altitude so the ground resolution of the photographs is < 1 m. Moreover, current color-infrared cameras and manned aircraft flight time are expensive, so the objective is the development of alternative methods for obtaining ve...

  18. Greenness indices from digital cameras predict the timing and seasonal dynamics of canopy-scale photosynthesis

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The proliferation of tower-mounted cameras co-located with eddy covariance instrumentation provides a novel opportunity to better understand the relationship between canopy phenology and the seasonality of canopy photosynthesis. In this paper, we describe the abilities and limitations of webcams to ...

  19. Levee crest elevation profiles derived from airborne lidar-based high resolution digital elevation models in south Louisiana

    USGS Publications Warehouse

    Palaseanu-Lovejoy, Monica; Thatcher, Cindy A.; Barras, John A.

    2014-01-01

    This study explores the feasibility of using airborne lidar surveys to derive high-resolution digital elevation models (DEMs) and develop an automated procedure to extract levee longitudinal elevation profiles for both federal levees in Atchafalaya Basin and local levees in Lafourche Parish. Generally, the use of traditional manual surveying methods to map levees is a costly and time consuming process that typically produces cross-levee profiles every few hundred meters, at best. The purpose of our paper is to describe and test methods for extracting levee crest elevations in an efficient, comprehensive manner using high resolution lidar generated DEMs. In addition, the vertical uncertainty in the elevation data and its effect on the resultant estimate of levee crest heights is addressed in an assessment of whether the federal levees in our study meet the USACE minimum height design criteria.

  20. Design of Digital Controller for a CCD Camera with Dual-Speed Tracking Imaging on Same Frame

    NASA Astrophysics Data System (ADS)

    Wang, Hui-Juan; Li, Bin-Hua; Li, Yong-Ming; He, Chun

    2007-12-01

    The CCD cameras with high performance have been widely used in astronomical observations. The techniques for observing moving objects or still objects independently are mature. However, when both the moving objects (such as satellites, debris and asteroids) and still objects (such as stars) are observed at the same time via the same CCD camera, images of one kind of these two objects must be elongated in the most time. In order to solve this problem, the authors developed a novel imaging technique and a corresponding observation method. The photosensitive areas in some CCD arrays are physically divided into two or more zones. Based on these CCD arrays, the new idea can be implemented: one half of the photosensitive area is used to image the still objects in stare mode, and another half to image the moving objects in drift scan mode. It means that both moving objects and still objects can be tracked at the same time without the elongation of their images on the same CCD frame. Thus the new technique is called Dual-Speed Tracking Imaging on Same Frame (DSTIS). This paper briefly introduces the operation principle of the DSTIS CCD camera. After some discussions on the request of a digital controller for the camera, the design philosophy and basic structure of the controller are presented. Then some simulation and testing results are shown, and problems that were encountered during the simulation and testing are analyzed in detail and solved successfully. By the results of the software simulation and hardware testing, the design has been certified correctly.

  1. Applying emerging digital video interface standards to airborne avionics sensor and digital map integrations: benefits outweigh the initial costs

    NASA Astrophysics Data System (ADS)

    Kuehl, C. Stephen

    1996-06-01

    Video signal system performance can be compromised in a military aircraft cockpit management system (CMS) with the tailoring of vintage Electronics Industries Association (EIA) RS170 and RS343A video interface standards. Video analog interfaces degrade when induced system noise is present. Further signal degradation has been traditionally associated with signal data conversions between avionics sensor outputs and the cockpit display system. If the CMS engineering process is not carefully applied during the avionics video and computing architecture development, extensive and costly redesign will occur when visual sensor technology upgrades are incorporated. Close monitoring and technical involvement in video standards groups provides the knowledge-base necessary for avionic systems engineering organizations to architect adaptable and extendible cockpit management systems. With the Federal Communications Commission (FCC) in the process of adopting the Digital HDTV Grand Alliance System standard proposed by the Advanced Television Systems Committee (ATSC), the entertainment and telecommunications industries are adopting and supporting the emergence of new serial/parallel digital video interfaces and data compression standards that will drastically alter present NTSC-M video processing architectures. The re-engineering of the U.S. Broadcasting system must initially preserve the electronic equipment wiring networks within broadcast facilities to make the transition to HDTV affordable. International committee activities in technical forums like ITU-R (former CCIR), ANSI/SMPTE, IEEE, and ISO/IEC are establishing global consensus on video signal parameterizations that support a smooth transition from existing analog based broadcasting facilities to fully digital computerized systems. An opportunity exists for implementing these new video interface standards over existing video coax/triax cabling in military aircraft cockpit management systems. Reductions in signal

  2. A Digital Readout System For The CSO Microwave Kinetic Inductance Camera

    NASA Astrophysics Data System (ADS)

    Max-Moerbeck, Walter; Mazin, B. A.; Zmuidzinas, J.

    2007-12-01

    Submillimeter galaxies are important to the understanding of galaxy formation and evolution. Determination of the spectral energy distribution in the millimeter and submillimeter regimes allows important and powerful diagnostics. Our group is developing a camera for the Caltech Submillimeter Observatory (CSO) using Microwave Kinetic Inductance Detectors (MKIDs). MKIDs are superconducting devices whose impedance changes with the absorption of photons. The camera will have 600 spatial pixels and 4 bands at 750 μm, 850 μm, 1.1 mm and 1.3 mm. For each spatial pixel of the camera the radiation is coupled to the MKIDs using phased-array antennas. This signal is split into 4 different bands using filters and detected using the superconductor as part of a MKID's resonant circuit. The detection process consists of measurement of the changes in the transmission through the resonator when it is illuminated. By designing resonant circuits to have different resonant frequencies and high transmission out resonance, MKIDs can be frequency-domain multiplexed. This allows the simultaneous readout of many detectors through a single coaxial cable. The readout system makes use of microwave IQ modulation and is based on commercial electronics components operating at room temperature. The basic readout has been demonstrated on the CSO. We are working on the implementation of an improved design to be tested on a prototype system with 6x6 pixels and 4 colors next April on the CSO.

  3. Detecting Chlorophyll and Phycocyanin in Lake Texoma Using in Situ Photo from GPS Digital Camera and Landsat 8 OLI Imagery

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Hambright, K.; Xiao, X.

    2013-12-01

    Characterizing the temporal and spatial change of algae blooms across lake systems is difficult through conventional sampling methodologies. The application of remote sensing to lake water quality has improved significantly over recent years. However there are seldom reports about in situ photos from GPS digital camera and the new satellite Landsat 8 OLI monitoring algae blooms in freshwater lakes. A pilot study was carried out in Lake Texoma in Oklahoma on April 25th 2013. At each site (12 sites in total), pigments (chlorophyll a and phycocyanin concentration), in situ spectral data and digital photos had been acquired using Hydrolab DS5X sonde (calibrated routinely against laboratory standards), ASD FieldSpec and GPS camera, respectively. The field spectral data sets were transformed to blue, green and red ranges which match the spectral resolution of Landsat 8 OLI images by average spectral reflectance signature to the first four Landsat 8 OLI bands. Comparing with other ratio indices, red/ blue was the best ratio index which can be employed in predicting phycocyanin and chlorophyll a concentration; and pigments (phycocyanin and chlorophyll a) concentration in whole depth should be selected to be detected using remote sensing method in Lake Texoam in the followed analysis. An image based darkest pixel subtraction method was used to process atmospheric correction of Landsat 8 OLI images. After atmospheric correction, the DN values were extracted and used to compute ratio of band4 (Red)/ band1(Blue). Higher correlation coefficients existed in both between resampled spectral reflectance and ratio of red/ blue of photo DN values (R2=0.9425 n=12) and between resampled spectral reflectance and ratio of red/ blue of Landsat 8 OLI images DN values (R2=0.8476 n=12). Finally, we analyzed the correlation between pigments concentrations in whole depth and DN values ratio red/ blue of both Landsat 8 OLI images and digital photos. There were higher correlation coefficients

  4. Digital multi-focusing from a single photograph taken with an uncalibrated conventional camera.

    PubMed

    Cao, Yang; Fang, Shuai; Wang, Zengfu

    2013-09-01

    The demand to restore all-in-focus images from defocused images and produce photographs focused at different depths is emerging in more and more cases, such as low-end hand-held cameras and surveillance cameras. In this paper, we manage to solve this challenging multi-focusing problem with a single image taken with an uncalibrated conventional camera. Different from all existing multi-focusing approaches, our method does not need to include a deconvolution process, which is quite time-consuming and will cause ringing artifacts in the focused region and low depth-of-field. This paper proposes a novel systematic approach to realize multi-focusing from a single photograph. First of all, with the optical explanation for the local smooth assumption, we present a new point-to-point defocus model. Next, the blur map of the input image, which reflects the amount of defocus blur at each pixel in the image, is estimated by two steps. 1) With the sharp edge prior, a rough blur map is obtained by estimating the blur amount at the edge regions. 2) The guided image filter is applied to propagate the blur value from the edge regions to the whole image by which a refined blur map is obtained. Thus far, we can restore the all-in-focus photograph from a defocused input. To further produce photographs focused at different depths, the depth map from the blur map must be derived. To eliminate the ambiguity over the focal plane, user interaction is introduced and a binary graph cut algorithm is used. So we introduce user interaction and use a binary graph cut algorithm to eliminate the ambiguity over the focal plane. Coupled with the camera parameters, this approach produces images focused at different depths. The performance of this new multi-focusing algorithm is evaluated both objectively and subjectively by various test images. Both results demonstrate that this algorithm produces high quality depth maps and multi-focusing results, outperforming the previous approaches. PMID

  5. Estimation of Leaf Area Index Using Downward and Upward Looking Digital Cameras in a Deciduous Broadleaf Forest

    NASA Astrophysics Data System (ADS)

    Choi, J.; Kang, S.; Lim, J.; Nasahara, K. N.

    2010-12-01

    Monitoring the distribution and changes of leaf area index (LAI) is important for assessing growth of a forest ecosystem. However, it is difficult and time consuming to directly measure LAI. In this study, we suggest an indirect method to calculate the LAI based on the analyses of digital spectral image from the Phenological Eyes Network (PEN) system which consists of Automatic-capturing Digital Fisheye Camera (ADFC) and Hemi-Spherical Spectroradiometer (HSSR). Our main purpose is to develop indirect methods for estimating LAI using either upward or downward ADFC without other ancillary field observation. In developing stage, we used field measured LAI by LAI-2000 plant canopy analyzer (PCA, LI-Cor.), two ADFCs and Hemiview software. The ADFC is a set of Nikon coolpix 4500 camera and FC-E8 fisheye lens and it automatically capture downward and upward canopy in hourly interval. The downward ADFC was used to calculate various vegetation indices through RGB analysis. Meanwhile, the upward ADFC was used to estimate LAI using the Hemiview software. Threshold value of the Hemiview is important to separate the leaves and background such as sky, wood, edge on digital image. In other to decide accurate threshold value of the Hemiview, we performed that comparison of field measured LAI measured and the Hemiview LAI using upward ADFC digital image. Based on the determined threshold value, an objective method to recognize peculiar patterns of RGB histogram around the threshold was developed and applied to estimate LAI from upward ADFC images only. As well, two spectral indices (i.e. G/R ratio and 2G-RB) were calculated from the downward ADFC images. The relations between the spectral indices and LAI time series from the upward ADFC images were investigated and regression models were developed. The regression models were utilized to reconstruct seasonal LAI variation from the downward ADFC images only. Both field-measured and upward ADFC-derived LAIs showed good agreement (R2

  6. Field test comparison of an autocorrelation technique for determining grain size using a digital 'beachball' camera versus traditional methods

    USGS Publications Warehouse

    Barnard, P.L.; Rubin, D.M.; Harney, J.; Mustain, N.

    2007-01-01

    This extensive field test of an autocorrelation technique for determining grain size from digital images was conducted using a digital bed-sediment camera, or 'beachball' camera. Using 205 sediment samples and >1200 images from a variety of beaches on the west coast of the US, grain size ranging from sand to granules was measured from field samples using both the autocorrelation technique developed by Rubin [Rubin, D.M., 2004. A simple autocorrelation algorithm for determining grain size from digital images of sediment. Journal of Sedimentary Research, 74(1): 160-165.] and traditional methods (i.e. settling tube analysis, sieving, and point counts). To test the accuracy of the digital-image grain size algorithm, we compared results with manual point counts of an extensive image data set in the Santa Barbara littoral cell. Grain sizes calculated using the autocorrelation algorithm were highly correlated with the point counts of the same images (r2 = 0.93; n = 79) and had an error of only 1%. Comparisons of calculated grain sizes and grain sizes measured from grab samples demonstrated that the autocorrelation technique works well on high-energy dissipative beaches with well-sorted sediment such as in the Pacific Northwest (r2 ??? 0.92; n = 115). On less dissipative, more poorly sorted beaches such as Ocean Beach in San Francisco, results were not as good (r2 ??? 0.70; n = 67; within 3% accuracy). Because the algorithm works well compared with point counts of the same image, the poorer correlation with grab samples must be a result of actual spatial and vertical variability of sediment in the field; closer agreement between grain size in the images and grain size of grab samples can be achieved by increasing the sampling volume of the images (taking more images, distributed over a volume comparable to that of a grab sample). In all field tests the autocorrelation method was able to predict the mean and median grain size with ???96% accuracy, which is more than

  7. Design and analysis of filter-based optical systems for spectral responsivity estimation of digital video cameras

    NASA Astrophysics Data System (ADS)

    Chang, Gao-Wei; Jian, Hong-Da; Yeh, Zong-Mu; Cheng, Chin-Pao

    2004-02-01

    For estimating spectral responsivities of digital video cameras, a filter-based optical system is designed with sophisticated filter selections, in this paper. The filter consideration in the presence of noise is central to the optical systems design, since the spectral filters primarily prescribe the structure of the perturbed system. A theoretical basis is presented to confirm that sophisticated filter selections can make this system as insensitive to noise as possible. Also, we propose a filter selection method based on the orthogonal-triangular (QR) decomposition with column pivoting (QRCP). To investigate the noise effects, we assess the estimation errors between the actual and estimated spectral responsivities, with the different signal-to-noise ratio (SNR) levels of an eight-bit/channel camera. Simulation results indicate that the proposed method yields satisfactory estimation accuracy. That is, the filter-based optical system with the spectral filters selected from the QRCP-based method is much less sensitive to noise than those with other filters from different selections.

  8. A digital architecture for striping noise compensation in push-broom hyperspectral cameras

    NASA Astrophysics Data System (ADS)

    Valenzuela, Wladimir E.; Figueroa, Miguel; Pezoa, Jorge E.; Meza, Pablo

    2015-09-01

    We present a striping noise compensation architecture for hyperspectral push-broom cameras, implemented on a Field-Programmable Gate Array (FPGA). The circuit is fast, compact, low power, and is capable of eliminating the striping noise in-line during the image acquisition process. The architecture implements a multi dimensional neural network (MDNN) algorithm for striping noise compensation previously reported by our group. The algorithm relies on the assumption that the amount of light impinging at the neighboring photo-detectors is approximately the same in the spatial and spectral dimensions. Under this assumption, two striping noise parameters are estimated using spatial and spectral information from the raw data. We implemented the circuit on a Xilinx ZYNQ XC7Z2010 FPGA and tested it with images obtained from a NIR N17E push-broom camera, with a frame rate of 25fps and a band-pixel rate of 1.888 MHz. The setup consists of a loop of 320 samples of 320 spatial lines and 236 spectral bands between 900 and 1700 nanometers, in laboratory condition, captured with a rigid push-broom controller. The noise compensation core can run at more than 100 MHZ and consumes less than 30mW of dynamic power, using less than 10% of the logic resources available on the chip. It also uses one of two ARM processors available on the FPGA for data acquisition and communication purposes.

  9. Definition and trade-off study of reconfigurable airborne digital computer system organizations

    NASA Technical Reports Server (NTRS)

    Conn, R. B.

    1974-01-01

    A highly-reliable, fault-tolerant reconfigurable computer system for aircraft applications was developed. The development and application reliability and fault-tolerance assessment techniques are described. Particular emphasis is placed on the needs of an all-digital, fly-by-wire control system appropriate for a passenger-carrying airplane.

  10. A practical enhanced-resolution integrated optical-digital imaging camera (PERIODIC)

    NASA Astrophysics Data System (ADS)

    Mirotznik, M.; Mathews, S.; Plemmons, R.; Pauca, P.; Torgersen, T.; Barnard, R.; Gray, B.; Zhang, Q.; van der Gracht, J.; Curt, P.; Bodnar, M.; Prasad, S.

    2009-05-01

    An integrated array computational imaging system, dubbed PERIODIC, is presented which is capable of exploiting a diverse variety of optical information including sub-pixel displacements, phase, polarization, intensity, and wavelength. Several applications of this technology will be presented including digital superresolution, enhanced dynamic range and multi-spectral imaging. Other applications include polarization based dehazing, extended depth of field and 3D imaging. The optical hardware system and software algorithms are described, and sample results are shown.

  11. Study on key techniques for camera-based hydrological record image digitization

    NASA Astrophysics Data System (ADS)

    Li, Shijin; Zhan, Di; Hu, Jinlong; Gao, Xiangtao; Bo, Ping

    2015-10-01

    With the development of information technology, the digitization of scientific or engineering drawings has received more and more attention. In hydrology, meteorology, medicine and mining industry, the grid drawing sheet is commonly used to record the observations from sensors. However, these paper drawings may be destroyed and contaminated due to improper preservation or overuse. Further, it will be a heavy workload and prone to error if these data are manually transcripted into the computer. Hence, in order to digitize these drawings, establishing the corresponding data base will ensure the integrity of data and provide invaluable information for further research. This paper presents an automatic system for hydrological record image digitization, which consists of three key techniques, i.e., image segmentation, intersection point localization and distortion rectification. First, a novel approach to the binarization of the curves and grids in the water level sheet image has been proposed, which is based on the fusion of gradient and color information adaptively. Second, a fast search strategy for cross point location is invented and point-by-point processing is thus avoided, with the help of grid distribution information. And finally, we put forward a local rectification method through analyzing the central portions of the image and utilizing the domain knowledge of hydrology. The processing speed is accelerated, while the accuracy is still satisfying. Experiments on several real water level records show that our proposed techniques are effective and capable of recovering the hydrological observations accurately.

  12. High precision digital control LED spot light source used to calibrate camera

    NASA Astrophysics Data System (ADS)

    Du, Boyu; Xu, Xiping; Liu, Yang

    2015-04-01

    This paper introduces a method of using LED point light source as the camera calibration light. According to the characteristics of the LED point light source, the constant current source is used to provide the necessary current and the illuminometer is used to measure the luminance of the LED point light source. The constant current source is controlled by ARM MCU and exchange data with the host computer though the mode of serial communications. The PC is used as the host computer, it adjust the current according to the luminance of the LED point light source until the luminance achieve the anticipated value. By experimental analysis, we found that the LED point light source can achieve the desired requirements as the calibration light source, and the accuracy is quite better that achieve the desired effect and it can adaptive control the luminance of LED well. The system is convenient and flexible, and its performance is stable and reliable.

  13. In vivo imaging of scattering and absorption properties of exposed brain using a digital red-green-blue camera

    NASA Astrophysics Data System (ADS)

    Nishidate, Izumi; Yoshida, Keiichiro; Kawauchi, Satoko; Sato, Shunichi; Sato, Manabu

    2014-03-01

    We investigate a method to estimate the spectral images of reduced scattering coefficients and the absorption coefficients of in vivo exposed brain tissues in the range from visible to near-infrared wavelength (500-760 nm) based on diffuse reflectance spectroscopy using a digital RGB camera. In the proposed method, the multi-spectral reflectance images of in vivo exposed brain are reconstructed from the digital red, green blue images using the Wiener estimation algorithm. The Monte Carlo simulation-based multiple regression analysis for the absorbance spectra is then used to specify the absorption and scattering parameters of brain tissue. In this analysis, the concentration of oxygenated hemoglobin and that of deoxygenated hemoglobin are estimated as the absorption parameters whereas the scattering amplitude a and the scattering power b in the expression of μs'=aλ-b as the scattering parameters, respectively. The spectra of absorption and reduced scattering coefficients are reconstructed from the absorption and scattering parameters, and finally, the spectral images of absorption and reduced scattering coefficients are estimated. The estimated images of absorption coefficients were dominated by the spectral characteristics of hemoglobin. The estimated spectral images of reduced scattering coefficients showed a broad scattering spectrum, exhibiting larger magnitude at shorter wavelengths, corresponding to the typical spectrum of brain tissue published in the literature. In vivo experiments with exposed brain of rats during CSD confirmed the possibility of the method to evaluate both hemodynamics and changes in tissue morphology due to electrical depolarization.

  14. Noncontact imaging of plethysmographic pulsation and spontaneous low-frequency oscillation in skin perfusion with a digital red-green-blue camera

    NASA Astrophysics Data System (ADS)

    Nishidate, Izumi; Hoshi, Akira; Aoki, Yuta; Nakano, Kazuya; Niizeki, Kyuichi; Aizu, Yoshihisa

    2016-03-01

    A non-contact imaging method with a digital RGB camera is proposed to evaluate plethysmogram and spontaneous lowfrequency oscillation. In vivo experiments with human skin during mental stress induced by the Stroop color-word test demonstrated the feasibility of the method to evaluate the activities of autonomic nervous systems.

  15. Formal methods and their role in digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1995-01-01

    This report is based on one prepared as a chapter for the FAA Digital Systems Validation Handbook (a guide to assist FAA certification specialists with advanced technology issues). Its purpose is to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used in critical applications; and to suggest factors for consideration when formal methods are offered in support of certification. The presentation concentrates on the rationale for formal methods and on their contribution to assurance for critical applications within a context such as that provided by DO-178B (the guidelines for software used on board civil aircraft); it is intended as an introduction for those to whom these topics are new.

  16. Digital Intermediate Frequency Receiver Module For Use In Airborne Sar Applications

    DOEpatents

    Tise, Bertice L.; Dubbert, Dale F.

    2005-03-08

    A digital IF receiver (DRX) module directly compatible with advanced radar systems such as synthetic aperture radar (SAR) systems. The DRX can combine a 1 G-Sample/sec 8-bit ADC with high-speed digital signal processor, such as high gate-count FPGA technology or ASICs to realize a wideband IF receiver. DSP operations implemented in the DRX can include quadrature demodulation and multi-rate, variable-bandwidth IF filtering. Pulse-to-pulse (Doppler domain) filtering can also be implemented in the form of a presummer (accumulator) and an azimuth prefilter. An out of band noise source can be employed to provide a dither signal to the ADC, and later be removed by digital signal processing. Both the range and Doppler domain filtering operations can be implemented using a unique pane architecture which allows on-the-fly selection of the filter decimation factor, and hence, the filter bandwidth. The DRX module can include a standard VME-64 interface for control, status, and programming. An interface can provide phase history data to the real-time image formation processors. A third front-panel data port (FPDP) interface can send wide bandwidth, raw phase histories to a real-time phase history recorder for ground processing.

  17. Stream Segment Energy Balances Using Digital Cameras and Inexpensive Light Sensors

    NASA Astrophysics Data System (ADS)

    Barnes, P.; Zhu, J.; Li, D.; Villamizar-Amaya, S.; Butler, C. A.; Pai, H.; Harmon, T. C.

    2009-12-01

    The energy balance within a river reach is influenced significantly by the availability of light that can be used by primary and secondary producers. We are proposing a method to rapidly and inexpensively characterize the amount of available light within a river reach. We tested our approach within a 50 meter segment in the lower Merced River in California, where flows are heavily affected by the current reservoir operations and agricultural withdrawals and inputs. Overhead images taken with a network security camera were used in conjunction with inexpensive, self-logging temperature and light sensors installed at multiple locations at the water surface (used as ground truth points) to capture changes in the light/shade patterns within the study reach. The Multivariate Alteration Detection (MAD) technique with decision thresholds was used to identify changes in shadow for a given image pixel. The changes in shadow (increase or decrease) were classified based on scale, color (spectral properties), and shape (smoothness and compactness). A strong inverse relationship between changes in shadow and changes in light intensity was observed. The preliminary results suggest the feasibility of the proposed method for rapidly characterizing riparian vegetation-stream shading conditions over time and space, and their relation to river metabolism.

  18. Design of a fault tolerant airborne digital computer. Volume 1: Architecture

    NASA Technical Reports Server (NTRS)

    Wensley, J. H.; Levitt, K. N.; Green, M. W.; Goldberg, J.; Neumann, P. G.

    1973-01-01

    This volume is concerned with the architecture of a fault tolerant digital computer for an advanced commercial aircraft. All of the computations of the aircraft, including those presently carried out by analogue techniques, are to be carried out in this digital computer. Among the important qualities of the computer are the following: (1) The capacity is to be matched to the aircraft environment. (2) The reliability is to be selectively matched to the criticality and deadline requirements of each of the computations. (3) The system is to be readily expandable. contractible, and (4) The design is to appropriate to post 1975 technology. Three candidate architectures are discussed and assessed in terms of the above qualities. Of the three candidates, a newly conceived architecture, Software Implemented Fault Tolerance (SIFT), provides the best match to the above qualities. In addition SIFT is particularly simple and believable. The other candidates, Bus Checker System (BUCS), also newly conceived in this project, and the Hopkins multiprocessor are potentially more efficient than SIFT in the use of redundancy, but otherwise are not as attractive.

  19. Airborne digital-image data for monitoring the Colorado River corridor below Glen Canyon Dam, Arizona, 2009 - Image-mosaic production and comparison with 2002 and 2005 image mosaics

    USGS Publications Warehouse

    Davis, Philip A.

    2012-01-01

    Airborne digital-image data were collected for the Arizona part of the Colorado River ecosystem below Glen Canyon Dam in 2009. These four-band image data are similar in wavelength band (blue, green, red, and near infrared) and spatial resolution (20 centimeters) to image collections of the river corridor in 2002 and 2005. These periodic image collections are used by the Grand Canyon Monitoring and Research Center (GCMRC) of the U.S. Geological Survey to monitor the effects of Glen Canyon Dam operations on the downstream ecosystem. The 2009 collection used the latest model of the Leica ADS40 airborne digital sensor (the SH52), which uses a single optic for all four bands and collects and stores band radiance in 12-bits, unlike the image sensors that GCMRC used in 2002 and 2005. This study examined the performance of the SH52 sensor, on the basis of the collected image data, and determined that the SH52 sensor provided superior data relative to the previously employed sensors (that is, an early ADS40 model and Zeiss Imaging's Digital Mapping Camera) in terms of band-image registration, dynamic range, saturation, linearity to ground reflectance, and noise level. The 2009 image data were provided as orthorectified segments of each flightline to constrain the size of the image files; each river segment was covered by 5 to 6 overlapping, linear flightlines. Most flightline images for each river segment had some surface-smear defects and some river segments had cloud shadows, but these two conditions did not generally coincide in the majority of the overlapping flightlines for a particular river segment. Therefore, the final image mosaic for the 450-kilometer (km)-long river corridor required careful selection and editing of numerous flightline segments (a total of 513 segments, each 3.2 km long) to minimize surface defects and cloud shadows. The final image mosaic has a total of only 3 km of surface defects. The final image mosaic for the western end of the corridor has

  20. Design of a fault tolerant airborne digital computer. Volume 2: Computational requirements and technology

    NASA Technical Reports Server (NTRS)

    Ratner, R. S.; Shapiro, E. B.; Zeidler, H. M.; Wahlstrom, S. E.; Clark, C. B.; Goldberg, J.

    1973-01-01

    This final report summarizes the work on the design of a fault tolerant digital computer for aircraft. Volume 2 is composed of two parts. Part 1 is concerned with the computational requirements associated with an advanced commercial aircraft. Part 2 reviews the technology that will be available for the implementation of the computer in the 1975-1985 period. With regard to the computation task 26 computations have been categorized according to computational load, memory requirements, criticality, permitted down-time, and the need to save data in order to effect a roll-back. The technology part stresses the impact of large scale integration (LSI) on the realization of logic and memory. Also considered was module interconnection possibilities so as to minimize fault propagation.

  1. Noninvasive imaging of human skin hemodynamics using a digital red-green-blue camera

    NASA Astrophysics Data System (ADS)

    Nishidate, Izumi; Tanaka, Noriyuki; Kawase, Tatsuya; Maeda, Takaaki; Yuasa, Tomonori; Aizu, Yoshihisa; Yuasa, Tetsuya; Niizeki, Kyuichi

    2011-08-01

    In order to visualize human skin hemodynamics, we investigated a method that is specifically developed for the visualization of concentrations of oxygenated blood, deoxygenated blood, and melanin in skin tissue from digital RGB color images. Images of total blood concentration and oxygen saturation can also be reconstructed from the results of oxygenated and deoxygenated blood. Experiments using tissue-like agar gel phantoms demonstrated the ability of the developed method to quantitatively visualize the transition from an oxygenated blood to a deoxygenated blood in dermis. In vivo imaging of the chromophore concentrations and tissue oxygen saturation in the skin of the human hand are performed for 14 subjects during upper limb occlusion at 50 and 250 mm Hg. The response of the total blood concentration in the skin acquired by this method and forearm volume changes obtained from the conventional strain-gauge plethysmograph were comparable during the upper arm occlusion at pressures of both 50 and 250 mm Hg. The results presented in the present paper indicate the possibility of visualizing the hemodynamics of subsurface skin tissue.

  2. Noninvasive imaging of human skin hemodynamics using a digital red-green-blue camera.

    PubMed

    Nishidate, Izumi; Tanaka, Noriyuki; Kawase, Tatsuya; Maeda, Takaaki; Yuasa, Tomonori; Aizu, Yoshihisa; Yuasa, Tetsuya; Niizeki, Kyuichi

    2011-08-01

    In order to visualize human skin hemodynamics, we investigated a method that is specifically developed for the visualization of concentrations of oxygenated blood, deoxygenated blood, and melanin in skin tissue from digital RGB color images. Images of total blood concentration and oxygen saturation can also be reconstructed from the results of oxygenated and deoxygenated blood. Experiments using tissue-like agar gel phantoms demonstrated the ability of the developed method to quantitatively visualize the transition from an oxygenated blood to a deoxygenated blood in dermis. In vivo imaging of the chromophore concentrations and tissue oxygen saturation in the skin of the human hand are performed for 14 subjects during upper limb occlusion at 50 and 250 mm Hg. The response of the total blood concentration in the skin acquired by this method and forearm volume changes obtained from the conventional strain-gauge plethysmograph were comparable during the upper arm occlusion at pressures of both 50 and 250 mm Hg. The results presented in the present paper indicate the possibility of visualizing the hemodynamics of subsurface skin tissue. PMID:21895324

  3. Beyond photography: Evaluation of the consumer digital camera to identify strabismus and anisometropia by analyzing the Bruckner's reflex

    PubMed Central

    Bani, Sadat A. O.; Amitava, Abadan K.; Sharma, Richa; Danish, Alam

    2013-01-01

    Amblyopia screening is often either costly or laborious. We evaluated the Canon Powershot TX1 (CPTX1) digital camera as an efficient screener for amblyogenic risk factors (ARF). We included 138 subjects: 84-amblyopes and 54-normal. With the red-eye-reduction feature off, we obtained Bruckner reflex photographs of different sized crescents which suggested anisometropia, while asymmetrical brightness indicated strabismus; symmetry implied normalcy. Eight sets of randomly arranged 138 photographs were made. After training, 8 personnel, marked each as normal or abnormal. Of the 84 amblyopes, 42 were strabismus alone (SA), 36 had anisometropia alone (AA) while six were mixed amblyopes (MA). Overall mean sensitivity for amblyopes was 0.86 (95% CI: 0.83-0.89) and specificity 0.85 (95% CI: 0.77-0.93). Sub-group analyses on SA, AA and MA returned sensitivities of 0.86, 0.89 and 0.69, while specificities were 0.85 for all three. Overall Cohen's Kappa was 0.66 (95% CI: 0.62-0.71). The CPTX1 appears to be a feasible option to screen for ARF, although results need to be validated on appropriate age groups. PMID:24212318

  4. Evaluation of diabetic retinopathy screening using a non-mydriatic retinal digital camera in primary care settings in south Israel.

    PubMed

    Mizrachi, Yossi; Knyazer, Boris; Guigui, Sara; Rosen, Shirley; Lifshitz, Tova; Belfair, Nadav; Klemperer, Itamar; Schneck, Marina; Levy, Jaime

    2014-08-01

    To evaluate the effectiveness of the non-mydriatic digital camera for diabetic retinopathy (DR) screening. Secondary purposes of the study were to characterize diabetic patients being screened for the presence of DR and to calculate the sensitivity, specificity, and positive predictive value of the test. All 6,962 consecutive patients with type 2 diabetes undergoing non-mydriatic digital retinal photography between January 1, 2009 and June 30, 2010 in eight community health clinics in the south of the country were included. Comparison of a random sample of patients who underwent non-mydriatic retinal photography, and who were also examined by an ophthalmologist with pupil dilation was also performed. The average age of all patients was 64.2 years. A total of 5,960 cases (85.6 % of all photographs) were of adequate quality for the diagnosis. DR of any degree was found in 1,092 (18.3 %) patients. Normal fundus pictures were found in 49.4 % of patients. In 32.2 % of cases, non-DR pathologies were found. Among cases in which DR was found, 73.3 % (801 cases) had mild non-proliferative retinopathy (NPDR), 7.1 % (77 cases) had moderate NPDR, 6.8 % (74 cases) had proliferative retinopathy, and 12.8 % (140 cases) had diabetic macular edema. Older patients had more chance of having poor quality pictures (p < 0.001 between patients older and younger than 70 years). When non-mydriatic fundus photography was compared with dilated fundus examination by an ophthalmologist, sensitivity of 99.3 %, specificity of 88.3 %, and positive predictive value of 85.3 % were found. Non-mydriatic digital retinal photography is an efficient method for DR screening. The test has high sensitivity and specificity. The test, as performed in community health centers in the south of the country, contributed to the early diagnosis of >1,000 cases of DR. Many patients can be followed up in a fast and efficient way, although the test cannot replace a complete eye examination after pupil dilation mainly

  5. Solar-Powered Airplane with Cameras and WLAN

    NASA Technical Reports Server (NTRS)

    Higgins, Robert G.; Dunagan, Steve E.; Sullivan, Don; Slye, Robert; Brass, James; Leung, Joe G.; Gallmeyer, Bruce; Aoyagi, Michio; Wei, Mei Y.; Herwitz, Stanley R.; Johnson, Lee; Arvesen, John C.

    2004-01-01

    An experimental airborne remote sensing system includes a remotely controlled, lightweight, solar-powered airplane (see figure) that carries two digital-output electronic cameras and communicates with a nearby ground control and monitoring station via a wireless local-area network (WLAN). The speed of the airplane -- typically <50 km/h -- is low enough to enable loitering over farm fields, disaster scenes, or other areas of interest to collect high-resolution digital imagery that could be delivered to end users (e.g., farm managers or disaster-relief coordinators) in nearly real time.

  6. Waste reduction efforts through evaluation and procurement of a digital camera system for the Alpha-Gamma Hot Cell Facility at Argonne National Laboratory-East.

    SciTech Connect

    Bray, T. S.; Cohen, A. B.; Tsai, H.; Kettman, W. C.; Trychta, K.

    1999-11-08

    The Alpha-Gamma Hot Cell Facility (AGHCF) at Argonne National Laboratory-East is a research facility where sample examinations involve traditional photography. The AGHCF documents samples with photographs (both Polaroid self-developing and negative film). Wastes generated include developing chemicals. The AGHCF evaluated, procured, and installed a digital camera system for the Leitz metallograph to significantly reduce labor, supplies, and wastes associated with traditional photography with a return on investment of less than two years.

  7. Processor architecture for airborne SAR systems

    NASA Technical Reports Server (NTRS)

    Glass, C. M.

    1983-01-01

    Digital processors for spaceborne imaging radars and application of the technology developed for airborne SAR systems are considered. Transferring algorithms and implementation techniques from airborne to spaceborne SAR processors offers obvious advantages. The following topics are discussed: (1) a quantification of the differences in processing algorithms for airborne and spaceborne SARs; and (2) an overview of three processors for airborne SAR systems.

  8. Using digital time-lapse cameras to monitor species-specific understorey and overstorey phenology in support of wildlife habitat assessment.

    PubMed

    Bater, Christopher W; Coops, Nicholas C; Wulder, Michael A; Hilker, Thomas; Nielsen, Scott E; McDermid, Greg; Stenhouse, Gordon B

    2011-09-01

    Critical to habitat management is the understanding of not only the location of animal food resources, but also the timing of their availability. Grizzly bear (Ursus arctos) diets, for example, shift seasonally as different vegetation species enter key phenological phases. In this paper, we describe the use of a network of seven ground-based digital camera systems to monitor understorey and overstorey vegetation within species-specific regions of interest. Established across an elevation gradient in western Alberta, Canada, the cameras collected true-colour (RGB) images daily from 13 April 2009 to 27 October 2009. Fourth-order polynomials were fit to an RGB-derived index, which was then compared to field-based observations of phenological phases. Using linear regression to statistically relate the camera and field data, results indicated that 61% (r (2) = 0.61, df = 1, F = 14.3, p = 0.0043) of the variance observed in the field phenological phase data is captured by the cameras for the start of the growing season and 72% (r (2) = 0.72, df = 1, F = 23.09, p = 0.0009) of the variance in length of growing season. Based on the linear regression models, the mean absolute differences in residuals between predicted and observed start of growing season and length of growing season were 4 and 6 days, respectively. This work extends upon previous research by demonstrating that specific understorey and overstorey species can be targeted for phenological monitoring in a forested environment, using readily available digital camera technology and RGB-based vegetation indices. PMID:21082343

  9. Plenoptic processing methods for distributed camera arrays

    NASA Astrophysics Data System (ADS)

    Boyle, Frank A.; Yancey, Jerry W.; Maleh, Ray; Deignan, Paul

    2011-05-01

    Recent advances in digital photography have enabled the development and demonstration of plenoptic cameras with impressive capabilities. They function by recording sub-aperture images that can be combined to re-focus images or to generate stereoscopic pairs. Plenoptic methods are being explored for fusing images from distributed arrays of cameras, with a view toward applications in which hardware resources are limited (e.g. size, weight, power constraints). Through computer simulation and experimental studies, the influences of non-idealities such as camera position uncertainty are being considered. Component image rescaling and balancing methods are being explored to compensate. Of interest is the impact on precision passive ranging and super-resolution. In a preliminary experiment, a set of images from a camera array was recorded and merged to form a 3D representation of a scene. Conventional plenoptic refocusing was demonstrated and techniques were explored for balancing the images. Nonlinear methods were explored for combining the images limited the ghosting caused by sub-sampling. Plenoptic processing was explored as a means for determining 3D information from airborne video. Successive frames were processed as camera array elements to extract the heights of structures. Practical means were considered for rendering the 3D information in color.

  10. Product Accuracy Effect of Oblique and Vertical Non-Metric Digital Camera Utilization in Uav-Photogrammetry to Determine Fault Plane

    NASA Astrophysics Data System (ADS)

    Amrullah, C.; Suwardhi, D.; Meilano, I.

    2016-06-01

    This study aims to see the effect of non-metric oblique and vertical camera combination along with the configuration of the ground control points to improve the precision and accuracy in UAV-Photogrammetry project. The field observation method is used for data acquisition with aerial photographs and ground control points. All data are processed by digital photogrammetric process with some scenarios in camera combination and ground control point configuration. The model indicates that the value of precision and accuracy increases with the combination of oblique and vertical camera at all control point configuration. The best products of the UAV-Photogrammetry model are produced in the form of Digital Elevation Model (DEM) compared to the LiDAR DEM. Furthermore, DEM from UAV-Photogrammetry and LiDAR are used to define the fault plane by using cross-section on the model and interpretation to determine the point at the extreme height of terrain changes. The result of the defined fault planes indicate that two models do not show any significant difference.

  11. Ground-based detection of nighttime clouds above Manila Observatory (14.64°N, 121.07°E) using a digital camera.

    PubMed

    Gacal, Glenn Franco B; Antioquia, Carlo; Lagrosas, Nofel

    2016-08-01

    Ground-based cloud detection at nighttime is achieved by using cameras, lidars, and ceilometers. Despite these numerous instruments gathering cloud data, there is still an acknowledged scarcity of information on quantified local cloud cover, especially at nighttime. In this study, a digital camera is used to continuously collect images near the sky zenith at nighttime in an urban environment. An algorithm is developed to analyze pixel values of images of nighttime clouds. A minimum threshold pixel value of 17 is assigned to determine cloud occurrence. The algorithm uses temporal averaging to estimate the cloud fraction based on the results within the limited field of view. The analysis of the data from the months of January, February, and March 2015 shows that cloud occurrence is low during the months with relatively lower minimum temperature (January and February), while cloud occurrence during the warmer month (March) increases. PMID:27505386

  12. Land cover/use classification of Cairns, Queensland, Australia: A remote sensing study involving the conjunctive use of the airborne imaging spectrometer, the large format camera and the thematic mapper simulator

    NASA Technical Reports Server (NTRS)

    Heric, Matthew; Cox, William; Gordon, Daniel K.

    1987-01-01

    In an attempt to improve the land cover/use classification accuracy obtainable from remotely sensed multispectral imagery, Airborne Imaging Spectrometer-1 (AIS-1) images were analyzed in conjunction with Thematic Mapper Simulator (NS001) Large Format Camera color infrared photography and black and white aerial photography. Specific portions of the combined data set were registered and used for classification. Following this procedure, the resulting derived data was tested using an overall accuracy assessment method. Precise photogrammetric 2D-3D-2D geometric modeling techniques is not the basis for this study. Instead, the discussion exposes resultant spectral findings from the image-to-image registrations. Problems associated with the AIS-1 TMS integration are considered, and useful applications of the imagery combination are presented. More advanced methodologies for imagery integration are needed if multisystem data sets are to be utilized fully. Nevertheless, research, described herein, provides a formulation for future Earth Observation Station related multisensor studies.

  13. Using a slit lamp-mounted digital high-speed camera for dynamic observation of phakic lenses during eye movements: a pilot study

    PubMed Central

    Leitritz, Martin Alexander; Ziemssen, Focke; Bartz-Schmidt, Karl Ulrich; Voykov, Bogomil

    2014-01-01

    Purpose To evaluate a digital high-speed camera combined with digital morphometry software for dynamic measurements of phakic intraocular lens movements to observe kinetic influences, particularly in fast direction changes and at lateral end points. Materials and methods A high-speed camera taking 300 frames per second observed movements of eight iris-claw intraocular lenses and two angle-supported intraocular lenses. Standardized saccades were performed by the patients to trigger mass inertia with lens position changes. Freeze images with maximum deviation were used for digital software-based morphometry analysis with ImageJ. Results Two eyes from each of five patients (median age 32 years, range 28–45 years) without findings other than refractive errors were included. The high-speed images showed sufficient usability for further morphometric processing. In the primary eye position, the median decentrations downward and in a lateral direction were −0.32 mm (range −0.69 to 0.024) and 0.175 mm (range −0.37 to 0.45), respectively. Despite the small sample size of asymptomatic patients, we found a considerable amount of lens dislocation. The median distance amplitude during eye movements was 0.158 mm (range 0.02–0.84). There was a slight positive correlation (r=0.39, P<0.001) between the grade of deviation in the primary position and the distance increase triggered by movements. Conclusion With the use of a slit lamp-mounted high-speed camera system and morphometry software, observation and objective measurements of iris-claw intraocular lenses and angle-supported intraocular lenses movements seem to be possible. Slight decentration in the primary position might be an indicator of increased lens mobility during kinetic stress during eye movements. Long-term assessment by high-speed analysis with higher case numbers has to clarify the relationship between progressing motility and endothelial cell damage. PMID:25071365

  14. Optical engineering application of modeled photosynthetically active radiation (PAR) for high-speed digital camera dynamic range optimization

    NASA Astrophysics Data System (ADS)

    Alves, James; Gueymard, Christian A.

    2009-08-01

    As efforts to create accurate yet computationally efficient estimation models for clear-sky photosynthetically active solar radiation (PAR) have succeeded, the range of practical engineering applications where these models can be successfully applied has increased. This paper describes a novel application of the REST2 radiative model (developed by the second author) in optical engineering. The PAR predictions in this application are used to predict the possible range of instantaneous irradiances that could impinge on the image plane of a stationary video camera designed to image license plates on moving vehicles. The overall spectral response of the camera (including lens and optical filters) is similar to the 400-700 nm PAR range, thereby making PAR irradiance (rather than luminance) predictions most suitable for this application. The accuracy of the REST2 irradiance predictions for horizontal surfaces, coupled with another radiative model to obtain irradiances on vertical surfaces, and to standard optical image formation models, enable setting the dynamic range controls of the camera to ensure that the license plate images are legible (unsaturated with adequate contrast) regardless of the time of day, sky condition, or vehicle speed. A brief description of how these radiative models are utilized as part of the camera control algorithm is provided. Several comparisons of the irradiance predictions derived from the radiative model versus actual PAR measurements under varying sky conditions with three Licor sensors (one horizontal and two vertical) have been made and showed good agreement. Various camera-to-plate geometries and compass headings have been considered in these comparisons. Time-lapse sequences of license plate images taken with the camera under various sky conditions over a 30-day period are also analyzed. They demonstrate the success of the approach at creating legible plate images under highly variable lighting, which is the main goal of this

  15. Measurement of Young’s modulus and Poisson’s ratio of metals by means of ESPI using a digital camera

    NASA Astrophysics Data System (ADS)

    Francisco, J. B. Pascual; Michtchenko, A.; Barragán Pérez, O.; Susarrey Huerta, O.

    2016-09-01

    In this paper, mechanical experiments with a low-cost interferometry set-up are presented. The set-up is suitable for an undergraduate laboratory where optical equipment is absent. The arrangement consists of two planes of illumination, allowing the measurement of the two perpendicular in-plane displacement directions. An axial load was applied on three different metals, and the longitudinal and transversal displacements were measured sequentially. A digital camera was used to acquire the images of the different states of load of the illuminated area. A personal computer was used to perform the digital subtraction of the images to obtain the fringe correlations, which are needed to calculate the displacements. Finally, Young’s modulus and Poisson’s ratio of the metals were calculated using the displacement data.

  16. Evaluating intensified camera systems

    SciTech Connect

    S. A. Baker

    2000-06-30

    This paper describes image evaluation techniques used to standardize camera system characterizations. The authors group is involved with building and fielding several types of camera systems. Camera types include gated intensified cameras, multi-frame cameras, and streak cameras. Applications range from X-ray radiography to visible and infrared imaging. Key areas of performance include sensitivity, noise, and resolution. This team has developed an analysis tool, in the form of image processing software, to aid an experimenter in measuring a set of performance metrics for their camera system. These performance parameters are used to identify a camera system's capabilities and limitations while establishing a means for camera system comparisons. The analysis tool is used to evaluate digital images normally recorded with CCD cameras. Electro-optical components provide fast shuttering and/or optical gain to camera systems. Camera systems incorporate a variety of electro-optical components such as microchannel plate (MCP) or proximity focused diode (PFD) image intensifiers; electro-static image tubes; or electron-bombarded (EB) CCDs. It is often valuable to evaluate the performance of an intensified camera in order to determine if a particular system meets experimental requirements.

  17. A geometric comparison of video camera-captured raster data to vector-parented raster data generated by the X-Y digitizing table

    NASA Technical Reports Server (NTRS)

    Swalm, C.; Pelletier, R.; Rickman, D.; Gilmore, K.

    1989-01-01

    The relative accuracy of a georeferenced raster data set captured by the Megavision 1024XM system using the Videk Megaplus CCD cameras is compared to a georeferenced raster data set generated from vector lines manually digitized through the ELAS software package on a Summagraphics X-Y digitizer table. The study also investigates the amount of time necessary to fully complete the rasterization of the two data sets, evaluating individual areas such as time necessary to generate raw data, time necessary to edit raw data, time necessary to georeference raw data, and accuracy of georeferencing against a norm. Preliminary results exhibit a high level of agreement between areas of the vector-parented data and areas of the captured file data where sufficient control points were chosen. Maps of 1:20,000 scale were digitized into raster files of 5 meter resolution per pixel and overall error in RMS was estimated at less than eight meters. Such approaches offer time and labor-saving advantages as well as increasing the efficiency of project scheduling and enabling the digitization of new types of data.

  18. Experimental Demonstration of Extended Depth-of-Field F/1.2 Visible High Definition Camera with Jointly Optimized Phase Mask and Real-Time Digital Processing

    NASA Astrophysics Data System (ADS)

    Burcklen, M.-A.; Diaz, F.; Lepretre, F.; Rollin, J.; Delboulbé, A.; Lee, M.-S. L.; Loiseaux, B.; Koudoli, A.; Denel, S.; Millet, P.; Duhem, F.; Lemonnier, F.; Sauer, H.; Goudail, F.

    2015-10-01

    Increasing the depth of field (DOF) of compact visible high resolution cameras while maintaining high imaging performance in the DOF range is crucial for such applications as night vision goggles or industrial inspection. In this paper, we present the end-to-end design and experimental validation of an extended depth-of-field visible High Definition camera with a very small f-number, combining a six-ring pyramidal phase mask in the aperture stop of the lens with a digital deconvolution. The phase mask and the deconvolution algorithm are jointly optimized during the design step so as to maximize the quality of the deconvolved image over the DOF range. The deconvolution processing is implemented in real-time on a Field-Programmable Gate Array and we show that it requires very low power consumption. By mean of MTF measurements and imaging experiments we experimentally characterize the performance of both cameras with and without phase mask and thereby demonstrate a significant increase in depth of field of a factor 2.5, as it was expected in the design step.

  19. The Laser Vegetation Imaging Sensor (LVIS): A Medium-Altitude, Digitization-Only, Airborne Laser Altimeter for Mapping Vegetation and Topography

    NASA Technical Reports Server (NTRS)

    Blair, J. Bryan; Rabine, David L.; Hofton, Michelle A.

    1999-01-01

    The Laser Vegetation Imaging Sensor (LVIS) is an airborne, scanning laser altimeter designed and developed at NASA's Goddard Space Flight Center. LVIS operates at altitudes up to 10 km above ground, and is capable of producing a data swath up to 1000 m wide nominally with 25 m wide footprints. The entire time history of the outgoing and return pulses is digitized, allowing unambiguous determination of range and return pulse structure. Combined with aircraft position and attitude knowledge, this instrument produces topographic maps with decimeter accuracy and vertical height and structure measurements of vegetation. The laser transmitter is a diode-pumped Nd:YAG oscillator producing 1064 nm, 10 nsec, 5 mJ pulses at repetition rates up to 500 Hz. LVIS has recently demonstrated its ability to determine topography (including sub-canopy) and vegetation height and structure on flight missions to various forested regions in the U.S. and Central America. The LVIS system is the airborne simulator for the Vegetation Canopy Lidar (VCL) mission (a NASA Earth remote sensing satellite due for launch in 2000), providing simulated data sets and a platform for instrument proof-of-concept studies. The topography maps and return waveforms produced by LVIS provide Earth scientists with a unique data set allowing studies of topography, hydrology, and vegetation with unmatched accuracy and coverage.

  20. Column-integrated aerosol optical properties from ground-based spectroradiometer measurements at Barrax (Spain) during the Digital Airborne Imaging Spectrometer Experiment (DAISEX) campaigns

    NASA Astrophysics Data System (ADS)

    Pedrós, Roberto; Martinez-Lozano, Jose A.; Utrillas, Maria P.; Gómez-Amo, José L.; Tena, Fernando

    2003-09-01

    The Digital Airborne Imaging Spectrometer Experiment (DAISEX) was carried out for the European Space Agency (ESA) in order to develop the potential of spaceborne imaging spectroscopy for a range of different scientific applications. DAISEX involved simultaneous data acquisitions using different airborne imaging spectrometers over test sites in southeast Spain (Barrax) and the Upper Rhine valley (Colmar, France, and Hartheim, Germany). This paper presents the results corresponding to the column-integrated aerosol optical properties from ground-based spectroradiometer measurements over the Barrax area during the DAISEX campaign days in the years 1998, 1999, and 2000. The instruments used for spectral irradiance measurements were two Licor 1800 and one Optronic OL-754 spectroradiometers. The analysis of the spectral aerosol optical depth in the visible range shows in all cases the predominance of the coarse-particle mode over the fine-particle mode. The analysis of the back trajectories of the air masses indicates a predominance of marine-type aerosols in the lower atmospheric layers in all cases. Overall, the results obtained show that during the DAISEX there was a combination of maritime aerosols with smaller continental aerosols.

  1. Getting the Picture: Using the Digital Camera as a Tool to Support Reflective Practice and Responsive Care

    ERIC Educational Resources Information Center

    Luckenbill, Julia

    2012-01-01

    Many early childhood educators use cameras to share the charming things that children do and the artwork they make. Programs often bind these photographs into portfolios and give them to children and their families as mementos at the end of the year. In the author's classrooms, they use photography on a daily basis to document children's…

  2. Evaluating intensified camera systems

    SciTech Connect

    S. A. Baker

    2000-07-01

    This paper describes image evaluation techniques used to standardize camera system characterizations. Key areas of performance include resolution, noise, and sensitivity. This team has developed a set of analysis tools, in the form of image processing software used to evaluate camera calibration data, to aid an experimenter in measuring a set of camera performance metrics. These performance metrics identify capabilities and limitations of the camera system, while establishing a means for comparing camera systems. Analysis software is used to evaluate digital camera images recorded with charge-coupled device (CCD) cameras. Several types of intensified camera systems are used in the high-speed imaging field. Electro-optical components are used to provide precise shuttering or optical gain for a camera system. These components including microchannel plate or proximity focused diode image intensifiers, electro-static image tubes, or electron-bombarded CCDs affect system performance. It is important to quantify camera system performance in order to qualify a system as meeting experimental requirements. The camera evaluation tool is designed to provide side-by-side camera comparison and system modeling information.

  3. The Outdoor Rapid Calibration Technique and Realization of Nonmetric Digital Camera Based on the Method of Multi-Image Dlt and Resection

    NASA Astrophysics Data System (ADS)

    Qiang, Zhang; Jian-jing, Shen; Meng-qing, Sun

    2016-06-01

    For non-metric CCD digital camera features and the needs of Rapid field non-metric cameras calibration, the error sources was detailed analyzed and a mathematical calibration model has been founded. Both detailed multi-image group iterative method for solving DLT coefficient, the elements of interior orientation and distortion parameters of lens and the multi-image resection method for solving the elements of interior orientation, elements of exterior orientation and distortion parameters of lens have been discussed. A standard steel cage (e.g. Figure 1) has been made for real calibrating non-metric cameras outdoor quickly. In order to verify the accuracy, each method mentioned has been used to solve elements of interior orientation and distortion parameters with the same camera (e.g. Figure 2) and the same test images. The results of accuracy show that the maximum X error was 0.2585mm, the maximum Y error was 0.6719mm and the maximum Z error was 0.1319mm by using multi-image DLT algorithm. On the other hand, the maximum X error was 0.1914mm, the maximum Y error was 0.9808mm and the maximum Z error was 0.1453mm by using multi-image resection algorithm. The forward intersection accuracy of the two methods was quite, and the both were less than 1mm. By using multi-image DLT algorithm the planimetric accuracy was less than 0.2585mm and the height accuracy was less than 0.6719mm. On the other hand, by using multi-image resection algorithm the planimetric accuracy was less than 0.1914mm and the height accuracy was less than 0.9808mm. The planimetric accuracy of resection algorithm was the better than DLT algorithm, but the elevation accuracy of DLT algorithm was the better than resection algorithm. In summary both method can be accepted for nonmetric camera calibration. But also the solver accuracy in the inner orientation elements and distortion parameters was not very high has been noted. However for non-metric camera, the true value of inner orientation elements and

  4. Anger Camera Firmware

    2010-11-19

    The firmware is responsible for the operation of Anger Camera Electronics, calculation of position, time of flight and digital communications. It provides a first stage analysis of 48 signals from 48 analog signals that have been converted to digital values using A/D convertors.

  5. Polarization encoded color camera.

    PubMed

    Schonbrun, Ethan; Möller, Guðfríður; Di Caprio, Giuseppe

    2014-03-15

    Digital cameras would be colorblind if they did not have pixelated color filters integrated into their image sensors. Integration of conventional fixed filters, however, comes at the expense of an inability to modify the camera's spectral properties. Instead, we demonstrate a micropolarizer-based camera that can reconfigure its spectral response. Color is encoded into a linear polarization state by a chiral dispersive element and then read out in a single exposure. The polarization encoded color camera is capable of capturing three-color images at wavelengths spanning the visible to the near infrared. PMID:24690806

  6. Integration of airborne Thematic Mapper Simulator (TMS) data and digitized aerial photography via an ISH transformation. [Intensity Saturation Hue

    NASA Technical Reports Server (NTRS)

    Ambrosia, Vincent G.; Myers, Jeffrey S.; Ekstrand, Robert E.; Fitzgerald, Michael T.

    1991-01-01

    A simple method for enhancing the spatial and spectral resolution of disparate data sets is presented. Two data sets, digitized aerial photography at a nominal spatial resolution 3,7 meters and TMS digital data at 24.6 meters, were coregistered through a bilinear interpolation to solve the problem of blocky pixel groups resulting from rectification expansion. The two data sets were then subjected to intensity-saturation-hue (ISH) transformations in order to 'blend' the high-spatial-resolution (3.7 m) digitized RC-10 photography with the high spectral (12-bands) and lower spatial (24.6 m) resolution TMS digital data. The resultant merged products make it possible to perform large-scale mapping, ease photointerpretation, and can be derived for any of the 12 available TMS spectral bands.

  7. Hierarchical object-based classification of ultra-high-resolution digital mapping camera (DMC) imagery for rangeland mapping and assessment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Ultra high resolution digital aerial photography has great potential to complement or replace ground measurements of vegetation cover for rangeland monitoring and assessment. We investigated object-based image analysis (OBIA) techniques for classifying vegetation in southwestern U.S. arid rangelands...

  8. In vivo multispectral imaging of the absorption and scattering properties of exposed brain using a digital red-green-blue camera

    NASA Astrophysics Data System (ADS)

    Yoshida, Keiichiro; Ishizuka, Tomohiro; Mizushima, Chiharu; Nishidate, Izumi; Kawauchi, Satoko; Sato, Shunichi; Sato, Manabu

    2015-04-01

    To evaluate multi-spectral images of the absorption and scattering properties in the cerebral cortex of rat brain, we investigated spectral reflectance images estimated by the Wiener estimation method using a digital red-green-blue camera. A Monte Carlo simulation-based multiple regression analysis for the corresponding spectral absorbance images at nine wavelengths (500, 520, 540, 560, 570, 580, 600, 730, and 760 nm) was then used to specify the absorption and scattering parameters. The spectral images of absorption and reduced scattering coefficients were reconstructed from the absorption and scattering parameters. We performed in vivo experiments on exposed rat brain to confirm the feasibility of this method. The estimated images of the absorption coefficients were dominated by hemoglobin spectra. The estimated images of the reduced scattering coefficients had a broad scattering spectrum, exhibiting a larger magnitude at shorter wavelengths, corresponding to the typical spectrum of brain tissue published in the literature.

  9. A microcontroller-based system for automated and continuous sky glow measurements with the use of digital single-lens reflex cameras

    NASA Astrophysics Data System (ADS)

    Solano Lamphar, Hétor Antonio; Kundracik, Frantisek

    2014-02-01

    In recent years, the scientific community has shown an increased interest in sky glow research. This has revealed an increased need for automated technology that enables continuous evaluation of sky glow. As a result, a reliable low-cost platform has been developed and constructed for automating sky glow measurement. The core of the system is embedded software and hardware managed by a microcontroller with ARM architecture. A monolithic photodiode transimpedance amplifier is used to allow linear light measurement. Data from the diode are collected and used to arrange the exposure time of every image captured by the digital single-lens reflex camera. This proposal supports experimenters by providing a low-cost system to analyse sky glow variations overnight without a human interface.

  10. Low altitude remote-sensing method to monitor marine and beach litter of various colors using a balloon equipped with a digital camera.

    PubMed

    Kako, Shin'ichiro; Isobe, Atsuhiko; Magome, Shinya

    2012-06-01

    This study aims to establish a low-altitude remote sensing system for surveying litter on a beach or the ocean using a remote-controlled digital camera suspended from a balloon filled with helium gas. The resultant images are processed to identify the litter using projective transformation method and color difference in the CIELUV color space. Low-altitude remote sensing experimental observations were conducted on two locations in Japan. Although the sizes of the litter and the areas covered are distorted in the original photographs taken at various angles and heights, the proposed image process system is capable of identifying object positions with a high degree of accuracy (1-3 m). Furthermore, the color difference approach in the CIELUV color space used in this study is well capable of extracting pixels of litter objects of various colors allowing us to estimate the number of objects from the photographs. PMID:22525012

  11. Target-Tracking Camera for a Metrology System

    NASA Technical Reports Server (NTRS)

    Liebe, Carl; Bartman, Randall; Chapsky, Jacob; Abramovici, Alexander; Brown, David

    2009-01-01

    An analog electronic camera that is part of a metrology system measures the varying direction to a light-emitting diode that serves as a bright point target. In the original application for which the camera was developed, the metrological system is used to determine the varying relative positions of radiating elements of an airborne synthetic aperture-radar (SAR) antenna as the airplane flexes during flight; precise knowledge of the relative positions as a function of time is needed for processing SAR readings. It has been common metrology system practice to measure the varying direction to a bright target by use of an electronic camera of the charge-coupled-device or active-pixel-sensor type. A major disadvantage of this practice arises from the necessity of reading out and digitizing the outputs from a large number of pixels and processing the resulting digital values in a computer to determine the centroid of a target: Because of the time taken by the readout, digitization, and computation, the update rate is limited to tens of hertz. In contrast, the analog nature of the present camera makes it possible to achieve an update rate of hundreds of hertz, and no computer is needed to determine the centroid. The camera is based on a position-sensitive detector (PSD), which is a rectangular photodiode with output contacts at opposite ends. PSDs are usually used in triangulation for measuring small distances. PSDs are manufactured in both one- and two-dimensional versions. Because it is very difficult to calibrate two-dimensional PSDs accurately, the focal-plane sensors used in this camera are two orthogonally mounted one-dimensional PSDs.

  12. Camera-based measurement for transverse vibrations of moving catenaries in mine hoists using digital image processing techniques

    NASA Astrophysics Data System (ADS)

    Yao, Jiannan; Xiao, Xingming; Liu, Yao

    2016-03-01

    This paper proposes a novel, non-contact, sensing method to measure the transverse vibrations of hoisting catenaries in mine hoists. Hoisting catenaries are typically moving cables and it is not feasible to use traditional methods to measure their transverse vibrations. In order to obtain the transverse displacements of an arbitrary point in a moving catenary, by superposing a mask image having the predefined reference line perpendicular to the hoisting catenaries on each frame of the processed image sequence, the dynamic intersecting points with a grey value of 0 in the image sequence could be identified. Subsequently, by traversing the coordinates of the pixel with a grey value of 0 and calculating the distance between the identified dynamic points from the reference, the transverse displacements of the selected arbitrary point in the hoisting catenary can be obtained. Furthermore, based on a theoretical model, the reasonability and applicability of the proposed camera-based method were confirmed. Additionally, a laboratory experiment was also carried out, which then validated the accuracy of the proposed method. The research results indicate that the proposed camera-based method is suitable for the measurement of the transverse vibrations of moving cables.

  13. Image Sensors Enhance Camera Technologies

    NASA Technical Reports Server (NTRS)

    2010-01-01

    In the 1990s, a Jet Propulsion Laboratory team led by Eric Fossum researched ways of improving complementary metal-oxide semiconductor (CMOS) image sensors in order to miniaturize cameras on spacecraft while maintaining scientific image quality. Fossum s team founded a company to commercialize the resulting CMOS active pixel sensor. Now called the Aptina Imaging Corporation, based in San Jose, California, the company has shipped over 1 billion sensors for use in applications such as digital cameras, camera phones, Web cameras, and automotive cameras. Today, one of every three cell phone cameras on the planet feature Aptina s sensor technology.

  14. The DSLR Camera

    NASA Astrophysics Data System (ADS)

    Berkó, Ernő; Argyle, R. W.

    Cameras have developed significantly in the past decade; in particular, digital Single-Lens Reflex Cameras (DSLR) have appeared. As a consequence we can buy cameras of higher and higher pixel number, and mass production has resulted in the great reduction of prices. CMOS sensors used for imaging are increasingly sensitive, and the electronics in the cameras allows images to be taken with much less noise. The software background is developing in a similar way—intelligent programs are created for after-processing and other supplementary works. Nowadays we can find a digital camera in almost every household, most of these cameras are DSLR ones. These can be used very well for astronomical imaging, which is nicely demonstrated by the amount and quality of the spectacular astrophotos appearing in different publications. These examples also show how much post-processing software contributes to the rise in the standard of the pictures. To sum up, the DSLR camera serves as a cheap alternative for the CCD camera, with somewhat weaker technical characteristics. In the following, I will introduce how we can measure the main parameters (position angle and separation) of double stars, based on the methods, software and equipment I use. Others can easily apply these for their own circumstances.

  15. Quantitative single-particle digital autoradiography with α-particle emitters for targeted radionuclide therapy using the iQID camera

    SciTech Connect

    Miller, Brian W.; Frost, Sofia H. L.; Frayo, Shani L.; Kenoyer, Aimee L.; Santos, Erlinda; Jones, Jon C.; Orozco, Johnnie J.; Green, Damian J.; Press, Oliver W.; Pagel, John M.; Sandmaier, Brenda M.; Hamlin, Donald K.; Wilbur, D. Scott; Fisher, Darrell R.

    2015-07-15

    Purpose: Alpha-emitting radionuclides exhibit a potential advantage for cancer treatments because they release large amounts of ionizing energy over a few cell diameters (50–80 μm), causing localized, irreparable double-strand DNA breaks that lead to cell death. Radioimmunotherapy (RIT) approaches using monoclonal antibodies labeled with α emitters may thus inactivate targeted cells with minimal radiation damage to surrounding tissues. Tools are needed to visualize and quantify the radioactivity distribution and absorbed doses to targeted and nontargeted cells for accurate dosimetry of all treatment regimens utilizing α particles, including RIT and others (e.g., Ra-223), especially for organs and tumors with heterogeneous radionuclide distributions. The aim of this study was to evaluate and characterize a novel single-particle digital autoradiography imager, the ionizing-radiation quantum imaging detector (iQID) camera, for use in α-RIT experiments. Methods: The iQID camera is a scintillator-based radiation detection system that images and identifies charged-particle and gamma-ray/x-ray emissions spatially and temporally on an event-by-event basis. It employs CCD-CMOS cameras and high-performance computing hardware for real-time imaging and activity quantification of tissue sections, approaching cellular resolutions. In this work, the authors evaluated its characteristics for α-particle imaging, including measurements of intrinsic detector spatial resolutions and background count rates at various detector configurations and quantification of activity distributions. The technique was assessed for quantitative imaging of astatine-211 ({sup 211}At) activity distributions in cryosections of murine and canine tissue samples. Results: The highest spatial resolution was measured at ∼20 μm full width at half maximum and the α-particle background was measured at a rate as low as (2.6 ± 0.5) × 10{sup −4} cpm/cm{sup 2} (40 mm diameter detector area

  16. Quantitative single-particle digital autoradiography with α-particle emitters for targeted radionuclide therapy using the iQID camera

    PubMed Central

    Miller, Brian W.; Frost, Sofia H. L.; Frayo, Shani L.; Kenoyer, Aimee L.; Santos, Erlinda; Jones, Jon C.; Green, Damian J.; Hamlin, Donald K.; Wilbur, D. Scott; Orozco, Johnnie J.; Press, Oliver W.; Pagel, John M.; Sandmaier, Brenda M.

    2015-01-01

    Purpose: Alpha-emitting radionuclides exhibit a potential advantage for cancer treatments because they release large amounts of ionizing energy over a few cell diameters (50–80 μm), causing localized, irreparable double-strand DNA breaks that lead to cell death. Radioimmunotherapy (RIT) approaches using monoclonal antibodies labeled with α emitters may thus inactivate targeted cells with minimal radiation damage to surrounding tissues. Tools are needed to visualize and quantify the radioactivity distribution and absorbed doses to targeted and nontargeted cells for accurate dosimetry of all treatment regimens utilizing α particles, including RIT and others (e.g., Ra-223), especially for organs and tumors with heterogeneous radionuclide distributions. The aim of this study was to evaluate and characterize a novel single-particle digital autoradiography imager, the ionizing-radiation quantum imaging detector (iQID) camera, for use in α-RIT experiments. Methods: The iQID camera is a scintillator-based radiation detection system that images and identifies charged-particle and gamma-ray/x-ray emissions spatially and temporally on an event-by-event basis. It employs CCD-CMOS cameras and high-performance computing hardware for real-time imaging and activity quantification of tissue sections, approaching cellular resolutions. In this work, the authors evaluated its characteristics for α-particle imaging, including measurements of intrinsic detector spatial resolutions and background count rates at various detector configurations and quantification of activity distributions. The technique was assessed for quantitative imaging of astatine-211 (211At) activity distributions in cryosections of murine and canine tissue samples. Results: The highest spatial resolution was measured at ∼20 μm full width at half maximum and the α-particle background was measured at a rate as low as (2.6 ± 0.5) × 10−4 cpm/cm2 (40 mm diameter detector area). Simultaneous imaging of

  17. Accuracy Potential and Applications of MIDAS Aerial Oblique Camera System

    NASA Astrophysics Data System (ADS)

    Madani, M.

    2012-07-01

    Airborne oblique cameras such as Fairchild T-3A were initially used for military reconnaissance in 30s. A modern professional digital oblique camera such as MIDAS (Multi-camera Integrated Digital Acquisition System) is used to generate lifelike three dimensional to the users for visualizations, GIS applications, architectural modeling, city modeling, games, simulators, etc. Oblique imagery provide the best vantage for accessing and reviewing changes to the local government tax base, property valuation assessment, buying & selling of residential/commercial for better decisions in a more timely manner. Oblique imagery is also used for infrastructure monitoring making sure safe operations of transportation, utilities, and facilities. Sanborn Mapping Company acquired one MIDAS from TrackAir in 2011. This system consists of four tilted (45 degrees) cameras and one vertical camera connected to a dedicated data acquisition computer system. The 5 digital cameras are based on the Canon EOS 1DS Mark3 with Zeiss lenses. The CCD size is 5,616 by 3,744 (21 MPixels) with the pixel size of 6.4 microns. Multiple flights using different camera configurations (nadir/oblique (28 mm/50 mm) and (50 mm/50 mm)) were flown over downtown Colorado Springs, Colorado. Boresight fights for 28 mm nadir camera were flown at 600 m and 1,200 m and for 50 mm nadir camera at 750 m and 1500 m. Cameras were calibrated by using a 3D cage and multiple convergent images utilizing Australis model. In this paper, the MIDAS system is described, a number of real data sets collected during the aforementioned flights are presented together with their associated flight configurations, data processing workflow, system calibration and quality control workflows are highlighted and the achievable accuracy is presented in some detail. This study revealed that the expected accuracy of about 1 to 1.5 GSD (Ground Sample Distance) for planimetry and about 2 to 2.5 GSD for vertical can be achieved. Remaining systematic

  18. Design of a modular digital computer system DRL 4 and 5. [design of airborne/spaceborne computer system

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Design and development efforts for a spaceborne modular computer system are reported. An initial baseline description is followed by an interface design that includes definition of the overall system response to all classes of failure. Final versions for the register level designs for all module types were completed. Packaging, support and control executive software, including memory utilization estimates and design verification plan, were formalized to insure a soundly integrated design of the digital computer system.

  19. Oblique Multi-Camera Systems - Orientation and Dense Matching Issues

    NASA Astrophysics Data System (ADS)

    Rupnik, E.; Nex, F.; Remondino, F.

    2014-03-01

    The use of oblique imagery has become a standard for many civil and mapping applications, thanks to the development of airborne digital multi-camera systems, as proposed by many companies (Blomoblique, IGI, Leica, Midas, Pictometry, Vexcel/Microsoft, VisionMap, etc.). The indisputable virtue of oblique photography lies in its simplicity of interpretation and understanding for inexperienced users allowing their use of oblique images in very different applications, such as building detection and reconstruction, building structural damage classification, road land updating and administration services, etc. The paper reports an overview of the actual oblique commercial systems and presents a workflow for the automated orientation and dense matching of large image blocks. Perspectives, potentialities, pitfalls and suggestions for achieving satisfactory results are given. Tests performed on two datasets acquired with two multi-camera systems over urban areas are also reported.

  20. Dry imaging cameras

    PubMed Central

    Indrajit, IK; Alam, Aftab; Sahni, Hirdesh; Bhatia, Mukul; Sahu, Samaresh

    2011-01-01

    Dry imaging cameras are important hard copy devices in radiology. Using dry imaging camera, multiformat images of digital modalities in radiology are created from a sealed unit of unexposed films. The functioning of a modern dry camera, involves a blend of concurrent processes, in areas of diverse sciences like computers, mechanics, thermal, optics, electricity and radiography. Broadly, hard copy devices are classified as laser and non laser based technology. When compared with the working knowledge and technical awareness of different modalities in radiology, the understanding of a dry imaging camera is often superficial and neglected. To fill this void, this article outlines the key features of a modern dry camera and its important issues that impact radiology workflow. PMID:21799589

  1. Development and Application of a new DACOM Airborne Trace Gas Instrument based on Room-Temperature Laser and Detector Technology and all-Digital Control and Data Processin

    NASA Astrophysics Data System (ADS)

    Diskin, G. S.; Sachse, G. W.; DiGangi, J. P.; Pusede, S. E.; Slate, T. A.; Rana, M.

    2014-12-01

    The DACOM (Differential Absorption Carbon monOxide Measurements) instrument has been used for airborne measurements of carbon monoxide, methane, and nitrous oxide for nearly four decades. Over the years, the instrument has undergone a nearly continuous series of modifications, taking advantage of improvements in available technology and the benefits of experience, but always utilizing cryogenically cooled lasers and detectors. More recently, though, the availability of room-temperature, higher-power single-mode lasers at the mid-infrared wavelengths used by DACOM has made it possible to replace both the cryogenic lasers and detectors with thermoelectrically cooled versions. And the relative stability of these lasers has allowed us to incorporate an all-digital wavelength stabilization technique developed previously for the Diode Laser Hygrometer (DLH) instrument. The new DACOM flew first in the summer 2013 SEAC4RS campaign, measuring CO from the DC-8 aircraft, and more recently measuring all three gases from the NASA P-3B aircraft in support of the summer 2014 DISCOVER-AQ campaign. We will present relevant aspects of the new instrument design and operation as well as selected data from recent campaigns illustrating instrument performance and some preliminary science.

  2. Validating NASA's Airborne Multikilohertz Microlaser Altimeter (Microaltimeter) by Direct Comparison of Data Taken Over Ocean City, Maryland Against an Existing Digital Elevation Model

    NASA Technical Reports Server (NTRS)

    Abel, Peter

    2003-01-01

    NASA's Airborne Multikilohertz Microlaser Altimeter (Microaltimeter) is a scanning, photon-counting laser altimeter, which uses a low energy (less than 10 microJuoles), high repetition rate (approximately 10 kHz) laser, transmitting at 532 nm. A 14 cm diameter telescope images the ground return onto a segmented anode photomultiplier, which provides up to 16 range returns for each fire. Multiple engineering flights were made during 2001 and 2002 over the Maryland and Virginia coastal area, all during daylight hours. Post-processing of the data to geolocate the laser footprint and determine the terrain height requires post- detection Poisson filtering techniques to extract the actual ground returns from the noise. Validation of the instrument's ability to produce accurate terrain heights will be accomplished by direct comparison of data taken over Ocean City, Maryland with a Digital Elevation Model (DEM) of the region produced at Ohio State University (OSU) from other laser altimeter and photographic sources. The techniques employed to produce terrain heights from the Microaltimeter ranges will be shown, along with some preliminary comparisons with the OSU DEM.

  3. Single chip camera active pixel sensor

    NASA Technical Reports Server (NTRS)

    Shaw, Timothy (Inventor); Pain, Bedabrata (Inventor); Olson, Brita (Inventor); Nixon, Robert H. (Inventor); Fossum, Eric R. (Inventor); Panicacci, Roger A. (Inventor); Mansoorian, Barmak (Inventor)

    2003-01-01

    A totally digital single chip camera includes communications to operate most of its structure in serial communication mode. The digital single chip camera include a D/A converter for converting an input digital word into an analog reference signal. The chip includes all of the necessary circuitry for operating the chip using a single pin.

  4. Integrated Digital Image Correlation considering gray level and blur variations: Application to distortion measurements of IR camera

    NASA Astrophysics Data System (ADS)

    Charbal, Ali; Dufour, John-Eric; Guery, Adrien; Hild, François; Roux, Stéphane; Vincent, Ludovic; Poncelet, Martin

    2016-03-01

    The acquisition of images with different modalities may involve different alterations with respect to an ideal model. Inhomogeneous brightness and contrast, blur due to non-ideal focusing, distortions are common. It is proposed herein to account for such effects for instance by registering a calibration target image with an actual optical image to measure lens distortions. An Integrated Digital Image Correlation (I-DIC) algorithm is proposed to account for the above artifacts and the algorithm is detailed. The resolution and uncertainty of the technique are first investigated on synthetic images, and then applied to the measurement of distortions for infrared (IR) images. The procedure is shown to reduce drastically the residual level assessing the validity of the image formation model, but more importantly allowing for a much improved registration of images.

  5. New approach to color calibration of high fidelity color digital camera by using unique wide gamut color generator based on LED diodes

    NASA Astrophysics Data System (ADS)

    Kretkowski, M.; Shimodaira, Y.; Jabłoński, R.

    2008-11-01

    Development of a high accuracy color reproduction system requires certain instrumentation and reference for color calibration. Our research led to development of a high fidelity color digital camera with implemented filters that realize the color matching functions. The output signal returns XYZ values which provide absolute description of color. In order to produce XYZ output a mathematical conversion must be applied to CCD output values introducing a conversion matrix. The conversion matrix coefficients are calculated by using a color reference with known XYZ values and corresponding output signals from the CCD sensor under each filter acquisition from a certain amount of color samples. The most important feature of the camera is its ability to acquire colors from the complete theoretically visible color gamut due to implemented filters. However market available color references such as various color checkers are enclosed within HDTV gamut, which is insufficient for calibration in the whole operating color range. This led to development of a unique color reference based on LED diodes called the LED Color Generator (LED CG). It is capable of displaying colors in a wide color gamut estimated by chromaticity coordinates of 12 primary colors. The total amount of colors possible to produce is 25512. The biggest advantage is a possibility of displaying colors with desired spectral distribution (with certain approximations) due to multiple primary colors it consists. The average color difference obtained for test colors was found to be ▵E~0.78 for calibration with LED CG. The result is much better and repetitive in comparison with the Macbeth ColorCheckerTM which typically gives ▵E~1.2 and in the best case ▵E~0.83 with specially developed techniques.

  6. Quantitative Single-Particle Digital Autoradiography with α-Particle Emitters for Targeted Radionuclide Therapy using the iQID Camera

    SciTech Connect

    Miller, Brian W.; Frost, Sophia; Frayo, Shani; Kenoyer, Aimee L.; Santos, E. B.; Jones, Jon C.; Green, Damian J.; Hamlin, Donald K.; Wilbur, D. Scott; Fisher, Darrell R.; Orozco, Johnnie J.; Press, Oliver W.; Pagel, John M.; Sandmaier, B. M.

    2015-07-01

    Abstract Alpha emitting radionuclides exhibit a potential advantage for cancer treatments because they release large amounts of ionizing energy over a few cell diameters (50–80 μm) causing localized, irreparable double-strand DNA breaks that lead to cell death. Radioimmunotherapy (RIT) approaches using monoclonal antibodies labeled with alpha emitters may inactivate targeted cells with minimal radiation damage to surrounding tissues. For accurate dosimetry in alpha-RIT, tools are needed to visualize and quantify the radioactivity distribution and absorbed dose to targeted and non-targeted cells, especially for organs and tumors with heterogeneous radionuclide distributions. The aim of this study was to evaluate and characterize a novel single-particle digital autoradiography imager, iQID (ionizing-radiation Quantum Imaging Detector), for use in alpha-RIT experiments. Methods: The iQID camera is a scintillator-based radiation detection technology that images and identifies charged-particle and gamma-ray/X-ray emissions spatially and temporally on an event-by-event basis. It employs recent advances in CCD/CMOS cameras and computing hardware for real-time imaging and activity quantification of tissue sections, approaching cellular resolutions. In this work, we evaluated this system’s characteristics for alpha particle imaging including measurements of spatial resolution and background count rates at various detector configurations and quantification of activity distributions. The technique was assessed for quantitative imaging of astatine-211 (211At) activity distributions in cryosections of murine and canine tissue samples. Results: The highest spatial resolution was measured at ~20 μm full width at half maximum (FWHM) and the alpha particle background was measured at a rate of (2.6 ± 0.5) × 10–4 cpm/cm2 (40 mm diameter detector area). Simultaneous imaging of multiple tissue sections was performed using a large-area iQID configuration (ø 11.5 cm

  7. High-resolution digital elevation model of lower Cowlitz and Toutle Rivers, adjacent to Mount St. Helens, Washington, based on an airborne lidar survey of October 2007

    USGS Publications Warehouse

    Mosbrucker, Adam

    2015-01-01

    The lateral blast, debris avalanche, and lahars of the May 18th, 1980, eruption of Mount St. Helens, Washington, dramatically altered the surrounding landscape. Lava domes were extruded during the subsequent eruptive periods of 1980–1986 and 2004–2008. More than three decades after the emplacement of the 1980 debris avalanche, high sediment production persists in the Toutle River basin, which drains the northern and western flanks of the volcano. Because this sediment increases the risk of flooding to downstream communities on the Toutle and lower Cowlitz Rivers, the U.S. Army Corps of Engineers (USACE), under the direction of Congress to maintain an authorized level of flood protection, continues to monitor and mitigate excess sediment in North and South Fork Toutle River basins to help reduce this risk and to prevent sediment from clogging the shipping channel of the Columbia River. From October 22–27, 2007, Watershed Sciences, Inc., under contract to USACE, collected high-precision airborne lidar (light detection and ranging) data that cover 273 square kilometers (105 square miles) of lower Cowlitz and Toutle River tributaries from the Columbia River at Kelso, Washington, to upper North Fork Toutle River (below the volcano's edifice), including lower South Fork Toutle River. These data provide a digital dataset of the ground surface, including beneath forest cover. Such remotely sensed data can be used to develop sediment budgets and models of sediment erosion, transport, and deposition. The U.S. Geological Survey (USGS) used these lidar data to develop digital elevation models (DEMs) of the study area. DEMs are fundamental to monitoring natural hazards and studying volcanic landforms, fluvial and glacial geomorphology, and surface geology. Watershed Sciences, Inc., provided files in the LASer (LAS) format containing laser returns that had been filtered, classified, and georeferenced. The USGS produced a hydro-flattened DEM from ground-classified points at

  8. High-resolution digital elevation model of Mount St. Helens crater and upper North Fork Toutle River basin, Washington, based on an airborne lidar survey of September 2009

    USGS Publications Warehouse

    Mosbrucker, Adam

    2014-01-01

    The lateral blast, debris avalanche, and lahars of the May 18th, 1980, eruption of Mount St. Helens, Washington, dramatically altered the surrounding landscape. Lava domes were extruded during the subsequent eruptive periods of 1980–1986 and 2004–2008. More than three decades after the emplacement of the 1980 debris avalanche, high sediment production persists in the North Fork Toutle River basin, which drains the northern flank of the volcano. Because this sediment increases the risk of flooding to downstream communities on the Toutle and Cowlitz Rivers, the U.S. Army Corps of Engineers (USACE), under the direction of Congress to maintain an authorized level of flood protection, built a sediment retention structure on the North Fork Toutle River in 1989 to help reduce this risk and to prevent sediment from clogging the shipping channel of the Columbia River. From September 16–20, 2009, Watershed Sciences, Inc., under contract to USACE, collected high-precision airborne lidar (light detection and ranging) data that cover 214 square kilometers (83 square miles) of Mount St. Helens and the upper North Fork Toutle River basin from the sediment retention structure to the volcano's crater. These data provide a digital dataset of the ground surface, including beneath forest cover. Such remotely sensed data can be used to develop sediment budgets and models of sediment erosion, transport, and deposition. The U.S. Geological Survey (USGS) used these lidar data to develop digital elevation models (DEMs) of the study area. DEMs are fundamental to monitoring natural hazards and studying volcanic landforms, fluvial and glacial geomorphology, and surface geology. Watershed Sciences, Inc., provided files in the LASer (LAS) format containing laser returns that had been filtered, classified, and georeferenced. The USGS produced a hydro-flattened DEM from ground-classified points at Castle, Coldwater, and Spirit Lakes. Final results averaged about five laser last

  9. A Low Noise, Microprocessor-Controlled, Internally Digitizing Rotating-Vane Electric Field Mill for Airborne Platforms

    NASA Technical Reports Server (NTRS)

    Bateman, M. G.; Stewart, M. F.; Blakeslee, R. J.; Podgorny, s. J.; Christian, H. J.; Mach, D. M.; Bailey, J. C.; Daskar, D.

    2006-01-01

    This paper reports on a new generation of aircraft-based rotating-vane style electric field mills designed and built at NASA's Marshall Spaceflight Center. The mills have individual microprocessors that digitize the electric field signal at the mill and respond to commands from the data system computer. The mills are very sensitive (1 V/m per bit), have a wide dynamic range (115 dB), and are very low noise (+/-1 LSB). Mounted on an aircraft, these mills can measure fields from +/-1 V/m to +/-500 kV/m. Once-per-second commanding from the data collection computer to each mill allows for precise timing and synchronization. The mills can also be commanded to execute a self-calibration in flight, which is done periodically to monitor the status and health of each mill.

  10. Use of a digital camera onboard an unmanned aerial vehicle to monitor spring phenology at individual tree level

    NASA Astrophysics Data System (ADS)

    Berra, Elias; Gaulton, Rachel; Barr, Stuart

    2016-04-01

    The monitoring of forest phenology, in a cost-effective manner, at a fine spatial scale and over relatively large areas remains a significant challenge. To address this issue, unmanned aerial vehicles (UAVs) appear as a potential new option for forest phenology monitoring. The aim of this study is to assess the potential of imagery acquired from a UAV to track seasonal changes in leaf canopy at individual tree level. UAV flights, deploying consumer-grade standard and near-infrared modified cameras, were carried out over a deciduous woodland during the spring season of 2015, from which a temporal series of calibrated and georeferenced 5 cm spatial resolution orthophotos was generated. Initial results from a subset of trees are presented in this paper. Four trees with different observed Start of Season (SOS) dates were selected to monitor UAV-derived Green Chromatic Coordinate (GCC), as a measure of canopy greenness. Mean GCC values were extracted from within the four individual tree crowns and were plotted against the day of year (DOY) when the data were acquired. The temporal GCC trajectory of each tree was associated with the visual observations of leaf canopy phenology (SOS) and also with the development of understory vegetation. The chronological order when sudden increases of GCC values occurred matched with the chronological order of observed SOS: the first sudden increase in GCC was detected in the tree which first reached SOS; 18.5 days later (on average) the last sudden increase of GCC was detected in the tree which last reached SOS (18 days later than the first one). Trees with later observed SOS presented GCC values increasing slowly over time, which were associated with development of understory vegetation. Ongoing work is dealing with: 1) testing different indices; 2) radiometric calibration (retrieving of spectral reflectance); 3) expanding the analysis to more tree individuals, more tree species and over larger forest areas, and; 4) deriving

  11. Suitability of low cost commercial off-the-shelf aerial platforms and consumer grade digital cameras for small format aerial photography

    NASA Astrophysics Data System (ADS)

    Turley, Anthony Allen

    Many research projects require the use of aerial images. Wetlands evaluation, crop monitoring, wildfire management, environmental change detection, and forest inventory are but a few of the applications of aerial imagery. Low altitude Small Format Aerial Photography (SFAP) is a bridge between satellite and man-carrying aircraft image acquisition and ground-based photography. The author's project evaluates digital images acquired using low cost commercial digital cameras and standard model airplanes to determine their suitability for remote sensing applications. Images from two different sites were obtained. Several photo missions were flown over each site, acquiring images in the visible and near infrared electromagnetic bands. Images were sorted and analyzed to select those with the least distortion, and blended together with Microsoft Image Composite Editor. By selecting images taken within minutes apart, radiometric qualities of the images were virtually identical, yielding no blend lines in the composites. A commercial image stitching program, Autopano Pro, was purchased during the later stages of this study. Autopano Pro was often able to mosaic photos that the free Image Composite Editor was unable to combine. Using telemetry data from an onboard data logger, images were evaluated to calculate scale and spatial resolution. ERDAS ER Mapper and ESRI ArcGIS were used to rectify composite images. Despite the limitations inherent in consumer grade equipment, images of high spatial resolution were obtained. Mosaics of as many as 38 images were created, and the author was able to record detailed aerial images of forest and wetland areas where foot travel was impractical or impossible.

  12. Wide-Area Persistent Airborne Video: Architecture and Challenges

    NASA Astrophysics Data System (ADS)

    Palaniappan, Kannappan; Rao, Raghuveer M.; Seetharaman, Guna

    The need for persistent video covering large geospatial areas using embedded camera networks and stand-off sensors has increased over the past decade. The availability of inexpensive, compact, light-weight, energy-efficient, high resolution optical sensors and associated digital image processing hardware has led to a new class of airborne surveillance platforms. Traditional tradeoffs posed between lens size and resolution, that is the numerical aperture of the system, can now be mitigated using an array of cameras mounted in a specific geometry. This fundamental advancement enables new imaging systems to cover very large fields of view at high resolution, albeit with spatially varying point spread functions. Airborne imaging systems capable of acquiring 88 megapixels per frame, over a wide field-of-view of 160 degrees or more at low frame rates of several hertz along with color sampling have been built using an optical array with up to eight cameras. These platforms fitted with accurate orientation sensors circle above an area of interest at constant altitude, adjusting steadily the orientation of the camera array fixed around a narrow area of interest, ideally locked to a point on the ground. The resulting image sequence maintains a persistent observation of an extended geographical area depending on the altitude of the platform and the configuration of the camera array. Suitably geo-registering and stabilizing these very large format videos provide a virtual nadir view of the region being monitored enabling a new class of urban scale activity analysis applications. The sensor geometry, processing challenges and scene interpretation complexities are highlighted.

  13. Multi-illumination Gabor holography recorded in a single camera snap-shot for high-resolution phase retrieval in digital in-line holographic microscopy

    NASA Astrophysics Data System (ADS)

    Sanz, Martin; Picazo-Bueno, Jose A.; Garcia, Javier; Micó, Vicente

    2015-05-01

    In this contribution we introduce MISHELF microscopy, a new concept and design of a lensless holographic microscope based on wavelength multiplexing, single hologram acquisition and digital image processing. The technique which name comes from Multi-Illumination Single-Holographic-Exposure Lensless Fresnel microscopy, is based on the simultaneous illumination and recording of three diffraction patterns in the Fresnel domain. In combination with a novel and fast iterative phase retrieval algorithm, MISHELF microscopy is capable of high-resolution (micron range) phase-retrieved (twin image elimination) biological imaging of dynamic events (video rate recording speed) since it avoids the time multiplexing needed for the in-line hologram sequence recording when using conventional phase-shifting or phase retrieval algorithms. MISHELF microscopy is validated using two different experimental layouts: one using RGB illumination and detection schemes and another using IRRB as illumination while keeping the RGB color camera as detection device. Preliminary experimental results are provided for both experimental layouts using a synthetic object (USAF resolution test target).

  14. Real-time look-up table-based color correction for still image stabilization of digital cameras without using frame memory

    NASA Astrophysics Data System (ADS)

    Luo, Lin-Bo; An, Sang-Woo; Wang, Chang-Shuai; Li, Ying-Chun; Chong, Jong-Wha

    2012-09-01

    Digital cameras usually decrease exposure time to capture motion-blur-free images. However, this operation will generate an under-exposed image with a low-budget complementary metal-oxide semiconductor image sensor (CIS). Conventional color correction algorithms can efficiently correct under-exposed images; however, they are generally not performed in real time and need at least one frame memory if they are implemented by hardware. The authors propose a real-time look-up table-based color correction method that corrects under-exposed images with hardware without using frame memory. The method utilizes histogram matching of two preview images, which are exposed for a long and short time, respectively, to construct an improved look-up table (ILUT) and then corrects the captured under-exposed image in real time. Because the ILUT is calculated in real time before processing the captured image, this method does not require frame memory to buffer image data, and therefore can greatly save the cost of CIS. This method not only supports single image capture, but also bracketing to capture three images at a time. The proposed method was implemented by hardware description language and verified by a field-programmable gate array with a 5 M CIS. Simulations show that the system can perform in real time with a low cost and can correct the color of under-exposed images well.

  15. Airborne system for testing multispectral reconnaissance technologies

    NASA Astrophysics Data System (ADS)

    Schmitt, Dirk-Roger; Doergeloh, Heinrich; Keil, Heiko; Wetjen, Wilfried

    1999-07-01

    There is an increasing demand for future airborne reconnaissance systems to obtain aerial images for tactical or peacekeeping operations. Especially Unmanned Aerial Vehicles (UAVs) equipped with multispectral sensor system and with real time jam resistant data transmission capabilities are of high interest. An airborne experimental platform has been developed as testbed to investigate different concepts of reconnaissance systems before their application in UAVs. It is based on a Dornier DO 228 aircraft, which is used as flying platform. Great care has been taken to achieve the possibility to test different kinds of multispectral sensors. Hence basically it is capable to be equipped with an IR sensor head, high resolution aerial cameras of the whole optical spectrum and radar systems. The onboard equipment further includes system for digital image processing, compression, coding, and storage. The data are RF transmitted to the ground station using technologies with high jam resistance. The images, after merging with enhanced vision components, are delivered to the observer who has an uplink data channel available to control flight and imaging parameters.

  16. Assessing modern ground survey methods and airborne laser scanning for digital terrain modelling: A case study from the Lake District, England

    NASA Astrophysics Data System (ADS)

    Gallay, Michal; Lloyd, Christopher D.; McKinley, Jennifer; Barry, Lorraine

    2013-02-01

    This paper compares the applicability of three ground survey methods for modelling terrain: one man electronic tachymetry (TPS), real time kinematic GPS (GPS), and terrestrial laser scanning (TLS). Vertical accuracy of digital terrain models (DTMs) derived from GPS, TLS and airborne laser scanning (ALS) data is assessed. Point elevations acquired by the four methods represent two sections of a mountainous area in Cumbria, England. They were chosen so that the presence of non-terrain features is constrained to the smallest amount. The vertical accuracy of the DTMs was addressed by subtracting each DTM from TPS point elevations. The error was assessed using exploratory measures including statistics, histograms, and normal probability plots. The results showed that the internal measurement accuracy of TPS, GPS, and TLS was below a centimetre. TPS and GPS can be considered equally applicable alternatives for sampling the terrain in areas accessible on foot. The highest DTM vertical accuracy was achieved with GPS data, both on sloped terrain (RMSE 0.16 m) and flat terrain (RMSE 0.02 m). TLS surveying was the most efficient overall but veracity of terrain representation was subject to dense vegetation cover. Therefore, the DTM accuracy was the lowest for the sloped area with dense bracken (RMSE 0.52 m) although it was the second highest on the flat unobscured terrain (RMSE 0.07 m). ALS data represented the sloped terrain more realistically (RMSE 0.23 m) than the TLS. However, due to a systematic bias identified on the flat terrain the DTM accuracy was the lowest (RMSE 0.29 m) which was above the level stated by the data provider. Error distribution models were more closely approximated by normal distribution defined using median and normalized median absolute deviation which supports the use of the robust measures in DEM error modelling and its propagation.

  17. Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    Holland, S. Douglas (Inventor)

    1992-01-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  18. Introduction to an airborne remote sensing system equipped onboard the Chinese marine surveillance plane

    NASA Astrophysics Data System (ADS)

    Gong, Fang; Wang, Difeng; Pan, Delu; Hao, Zengzhou

    2008-10-01

    The airborne remote sensing system onboard the Chinese Marine Surveillance Plane have three scanners including marine airborne multi-spectrum scanner(MAMS), airborne hyper spectral system(AISA+) and optical-electric platform(MOP) currently. MAMS is developed by Shanghai Institute of Technology and Physics CAS with 11 bands from ultraviolet to infrared and mainly used for inversion of oceanic main factors and pollution information, like chlorophyll, sea surface temperature, red tide, etc. The AISA+ made by Finnish Specim company is a push broom system, consist of a high spectrum scanner head, a miniature GPS/INS sensor and data collecting PC. It is a kind of aviation imaging spectrometer and has the ability of ground target imaging and measuring target spectrum characteristic. The MOP mainly supports for object watching, recording and track. It mainly includes 3 equipments: digital CCD with Sony-DXC390, CANON EOS film camera and digital camera Sony F717. This paper mainly introduces these three remote sensing instruments as well as the ground processing information system, involving the system's hardware and software design, related algorithm research, etc.

  19. Targetless Camera Calibration

    NASA Astrophysics Data System (ADS)

    Barazzetti, L.; Mussio, L.; Remondino, F.; Scaioni, M.

    2011-09-01

    In photogrammetry a camera is considered calibrated if its interior orientation parameters are known. These encompass the principal distance, the principal point position and some Additional Parameters used to model possible systematic errors. The current state of the art for automated camera calibration relies on the use of coded targets to accurately determine the image correspondences. This paper presents a new methodology for the efficient and rigorous photogrammetric calibration of digital cameras which does not require any longer the use of targets. A set of images depicting a scene with a good texture are sufficient for the extraction of natural corresponding image points. These are automatically matched with feature-based approaches and robust estimation techniques. The successive photogrammetric bundle adjustment retrieves the unknown camera parameters and their theoretical accuracies. Examples, considerations and comparisons with real data and different case studies are illustrated to show the potentialities of the proposed methodology.

  20. New portable system for dental plaque measurement using a digital single-lens reflex camera and image analysis: Study of reliability and validation

    PubMed Central

    Rosa, Guillermo Martin; Elizondo, Maria Lidia

    2015-01-01

    Background: The quantification of the dental plaque (DP) by indices has limitations: They depend on the subjective operator's evaluation and are measured in an ordinal scale. The purpose of this study was to develop and evaluate a method to measure DP in a proportional scale. Materials and Methods: A portable photographic positioning device (PPPD) was designed and added to a photographic digital single-lens reflex camera. Seventeen subjects participated in this study, after DP disclosure with the erythrosine, their incisors, and a calibration scale ware photographed by two operators in duplicate, re-positioning the PPPD among each acquisition. A third operator registered the Quigley-Hein modified by Turesky DP index (Q-H/TPI). After tooth brushing, the same operators repeated the photographs and the Q-H/TPI. The image analysis system (IAS) technique allowed the measurement in mm2 of the vestibular total tooth area and the area with DP. Results: The reliability was determined with the intra-class correlation coefficient that was 0.9936 (P < 0.05) for the intra-operator repeatability and 0.9931 (P < 0.05) for inter-operator reproducibility. The validity was assessed using the Spearman's correlation coefficient that indicated a strong positive correlation with the Q-H/TPI rs = 0.84 (P < 0.01). The sensitivity of the IAS was evaluated with two sample sizes, only the IAS was able to detect significant differences (P < 0.05) with the sample of smaller size (n = 8). Conclusions: Image analysis system showed to be a reliable and valid method to measure the quantity of DP in a proportional scale, allowing a more powerful statistical analysis, thus facilitating trials with a smaller sample size. PMID:26229267

  1. Multispectral imaging of absorption and scattering properties of in vivo exposed rat brain using a digital red-green-blue camera

    NASA Astrophysics Data System (ADS)

    Yoshida, Keiichiro; Nishidate, Izumi; Ishizuka, Tomohiro; Kawauchi, Satoko; Sato, Shunichi; Sato, Manabu

    2015-05-01

    In order to estimate multispectral images of the absorption and scattering properties in the cerebral cortex of in vivo rat brain, we investigated spectral reflectance images estimated by the Wiener estimation method using a digital RGB camera. A Monte Carlo simulation-based multiple regression analysis for the corresponding spectral absorbance images at nine wavelengths (500, 520, 540, 560, 570, 580, 600, 730, and 760 nm) was then used to specify the absorption and scattering parameters of brain tissue. In this analysis, the concentrations of oxygenated hemoglobin and that of deoxygenated hemoglobin were estimated as the absorption parameters, whereas the coefficient a and the exponent b of the reduced scattering coefficient spectrum approximated by a power law function were estimated as the scattering parameters. The spectra of absorption and reduced scattering coefficients were reconstructed from the absorption and scattering parameters, and the spectral images of absorption and reduced scattering coefficients were then estimated. In order to confirm the feasibility of this method, we performed in vivo experiments on exposed rat brain. The estimated images of the absorption coefficients were dominated by the spectral characteristics of hemoglobin. The estimated spectral images of the reduced scattering coefficients had a broad scattering spectrum, exhibiting a larger magnitude at shorter wavelengths, corresponding to the typical spectrum of brain tissue published in the literature. The changes in the estimated absorption and scattering parameters during normoxia, hyperoxia, and anoxia indicate the potential applicability of the method by which to evaluate the pathophysiological conditions of in vivo brain due to the loss of tissue viability.

  2. Time-resolved imaging of prompt-gamma rays for proton range verification using a knife-edge slit camera based on digital photon counters.

    PubMed

    Cambraia Lopes, Patricia; Clementel, Enrico; Crespo, Paulo; Henrotin, Sebastien; Huizenga, Jan; Janssens, Guillaume; Parodi, Katia; Prieels, Damien; Roellinghoff, Frauke; Smeets, Julien; Stichelbaut, Frederic; Schaart, Dennis R

    2015-08-01

    Proton range monitoring may facilitate online adaptive proton therapy and improve treatment outcomes. Imaging of proton-induced prompt gamma (PG) rays using a knife-edge slit collimator is currently under investigation as a potential tool for real-time proton range monitoring. A major challenge in collimated PG imaging is the suppression of neutron-induced background counts. In this work, we present an initial performance test of two knife-edge slit camera prototypes based on arrays of digital photon counters (DPCs). PG profiles emitted from a PMMA target upon irradiation with a 160 MeV proton pencil beams (about 6.5 × 10(9) protons delivered in total) were measured using detector modules equipped with four DPC arrays coupled to BGO or LYSO : Ce crystal matrices. The knife-edge slit collimator and detector module were placed at 15 cm and 30 cm from the beam axis, respectively, in all cases. The use of LYSO : Ce enabled time-of-flight (TOF) rejection of background events, by synchronizing the DPC readout electronics with the 106 MHz radiofrequency signal of the cyclotron. The signal-to-background (S/B) ratio of 1.6 obtained with a 1.5 ns TOF window and a 3 MeV-7 MeV energy window was about 3 times higher than that obtained with the same detector module without TOF discrimination and 2 times higher than the S/B ratio obtained with the BGO module. Even 1 mm shifts of the Bragg peak position translated into clear and consistent shifts of the PG profile if TOF discrimination was applied, for a total number of protons as low as about 6.5 × 10(8) and a detector surface of 6.6 cm × 6.6 cm. PMID:26216269

  3. Development of an airborne laser bathymeter

    NASA Technical Reports Server (NTRS)

    Kim, H., H.; Cervenka, P. O.; Lankford, C. B.

    1975-01-01

    An airborne laser depth sounding system was built and taken through a complete series of field tests. Two green laser sources were tried: a pulsed neon laser at 540 nm and a frequency-doubled Nd:YAG transmitter at 532 nm. To obtain a depth resolution of better than 20 cm, the pulses had a duration of 5 to 7 nanoseconds and could be fired up to at rates of 50 pulses per second. In the receiver, the signal was detected by a photomultiplier tube connected to a 28 cm diameter Cassegrainian telescope that was aimed vertically downward. Oscilloscopic traces of the signal reflected from the sea surface and the ocean floor could either be recorded by a movie camera on 35 mm film or digitized into 500 discrete channels of information and stored on magnetic tape, from which depth information could be extracted. An aerial color movie camera recorded the geographic footprint while a boat crew of oceanographers measured depth and other relevant water parameters. About two hundred hours of flight time on the NASA C-54 airplane in the area of Chincoteague, Virginia, the Chesapeake Bay, and in Key West, Florida, have yielded information on the actual operating conditions of such a system and helped to optimize the design. One can predict the maximum depth attainable in a mission by measuring the effective attenuation coefficient in flight. This quantity is four times smaller than the usual narrow beam attenuation coefficient. Several square miles of a varied underwater landscape were also mapped.

  4. Spectra-view: A high performance, low-cost multispectral airborne imaging system

    SciTech Connect

    Helder, D.

    1996-11-01

    Although a variety of airborne platforms are available for collecting remote sensing data, a niche exists for a low cost, compact systemd capable of collecting accurate visible and infrared multispectral data in a digital format. To fill this void, an instrument known as Spectra-View was developed by Airborne Data Systems. Multispectral data is collected in the visible and near-infrared using an array of CCD cameras with appropriate spectral filtering. Infrared imaging is accomplished using commercially available cameras. Although the current system images in five spectral bands, a modular design approach allows various configurations for imaging in the visible and infrared regions with up to 10 or more channels. It was built entirely through integration of readily available commercial components, is compact enough to fly in an aircraft as small as a Cessna 172, and can record imagery at airspeeds in excess of 150 knots. A GPS-based navigation system provides a course deviation indicator for the pilot to follow and allows for georeferencing of the data. To maintain precise pointing knowledge, and at the same time keep system cost low, attitude sensors are mounted directly with the cameras rather than using a stabilized mounting system. Information is collect during camera firing of aircraft/camera attitude along the yaw, pitch, and roll axes. All data is collected in a digital format on a hard disk that is removable during flight so that virtually unlimited amounts of data may be recorded. Following collection, imagery is readily available for viewing and incorporation into computer-based systems for analysis and reduction. Ground processing software has been developed to perform radiometric calibration and georeference the imagery. Since June, 1995, the system has been collecting high-quality data in a variety of applications for numerous customers including applications in agriculture, forestry, and global change research. Several examples will be presented.

  5. Time-resolved imaging of prompt-gamma rays for proton range verification using a knife-edge slit camera based on digital photon counters

    NASA Astrophysics Data System (ADS)

    Cambraia Lopes, Patricia; Clementel, Enrico; Crespo, Paulo; Henrotin, Sebastien; Huizenga, Jan; Janssens, Guillaume; Parodi, Katia; Prieels, Damien; Roellinghoff, Frauke; Smeets, Julien; Stichelbaut, Frederic; Schaart, Dennis R.

    2015-08-01

    Proton range monitoring may facilitate online adaptive proton therapy and improve treatment outcomes. Imaging of proton-induced prompt gamma (PG) rays using a knife-edge slit collimator is currently under investigation as a potential tool for real-time proton range monitoring. A major challenge in collimated PG imaging is the suppression of neutron-induced background counts. In this work, we present an initial performance test of two knife-edge slit camera prototypes based on arrays of digital photon counters (DPCs). PG profiles emitted from a PMMA target upon irradiation with a 160 MeV proton pencil beams (about 6.5   ×   109 protons delivered in total) were measured using detector modules equipped with four DPC arrays coupled to BGO or LYSO : Ce crystal matrices. The knife-edge slit collimator and detector module were placed at 15 cm and 30 cm from the beam axis, respectively, in all cases. The use of LYSO : Ce enabled time-of-flight (TOF) rejection of background events, by synchronizing the DPC readout electronics with the 106 MHz radiofrequency signal of the cyclotron. The signal-to-background (S/B) ratio of 1.6 obtained with a 1.5 ns TOF window and a 3 MeV-7 MeV energy window was about 3 times higher than that obtained with the same detector module without TOF discrimination and 2 times higher than the S/B ratio obtained with the BGO module. Even 1 mm shifts of the Bragg peak position translated into clear and consistent shifts of the PG profile if TOF discrimination was applied, for a total number of protons as low as about 6.5   ×   108 and a detector surface of 6.6 cm  ×  6.6 cm.

  6. Expected accuracy of tilt measurements on a novel hexapod-based digital zenith camera system: a Monte-Carlo simulation study

    NASA Astrophysics Data System (ADS)

    Hirt, Christian; Papp, Gábor; Pál, András; Benedek, Judit; Szũcs, Eszter

    2014-08-01

    Digital zenith camera systems (DZCS) are dedicated astronomical-geodetic measurement systems for the observation of the direction of the plumb line. A DZCS key component is a pair of tilt meters for the determination of the instrumental tilt with respect to the plumb line. Highest accuracy (i.e., 0.1 arc-seconds or better) is achieved in practice through observation with precision tilt meters in opposite faces (180° instrumental rotation), and application of rigorous tilt reduction models. A novel concept proposes the development of a hexapod (Stewart platform)-based DZCS. However, hexapod-based total rotations are limited to about 30°-60° in azimuth (equivalent to ±15° to ±30° yaw rotation), which raises the question of the impact of the rotation angle between the two faces on the accuracy of the tilt measurement. The goal of the present study is the investigation of the expected accuracy of tilt measurements to be carried out on future hexapod-based DZCS, with special focus placed on the role of the limited rotation angle. A Monte-Carlo simulation study is carried out in order to derive accuracy estimates for the tilt determination as a function of several input parameters, and the results are validated against analytical error propagation. As the main result of the study, limitation of the instrumental rotation to 60° (30°) deteriorates the tilt accuracy by a factor of about 2 (4) compared to a 180° rotation between the faces. Nonetheless, a tilt accuracy at the 0.1 arc-second level is expected when the rotation is at least 45°, and 0.05 arc-second (about 0.25 microradian) accurate tilt meters are deployed. As such, a hexapod-based DZCS can be expected to allow sufficiently accurate determination of the instrumental tilt. This provides supporting evidence for the feasibility of such a novel instrumentation. The outcomes of our study are not only relevant to the field of DZCS, but also to all other types of instruments where the instrumental tilt

  7. Airborne Imagery Collections Barrow 2013

    DOE Data Explorer

    Cherry, Jessica; Crowder, Kerri

    2015-07-20

    The data here are orthomosaics, digital surface models (DSMs), and individual frames captured during low altitude airborne flights in 2013 at the Barrow Environmental Observatory. The orthomosaics, thermal IR mosaics, and DSMs were generated from the individual frames using Structure from Motion techniques.

  8. Range-Gated LADAR Coherent Imaging Using Parametric Up-Conversion of IR and NIR Light for Imaging with a Visible-Range Fast-Shuttered Intensified Digital CCD Camera

    SciTech Connect

    YATES,GEORGE J.; MCDONALD,THOMAS E. JR.; BLISS,DAVID E.; CAMERON,STEWART M.; ZUTAVERN,FRED J.

    2000-12-20

    Research is presented on infrared (IR) and near infrared (NIR) sensitive sensor technologies for use in a high speed shuttered/intensified digital video camera system for range-gated imaging at ''eye-safe'' wavelengths in the region of 1.5 microns. The study is based upon nonlinear crystals used for second harmonic generation (SHG) in optical parametric oscillators (OPOS) for conversion of NIR and IR laser light to visible range light for detection with generic S-20 photocathodes. The intensifiers are ''stripline'' geometry 18-mm diameter microchannel plate intensifiers (MCPIIS), designed by Los Alamos National Laboratory and manufactured by Philips Photonics. The MCPIIS are designed for fast optical shattering with exposures in the 100-200 ps range, and are coupled to a fast readout CCD camera. Conversion efficiency and resolution for the wavelength conversion process are reported. Experimental set-ups for the wavelength shifting and the optical configurations for producing and transporting laser reflectance images are discussed.

  9. CAOS-CMOS camera.

    PubMed

    Riza, Nabeel A; La Torre, Juan Pablo; Amin, M Junaid

    2016-06-13

    Proposed and experimentally demonstrated is the CAOS-CMOS camera design that combines the coded access optical sensor (CAOS) imager platform with the CMOS multi-pixel optical sensor. The unique CAOS-CMOS camera engages the classic CMOS sensor light staring mode with the time-frequency-space agile pixel CAOS imager mode within one programmable optical unit to realize a high dynamic range imager for extreme light contrast conditions. The experimentally demonstrated CAOS-CMOS camera is built using a digital micromirror device, a silicon point-photo-detector with a variable gain amplifier, and a silicon CMOS sensor with a maximum rated 51.3 dB dynamic range. White light imaging of three different brightness simultaneously viewed targets, that is not possible by the CMOS sensor, is achieved by the CAOS-CMOS camera demonstrating an 82.06 dB dynamic range. Applications for the camera include industrial machine vision, welding, laser analysis, automotive, night vision, surveillance and multispectral military systems. PMID:27410361

  10. Hybrid 3D laser sensor based on a high-performance long-range wide-field-of-view laser scanner and a calibrated high-resolution digital camera

    NASA Astrophysics Data System (ADS)

    Ullrich, Andreas; Studnicka, Nikolaus; Riegl, Johannes

    2004-09-01

    We present a hybrid sensor consisting of a high-performance 3D imaging laser sensor and a high-resolution digital camera. The laser sensor uses the time-of-flight principle based on near-infrared pulses. We demonstrate the performance capabilities of the system by presenting example data and we describe the software package used for data acquisition, data merging and visualization. The advantages of using both, near range photogrammetry and laser scanning, for data registration and data extraction are discussed.

  11. Space Camera

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Nikon's F3 35mm camera was specially modified for use by Space Shuttle astronauts. The modification work produced a spinoff lubricant. Because lubricants in space have a tendency to migrate within the camera, Nikon conducted extensive development to produce nonmigratory lubricants; variations of these lubricants are used in the commercial F3, giving it better performance than conventional lubricants. Another spinoff is the coreless motor which allows the F3 to shoot 140 rolls of film on one set of batteries.

  12. Fourth Airborne Geoscience Workshop

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The focus of the workshop was on how the airborne community can assist in achieving the goals of the Global Change Research Program. The many activities that employ airborne platforms and sensors were discussed: platforms and instrument development; airborne oceanography; lidar research; SAR measurements; Doppler radar; laser measurements; cloud physics; airborne experiments; airborne microwave measurements; and airborne data collection.

  13. Airborne laser

    NASA Astrophysics Data System (ADS)

    Lamberson, Steven E.

    2002-06-01

    The US Air Force Airborne Laser (ABL) is an airborne, megawatt-class laser system with a state-of-the-art atmospheric compensation system to destroy enemy ballistic missiles at long ranges. This system will provide both deterrence and defense against the use of such weapons during conflicts. This paper provides an overview of the ABL weapon system including: the notional operational concept, the development approach and schedule, the overall aircraft configuration, the technologies being incorporated in the ABL, and the risk reduction approach being utilized to ensure program success.

  14. The future of consumer cameras

    NASA Astrophysics Data System (ADS)

    Battiato, Sebastiano; Moltisanti, Marco

    2015-03-01

    In the last two decades multimedia, and in particular imaging devices (camcorders, tablets, mobile phones, etc.) have been dramatically diffused. Moreover the increasing of their computational performances, combined with an higher storage capability, allows them to process large amount of data. In this paper an overview of the current trends of consumer cameras market and technology will be given, providing also some details about the recent past (from Digital Still Camera up today) and forthcoming key issues.

  15. Characterization of the Series 1000 Camera System

    SciTech Connect

    Kimbrough, J; Moody, J; Bell, P; Landen, O

    2004-04-07

    The National Ignition Facility requires a compact network addressable scientific grade CCD camera for use in diagnostics ranging from streak cameras to gated x-ray imaging cameras. Due to the limited space inside the diagnostic, an analog and digital input/output option in the camera controller permits control of both the camera and the diagnostic by a single Ethernet link. The system consists of a Spectral Instruments Series 1000 camera, a PC104+ controller, and power supply. The 4k by 4k CCD camera has a dynamic range of 70 dB with less than 14 electron read noise at a 1MHz readout rate. The PC104+ controller includes 16 analog inputs, 4 analog outputs and 16 digital input/output lines for interfacing to diagnostic instrumentation. A description of the system and performance characterization is reported.

  16. The Geospectral Camera: a Compact and Geometrically Precise Hyperspectral and High Spatial Resolution Imager

    NASA Astrophysics Data System (ADS)

    Delauré, B.; Michiels, B.; Biesemans, J.; Livens, S.; Van Achteren, T.

    2013-04-01

    Small unmanned aerial vehicles are increasingly being employed for environmental monitoring at local scale, which drives the demand for compact and lightweight spectral imagers. This paper describes the geospectral camera, which is a novel compact imager concept. The camera is built around an innovative detector which has two sensor elements on a single chip and therefore offers the functionality of two cameras within the volume of a single one. The two sensor elements allow the camera to derive both spectral information as well as geometric information (high spatial resolution imagery and a digital surface model) of the scene of interest. A first geospectral camera prototype has been developed. It uses a linear variable optical filter which is installed in front of one of the two sensors of the MEDUSA CMOS imager chip. A accompanying software approach has been developed which exploits the simultaneous information of the two sensors in order to extract an accurate spectral image product. This method has been functionally demonstrated by applying it on image data acquired during an airborne acquisition.

  17. Using Digital Imaging in Classroom and Outdoor Activities.

    ERIC Educational Resources Information Center

    Thomasson, Joseph R.

    2002-01-01

    Explains how to use digital cameras and related basic equipment during indoor and outdoor activities. Uses digital imaging in general botany class to identify unknown fungus samples. Explains how to select a digital camera and other necessary equipment. (YDS)

  18. Infrared Camera

    NASA Technical Reports Server (NTRS)

    1997-01-01

    A sensitive infrared camera that observes the blazing plumes from the Space Shuttle or expendable rocket lift-offs is capable of scanning for fires, monitoring the environment and providing medical imaging. The hand-held camera uses highly sensitive arrays in infrared photodetectors known as quantum well infrared photo detectors (QWIPS). QWIPS were developed by the Jet Propulsion Laboratory's Center for Space Microelectronics Technology in partnership with Amber, a Raytheon company. In October 1996, QWIP detectors pointed out hot spots of the destructive fires speeding through Malibu, California. Night vision, early warning systems, navigation, flight control systems, weather monitoring, security and surveillance are among the duties for which the camera is suited. Medical applications are also expected.

  19. LISS-4 camera for Resourcesat

    NASA Astrophysics Data System (ADS)

    Paul, Sandip; Dave, Himanshu; Dewan, Chirag; Kumar, Pradeep; Sansowa, Satwinder Singh; Dave, Amit; Sharma, B. N.; Verma, Anurag

    2006-12-01

    The Indian Remote Sensing Satellites use indigenously developed high resolution cameras for generating data related to vegetation, landform /geomorphic and geological boundaries. This data from this camera is used for working out maps at 1:12500 scale for national level policy development for town planning, vegetation etc. The LISS-4 Camera was launched onboard Resourcesat-1 satellite by ISRO in 2003. LISS-4 is a high-resolution multi-spectral camera with three spectral bands and having a resolution of 5.8m and swath of 23Km from 817 Km altitude. The panchromatic mode provides a swath of 70Km and 5-day revisit. This paper briefly discusses the configuration of LISS-4 Camera of Resourcesat-1, its onboard performance and also the changes in the Camera being developed for Resourcesat-2. LISS-4 camera images the earth in push-broom mode. It is designed around a three mirror un-obscured telescope, three linear 12-K CCDs and associated electronics for each band. Three spectral bands are realized by splitting the focal plane in along track direction using an isosceles prism. High-speed Camera Electronics is designed for each detector with 12- bit digitization and digital double sampling of video. Seven bit data selected from 10 MSBs data by Telecommand is transmitted. The total dynamic range of the sensor covers up to 100% albedo. The camera structure has heritage of IRS- 1C/D. The optical elements are precisely glued to specially designed flexure mounts. The camera is assembled onto a rotating deck on spacecraft to facilitate +/- 26° steering in Pitch-Yaw plane. The camera is held on spacecraft in a stowed condition before deployment. The excellent imageries from LISS-4 Camera onboard Resourcesat-1 are routinely used worldwide. Such second Camera is being developed for Resourcesat-2 launch in 2007 with similar performance. The Camera electronics is optimized and miniaturized. The size and weight are reduced to one third and the power to half of the values in Resourcesat

  20. Nikon Camera

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Nikon FM compact has simplification feature derived from cameras designed for easy, yet accurate use in a weightless environment. Innovation is a plastic-cushioned advance lever which advances the film and simultaneously switches on a built in light meter. With a turn of the lens aperture ring, a glowing signal in viewfinder confirms correct exposure.

  1. CCD Camera

    DOEpatents

    Roth, Roger R.

    1983-01-01

    A CCD camera capable of observing a moving object which has varying intensities of radiation eminating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other.

  2. CCD Camera

    DOEpatents

    Roth, R.R.

    1983-08-02

    A CCD camera capable of observing a moving object which has varying intensities of radiation emanating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other. 7 figs.

  3. Estimation of the Atmospheric Refraction Effect in Airborne Images Using Radiosonde Data

    NASA Astrophysics Data System (ADS)

    Beisl, U.; Tempelmann, U.

    2016-06-01

    The influence of the atmospheric refraction on the geometric accuracy of airborne photogrammetric images was already considered in the days of analogue photography. The effect is a function of the varying refractive index on the path from the ground to the image sensor. Therefore the effect depends on the height over ground, the view zenith angle and the atmospheric constituents. It is leading to a gradual increase of the scale towards the borders of the image, i.e. a magnification takes place. Textbooks list a shift of several pixels at the borders of standard wide angle images. As it was the necessity of that time when images could only be acquired at good weather conditions, the effect was calculated using standard atmospheres for good atmospheric conditions, leading to simple empirical formulas. Often the pixel shift caused by refraction was approximated as linear with height and compensated by an adjustment of the focal length. With the advent of sensitive digital cameras, the image dynamics allows for capturing images at adverse weather conditions. So the influence of the atmospheric profiles on the geometric accuracy of the images has to be investigated and the validity of the standard correction formulas has to be checked. This paper compares the results from the standard formulas by Saastamoinen with the results calculated from a broad selection of atmospheres obtained from radiosonde profile data. The geometric deviation is calculated by numerical integration of the refractive index as a function of the height using the refractive index formula by Ciddor. It turns out that the effect of different atmospheric profiles (including inversion situations) is generally small compared to the overall effect except at low camera heights. But there the absolute deviation is small. Since the necessary atmospheric profile data are often not readily available for airborne images a formula proposed by Saastamoinen is verified that uses only camera height, the pressure

  4. SITHON: An Airborne Fire Detection System Compliant with Operational Tactical Requirements

    PubMed Central

    Kontoes, Charalabos; Keramitsoglou, Iphigenia; Sifakis, Nicolaos; Konstantinidis, Pavlos

    2009-01-01

    In response to the urging need of fire managers for timely information on fire location and extent, the SITHON system was developed. SITHON is a fully digital thermal imaging system, integrating INS/GPS and a digital camera, designed to provide timely positioned and projected thermal images and video data streams rapidly integrated in the GIS operated by Crisis Control Centres. This article presents in detail the hardware and software components of SITHON, and demonstrates the first encouraging results of test flights over the Sithonia Peninsula in Northern Greece. It is envisaged that the SITHON system will be soon operated onboard various airborne platforms including fire brigade airplanes and helicopters as well as on UAV platforms owned and operated by the Greek Air Forces. PMID:22399963

  5. Development of an airborne remote sensing system for aerial applicators

    Technology Transfer Automated Retrieval System (TEKTRAN)

    An airborne remote sensing system was developed and tested for recording aerial images of field crops, which were analyzed for variations of crop health or pest infestation. The multicomponent system consists of a multi-spectral camera system, a camera control system, and a radiometer for normalizi...

  6. Making Connections with Digital Data

    ERIC Educational Resources Information Center

    Leonard, William; Bassett, Rick; Clinger, Alicia; Edmondson, Elizabeth; Horton, Robert

    2004-01-01

    State-of-the-art digital cameras open up enormous possibilities in the science classroom, especially when used as data collectors. Because most high school students are not fully formal thinkers, the digital camera can provide a much richer learning experience than traditional observation. Data taken through digital images can make the…

  7. Cameras Monitor Spacecraft Integrity to Prevent Failures

    NASA Technical Reports Server (NTRS)

    2014-01-01

    The Jet Propulsion Laboratory contracted Malin Space Science Systems Inc. to outfit Curiosity with four of its cameras using the latest commercial imaging technology. The company parlayed the knowledge gained under working with NASA to develop an off-the-shelf line of cameras, along with a digital video recorder, designed to help troubleshoot problems that may arise on satellites in space.

  8. Extracting Roof Parameters and Heat Bridges Over the City of Oldenburg from Hyperspectral, Thermal, and Airborne Laser Scanning Data

    NASA Astrophysics Data System (ADS)

    Bannehr, L.; Luhmann, Th.; Piechel, J.; Roelfs, T.; Schmidt, An.

    2011-09-01

    Remote sensing methods are used to obtain different kinds of information about the state of the environment. Within the cooperative research project HiReSens, funded by the German BMBF, a hyperspectral scanner, an airborne laser scanner, a thermal camera, and a RGB-camera are employed on a small aircraft to determine roof material parameters and heat bridges of house tops over the city Oldenburg, Lower Saxony. HiReSens aims to combine various geometrical highly resolved data in order to achieve relevant evidence about the state of the city buildings. Thermal data are used to obtain the energy distribution of single buildings. The use of hyperspectral data yields information about material consistence of roofs. From airborne laser scanning data (ALS) digital surface models are inferred. They build the basis to locate the best orientations for solar panels of the city buildings. The combination of the different data sets offers the opportunity to capitalize synergies between differently working systems. Central goals are the development of tools for the collection of heat bridges by means of thermal data, spectral collection of roofs parameters on basis of hyperspectral data as well as 3D-capture of buildings from airborne lasers scanner data. Collecting, analyzing and merging of the data are not trivial especially not when the resolution and accuracy is aimed in the domain of a few decimetre. The results achieved need to be regarded as preliminary. Further investigations are still required to prove the accuracy in detail.

  9. Automatic calibration method for plenoptic camera

    NASA Astrophysics Data System (ADS)

    Luan, Yinsen; He, Xing; Xu, Bing; Yang, Ping; Tang, Guomao

    2016-04-01

    An automatic calibration method is proposed for a microlens-based plenoptic camera. First, all microlens images on the white image are searched and recognized automatically based on digital morphology. Then, the center points of microlens images are rearranged according to their relative position relationships. Consequently, the microlens images are located, i.e., the plenoptic camera is calibrated without the prior knowledge of camera parameters. Furthermore, this method is appropriate for all types of microlens-based plenoptic cameras, even the multifocus plenoptic camera, the plenoptic camera with arbitrarily arranged microlenses, or the plenoptic camera with different sizes of microlenses. Finally, we verify our method by the raw data of Lytro. The experiments show that our method has higher intelligence than the methods published before.

  10. Digitized Photography: What You Can Do with It.

    ERIC Educational Resources Information Center

    Kriss, Jack

    1997-01-01

    Discusses benefits of digital cameras which allow users to take a picture, store it on a digital disk, and manipulate/export these photos to a print document, Web page, or multimedia presentation. Details features of digital cameras and discusses educational uses. A sidebar presents prices and other information for 12 digital cameras. (AEF)

  11. Enhancing Positioning Accuracy in Urban Terrain by Fusing Data from a GPS Receiver, Inertial Sensors, Stereo-Camera and Digital Maps for Pedestrian Navigation

    PubMed Central

    Przemyslaw, Baranski; Pawel, Strumillo

    2012-01-01

    The paper presents an algorithm for estimating a pedestrian location in an urban environment. The algorithm is based on the particle filter and uses different data sources: a GPS receiver, inertial sensors, probability maps and a stereo camera. Inertial sensors are used to estimate a relative displacement of a pedestrian. A gyroscope estimates a change in the heading direction. An accelerometer is used to count a pedestrian's steps and their lengths. The so-called probability maps help to limit GPS inaccuracy by imposing constraints on pedestrian kinematics, e.g., it is assumed that a pedestrian cannot cross buildings, fences etc. This limits position inaccuracy to ca. 10 m. Incorporation of depth estimates derived from a stereo camera that are compared to the 3D model of an environment has enabled further reduction of positioning errors. As a result, for 90% of the time, the algorithm is able to estimate a pedestrian location with an error smaller than 2 m, compared to an error of 6.5 m for a navigation based solely on GPS. PMID:22969321

  12. The Continuous wavelet in airborne gravimetry

    NASA Astrophysics Data System (ADS)

    Liang, X.; Liu, L.

    2013-12-01

    Airborne gravimetry is an efficient method to recover medium and high frequency band of earth gravity over any region, especially inaccessible areas, which can measure gravity data with high accuracy,high resolution and broad range in a rapidly and economical way, and It will play an important role for geoid and geophysical exploration. Filtering methods for reducing high-frequency errors is critical to the success of airborne gravimetry due to Aircraft acceleration determination based on GPS.Tradiontal filters used in airborne gravimetry are FIR,IIR filer and so on. This study recommends an improved continuous wavelet to process airborne gravity data. Here we focus on how to construct the continuous wavelet filters and show their working principle. Particularly the technical parameters (window width parameter and scale parameter) of the filters are tested. Then the raw airborne gravity data from the first Chinese airborne gravimetry campaign are filtered using FIR-low pass filter and continuous wavelet filters to remove the noise. The comparison to reference data is performed to determinate external accuracy, which shows that continuous wavelet filters applied to airborne gravity in this thesis have good performances. The advantages of the continuous wavelet filters over digital filters are also introduced. The effectiveness of the continuous wavelet filters for airborne gravimetry is demonstrated through real data computation.

  13. Benchmarking High Density Image Matching for Oblique Airborne Imagery

    NASA Astrophysics Data System (ADS)

    Cavegn, S.; Haala, N.; Nebiker, S.; Rothermel, M.; Tutzauer, P.

    2014-08-01

    Both, improvements in camera technology and new pixel-wise matching approaches triggered the further development of software tools for image based 3D reconstruction. Meanwhile research groups as well as commercial vendors provide photogrammetric software to generate dense, reliable and accurate 3D point clouds and Digital Surface Models (DSM) from highly overlapping aerial images. In order to evaluate the potential of these algorithms in view of the ongoing software developments, a suitable test bed is provided by the ISPRS/EuroSDR initiative Benchmark on High Density Image Matching for DSM Computation. This paper discusses the proposed test scenario to investigate the potential of dense matching approaches for 3D data capture from oblique airborne imagery. For this purpose, an oblique aerial image block captured at a GSD of 6 cm in the west of Zürich by a Leica RCD30 Oblique Penta camera is used. Within this paper, the potential test scenario is demonstrated using matching results from two software packages, Agisoft PhotoScan and SURE from University of Stuttgart. As oblique images are frequently used for data capture at building facades, 3D point clouds are mainly investigated at such areas. Reference data from terrestrial laser scanning is used to evaluate data quality from dense image matching for several facade patches with respect to accuracy, density and reliability.

  14. A Synergistic Approach to Atmospheric Compensation of Neon's Airborne Hyperspectral Imagery Utilizing an Airborne Solar Spectral Irradiance Radiometer

    NASA Astrophysics Data System (ADS)

    Wright, L.; Karpowicz, B. M.; Kindel, B. C.; Schmidt, S.; Leisso, N.; Kampe, T. U.; Pilewskie, P.

    2014-12-01

    A wide variety of critical information regarding bioclimate, biodiversity, and biogeochemistry is embedded in airborne hyperspectral imagery. Most, if not all of the primary signal relies upon first deriving the surface reflectance of land cover and vegetation from measured hyperspectral radiance. This places stringent requirements on terrain, and atmospheric compensation algorithms to accurately derive surface reflectance properties. An observatory designed to measure bioclimate, biodiversity, and biogeochemistry variables from surface reflectance must take great care in developing an approach which chooses algorithms with the highest accuracy, along with providing those algorithms with data necessary to describe the physical mechanisms that affect the measured at sensor radiance. The Airborne Observation Platform (AOP) part of the National Ecological Observatory Network (NEON) is developing such an approach. NEON is a continental-scale ecological observation platform designed to collect and disseminate data to enable the understanding and forecasting of the impacts of climate change, land use change, and invasive species on ecology. The instrumentation package used by the AOP includes a visible and shortwave infrared hyperspectral imager, waveform LiDAR, and high resolution (RGB) digital camera. In addition to airborne measurements, ground-based CIMEL sun photometers will be used to help characterize atmospheric aerosol loading, and ground validation measurements with field spectrometers will be made at select NEON sites. While the core instrumentation package provides critical information to derive surface reflectance of land surfaces and vegetation, the addition of a Solar Spectral Irradiance Radiometer (SSIR) is being investigated as an additional source of data to help identify and characterize atmospheric aerosol, and cloud contributions contributions to the radiance measured by the hyperspectral imager. The addition of the SSIR provides the opportunity to

  15. Correction of dark current in consumer cameras

    NASA Astrophysics Data System (ADS)

    Dunlap, Justin C.; Bodegom, Erik; Widenhorn, Ralf

    2010-01-01

    A study of dark current in digital imagers in digital single-lens reflex (DSLR) and compact consumer-grade digital cameras is presented. Dark current is shown to vary with temperature, exposure time, and ISO setting. Further, dark current is shown to increase in successive images during a series of images. DSLR and compact consumer cameras are often designed such that they are contained within a densely packed camera body, and therefore the digital imagers within the camera frame are prone to heat generated by the sensor as well as nearby elements within the camera body. It is the scope of this work to characterize the dark current in such cameras and to show that the dark current, in part due to heat generated by the camera itself, can be corrected by using hot pixels on the imager. This method generates computed dark frames based on the dark current indicator value of the hottest pixels on the chip. We compare this method to standard methods of dark current correction.

  16. 75 FR 8112 - In the Matter of Certain Mobile Telephones and Wireless Communication Devices Featuring Digital...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-23

    ... Cameras, and Components Thereof; Notice of Investigation AGENCY: U.S. International Trade Commission... communication devices featuring digital cameras, and components thereof by reason of infringement of certain... mobile telephones or wireless communication devices featuring digital cameras, or ] components...

  17. Solid state television camera has no imaging tube

    NASA Technical Reports Server (NTRS)

    Huggins, C. T.

    1972-01-01

    Camera with characteristics of vidicon camera and greater resolution than home TV receiver uses mosaic of phototransistors. Because of low power and small size, camera has many applications. Mosaics can be used as cathode ray tubes and analog-to-digital converters.

  18. Imaging Emission Spectra with Handheld and Cellphone Cameras

    ERIC Educational Resources Information Center

    Sitar, David

    2012-01-01

    As point-and-shoot digital camera technology advances it is becoming easier to image spectra in a laboratory setting on a shoestring budget and get immediate results. With this in mind, I wanted to test three cameras to see how their results would differ. Two undergraduate physics students and I used one handheld 7.1 megapixel (MP) digital Cannon…

  19. Extracting dynamic spatial data from airborne imaging sensors to support traffic flow estimation

    NASA Astrophysics Data System (ADS)

    Toth, C. K.; Grejner-Brzezinska, D.

    The recent transition from analog to totally digital data acquisition and processing techniques in airborne surveying represents a major milestone in the evolution of spatial information science and practice. On one hand, the improved quality of the primary sensor data can provide the foundation for better automation of the information extraction processes. This phenomenon is also strongly supported by continuously expanding computer technology, which offers almost unlimited processing power. On the other hand, the variety of the data, including rich information content and better temporal characteristics, acquired by the new digital sensors and coupled with rapidly advancing processing techniques, is broadening the applications of airborne surveying. One of these new application areas is traffic flow extraction aimed at supporting better traffic monitoring and management. Transportation mapping has always represented a significant segment of civilian mapping and is mainly concerned with road corridor mapping for design and engineering purposes, infrastructure mapping and facility management, and more recently, environmental mapping. In all these cases, the objective of the mapping is to extract the static features of the object space, such as man-made and natural objects, typically along the road network. In contrast, the traffic moving in the transportation network represents a very dynamic environment, which complicates the spatial data extraction processes as the signals of moving vehicles should be identified and removed. Rather than removing and discarding the signals, however, they can be turned into traffic flow information. This paper reviews initial research efforts to extract traffic flow information from laserscanner and digital camera sensors installed in airborne platforms.

  20. Long range translocations ashore to offshore using Spacelab-1 metric camera imagery in the perspective of integrated geophysical surveys

    NASA Astrophysics Data System (ADS)

    Galibert, G.

    1985-04-01

    It is shown that long range translocations ashore to offshore of points or objects are possible (up to 150,000 m, X and Y) using Spacelab metric camera imagery when conventional methods of positioning by triangulation, Doppler satellites or radio-electric hyperbolic systems are unusable. Experiments in the Saint Malo coastal area along the French side of the English Channel are described. Dimensional measurements of colored crystals give an accuracy of 8 to 20 m through analog or digital methods, X and Y, with a possible transfer on topographical maps or charts at scales of 1:25000 or 1:15560. Transfers of digitized images on helicopter head up displays (theoretical accuracy 1 m) is also possible to prepare airborne coastal surveys over shallow waters.

  1. [Guide to buying a camera for dermatological photography].

    PubMed

    Barco, L; Ribera, M; Casanova, J M

    2012-01-01

    Choosing a camera for use in the dermatology office is difficult, particularly in the case of a digital camera because the market is constantly evolving. This article explains the features that should be taken into account, including camera type, sensor, lens and macro capability, aperture priority mode, screen, viewfinder, operating speed, flash, battery, memory card, and image format. The most recent advances in the field of digital photography relevant to the dermatologist are discussed. PMID:22463769

  2. Point cloud generation from aerial image data acquired by a quadrocopter type micro unmanned aerial vehicle and a digital still camera.

    PubMed

    Rosnell, Tomi; Honkavaara, Eija

    2012-01-01

    The objective of this investigation was to develop and investigate methods for point cloud generation by image matching using aerial image data collected by quadrocopter type micro unmanned aerial vehicle (UAV) imaging systems. Automatic generation of high-quality, dense point clouds from digital images by image matching is a recent, cutting-edge step forward in digital photogrammetric technology. The major components of the system for point cloud generation are a UAV imaging system, an image data collection process using high image overlaps, and post-processing with image orientation and point cloud generation. Two post-processing approaches were developed: one of the methods is based on Bae Systems' SOCET SET classical commercial photogrammetric software and another is built using Microsoft(®)'s Photosynth™ service available in the Internet. Empirical testing was carried out in two test areas. Photosynth processing showed that it is possible to orient the images and generate point clouds fully automatically without any a priori orientation information or interactive work. The photogrammetric processing line provided dense and accurate point clouds that followed the theoretical principles of photogrammetry, but also some artifacts were detected. The point clouds from the Photosynth processing were sparser and noisier, which is to a large extent due to the fact that the method is not optimized for dense point cloud generation. Careful photogrammetric processing with self-calibration is required to achieve the highest accuracy. Our results demonstrate the high performance potential of the approach and that with rigorous processing it is possible to reach results that are consistent with theory. We also point out several further research topics. Based on theoretical and empirical results, we give recommendations for properties of imaging sensor, data collection and processing of UAV image data to ensure accurate point cloud generation. PMID:22368479

  3. Point Cloud Generation from Aerial Image Data Acquired by a Quadrocopter Type Micro Unmanned Aerial Vehicle and a Digital Still Camera

    PubMed Central

    Rosnell, Tomi; Honkavaara, Eija

    2012-01-01

    The objective of this investigation was to develop and investigate methods for point cloud generation by image matching using aerial image data collected by quadrocopter type micro unmanned aerial vehicle (UAV) imaging systems. Automatic generation of high-quality, dense point clouds from digital images by image matching is a recent, cutting-edge step forward in digital photogrammetric technology. The major components of the system for point cloud generation are a UAV imaging system, an image data collection process using high image overlaps, and post-processing with image orientation and point cloud generation. Two post-processing approaches were developed: one of the methods is based on Bae Systems’ SOCET SET classical commercial photogrammetric software and another is built using Microsoft®’s Photosynth™ service available in the Internet. Empirical testing was carried out in two test areas. Photosynth processing showed that it is possible to orient the images and generate point clouds fully automatically without any a priori orientation information or interactive work. The photogrammetric processing line provided dense and accurate point clouds that followed the theoretical principles of photogrammetry, but also some artifacts were detected. The point clouds from the Photosynth processing were sparser and noisier, which is to a large extent due to the fact that the method is not optimized for dense point cloud generation. Careful photogrammetric processing with self-calibration is required to achieve the highest accuracy. Our results demonstrate the high performance potential of the approach and that with rigorous processing it is possible to reach results that are consistent with theory. We also point out several further research topics. Based on theoretical and empirical results, we give recommendations for properties of imaging sensor, data collection and processing of UAV image data to ensure accurate point cloud generation. PMID:22368479

  4. NASA IceBridge: Scientific Insights from Airborne Surveys of the Polar Sea Ice Covers

    NASA Astrophysics Data System (ADS)

    Richter-Menge, J.; Farrell, S. L.

    2015-12-01

    The NASA Operation IceBridge (OIB) airborne sea ice surveys are designed to continue a valuable series of sea ice thickness measurements by bridging the gap between NASA's Ice, Cloud and Land Elevation Satellite (ICESat), which operated from 2003 to 2009, and ICESat-2, which is scheduled for launch in 2017. Initiated in 2009, OIB has conducted campaigns over the western Arctic Ocean (March/April) and Southern Oceans (October/November) on an annual basis when the thickness of sea ice cover is nearing its maximum. More recently, a series of Arctic surveys have also collected observations in the late summer, at the end of the melt season. The Airborne Topographic Mapper (ATM) laser altimeter is one of OIB's primary sensors, in combination with the Digital Mapping System digital camera, a Ku-band radar altimeter, a frequency-modulated continuous-wave (FMCW) snow radar, and a KT-19 infrared radiation pyrometer. Data from the campaigns are available to the research community at: http://nsidc.org/data/icebridge/. This presentation will summarize the spatial and temporal extent of the OIB campaigns and their complementary role in linking in situ and satellite measurements, advancing observations of sea ice processes across all length scales. Key scientific insights gained on the state of the sea ice cover will be highlighted, including snow depth, ice thickness, surface roughness and morphology, and melt pond evolution.

  5. Camera-on-a-Chip

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Jet Propulsion Laboratory's research on a second generation, solid-state image sensor technology has resulted in the Complementary Metal- Oxide Semiconductor Active Pixel Sensor (CMOS), establishing an alternative to the Charged Coupled Device (CCD). Photobit Corporation, the leading supplier of CMOS image sensors, has commercialized two products of their own based on this technology: the PB-100 and PB-300. These devices are cameras on a chip, combining all camera functions. CMOS "active-pixel" digital image sensors offer several advantages over CCDs, a technology used in video and still-camera applications for 30 years. The CMOS sensors draw less energy, they use the same manufacturing platform as most microprocessors and memory chips, and they allow on-chip programming of frame size, exposure, and other parameters.

  6. Lava Transport and Accumulation Processes on EPR 9 27'N to 10N: Interpretations Based on Recent Near-Bottom Sonar Imaging and Seafloor Observations Using ABE, Alvin and a new Digital Deep Sea Camera

    NASA Astrophysics Data System (ADS)

    Schouten, H.; Tivey, M.; Fornari, D.; Yoerger, D.; Bradley, A.; Johnson, P.; Edwards, M.; Kurokawa, T.

    2002-12-01

    Sonar and digital photographic images of the seafloor on the East Pacific Rise (EPR) crest at 9 50'N and 9 28.5' N were collected during two cruises using the DSL-120A sidescan sonar, autonomous underwater vehicle ABE (Autonomous Benthic Explorer), a new digital towed camera, and the submersible Alvin. High-resolution micro-bathymetry, and sidescan sonar images of the EPR crest were correlated with visual and photographic observations of the seafloor. These data show evidence of channeled lava transport and extensive re-paving by successive volcanic flows consistent with eruptions originating from the axis and flowing onto the upper rise flank out to ~2 km. The EPR crest out to approximately 2 km from the axis, between 9 27'N and 10N, is, in map-view, dominated by enechelon and coalescing scalloped acoustic reflectors that we infer to be interfingered lava flow fronts. These reflectors are a few hundred meters to more than 1 km long, and are convex outward (away from the axis) in form suggesting primary flow patterns are downslope to either side of the ridge axis. Eruption of lava within or proximal to the axial trough and downslope transport is the dominant constructional mode for building the fast spreading crust in this area. ABE microbathymetry (1 m contour interval) shows the fronts to be several meters high, while towed camera digital images indicate that the lava fronts are either bulbous pillows and tubes, or distal lobes of lobate flows. Lava flows within 2 km of the EPR axis have predominantly lobate morphologies. There are numerous indications in the sonar imagery of low amplitude, dendritic flow channels emanating east and west from the axial trough along various sections of the ridge axis in the 9 28'N and 9 50'N region. Towed camera imagery confirmed that the channels are floored by smooth-surfaced sheet flows. Characteristics of the near-bottom magnetic field as measure by ABE correlate positively with interpretations and observations of lava

  7. SAND DEVIL: A digital video link for telemetry applications

    NASA Astrophysics Data System (ADS)

    Meacham, James A.; Ottesen, Cory W.; Bell, R. Michael

    1989-10-01

    A digital video encoder/decoder was built which is suitable for airborne telemetry. The system allows the use of multiple black and white or a single RGB camera. The spatial resolution, frame rate, and pixel compression algorithm can be tailored to specific mission requirements. The output bit rate of the encoder can be varied from 0.89 to 7.16 Mbit/sec, depending on test range capability and RF data link considerations. The digital output of the encoder can be encrypted for data security. The system architecture is flexible, yet very simple, leading to a compact design. Also, the entire system is implemented with off-the-shelf components, thus reducing development time and cost. The size of the encoder and decoder can be reduced substantially by using surface mount devices.

  8. Caught on Camera.

    ERIC Educational Resources Information Center

    Milshtein, Amy

    2002-01-01

    Describes the benefits of and rules to be followed when using surveillance cameras for school security. Discusses various camera models, including indoor and outdoor fixed position cameras, pan-tilt zoom cameras, and pinhole-lens cameras for covert surveillance. (EV)

  9. Digital photography

    PubMed Central

    Windsor, J S; Rodway, G W; Middleton, P M; McCarthy, S

    2006-01-01

    Objective The emergence of a new generation of “point‐and‐shoot” digital cameras offers doctors a compact, portable and user‐friendly solution to the recording of highly detailed digital photographs and video images. This work highlights the use of such technology, and provides information for those who wish to record, store and display their own medical images. Methods Over a 3‐month period, a digital camera was carried by a doctor in a busy, adult emergency department and used to record a range of clinical images that were subsequently transferred to a computer database. Results In total, 493 digital images were recorded, of which 428 were photographs and 65 were video clips. These were successfully used for teaching purposes, publications and patient records. Conclusions This study highlights the importance of informed consent, the selection of a suitable package of digital technology and the role of basic photographic technique in developing a successful digital database in a busy clinical environment. PMID:17068281

  10. Using Google Earth for Rapid Dissemination of Airborne Remote Sensing Lidar and Photography

    NASA Astrophysics Data System (ADS)

    Wright, C. W.; Nayegandhi, A.; Brock, J. C.

    2006-12-01

    In order to visualize and disseminate vast amounts of lidar and digital photography data, we present a unique method that make these data layers available via the Google Earth interface. The NASA Experimental Advanced Airborne Research Lidar (EAARL) provides unprecedented capabilities to survey coral reefs, nearshore benthic habitats, coastal vegetation, and sandy beaches. The EAARL sensor suite includes a water-penetrating lidar that provides high-resolution topographic information, a down-looking color digital camera, a down-looking high-resolution color-infrared (CIR) digital camera, and precision kinematic GPS receivers which provide for sub-meter geo-referencing of each laser and multispectral sample. Google Earth "kml" files are created for each EAARL multispectral and processed lidar image. A hierarchical structure of network links allows the user to download high-resolution images within the region of interest. The first network link (kmz file) downloaded by the user contains a color coded flight path and "minute marker" icons along the flight path. Each "minute" icon provides access to the image overlays, and additional network links for each second along the flight path as well as flight navigation information. Layers of false-color-coded lidar Digital Elevation Model (DEM) data are made available in 2 km by 2km tiles. These layers include canopy-top, bare-Earth, submerged topography, and links to any other lidar products. The user has the option to download the x,y,z ascii point data or a DEM in the Geotif file format for each tile. The NASA EAARL project captured roughly 250,000 digital photographs in five flights conducted a few days after Hurricane Katrina made landfall along the Gulf Coast in 2005. All of the photos and DEM layers are georeferenced and viewable online using Google Earth.

  11. Spherical Camera

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Developed largely through a Small Business Innovation Research contract through Langley Research Center, Interactive Picture Corporation's IPIX technology provides spherical photography, a panoramic 360-degrees. NASA found the technology appropriate for use in guiding space robots, in the space shuttle and space station programs, as well as research in cryogenic wind tunnels and for remote docking of spacecraft. Images of any location are captured in their entirety in a 360-degree immersive digital representation. The viewer can navigate to any desired direction within the image. Several car manufacturers already use IPIX to give viewers a look at their latest line-up of automobiles. Another application is for non-invasive surgeries. By using OmniScope, surgeons can look more closely at various parts of an organ with medical viewing instruments now in use. Potential applications of IPIX technology include viewing of homes for sale, hotel accommodations, museum sites, news events, and sports stadiums.

  12. Lights, camera, action research: The effects of didactic digital movie making on students' twenty-first century learning skills and science content in the middle school classroom

    NASA Astrophysics Data System (ADS)

    Ochsner, Karl

    Students are moving away from content consumption to content production. Short movies are uploaded onto video social networking sites and shared around the world. Unfortunately they usually contain little to no educational value, lack a narrative and are rarely created in the science classroom. According to new Arizona Technology standards and ISTE NET*S, along with the framework from the Partnership for 21st Century Learning Standards, our society demands students not only to learn curriculum, but to think critically, problem solve effectively, and become adept at communicating and collaborating. Didactic digital movie making in the science classroom may be one way that these twenty-first century learning skills may be implemented. An action research study using a mixed-methods approach to collect data was used to investigate if didactic moviemaking can help eighth grade students learn physical science content while incorporating 21st century learning skills of collaboration, communication, problem solving and critical thinking skills through their group production. Over a five week period, students researched lessons, wrote scripts, acted, video recorded and edited a didactic movie that contained a narrative plot to teach a science strand from the Arizona State Standards in physical science. A pretest/posttest science content test and KWL chart was given before and after the innovation to measure content learned by the students. Students then took a 21st Century Learning Skills Student Survey to measure how much they perceived that communication, collaboration, problem solving and critical thinking were taking place during the production. An open ended survey and a focus group of four students were used for qualitative analysis. Three science teachers used a project evaluation rubric to measure science content and production values from the movies. Triangulating the science content test, KWL chart, open ended questions and the project evaluation rubric, it

  13. Highly Protable Airborne Multispectral Imaging System

    NASA Technical Reports Server (NTRS)

    Lehnemann, Robert; Mcnamee, Todd

    2001-01-01

    A portable instrumentation system is described that includes and airborne and a ground-based subsytem. It can acquire multispectral image data over swaths of terrain ranging in width from about 1.5 to 1 km. The system was developed especially for use in coastal environments and is well suited for performing remote sensing and general environmental monitoring. It includes a small,munpilotaed, remotely controlled airplance that carries a forward-looking camera for navigation, three downward-looking monochrome video cameras for imaging terrain in three spectral bands, a video transmitter, and a Global Positioning System (GPS) reciever.

  14. Phoenix Robotic Arm Camera

    NASA Astrophysics Data System (ADS)

    Keller, H. U.; Goetz, W.; Hartwig, H.; Hviid, S. F.; Kramm, R.; Markiewicz, W. J.; Reynolds, R.; Shinohara, C.; Smith, P.; Tanner, R.; Woida, P.; Woida, R.; Bos, B. J.; Lemmon, M. T.

    2008-10-01

    The Phoenix Robotic Arm Camera (RAC) is a variable-focus color camera mounted to the Robotic Arm (RA) of the Phoenix Mars Lander. It is designed to acquire both close-up images of the Martian surface and microscopic images (down to a scale of 23 μm/pixel) of material collected in the RA scoop. The mounting position at the end of the Robotic Arm allows the RAC to be actively positioned for imaging of targets not easily seen by the Stereo Surface Imager (SSI), such as excavated trench walls and targets under the Lander structure. Color information is acquired by illuminating the target with red, green, and blue light-emitting diodes. Digital terrain models (DTM) can be generated from RAC images acquired from different view points. This can provide high-resolution stereo information about fine details of the trench walls. The large stereo baseline possible with the arm can also provide a far-field DTM. The primary science objectives of the RAC are the search for subsurface soil/ice layering at the landing site and the characterization of scoop samples prior to delivery to other instruments on board Phoenix. The RAC shall also provide low-resolution panoramas in support of SSI activities and acquire images of the Lander deck for instrument and Lander check out. The camera design was inherited from the unsuccessful Mars Polar Lander mission (1999) and further developed for the (canceled) Mars Surveyor 2001 Lander (MSL01). Extensive testing and partial recalibration qualified the MSL01 RAC flight model for integration into the Phoenix science payload.

  15. Matching image color from different cameras

    NASA Astrophysics Data System (ADS)

    Fairchild, Mark D.; Wyble, David R.; Johnson, Garrett M.

    2008-01-01

    Can images from professional digital SLR cameras be made equivalent in color using simple colorimetric characterization? Two cameras were characterized, these characterizations were implemented on a variety of images, and the results were evaluated both colorimetrically and psychophysically. A Nikon D2x and a Canon 5D were used. The colorimetric analyses indicated that accurate reproductions were obtained. The median CIELAB color differences between the measured ColorChecker SG and the reproduced image were 4.0 and 6.1 for the Canon (chart and spectral respectively) and 5.9 and 6.9 for the Nikon. The median differences between cameras were 2.8 and 3.4 for the chart and spectral characterizations, near the expected threshold for reliable image difference perception. Eight scenes were evaluated psychophysically in three forced-choice experiments in which a reference image from one of the cameras was shown to observers in comparison with a pair of images, one from each camera. The three experiments were (1) a comparison of the two cameras with the chart-based characterizations, (2) a comparison with the spectral characterizations, and (3) a comparison of chart vs. spectral characterization within and across cameras. The results for the three experiments are 64%, 64%, and 55% correct respectively. Careful and simple colorimetric characterization of digital SLR cameras can result in visually equivalent color reproduction.

  16. Proactive PTZ Camera Control

    NASA Astrophysics Data System (ADS)

    Qureshi, Faisal Z.; Terzopoulos, Demetri

    We present a visual sensor network—comprising wide field-of-view (FOV) passive cameras and pan/tilt/zoom (PTZ) active cameras—capable of automatically capturing closeup video of selected pedestrians in a designated area. The passive cameras can track multiple pedestrians simultaneously and any PTZ camera can observe a single pedestrian at a time. We propose a strategy for proactive PTZ camera control where cameras plan ahead to select optimal camera assignment and handoff with respect to predefined observational goals. The passive cameras supply tracking information that is used to control the PTZ cameras.

  17. Imagers for digital still photography

    NASA Astrophysics Data System (ADS)

    Bosiers, Jan; Dillen, Bart; Draijer, Cees; Manoury, Erik-Jan; Meessen, Louis; Peters, Inge

    2006-04-01

    This paper gives an overview of the requirements for, and current state-of-the-art of, CCD and CMOS imagers for use in digital still photography. Four market segments will be reviewed: mobile imaging, consumer "point-and-shoot cameras", consumer digital SLR cameras and high-end professional camera systems. The paper will also present some challenges and innovations with respect to packaging, testing, and system integration.

  18. Computer simulator for training operators of thermal cameras

    NASA Astrophysics Data System (ADS)

    Chrzanowski, Krzysztof; Krupski, Marcin

    2004-08-01

    A PC-based image generator SIMTERM developed for training operators of non-airborne military thermal imaging systems is presented in this paper. SIMTERM allows its users to generate images closely resembling thermal images of many military type targets at different scenarios obtained with the simulated thermal camera. High fidelity of simulation was achieved due to use of measurable parameters of thermal camera as input data. Two modified versions of this computer simulator developed for designers and test teams are presented, too.

  19. Key performance requirements for military low-light television cameras

    NASA Astrophysics Data System (ADS)

    Shimer, Steven; Heim, Gerald

    2007-04-01

    Low-light-level video cameras have benefited from rapid advances in digital technology during the past two decades. In legacy cameras, the video signal was processed using analog electronics which made real-time, nonlinear processing of the video signal very difficult. In state-of-the-art cameras, the analog signal is digitized directly from the sensor and processed entirely in the digital domain, enabling the application of advanced processing techniques to the video signal in real time. In fact, all aspects of modern low-light television cameras are controlled via digital technology, resulting in various enhancements that surpass analog electronics. In addition to video processing, large-scale digital integration in these low-light level cameras enables precise control of the image intensifier and image sensor, facilitating large inter-scene dynamic range capability, extended intra-scene dynamic range and blooming control. Digital video processing and digital camera control are used to provide improved system-level performance, including nearly perfect pixel response uniformity, correction of blemishes, and electronic boresight. Compact digital electronics also enable comprehensive camera built-in-test (BIT) capability which provides coverage for the entire camera--from photons into the sensor to the processed video signal going out the connector. Individuals involved in the procurement of present and future low-light-level cameras need to understand these advanced camera capabilities in order to write accurate specifications for their advanced video system requirements. This paper provides an overview of these modern video system capabilities along with example specification text.

  20. Advanced High-Definition Video Cameras

    NASA Technical Reports Server (NTRS)

    Glenn, William

    2007-01-01

    A product line of high-definition color video cameras, now under development, offers a superior combination of desirable characteristics, including high frame rates, high resolutions, low power consumption, and compactness. Several of the cameras feature a 3,840 2,160-pixel format with progressive scanning at 30 frames per second. The power consumption of one of these cameras is about 25 W. The size of the camera, excluding the lens assembly, is 2 by 5 by 7 in. (about 5.1 by 12.7 by 17.8 cm). The aforementioned desirable characteristics are attained at relatively low cost, largely by utilizing digital processing in advanced field-programmable gate arrays (FPGAs) to perform all of the many functions (for example, color balance and contrast adjustments) of a professional color video camera. The processing is programmed in VHDL so that application-specific integrated circuits (ASICs) can be fabricated directly from the program. ["VHDL" signifies VHSIC Hardware Description Language C, a computing language used by the United States Department of Defense for describing, designing, and simulating very-high-speed integrated circuits (VHSICs).] The image-sensor and FPGA clock frequencies in these cameras have generally been much higher than those used in video cameras designed and manufactured elsewhere. Frequently, the outputs of these cameras are converted to other video-camera formats by use of pre- and post-filters.

  1. New two-dimensional photon camera

    NASA Technical Reports Server (NTRS)

    Papaliolios, C.; Mertz, L.

    1982-01-01

    A photon-sensitive camera, applicable to speckle imaging of astronomical sources, high-resolution spectroscopy of faint galaxies in a crossed-dispersion spectrograph, or narrow-band direct imaging of galaxies, is presented. The camera is shown to supply 8-bit by 8-bit photon positions (256 x 256 pixels) for as many as 10 to the 6th photons/sec with a maximum linear resolution of approximately 10 microns. The sequence of photon positions is recorded digitally with a VHS-format video tape recorder or formed into an immediate image via a microcomputer. The four basic elements of the camera are described in detail: a high-gain image intensifier with fast-decay output phosphor, a glass-prism optical-beam splitter, a set of Gray-coded masks, and a photomultiplier tube for each mask. The characteristics of the camera are compared to those of other photon cameras.

  2. Determining camera parameters for round glassware measurements

    NASA Astrophysics Data System (ADS)

    Baldner, F. O.; Costa, P. B.; Gomes, J. F. S.; Filho, D. M. E. S.; Leta, F. R.

    2015-01-01

    Nowadays there are many types of accessible cameras, including digital single lens reflex ones. Although these cameras are not usually employed in machine vision applications, they can be an interesting choice. However, these cameras have many available parameters to be chosen by the user and it may be difficult to select the best of these in order to acquire images with the needed metrological quality. This paper proposes a methodology to select a set of parameters that will supply a machine vision system with the needed quality image, considering the measurement required of a laboratory glassware.

  3. Digital photorefraction

    NASA Astrophysics Data System (ADS)

    Costa, Manuel F.; Jorge, Jorge M.

    1997-12-01

    The early evaluation of the visual status of human infants is of a critical importance. It is of utmost importance to the development of the child's visual system that she perceives clear, focused, retinal images. Furthermore if the refractive problems are not corrected in due time amblyopia may occur. Photorefraction is a non-invasive clinical tool rather convenient for application to this kind of population. A qualitative or semi-quantitative information about refractive errors, accommodation, strabismus, amblyogenic factors and some pathologies (cataracts) can the easily obtained. The photorefraction experimental setup we established using new technological breakthroughs on the fields of imaging devices, image processing and fiber optics, allows the implementation of both the isotropic and eccentric photorefraction approaches. Essentially both methods consist on delivering a light beam into the eyes. It is refracted by the ocular media, strikes the retina, focusing or not, reflects off and is collected by a camera. The system is formed by one CCD color camera and a light source. A beam splitter in front of the camera's objective allows coaxial illumination and observation. An optomechanical system also allows eccentric illumination. The light source is a flash type one and is synchronized with the camera's image acquisition. The camera's image is digitized displayed in real time. Image processing routines are applied for image's enhancement and feature extraction.

  4. Digital photorefraction

    NASA Astrophysics Data System (ADS)

    Costa, Manuel F. M.; Jorge, Jorge M.

    1998-01-01

    The early evaluation of the visual status of human infants is of a critical importance. It is of utmost importance to the development of the child's visual system that she perceives clear, focused, retinal images. Furthermore if the refractive problems are not corrected in due time amblyopia may occur. Photorefraction is a non-invasive clinical tool rather convenient for application to this kind of population. A qualitative or semi-quantitative information about refractive errors, accommodation, strabismus, amblyogenic factors and some pathologies (cataracts) can the easily obtained. The photorefraction experimental setup we established using new technological breakthroughs on the fields of imaging devices, image processing and fiber optics, allows the implementation of both the isotropic and eccentric photorefraction approaches. Essentially both methods consist on delivering a light beam into the eyes. It is refracted by the ocular media, strikes the retina, focusing or not, reflects off and is collected by a camera. The system is formed by one CCD color camera and a light source. A beam splitter in front of the camera's objective allows coaxial illumination and observation. An optomechanical system also allows eccentric illumination. The light source is a flash type one and is synchronized with the camera's image acquisition. The camera's image is digitized displayed in real time. Image processing routines are applied for image's enhancement and feature extraction.

  5. Determining Camera Gain in Room Temperature Cameras

    SciTech Connect

    Joshua Cogliati

    2010-12-01

    James R. Janesick provides a method for determining the amplification of a CCD or CMOS camera when only access to the raw images is provided. However, the equation that is provided ignores the contribution of dark current. For CCD or CMOS cameras that are cooled well below room temperature, this is not a problem, however, the technique needs adjustment for use with room temperature cameras. This article describes the adjustment made to the equation, and a test of this method.

  6. Experimental Advanced Airborne Research Lidar (EAARL) Data Processing Manual

    USGS Publications Warehouse

    Bonisteel, Jamie M.; Nayegandhi, Amar; Wright, C. Wayne; Brock, John C.; Nagle, David

    2009-01-01

    The Experimental Advanced Airborne Research Lidar (EAARL) is an example of a Light Detection and Ranging (Lidar) system that utilizes a blue-green wavelength (532 nanometers) to determine the distance to an object. The distance is determined by recording the travel time of a transmitted pulse at the speed of light (fig. 1). This system uses raster laser scanning with full-waveform (multi-peak) resolving capabilities to measure submerged topography and adjacent coastal land elevations simultaneously (Nayegandhi and others, 2009). This document reviews procedures for the post-processing of EAARL data using the custom-built Airborne Lidar Processing System (ALPS). ALPS software was developed in an open-source programming environment operated on a Linux platform. It has the ability to combine the laser return backscatter digitized at 1-nanosecond intervals with aircraft positioning information. This solution enables the exploration and processing of the EAARL data in an interactive or batch mode. ALPS also includes modules for the creation of bare earth, canopy-top, and submerged topography Digital Elevation Models (DEMs). The EAARL system uses an Earth-centered coordinate and reference system that removes the necessity to reference submerged topography data relative to water level or tide gages (Nayegandhi and others, 2006). The EAARL system can be mounted in an array of small twin-engine aircraft that operate at 300 meters above ground level (AGL) at a speed of 60 meters per second (117 knots). While other systems strive to maximize operational depth limits, EAARL has a narrow transmit beam and receiver field of view (1.5 to 2 milliradians), which improves the depth-measurement accuracy in shallow, clear water but limits the maximum depth to about 1.5 Secchi disk depth (~20 meters) in clear water. The laser transmitter [Continuum EPO-5000 yttrium aluminum garnet (YAG)] produces up to 5,000 short-duration (1.2 nanosecond), low-power (70 microjoules) pulses each second

  7. Performance Evaluation of Thermographic Cameras for Photogrammetric Measurements

    NASA Astrophysics Data System (ADS)

    Yastikli, N.; Guler, E.

    2013-05-01

    The aim of this research is the performance evaluation of the termographic cameras for possible use for photogrammetric documentation and deformation analyses caused by moisture and isolation problem of the historical and cultural heritage. To perform geometric calibration of the termographic camera, the 3D test object was designed with 77 control points which were distributed in different depths. For performance evaluation, Flir A320 termographic camera with 320 × 240 pixels and lens with 18 mm focal length was used. The Nikon D3X SLR digital camera with 6048 × 4032 pixels and lens with 20 mm focal length was used as reference for comparison. The size of pixel was 25 μm for the Flir A320 termographic camera and 6 μm for the Nikon D3X SLR digital camera. The digital images of the 3D test object were recorded with the Flir A320 termographic camera and Nikon D3X SLR digital camera and the image coordinate of the control points in the images were measured. The geometric calibration parameters, including the focal length, position of principal points, radial and tangential distortions were determined with introduced additional parameters in bundle block adjustments. The measurement of image coordinates and bundle block adjustments with additional parameters were performed using the PHIDIAS digital photogrammetric system. The bundle block adjustment was repeated with determined calibration parameter for both Flir A320 termographic camera and Nikon D3X SLR digital camera. The obtained standard deviation of measured image coordinates was 9.6 μm and 10.5 μm for Flir A320 termographic camera and 8.3 μm and 7.7 μm for Nikon D3X SLR digital camera. The obtained standard deviation of measured image points in Flir A320 termographic camera images almost same accuracy level with digital camera in comparison with 4 times bigger pixel size. The obtained results from this research, the interior geometry of the termographic cameras and lens distortion was modelled efficiently

  8. Mini gamma camera, camera system and method of use

    DOEpatents

    Majewski, Stanislaw; Weisenberger, Andrew G.; Wojcik, Randolph F.

    2001-01-01

    A gamma camera comprising essentially and in order from the front outer or gamma ray impinging surface: 1) a collimator, 2) a scintillator layer, 3) a light guide, 4) an array of position sensitive, high resolution photomultiplier tubes, and 5) printed circuitry for receipt of the output of the photomultipliers. There is also described, a system wherein the output supplied by the high resolution, position sensitive photomultipiler tubes is communicated to: a) a digitizer and b) a computer where it is processed using advanced image processing techniques and a specific algorithm to calculate the center of gravity of any abnormality observed during imaging, and c) optional image display and telecommunications ports.

  9. Sub-Camera Calibration of a Penta-Camera

    NASA Astrophysics Data System (ADS)

    Jacobsen, K.; Gerke, M.

    2016-03-01

    Penta cameras consisting of a nadir and four inclined cameras are becoming more and more popular, having the advantage of imaging also facades in built up areas from four directions. Such system cameras require a boresight calibration of the geometric relation of the cameras to each other, but also a calibration of the sub-cameras. Based on data sets of the ISPRS/EuroSDR benchmark for multi platform photogrammetry the inner orientation of the used IGI Penta DigiCAM has been analyzed. The required image coordinates of the blocks Dortmund and Zeche Zollern have been determined by Pix4Dmapper and have been independently adjusted and analyzed by program system BLUH. With 4.1 million image points in 314 images respectively 3.9 million image points in 248 images a dense matching was provided by Pix4Dmapper. With up to 19 respectively 29 images per object point the images are well connected, nevertheless the high number of images per object point are concentrated to the block centres while the inclined images outside the block centre are satisfying but not very strongly connected. This leads to very high values for the Student test (T-test) of the finally used additional parameters or in other words, additional parameters are highly significant. The estimated radial symmetric distortion of the nadir sub-camera corresponds to the laboratory calibration of IGI, but there are still radial symmetric distortions also for the inclined cameras with a size exceeding 5μm even if mentioned as negligible based on the laboratory calibration. Radial and tangential effects of the image corners are limited but still available. Remarkable angular affine systematic image errors can be seen especially in the block Zeche Zollern. Such deformations are unusual for digital matrix cameras, but it can be caused by the correlation between inner and exterior orientation if only parallel flight lines are used. With exception of the angular affinity the systematic image errors for corresponding

  10. Performance of Large-Format Digital Cameras

    NASA Astrophysics Data System (ADS)

    Jacobsen, K.

    2014-03-01

    Based on test blocks and standard production blocks Z/I Imaging DMCII-140, 230 and 250 as well as UltraCam Eagle images have been analyzed by bundle block adjustment with self calibration. By analysis of image coordinate residuals it is possible to check remaining systematic image errors and to update the set of additional parameters to the required combination. The Hannover program BLUH is using a basic set of 12 additional parameters which is a combination between geometric parameters and Fourier parameters in polar coordinates. In addition it is necessary to use a set of special additional parameters for the combination of the 9 UltraCam sub-CCDs. CCDs are often not flat enough, causing a bending of the edges. For the correct handling of such effects, special additional parameters are required to eliminate or at least reduce remaining systematic effects at image corners. For all DMCII-blocks with 5cm, 7cm, 9cm and 20cm GSD the root mean square size of the systematic image errors with 0.05pixels is very small. Only the basic set of 12 additional parameters is required, the special parameters for the image corners did not improve the accuracy determined with independent check points. Against former results with UltraCam images the monolithic stitching of the panchromatic sub-images to the green image improved the image geometry, nevertheless for reaching the highest accuracy the full set of 52 additional parameters is required, leading to systematic image errors in the root mean square of approximately 0.2pixels. This seems to be small, but it is causing a model deformation up to more as 1.0 GSD in the height, while in the case of the DMCII-images the model deformation did not exceed 0.2 GSD in Z. The major reason for the UltraCam Eagle image deformation seems to be caused by corner effects of the green reference image. Such an effect can be avoided with a better calibration. The DMCII and the UltraCam Eagle images were improved by the firm ware for edge enhancement, influencing also the effective image resolution, determined by edge analysis. No real loss of the effective against the nominal resolution can be seen. Nevertheless the UltraCam Eagle images are a little noisy, which may be caused by the edge enhancement and by the imaging in January.

  11. The influence of the in situ camera calibration for direct georeferencing of aerial imagery

    NASA Astrophysics Data System (ADS)

    Mitishita, E.; Barrios, R.; Centeno, J.

    2014-11-01

    The direct determination of exterior orientation parameters (EOPs) of aerial images via GNSS/INS technologies is an essential prerequisite in photogrammetric mapping nowadays. Although direct sensor orientation technologies provide a high degree of automation in the process due to the GNSS/INS technologies, the accuracies of the obtained results depend on the quality of a group of parameters that models accurately the conditions of the system at the moment the job is performed. One sub-group of parameters (lever arm offsets and boresight misalignments) models the position and orientation of the sensors with respect to the IMU body frame due to the impossibility of having all sensors on the same position and orientation in the airborne platform. Another sub-group of parameters models the internal characteristics of the sensor (IOP). A system calibration procedure has been recommended by worldwide studies to obtain accurate parameters (mounting and sensor characteristics) for applications of the direct sensor orientation. Commonly, mounting and sensor characteristics are not stable; they can vary in different flight conditions. The system calibration requires a geometric arrangement of the flight and/or control points to decouple correlated parameters, which are not available in the conventional photogrammetric flight. Considering this difficulty, this study investigates the feasibility of the in situ camera calibration to improve the accuracy of the direct georeferencing of aerial images. The camera calibration uses a minimum image block, extracted from the conventional photogrammetric flight, and control point arrangement. A digital Vexcel UltraCam XP camera connected to POS AV TM system was used to get two photogrammetric image blocks. The blocks have different flight directions and opposite flight line. In situ calibration procedures to compute different sets of IOPs are performed and their results are analyzed and used in photogrammetric experiments. The IOPs

  12. Lightweight Electronic Camera for Research on Clouds

    NASA Technical Reports Server (NTRS)

    Lawson, Paul

    2006-01-01

    "Micro-CPI" (wherein "CPI" signifies "cloud-particle imager") is the name of a small, lightweight electronic camera that has been proposed for use in research on clouds. It would acquire and digitize high-resolution (3- m-pixel) images of ice particles and water drops at a rate up to 1,000 particles (and/or drops) per second.

  13. Airborne oceanographic lidar system

    NASA Technical Reports Server (NTRS)

    Bressel, C.; Itzkan, I.; Nunes, J. E.; Hoge, F.

    1977-01-01

    The characteristics of an Airborne Oceanographic Lidar (AOL) are given. The AOL system is described and its potential for various measurement applications including bathymetry and fluorosensing is discussed.

  14. Improved Airborne System for Sensing Wildfires

    NASA Technical Reports Server (NTRS)

    McKeown, Donald; Richardson, Michael

    2008-01-01

    The Wildfire Airborne Sensing Program (WASP) is engaged in a continuing effort to develop an improved airborne instrumentation system for sensing wildfires. The system could also be used for other aerial-imaging applications, including mapping and military surveillance. Unlike prior airborne fire-detection instrumentation systems, the WASP system would not be based on custom-made multispectral line scanners and associated custom- made complex optomechanical servomechanisms, sensors, readout circuitry, and packaging. Instead, the WASP system would be based on commercial off-the-shelf (COTS) equipment that would include (1) three or four electronic cameras (one for each of three or four wavelength bands) instead of a multispectral line scanner; (2) all associated drive and readout electronics; (3) a camera-pointing gimbal; (4) an inertial measurement unit (IMU) and a Global Positioning System (GPS) receiver for measuring the position, velocity, and orientation of the aircraft; and (5) a data-acquisition subsystem. It would be necessary to custom-develop an integrated sensor optical-bench assembly, a sensor-management subsystem, and software. The use of mostly COTS equipment is intended to reduce development time and cost, relative to those of prior systems.

  15. Vacuum Camera Cooler

    NASA Technical Reports Server (NTRS)

    Laugen, Geoffrey A.

    2011-01-01

    Acquiring cheap, moving video was impossible in a vacuum environment, due to camera overheating. This overheating is brought on by the lack of cooling media in vacuum. A water-jacketed camera cooler enclosure machined and assembled from copper plate and tube has been developed. The camera cooler (see figure) is cup-shaped and cooled by circulating water or nitrogen gas through copper tubing. The camera, a store-bought "spy type," is not designed to work in a vacuum. With some modifications the unit can be thermally connected when mounted in the cup portion of the camera cooler. The thermal conductivity is provided by copper tape between parts of the camera and the cooled enclosure. During initial testing of the demonstration unit, the camera cooler kept the CPU (central processing unit) of this video camera at operating temperature. This development allowed video recording of an in-progress test, within a vacuum environment.

  16. Novel fundus camera design

    NASA Astrophysics Data System (ADS)

    Dehoog, Edward A.

    A fundus camera a complex optical system that makes use of the principle of reflex free indirect ophthalmoscopy to image the retina. Despite being in existence as early as 1900's, little has changed in the design of a fundus camera and there is minimal information about the design principles utilized. Parameters and specifications involved in the design of fundus camera are determined and their affect on system performance are discussed. Fundus cameras incorporating different design methods are modeled and a performance evaluation based on design parameters is used to determine the effectiveness of each design strategy. By determining the design principles involved in the fundus camera, new cameras can be designed to include specific imaging modalities such as optical coherence tomography, imaging spectroscopy and imaging polarimetry to gather additional information about properties and structure of the retina. Design principles utilized to incorporate such modalities into fundus camera systems are discussed. Design, implementation and testing of a snapshot polarimeter fundus camera are demonstrated.

  17. Advanced camera for surveys

    NASA Astrophysics Data System (ADS)

    Clampin, Mark; Ford, Holland C.; Bartko, Frank; Bely, Pierre Y.; Broadhurst, Tom; Burrows, Christopher J.; Cheng, Edward S.; Crocker, James H.; Franx, Marijn; Feldman, Paul D.; Golimowski, David A.; Hartig, George F.; Illingworth, Garth; Kimble, Randy A.; Lesser, Michael P.; Miley, George H.; Postman, Marc; Rafal, Marc D.; Rosati, Piero; Sparks, William B.; Tsvetanov, Zlatan; White, Richard L.; Sullivan, Pamela; Volmer, Paul; LaJeunesse, Tom

    2000-07-01

    The Advanced Camera for Surveys (ACS) is a third generation instrument for the Hubble Space Telescope (HST). It is currently planned for installation in HST during the fourth servicing mission in Summer 2001. The ACS will have three cameras.

  18. Constrained space camera assembly

    DOEpatents

    Heckendorn, Frank M.; Anderson, Erin K.; Robinson, Casandra W.; Haynes, Harriet B.

    1999-01-01

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras.

  19. Making Ceramic Cameras

    ERIC Educational Resources Information Center

    Squibb, Matt

    2009-01-01

    This article describes how to make a clay camera. This idea of creating functional cameras from clay allows students to experience ceramics, photography, and painting all in one unit. (Contains 1 resource and 3 online resources.)

  20. Adaptation of the Camera Link Interface for Flight-Instrument Applications

    NASA Technical Reports Server (NTRS)

    Randall, David P.; Mahoney, John C.

    2010-01-01

    COTS (commercial-off-the-shelf) hard ware using an industry-standard Camera Link interface is proposed to accomplish the task of designing, building, assembling, and testing electronics for an airborne spectrometer that would be low-cost, but sustain the required data speed and volume. The focal plane electronics were designed to support that hardware standard. Analysis was done to determine how these COTS electronics could be interfaced with space-qualified camera electronics. Interfaces available for spaceflight application do not support the industry standard Camera Link interface, but with careful design, COTS EGSE (electronics ground support equipment), including camera interfaces and camera simulators, can still be used.

  1. Nanosecond frame cameras

    SciTech Connect

    Frank, A M; Wilkins, P R

    2001-01-05

    The advent of CCD cameras and computerized data recording has spurred the development of several new cameras and techniques for recording nanosecond images. We have made a side by side comparison of three nanosecond frame cameras, examining them for both performance and operational characteristics. The cameras include; Micro-Channel Plate/CCD, Image Diode/CCD and Image Diode/Film; combinations of gating/data recording. The advantages and disadvantages of each device will be discussed.

  2. The ethics of using cameras in care homes.

    PubMed

    Fisk, Malcolm; Flórez-Revuelta, Francisco

    senior researcher, Digital Imaging Research There are concerns about how cameras in care homes might intrude on residents' and staff privacy but worries about resident abuse must be recognised. This article outlines an ethical way forward and calls for a rethink about cameras that focuses less on their ability to "see" and more on their use as data-gathering tools. PMID:27141719

  3. Accuracy potential of large-format still-video cameras

    NASA Astrophysics Data System (ADS)

    Maas, Hans-Gerd; Niederoest, Markus

    1997-07-01

    High resolution digital stillvideo cameras have found wide interest in digital close range photogrammetry in the last five years. They can be considered fully autonomous digital image acquisition systems without the requirement of permanent connection to an external power supply and a host computer for camera control and data storage, thus allowing for convenient data acquisition in many applications of digital photogrammetry. The accuracy potential of stillvideo cameras has been extensively discussed. While large format CCD sensors themselves can be considered very accurate measurement devices, lenses, camera bodies and sensor mounts of stillvideo cameras are not compression techniques in image storage, which may also affect the accuracy potential. This presentation shows recent experiences from accuracy tests with a number of large format stillvideo cameras, including a modified Kodak DCS200, a Kodak DCS460, a Nikon E2 and a Polaroid PDC-2000. The tests of the cameras include absolute and relative measurements and were performed using strong photogrammetric networks and good external reference. The results of the tests indicate that very high accuracies can be achieved with large blocks of stillvideo imagery especially in deformation measurements. In absolute measurements, however, the accuracy potential of the large format CCD sensors is partly ruined by a lack of stability of the cameras.

  4. Automatic source camera identification using the intrinsic lens radial distortion

    NASA Astrophysics Data System (ADS)

    Choi, Kai San; Lam, Edmund Y.; Wong, Kenneth K. Y.

    2006-11-01

    Source camera identification refers to the task of matching digital images with the cameras that are responsible for producing these images. This is an important task in image forensics, which in turn is a critical procedure in law enforcement. Unfortunately, few digital cameras are equipped with the capability of producing watermarks for this purpose. In this paper, we demonstrate that it is possible to achieve a high rate of accuracy in the identification by noting the intrinsic lens radial distortion of each camera. To reduce manufacturing cost, the majority of digital cameras are equipped with lenses having rather spherical surfaces, whose inherent radial distortions serve as unique fingerprints in the images. We extract, for each image, parameters from aberration measurements, which are then used to train and test a support vector machine classifier. We conduct extensive experiments to evaluate the success rate of a source camera identification with five cameras. The results show that this is a viable approach with high accuracy. Additionally, we also present results on how the error rates may change with images captured using various optical zoom levels, as zooming is commonly available in digital cameras.

  5. Harpicon camera for HDTV

    NASA Astrophysics Data System (ADS)

    Tanada, Jun

    1992-08-01

    Ikegami has been involved in broadcast equipment ever since it was established as a company. In conjunction with NHK it has brought forth countless television cameras, from black-and-white cameras to color cameras, HDTV cameras, and special-purpose cameras. In the early days of HDTV (high-definition television, also known as "High Vision") cameras the specifications were different from those for the cameras of the present-day system, and cameras using all kinds of components, having different arrangements of components, and having different appearances were developed into products, with time spent on experimentation, design, fabrication, adjustment, and inspection. But recently the knowhow built up thus far in components, , printed circuit boards, and wiring methods has been incorporated in camera fabrication, making it possible to make HDTV cameras by metbods similar to the present system. In addition, more-efficient production, lower costs, and better after-sales service are being achieved by using the same circuits, components, mechanism parts, and software for both HDTV cameras and cameras that operate by the present system.

  6. Reading Challenging Barcodes with Cameras

    PubMed Central

    Gallo, Orazio; Manduchi, Roberto

    2010-01-01

    Current camera-based barcode readers do not work well when the image has low resolution, is out of focus, or is motion-blurred. One main reason is that virtually all existing algorithms perform some sort of binarization, either by gray scale thresholding or by finding the bar edges. We propose a new approach to barcode reading that never needs to binarize the image. Instead, we use deformable barcode digit models in a maximum likelihood setting. We show that the particular nature of these models enables efficient integration over the space of deformations. Global optimization over all digits is then performed using dynamic programming. Experiments with challenging UPC-A barcode images show substantial improvement over other state-of-the-art algorithms. PMID:20617113

  7. Dark current behavior in DSLR cameras

    NASA Astrophysics Data System (ADS)

    Dunlap, Justin C.; Sostin, Oleg; Widenhorn, Ralf; Bodegom, Erik

    2009-02-01

    Digital single-lens reflex (DSLR) cameras are examined and their dark current behavior is presented. We examine the influence of varying temperature, exposure time, and gain setting on dark current. Dark current behavior unique to sensors within such cameras is observed. In particular, heat is trapped within the camera body resulting in higher internal temperatures and an increase in dark current after successive images. We look at the possibility of correcting for the dark current, based on previous work done for scientific grade imagers, where hot pixels are used as indicators for the entire chip's dark current behavior. Standard methods of dark current correction are compared to computed dark frames. Dark current is a concern for DSLR cameras as optimum conditions for limiting dark current, such as cooling the imager, are not easily obtained in the typical use of such imagers.

  8. Vacuum compatible miniature CCD camera head

    DOEpatents

    Conder, Alan D.

    2000-01-01

    A charge-coupled device (CCD) camera head which can replace film for digital imaging of visible light, ultraviolet radiation, and soft to penetrating x-rays, such as within a target chamber where laser produced plasmas are studied. The camera head is small, capable of operating both in and out of a vacuum environment, and is versatile. The CCD camera head uses PC boards with an internal heat sink connected to the chassis for heat dissipation, which allows for close(0.04" for example) stacking of the PC boards. Integration of this CCD camera head into existing instrumentation provides a substantial enhancement of diagnostic capabilities for studying high energy density plasmas, for a variety of military industrial, and medical imaging applications.

  9. Vacuum compatible miniature CCD camera head

    SciTech Connect

    Conder, A.D.

    2000-06-20

    A charge-coupled device (CCD) camera head is disclosed which can replace film for digital imaging of visible light, ultraviolet radiation, and soft to penetrating x-rays, such as within a target chamber where laser produced plasmas are studied. The camera head is small, capable of operating both in and out of a vacuum environment, and is versatile. The CCD camera head uses PC boards with an internal heat sink connected to the chassis for heat dissipation, which allows for close (0.04 inches for example) stacking of the PC boards. Integration of this CCD camera head into existing instrumentation provides a substantial enhancement of diagnostic capabilities for studying high energy density plasmas, for a variety of military, industrial, and medical imaging applications.

  10. Aviation spectral camera infinity target simulation system

    NASA Astrophysics Data System (ADS)

    Liu, Xinyue; Ming, Xing; Liu, Jiu; Guo, Wenji; Lv, Gunbo

    2014-11-01

    With the development of science and technology, the applications of aviation spectral camera becoming more widely. Developing a test system of dynamic target is more important. Aviation spectral camera infinity target simulation system can be used to test the resolution and the modulation transfer function of camera. The construction and work principle of infinity target simulation system were introduced in detail. Dynamic target generator based digital micromirror device (DMD) and required performance of collimation System were analyzed and reported. The dynamic target generator based on DMD had the advantages of replacing image convenient, size small and flexible. According to the requirement of tested camera, by rotating and moving mirror, has completed a full field infinity dynamic target test plan.

  11. Research on airborne infrared leakage detection of natural gas pipeline

    NASA Astrophysics Data System (ADS)

    Tan, Dongjie; Xu, Bin; Xu, Xu; Wang, Hongchao; Yu, Dongliang; Tian, Shengjie

    2011-12-01

    An airborne laser remote sensing technology is proposed to detect natural gas pipeline leakage in helicopter which carrying a detector, and the detector can detect a high spatial resolution of trace of methane on the ground. The principle of the airborne laser remote sensing system is based on tunable diode laser absorption spectroscopy (TDLAS). The system consists of an optical unit containing the laser, camera, helicopter mount, electronic unit with DGPS antenna, a notebook computer and a pilot monitor. And the system is mounted on a helicopter. The principle and the architecture of the airborne laser remote sensing system are presented. Field test experiments are carried out on West-East Natural Gas Pipeline of China, and the results show that airborne detection method is suitable for detecting gas leak of pipeline on plain, desert, hills but unfit for the area with large altitude diversification.

  12. Airborne remote sensing of spatiotemporal change (1955-2004) in indigenous and exotic forest cover in the Taita Hills, Kenya

    NASA Astrophysics Data System (ADS)

    Pellikka, Petri K. E.; Lötjönen, Milla; Siljander, Mika; Lens, Luc

    2009-08-01

    We studied changes in area and species composition of six indigenous forest fragments in the Taita Hills, Kenya using 1955 and 1995 aerial photography with 2004 airborne digital camera mosaics. The study area is part of Eastern Arc Mountains, a global biodiversity hot spot that boasts an outstanding diversity of flora and fauna and a high level of endemism. While a total of 260 ha (50%) of indigenous tropical cloud forest was lost to agriculture and bushland between 1955 and 2004, large-scale planting of exotic pines, eucalyptus, grevillea, black wattle and cypress on barren land during the same period resulted in a balanced total forest area. In the Taita Hills, like in other Afrotropical forests, indigenous forest loss may adversely affect ecosystem services.

  13. 2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING WEST TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  14. 7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA INSIDE CAMERA CAR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  15. 6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA CAR WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  16. Enhancement of document images from cameras

    NASA Astrophysics Data System (ADS)

    Taylor, Michael J.; Dance, Christopher R.

    1998-04-01

    As digital cameras become cheaper and more powerful, driven by the consumer digital photography market, we anticipate significant value in extending their utility as a general office peripheral by adding a paper scanning capability. The main technical challenges in realizing this new scanning interface are insufficient resolution, blur and lighting variations. We have developed an efficient technique for the recovery of text from digital camera images, which simultaneously treats these three problems, unlike other local thresholding algorithms which do not cope with blur and resolution enhancement. The technique first performs deblurring by deconvolution, and then resolution enhancement by linear interpolation. We compare the performance of a threshold derived from the local mean and variance of all pixel values within a neighborhood with a threshold derived from the local mean of just those pixels with high gradient. We assess performance using OCR error scores.

  17. Tower Camera Handbook

    SciTech Connect

    Moudry, D

    2005-01-01

    The tower camera in Barrow provides hourly images of ground surrounding the tower. These images may be used to determine fractional snow cover as winter arrives, for comparison with the albedo that can be calculated from downward-looking radiometers, as well as some indication of present weather. Similarly, during spring time, the camera images show the changes in the ground albedo as the snow melts. The tower images are saved in hourly intervals. In addition, two other cameras, the skydeck camera in Barrow and the piling camera in Atqasuk, show the current conditions at those sites.

  18. Digital In, Digital Out: Digital Editing with Firewire.

    ERIC Educational Resources Information Center

    Doyle, Bob; Sauer, Jeff

    1997-01-01

    Reviews linear and nonlinear digital video (DV) editing equipment and software, using the IEEE 1394 (FireWire) connector. Includes a chart listing specifications and rating eight DV editing systems, reviews two DV still-photo cameras, and previews beta DV products. (PEN)

  19. Automated Camera Calibration

    NASA Technical Reports Server (NTRS)

    Chen, Siqi; Cheng, Yang; Willson, Reg

    2006-01-01

    Automated Camera Calibration (ACAL) is a computer program that automates the generation of calibration data for camera models used in machine vision systems. Machine vision camera models describe the mapping between points in three-dimensional (3D) space in front of the camera and the corresponding points in two-dimensional (2D) space in the camera s image. Calibrating a camera model requires a set of calibration data containing known 3D-to-2D point correspondences for the given camera system. Generating calibration data typically involves taking images of a calibration target where the 3D locations of the target s fiducial marks are known, and then measuring the 2D locations of the fiducial marks in the images. ACAL automates the analysis of calibration target images and greatly speeds the overall calibration process.

  20. Measuring Positions of Objects using Two or More Cameras

    NASA Technical Reports Server (NTRS)

    Klinko, Steve; Lane, John; Nelson, Christopher

    2008-01-01

    An improved method of computing positions of objects from digitized images acquired by two or more cameras (see figure) has been developed for use in tracking debris shed by a spacecraft during and shortly after launch. The method is also readily adaptable to such applications as (1) tracking moving and possibly interacting objects in other settings in order to determine causes of accidents and (2) measuring positions of stationary objects, as in surveying. Images acquired by cameras fixed to the ground and/or cameras mounted on tracking telescopes can be used in this method. In this method, processing of image data starts with creation of detailed computer- aided design (CAD) models of the objects to be tracked. By rotating, translating, resizing, and overlaying the models with digitized camera images, parameters that characterize the position and orientation of the camera can be determined. The final position error depends on how well the centroids of the objects in the images are measured; how accurately the centroids are interpolated for synchronization of cameras; and how effectively matches are made to determine rotation, scaling, and translation parameters. The method involves use of the perspective camera model (also denoted the point camera model), which is one of several mathematical models developed over the years to represent the relationships between external coordinates of objects and the coordinates of the objects as they appear on the image plane in a camera. The method also involves extensive use of the affine camera model, in which the distance from the camera to an object (or to a small feature on an object) is assumed to be much greater than the size of the object (or feature), resulting in a truly two-dimensional image. The affine camera model does not require advance knowledge of the positions and orientations of the cameras. This is because ultimately, positions and orientations of the cameras and of all objects are computed in a coordinate

  1. Camera Calibration with Radial Variance Component Estimation

    NASA Astrophysics Data System (ADS)

    Mélykuti, B.; Kruck, E. J.

    2014-11-01

    Camera calibration plays a more and more important role in recent times. Beside real digital aerial survey cameras the photogrammetric market is dominated by a big number of non-metric digital cameras mounted on UAVs or other low-weight flying platforms. The in-flight calibration of those systems has a significant role to enhance the geometric accuracy of survey photos considerably. It is expected to have a better precision of photo measurements in the center of images then along the edges or in the corners. With statistical methods the accuracy of photo measurements in dependency of the distance of points from image center has been analyzed. This test provides a curve for the measurement precision as function of the photo radius. A high number of camera types have been tested with well penetrated point measurements in image space. The result of the tests led to a general consequence to show a functional connection between accuracy and radial distance and to give a method how to check and enhance the geometrical capability of the cameras in respect to these results.

  2. Situational Awareness from a Low-Cost Camera System

    NASA Technical Reports Server (NTRS)

    Freudinger, Lawrence C.; Ward, David; Lesage, John

    2010-01-01

    A method gathers scene information from a low-cost camera system. Existing surveillance systems using sufficient cameras for continuous coverage of a large field necessarily generate enormous amounts of raw data. Digitizing and channeling that data to a central computer and processing it in real time is difficult when using low-cost, commercially available components. A newly developed system is located on a combined power and data wire to form a string-of-lights camera system. Each camera is accessible through this network interface using standard TCP/IP networking protocols. The cameras more closely resemble cell-phone cameras than traditional security camera systems. Processing capabilities are built directly onto the camera backplane, which helps maintain a low cost. The low power requirements of each camera allow the creation of a single imaging system comprising over 100 cameras. Each camera has built-in processing capabilities to detect events and cooperatively share this information with neighboring cameras. The location of the event is reported to the host computer in Cartesian coordinates computed from data correlation across multiple cameras. In this way, events in the field of view can present low-bandwidth information to the host rather than high-bandwidth bitmap data constantly being generated by the cameras. This approach offers greater flexibility than conventional systems, without compromising performance through using many small, low-cost cameras with overlapping fields of view. This means significant increased viewing without ignoring surveillance areas, which can occur when pan, tilt, and zoom cameras look away. Additionally, due to the sharing of a single cable for power and data, the installation costs are lower. The technology is targeted toward 3D scene extraction and automatic target tracking for military and commercial applications. Security systems and environmental/ vehicular monitoring systems are also potential applications.

  3. Digital security technology simplified.

    PubMed

    Scaglione, Bernard J

    2007-01-01

    Digital security technology is making great strides in replacing analog and other traditional security systems including CCTV card access, personal identification and alarm monitoring applications. Like any new technology, the author says, it is important to understand its benefits and limitations before purchasing and installing, to ensure its proper operation and effectiveness. This article is a primer for security directors on how digital technology works. It provides an understanding of the key components which make up the foundation for digital security systems, focusing on three key aspects of the digital security world: the security network, IP cameras and IP recorders. PMID:17907609

  4. Camera-Model Identification Using Markovian Transition Probability Matrix

    NASA Astrophysics Data System (ADS)

    Xu, Guanshuo; Gao, Shang; Shi, Yun Qing; Hu, Ruimin; Su, Wei

    Detecting the (brands and) models of digital cameras from given digital images has become a popular research topic in the field of digital forensics. As most of images are JPEG compressed before they are output from cameras, we propose to use an effective image statistical model to characterize the difference JPEG 2-D arrays of Y and Cb components from the JPEG images taken by various camera models. Specifically, the transition probability matrices derived from four different directional Markov processes applied to the image difference JPEG 2-D arrays are used to identify statistical difference caused by image formation pipelines inside different camera models. All elements of the transition probability matrices, after a thresholding technique, are directly used as features for classification purpose. Multi-class support vector machines (SVM) are used as the classification tool. The effectiveness of our proposed statistical model is demonstrated by large-scale experimental results.

  5. Calibration of Low Cost RGB and NIR Uav Cameras

    NASA Astrophysics Data System (ADS)

    Fryskowska, A.; Kedzierski, M.; Grochala, A.; Braula, A.

    2016-06-01

    Non-metric digital cameras are being widely used for photogrammetric studies. The increase in resolution and quality of images obtained by non-metric cameras, allows to use it in low-cost UAV and terrestrial photogrammetry. Imagery acquired with non-metric cameras can be used in 3D modeling of objects or landscapes, reconstructing of historical sites, generating digital elevation models (DTM), orthophotos, or in the assessment of accidents. Non-metric digital camcorders are characterized by instability and ignorance of the interior orientation parameters. Therefore, the use of these devices requires prior calibration. Calibration research was conducted using non-metric camera, different calibration tests and various software. The first part of the paper contains a brief theoretical introduction including the basic definitions, like the construction of non-metric cameras or description of different optical distortions. The second part of the paper contains cameras calibration process, details of the calibration methods and models that have been used. Sony Nex 5 camera calibration has been done using software: Image Master Calib, Matlab - Camera Calibrator application and Agisoft Lens. For the study 2D test fields has been used. As a part of the research a comparative analysis of the results have been done.

  6. Assessing the Photogrammetric Potential of Cameras in Portable Devices

    NASA Astrophysics Data System (ADS)

    Smith, M. J.; Kokkas, N.

    2012-07-01

    In recent years, there have been an increasing number of portable devices, tablets and Smartphone's employing high-resolution digital cameras to satisfy consumer demand. In most cases, these cameras are designed primarily for capturing visually pleasing images and the potential of using Smartphone and tablet cameras for metric applications remains uncertain. The compact nature of the host's devices leads to very small cameras and therefore smaller geometric characteristics. This also makes them extremely portable and with their integration into a multi-function device, which is part of the basic unit cost often makes them readily available. Many application specialists may find them an attractive proposition where some modest photogrammetric capability would be useful. This paper investigates the geometric potential of these cameras for close range photogrammetric applications by: • investigating their geometric characteristics using the self-calibration method of camera calibration and comparing results from a state-of-the-art Digital SLR camera. • investigating their capability for 3D building modelling. Again, these results will be compared with findings from results obtained from a Digital SLR camera. The early results presented show that the iPhone has greater potential for photogrammetric use than the iPad.

  7. Vision Sensors and Cameras

    NASA Astrophysics Data System (ADS)

    Hoefflinger, Bernd

    Silicon charge-coupled-device (CCD) imagers have been and are a specialty market ruled by a few companies for decades. Based on CMOS technologies, active-pixel sensors (APS) began to appear in 1990 at the 1 μm technology node. These pixels allow random access, global shutters, and they are compatible with focal-plane imaging systems combining sensing and first-level image processing. The progress towards smaller features and towards ultra-low leakage currents has provided reduced dark currents and μm-size pixels. All chips offer Mega-pixel resolution, and many have very high sensitivities equivalent to ASA 12.800. As a result, HDTV video cameras will become a commodity. Because charge-integration sensors suffer from a limited dynamic range, significant processing effort is spent on multiple exposure and piece-wise analog-digital conversion to reach ranges >10,000:1. The fundamental alternative is log-converting pixels with an eye-like response. This offers a range of almost a million to 1, constant contrast sensitivity and constant colors, important features in professional, technical and medical applications. 3D retino-morphic stacking of sensing and processing on top of each other is being revisited with sub-100 nm CMOS circuits and with TSV technology. With sensor outputs directly on top of neurons, neural focal-plane processing will regain momentum, and new levels of intelligent vision will be achieved. The industry push towards thinned wafers and TSV enables backside-illuminated and other pixels with a 100% fill-factor. 3D vision, which relies on stereo or on time-of-flight, high-speed circuitry, will also benefit from scaled-down CMOS technologies both because of their size as well as their higher speed.

  8. An evaluation of onshore digital elevation models for modelling tsunami inundation zones

    NASA Astrophysics Data System (ADS)

    Griffin, Jonathan; Latief, Hamzah; Kongko, Widjo; Harig, Sven; Horspool, Nick; Hanung, Raditya; Rojali, Aditia; Maher, Nicola; Fuchs, Annika; Hossen, Jakir; Upi, Supriyati; Edi, Dewanto; Rakowsky, Natalja; Cummins, Phil

    2015-06-01

    A sensitivity study is undertaken to assess the utility of different onshore digital elevation models (DEM) for simulating the extent of tsunami inundation using case studies from two locations in Indonesia. We compare airborne IFSAR, ASTER and SRTM against high resolution LiDAR and stereo-camera data in locations with different coastal morphologies. Tsunami inundation extents modelled with airborne IFSAR DEMs are comparable with those modelled with the higher resolution datasets and are also consistent with historical run-up data, where available. Large vertical errors and poor resolution of the coastline in the ASTER and SRTM elevation datasets cause the modelled inundation extent to be much less compared with the other datasets and observations. Therefore ASTER and SRTM should not be used to underpin tsunami inundation models. a model mesh resolution of 25 m was sufficient for estimating the inundated area when using elevation data with high vertical accuracy in the case studies presented here. Differences in modelled inundation between digital terrain models (DTM) and digital surface models (DSM) for LiDAR and IFSAR are greater than differences between the two data types. Models using DTM may overestimate inundation while those using DSM may underestimate inundation when a constant Manning’s roughness value is used. We recommend using DTM for modelling tsunami inundation extent with further work needed to resolve the scale at which surface roughness should be parameterised.

  9. Airborne gravity is here

    SciTech Connect

    Hammer, S.

    1982-01-11

    After 20 years of development efforts, the airborne gravity survey has finally become a practical exploration method. Besides gravity data, the airborne survey can also collect simultaneous, continuous records of high-precision magneticfield data as well as terrain clearance; these provide a topographic contour map useful in calculating terrain conditions and in subsequent planning and engineering. Compared with a seismic survey, the airborne gravity method can cover the same area much more quickly and cheaply; a seismograph could then detail the interesting spots.

  10. Microchannel plate streak camera

    DOEpatents

    Wang, Ching L.

    1989-01-01

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 KeV x-rays.

  11. GRACE star camera noise

    NASA Astrophysics Data System (ADS)

    Harvey, Nate

    2016-08-01

    Extending results from previous work by Bandikova et al. (2012) and Inacio et al. (2015), this paper analyzes Gravity Recovery and Climate Experiment (GRACE) star camera attitude measurement noise by processing inter-camera quaternions from 2003 to 2015. We describe a correction to star camera data, which will eliminate a several-arcsec twice-per-rev error with daily modulation, currently visible in the auto-covariance function of the inter-camera quaternion, from future GRACE Level-1B product releases. We also present evidence supporting the argument that thermal conditions/settings affect long-term inter-camera attitude biases by at least tens-of-arcsecs, and that several-to-tens-of-arcsecs per-rev star camera errors depend largely on field-of-view.

  12. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1989-03-21

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras is disclosed. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1,000 KeV x-rays. 3 figs.

  13. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1984-09-28

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (uv to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 keV x-rays.

  14. Camera-enabled techniques for organic synthesis

    PubMed Central

    Ingham, Richard J; O’Brien, Matthew; Browne, Duncan L

    2013-01-01

    Summary A great deal of time is spent within synthetic chemistry laboratories on non-value-adding activities such as sample preparation and work-up operations, and labour intensive activities such as extended periods of continued data collection. Using digital cameras connected to computer vision algorithms, camera-enabled apparatus can perform some of these processes in an automated fashion, allowing skilled chemists to spend their time more productively. In this review we describe recent advances in this field of chemical synthesis and discuss how they will lead to advanced synthesis laboratories of the future. PMID:23766820

  15. Analytical multicollimator camera calibration

    USGS Publications Warehouse

    Tayman, W.P.

    1978-01-01

    Calibration with the U.S. Geological survey multicollimator determines the calibrated focal length, the point of symmetry, the radial distortion referred to the point of symmetry, and the asymmetric characteristiecs of the camera lens. For this project, two cameras were calibrated, a Zeiss RMK A 15/23 and a Wild RC 8. Four test exposures were made with each camera. Results are tabulated for each exposure and averaged for each set. Copies of the standard USGS calibration reports are included. ?? 1978.

  16. NIR-green-blue high-resolution digital images for assessement of winter cover crop biomass

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Many small unmanned aerial systems use true-color digital cameras for remote sensing. For some cameras, only the red channel is sensitive to near-infrared (NIR) light; we attached a custom red-blocking filter to a digital camera to obtain NIR-green-blue digital images. One advantage of this low-co...

  17. LSST Camera Optics Design

    SciTech Connect

    Riot, V J; Olivier, S; Bauman, B; Pratuch, S; Seppala, L; Gilmore, D; Ku, J; Nordby, M; Foss, M; Antilogus, P; Morgado, N

    2012-05-24

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics will meet their performance goals.

  18. Ringfield lithographic camera

    DOEpatents

    Sweatt, William C.

    1998-01-01

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D.sub.source .apprxeq.0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry with an increased etendue for the camera system. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors.

  19. Airborne Laser Polar Nephelometer

    NASA Technical Reports Server (NTRS)

    Grams, Gerald W.

    1973-01-01

    A polar nephelometer has been developed at NCAR to measure the angular variation of the intensity of light scattered by air molecules and particles. The system has been designed for airborne measurements using outside air ducted through a 5-cm diameter airflow tube; the sample volume is that which is common to the intersection of a collimated source beam and the detector field of view within the airflow tube. The source is a linearly polarized helium-neon laser beam. The optical system defines a collimated field-of-view (0.5deg half-angle) through a series of diaphragms located behind a I72-mm focal length objective lens. A photomultiplier tube is located immediately behind an aperture in the focal plane of the objective lens. The laser beam is mechanically chopped (on-off) at a rate of 5 Hz; a two-channel pulse counter, synchronized to the laser output, measures the photomultiplier pulse rate with the light beam both on and off. The difference in these measured pulse rates is directly proportional to the intensity of the scattered light from the volume common to the intersection of the laser beam and the detector field-of-view. Measurements can be made at scattering angles from 15deg to 165deg with reference to the direction of propagation of the light beam. Intermediate angles are obtained by selecting the angular increments desired between these extreme angles (any multiple of 0.1deg can be selected for the angular increment; 5deg is used in normal operation). Pulses provided by digital circuits control a stepping motor which sequentially rotates the detector by pre-selected angular increments. The synchronous photon-counting system automatically begins measurement of the scattered-light intensity immediately after the rotation to a new angle has been completed. The instrument has been flown on the NASA Convair 990 airborne laboratory to obtain data on the complex index of refraction of atmospheric aerosols. A particle impaction device is operated simultaneously

  20. Seeing the trees yet not missing the forest: an airborne lidar approach

    NASA Astrophysics Data System (ADS)

    Guo, Q.; Li, W.; Flanagan, J.

    2011-12-01

    Light Detection and Ranging (lidar) is an optical remote sensing technology that measures properties of scattered light to find range and/or other information of a distant object. Due to its ability to generate 3-dimensional data with high spatial resolution and accuracy, lidar technology is being increasingly used in ecology, geography, geology, geomorphology, seismology, remote sensing, and atmospheric physics. In this study, we acquire airborne lidar data for the study of hydrologic, geomorphologic, and geochemical processes at six Critical Zone Observatories: Southern Sierra, Boulder Creek, Shale Hills, Luquillo, Jemez, and Christina River Basin. Each site will have two lidar flights (leaf on/off, or snow on/off). Based on lidar data, we derive various products, including high resolution Digital Elevation Model (DEM), Digital Surface Model (DSM), Canopy Height Model (CHM), canopy cover & closure, tree height, DBH, canopy base height, canopy bulk density, biomass, LAI, etc. A novel approach is also developed to map individual tree based on segmentation of lidar point clouds, and a virtual forest is simulated using the location of individual trees as well as tree structure information. The simulated image is then compared to a camera photo taken at the same location. The two images look very similar, while, our simulated image provides not only a visually impressive visualization of the landscape, but also contains all the detailed information about the individual tree locations and forest structure properties.

  1. Toolsets for Airborne Data

    Atmospheric Science Data Center

    2015-04-02

    article title:  Toolsets for Airborne Data     View larger image The ... limit of detection values. Prior to accessing the TAD Web Application ( https://tad.larc.nasa.gov ) for the first time, users must ...

  2. The airborne laser

    NASA Astrophysics Data System (ADS)

    Lamberson, Steven; Schall, Harold; Shattuck, Paul

    2007-05-01

    The Airborne Laser (ABL) is an airborne, megawatt-class laser system with a state-of-the-art atmospheric compensation system to destroy enemy ballistic missiles at long ranges. This system will provide both deterrence and defense against the use of such weapons during conflicts. This paper provides an overview of the ABL weapon system including: the notional operational concept, the development approach and schedule, the overall aircraft configuration, the technologies being incorporated in the ABL, and the current program status.

  3. Smart Camera Technology Increases Quality

    NASA Technical Reports Server (NTRS)

    2004-01-01

    When it comes to real-time image processing, everyone is an expert. People begin processing images at birth and rapidly learn to control their responses through the real-time processing of the human visual system. The human eye captures an enormous amount of information in the form of light images. In order to keep the brain from becoming overloaded with all the data, portions of an image are processed at a higher resolution than others, such as a traffic light changing colors. changing colors. In the same manner, image processing products strive to extract the information stored in light in the most efficient way possible. Digital cameras available today capture millions of pixels worth of information from incident light. However, at frame rates more than a few per second, existing digital interfaces are overwhelmed. All the user can do is store several frames to memory until that memory is full and then subsequent information is lost. New technology pairs existing digital interface technology with an off-the-shelf complementary metal oxide semiconductor (CMOS) imager to provide more than 500 frames per second of specialty image processing. The result is a cost-effective detection system unlike any other.

  4. The Camera Cook Book.

    ERIC Educational Resources Information Center

    Education Development Center, Inc., Newton, MA.

    Intended for use with the photographic materials available from the Workshop for Learning Things, Inc., this "camera cookbook" describes procedures that have been tried in classrooms and workshops and proven to be the most functional and inexpensive. Explicit starting off instructions--directions for exploring and loading the camera and for taking…

  5. Constrained space camera assembly

    DOEpatents

    Heckendorn, F.M.; Anderson, E.K.; Robinson, C.W.; Haynes, H.B.

    1999-05-11

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity is disclosed. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras. 17 figs.

  6. CCD Luminescence Camera

    NASA Technical Reports Server (NTRS)

    Janesick, James R.; Elliott, Tom

    1987-01-01

    New diagnostic tool used to understand performance and failures of microelectronic devices. Microscope integrated to low-noise charge-coupled-device (CCD) camera to produce new instrument for analyzing performance and failures of microelectronics devices that emit infrared light during operation. CCD camera also used to indentify very clearly parts that have failed where luminescence typically found.

  7. Camera Operator and Videographer

    ERIC Educational Resources Information Center

    Moore, Pam

    2007-01-01

    Television, video, and motion picture camera operators produce images that tell a story, inform or entertain an audience, or record an event. They use various cameras to shoot a wide range of material, including television series, news and sporting events, music videos, motion pictures, documentaries, and training sessions. Those who film or…

  8. Tests of commercial colour CMOS cameras for astronomical applications

    NASA Astrophysics Data System (ADS)

    Pokhvala, S. M.; Reshetnyk, V. M.; Zhilyaev, B. E.

    2013-12-01

    We present some results of testing commercial colour CMOS cameras for astronomical applications. Colour CMOS sensors allow to perform photometry in three filters simultaneously that gives a great advantage compared with monochrome CCD detectors. The Bayer BGR colour system realized in colour CMOS sensors is close to the astronomical Johnson BVR system. The basic camera characteristics: read noise (e^{-}/pix), thermal noise (e^{-}/pix/sec) and electronic gain (e^{-}/ADU) for the commercial digital camera Canon 5D MarkIII are presented. We give the same characteristics for the scientific high performance cooled CCD camera system ALTA E47. Comparing results for tests of Canon 5D MarkIII and CCD ALTA E47 show that present-day commercial colour CMOS cameras can seriously compete with the scientific CCD cameras in deep astronomical imaging.

  9. Target detection algorithm for airborne thermal hyperspectral data

    NASA Astrophysics Data System (ADS)

    Marwaha, R.; Kumar, A.; Raju, P. L. N.; Krishna Murthy, Y. V. N.

    2014-11-01

    Airborne hyperspectral imaging is constantly being used for classification purpose. But airborne thermal hyperspectral image usually is a challenge for conventional classification approaches. The Telops Hyper-Cam sensor is an interferometer-based imaging system that helps in the spatial and spectral analysis of targets utilizing a single sensor. It is based on the technology of Fourier-transform which yields high spectral resolution and enables high accuracy radiometric calibration. The Hypercam instrument has 84 spectral bands in the 868 cm-1 to 1280 cm-1 region (7.8 μm to 11.5 μm), at a spectral resolution of 6 cm-1 (full-width-half-maximum) for LWIR (long wave infrared) range. Due to the Hughes effect, only a few classifiers are able to handle high dimensional classification task. MNF (Minimum Noise Fraction) rotation is a data dimensionality reducing approach to segregate noise in the data. In this, the component selection of minimum noise fraction (MNF) rotation transformation was analyzed in terms of classification accuracy using constrained energy minimization (CEM) algorithm as a classifier for Airborne thermal hyperspectral image and for the combination of airborne LWIR hyperspectral image and color digital photograph. On comparing the accuracy of all the classified images for airborne LWIR hyperspectral image and combination of Airborne LWIR hyperspectral image with colored digital photograph, it was found that accuracy was highest for MNF component equal to twenty. The accuracy increased by using the combination of airborne LWIR hyperspectral image with colored digital photograph instead of using LWIR data alone.

  10. A neutron camera system for MAST

    SciTech Connect

    Cecconello, M.; Conroy, S.; Ericsson, G.; Ronchi, E.; Sangaroon, S.; Weiszflog, M.; Turnyanskiy, M.; Akers, R.; Fitzgerald, I.; Cullen, A.

    2010-10-15

    A prototype neutron camera has been developed and installed at MAST as part of a feasibility study for a multichord neutron camera system with the aim to measure the spatial and time resolved 2.45 MeV neutron emissivity profile. Liquid scintillators coupled to a fast digitizer are used for neutron/gamma ray digital pulse shape discrimination. The preliminary results obtained clearly show the capability of this diagnostic to measure neutron emissivity profiles with sufficient time resolution to study the effect of fast ion loss and redistribution due to magnetohydrodynamic activity. A minimum time resolution of 2 ms has been achieved with a modest 1.5 MW of neutral beam injection heating with a measured neutron count rate of a few 100 kHz.

  11. A Detailed Examination of DTM Source Data: Light, Camera, Action

    NASA Astrophysics Data System (ADS)

    Mosbrucker, A. R.; Spicer, K.; Major, J. J.; Pitlick, J.; Normandeau, J.

    2013-12-01

    High-resolution, multi-temporal, remote sensing technologies have revolutionized geomorphic analysis. Topographic point measurements (XYZ) acquired from airborne and terrestrial laser scanning (ALS and TLS) and photogrammetry commonly are used to generate 3D digital terrain models (DTMs). Here, we compare DTMs generated using Structure-from-Motion (SfM) photogrammetry to ALS, TLS, and classic photogrammetry. Our investigation utilized 5 years of remotely sensed topographic data, from ALS (2007, 2009), TLS (2010-2012), and airborne and terrestrial close-range oblique photographs (using both classic and SfM photogrammetry) (2010-2012), of a 70,000 m2, 500 m-long reach of the upper North Fork Toutle River, Washington, devastated by the cataclysmic 1980 eruption of Mount St. Helens. The study reach is sparsely vegetated and features 10-30 m-tall vertical banks separated by a 170 m-wide floodplain. In addition to remotely sensed data, we surveyed more than 300 ground control points (GCPs) using a 1-arc second reflectorless total station and map- and survey-grade GPS and RTK-GNSS. Few, if any, data sets have been obtained with this variety of technologies in spatial and temporal coincidence. We examine the application of each technique to assess fluvial morphological change, as computed by DTM differencing. A subset of GCPs was used to transform image coordinates into geodetic datum. DTM uncertainty was then quantified using the remaining GCPs. This uncertainty was used to determine the minimum level of detectable change. Owing to highly variable topography and point-to-surface interpolation techniques, method strengths and weaknesses were identified. ALS data were found to have greatest uncertainty in areas of low point density on steep slopes. TLS produced highly variable point density in the floodplain, where interpolation error is likely to be minimal. In contrast, classic and SfM photogrammetry using oblique photographs with a high degree of image overlap produced

  12. Determination of the spatial structure of vegetation on the repository of the mine "Fryderyk" in Tarnowskie Góry, based on airborne laser scanning from the ISOK project and digital orthophotomaps

    NASA Astrophysics Data System (ADS)

    Szostak, Marta; Wężyk, Piotr; Pająk, Marek; Haryło, Paweł; Lisańczuk, Marek

    2015-06-01

    The purpose of this study was to determine the spatial structure of vegetation on the repository of the mine "Fryderyk" in Tarnowskie Góry. Tested area was located in the Upper Silesian Industrial Region (a large industrial region in Poland). It was a unique refuge habitat - Natura2000; PLH240008. The main aspect of this elaboration was to investigate the possible use of geotechniques and generally available geodata for mapping LULC changes and determining the spatial structure of vegetation. The presented study focuses on the analysis of a spatial structure of vegetation in the research area. This exploration was based on aerial images and orthophotomaps from 1947, 1998, 2003, 2009, 2011 and airborne laser scanning data (2011, ISOK project). Forest succession changes which occurred between 1947 and 2011 were analysed. The selected features of vegetation overgrowing spoil heap "Fryderyk" was determined. The results demonstrated a gradual succession of greenery on soil heap. In 1947, 84% of this area was covered by low vegetation. Tree expansion was proceeding in the westerly and northwest direction. In 2011 this canopy layer covered almost 50% of the research area. Parameters such as height of vegetation, crowns length and cover density were calculated by an airborne laser scanning data. These analyses indicated significant diversity in vertical and horizontal structures of vegetation. The study presents some capacities to use airborne laser scanning for an impartial evaluation of the structure of vegetation.

  13. Optical Communications Link to Airborne Transceiver

    NASA Technical Reports Server (NTRS)

    Regehr, Martin W.; Kovalik, Joseph M.; Biswas, Abhijit

    2011-01-01

    An optical link from Earth to an aircraft demonstrates the ability to establish a link from a ground platform to a transceiver moving overhead. An airplane has a challenging disturbance environment including airframe vibrations and occasional abrupt changes in attitude during flight. These disturbances make it difficult to maintain pointing lock in an optical transceiver in an airplane. Acquisition can also be challenging. In the case of the aircraft link, the ground station initially has no precise knowledge of the aircraft s location. An airborne pointing system has been designed, built, and demonstrated using direct-drive brushless DC motors for passive isolation of pointing disturbances and for high-bandwidth control feedback. The airborne transceiver uses a GPS-INS system to determine the aircraft s position and attitude, and to then illuminate the ground station initially for acquisition. The ground transceiver participates in link-pointing acquisition by first using a wide-field camera to detect initial illumination from the airborne beacon, and to perform coarse pointing. It then transfers control to a high-precision pointing detector. Using this scheme, live video was successfully streamed from the ground to the aircraft at 270 Mb/s while simultaneously downlinking a 50 kb/s data stream from the aircraft to the ground.

  14. 3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH THE VAL TO THE RIGHT, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  15. 7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION EQUIPMENT AND STORAGE CABINET. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  16. Miniaturized Autonomous Extravehicular Robotic Camera (Mini AERCam)

    NASA Technical Reports Server (NTRS)

    Fredrickson, Steven E.

    2001-01-01

    The NASA Johnson Space Center (JSC) Engineering Directorate is developing the Autonomous Extravehicular Robotic Camera (AERCam), a low-volume, low-mass free-flying camera system . AERCam project team personnel recently initiated development of a miniaturized version of AERCam known as Mini AERCam. The Mini AERCam target design is a spherical "nanosatellite" free-flyer 7.5 inches in diameter and weighing 1 0 pounds. Mini AERCam is building on the success of the AERCam Sprint STS-87 flight experiment by adding new on-board sensing and processing capabilities while simultaneously reducing volume by 80%. Achieving enhanced capability in a smaller package depends on applying miniaturization technology across virtually all subsystems. Technology innovations being incorporated include micro electromechanical system (MEMS) gyros, "camera-on-a-chip" CMOS imagers, rechargeable xenon gas propulsion system , rechargeable lithium ion battery, custom avionics based on the PowerPC 740 microprocessor, GPS relative navigation, digital radio frequency communications and tracking, micropatch antennas, digital instrumentation, and dense mechanical packaging. The Mini AERCam free-flyer will initially be integrated into an approximate flight-like configuration for demonstration on an airbearing table. A pilot-in-the-loop and hardware-in-the-loop simulation to simulate on-orbit navigation and dynamics will complement the airbearing table demonstration. The Mini AERCam lab demonstration is intended to form the basis for future development of an AERCam flight system that provides beneficial on-orbit views unobtainable from fixed cameras, cameras on robotic manipulators, or cameras carried by EVA crewmembers.

  17. Novel computer-based endoscopic camera

    NASA Astrophysics Data System (ADS)

    Rabinovitz, R.; Hai, N.; Abraham, Martin D.; Adler, Doron; Nissani, M.; Fridental, Ron; Vitsnudel, Ilia

    1995-05-01

    We have introduced a computer-based endoscopic camera which includes (a) unique real-time digital image processing to optimize image visualization by reducing over exposed glared areas and brightening dark areas, and by accentuating sharpness and fine structures, and (b) patient data documentation and management. The image processing is based on i Sight's iSP1000TM digital video processor chip and Adaptive SensitivityTM patented scheme for capturing and displaying images with wide dynamic range of light, taking into account local neighborhood image conditions and global image statistics. It provides the medical user with the ability to view images under difficult lighting conditions, without losing details `in the dark' or in completely saturated areas. The patient data documentation and management allows storage of images (approximately 1 MB per image for a full 24 bit color image) to any storage device installed into the camera, or to an external host media via network. The patient data which is included with every image described essential information on the patient and procedure. The operator can assign custom data descriptors, and can search for the stored image/data by typing any image descriptor. The camera optics has extended zoom range of f equals 20 - 45 mm allowing control of the diameter of the field which is displayed on the monitor such that the complete field of view of the endoscope can be displayed on all the area of the screen. All these features provide versatile endoscopic camera with excellent image quality and documentation capabilities.

  18. Digital photography in orthopaedic surgery.

    PubMed

    Elbeshbeshy, B; Trepman, E

    2001-01-01

    Digital photography has become a practical alternative to film photography for documentation, communication, and education about orthopaedic problems and treatment. Digital cameras may be used to document preoperative and postoperative condition, intraoperative findings, and imaging studies. Digital photographs are captured on the charged coupler device (CCD) of the camera, and processed as digital data. Images may be immediately viewed on the liquid crystal display (LCD) screen of the camera and reshot if necessary. Photographic image files may be stored in the camera in a floppy diskette, CompactFlash card, or SmartMedia card, and transferred to a computer. The images may be manipulated using photo-editing software programs, stored on media such as Zip disks or CD-R discs, printed, and incorporated into digital presentations. The digital photographs may be transmitted to others using electronic mail (e-mail) and Internet web sites. Transparency film slides may be converted to digital format and used in digital presentations. Despite the initial expense to obtain the required hardware, major cost savings in film and processing charges may be realized over time compared with film photography. PMID:11206828

  19. Digital field ion microscopy

    SciTech Connect

    Sijbrandij, S.J.; Russell, K.F.; Miller, M.K.; Thomson, R.C.

    1998-01-01

    Due to environmental concerns, there is a trend to avoid the use of chemicals needed to develop negatives and to process photographic paper, and to use digital technologies instead. Digital technology also offers the advantages that it is convenient, as it enables quick access to the end result, allows image storage and processing on computer, allows rapid hard copy output, and simplifies electronic publishing. Recently significant improvements have been made to the performance and cost of camera-sensors and printers. In this paper, field ion images recorded with two digital cameras of different resolution are compared to images recorded on standard 35 mm negative film. It should be noted that field ion images exhibit low light intensity and high contrast. Field ion images were recorded from a standard microchannel plate and a phosphor screen and had acceptance angles of {approximately} 60{degree}. Digital recordings were made with a Digital Vision Technologies (DVT) MICAM VHR1000 camera with a resolution of 752 x 582 pixels, and a Kodak DCS 460 digital camera with a resolution of 3,060 x 2,036 pixels. Film based recordings were made with Kodak T-MAX film rated at 400 ASA. The resolving power of T-MAX film, as specified by Kodak, is between 50 and 125 lines per mm, which corresponds to between 1,778 x 1,181 and 4,445 x 2,953 pixels, i.e. similar to that from the DCS 460 camera. The intensities of the images were sufficient to be recorded with standard fl:1.2 lenses with exposure times of less than 2 s. Many digital cameras were excluded from these experiments due to their lack of sensitivity or the inability to record a full frame image due to the fixed working distance defined by the vacuum system. The digital images were output on a Kodak Digital Science 8650 PS dye sublimation color printer (300 dpi). All field ion micrographs presented were obtained from a Ni-Al-Be specimen.

  20. Lens assemblies for multispectral camera

    NASA Astrophysics Data System (ADS)

    Lepretre, Francois

    1994-09-01

    In the framework of a contract with the Indian Space Research Organization (ISRO), MATRA DEFENSE - DOD/UAO have developed, produced and tested 36 types LISS 1 - LISS 2 lenses and 12 LISS 3 lenses equipped with their interferential filters. These lenses are intended to form the optical systems of multispectral imaging sensors aboard Indian earth observation satellites IRS 1A, 1B, 1C, and 1D. It should be noted that the multispectrum cameras of the IRS 1A - 1B satellite have been in operation for two years and have given very satisfactory results according to ISRO. Each of these multispectrum LISS 3 cameras consists of lenses, each working in a different spectral bandwidth (B2: 520 - 590 nm; B3: 620 - 680 nm; B4: 770 - 860 nm; B5: 1550 - 1700 nm). In order to superimpose the images of each spectral band without digital processing, the image formats (60 mm) of the lenses are registered better that 2 micrometers and remain as such throughout all the environmental tests. Similarly, due to the absence of precise thermal control aboard the satellite, the lenses are as athermal as possible.

  1. Kodak DCS200: a camera for high-accuracy measurements?

    NASA Astrophysics Data System (ADS)

    Gruen, Armin; Maas, Hans-Gerd; Keller, Andrea

    1995-09-01

    The digital high-resolution stillvideo camera Kodak DCS200 has reached a high degree of popularity among photogrammetrists within a very short time. Consisting of a mirror reflex camera, a high resolution CCD sensor, A/D conversion, power supply, and data storage capacity for 50 images, it can basically be considered a comfortable, autonomous device for digital image data acquisition, especially for industrial applications and for architectural photogrammetry. First tests of the camera showed a high precision potential: 1/20-1/30 pixel in image space could be achieved in several applications, and with large self-calibrating networks relative precisions of 1:100,000 and better have been reported. To be able to make more detailed statements on the accuracy potential of the camera, a thorough accuracy test was performed at ETH Zurich by taking 150 images of a 186 target 3D testfield. Although the precision estimates of this large block were exceptionally good, strong systematic object deformations were found in comparison with theodolite-measured reference coordinates of the testfield points. The reasons for these deformations are most probably temporal instabilities of some camera parameters, which could make the use of this camera very problematic for high accuracy applications. It is argued that these instabilities are caused by the weak fixture of the CCD-chip to the camera body. In this context it is often overlooked that this camera was not developed for precise measurement applications but rather for professional photographers.

  2. Night Vision Camera

    NASA Technical Reports Server (NTRS)

    1996-01-01

    PixelVision, Inc. developed the Night Video NV652 Back-illuminated CCD Camera, based on the expertise of a former Jet Propulsion Laboratory employee and a former employee of Scientific Imaging Technologies, Inc. The camera operates without an image intensifier, using back-illuminated and thinned CCD technology to achieve extremely low light level imaging performance. The advantages of PixelVision's system over conventional cameras include greater resolution and better target identification under low light conditions, lower cost and a longer lifetime. It is used commercially for research and aviation.

  3. Kitt Peak speckle camera

    NASA Technical Reports Server (NTRS)

    Breckinridge, J. B.; Mcalister, H. A.; Robinson, W. G.

    1979-01-01

    The speckle camera in regular use at Kitt Peak National Observatory since 1974 is described in detail. The design of the atmospheric dispersion compensation prisms, the use of film as a recording medium, the accuracy of double star measurements, and the next generation speckle camera are discussed. Photographs of double star speckle patterns with separations from 1.4 sec of arc to 4.7 sec of arc are shown to illustrate the quality of image formation with this camera, the effects of seeing on the patterns, and to illustrate the isoplanatic patch of the atmosphere.

  4. Structured light camera calibration

    NASA Astrophysics Data System (ADS)

    Garbat, P.; Skarbek, W.; Tomaszewski, M.

    2013-03-01

    Structured light camera which is being designed with the joined effort of Institute of Radioelectronics and Institute of Optoelectronics (both being large units of the Warsaw University of Technology within the Faculty of Electronics and Information Technology) combines various hardware and software contemporary technologies. In hardware it is integration of a high speed stripe projector and a stripe camera together with a standard high definition video camera. In software it is supported by sophisticated calibration techniques which enable development of advanced application such as real time 3D viewer of moving objects with the free viewpoint or 3D modeller for still objects.

  5. Ringfield lithographic camera

    DOEpatents

    Sweatt, W.C.

    1998-09-08

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D{sub source} {approx_equal} 0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors. 11 figs.

  6. Airborne and Ground-Based Platforms for Data Collection in Small Vineyards: Examples from the UK and Switzerland

    NASA Astrophysics Data System (ADS)

    Green, David R.; Gómez, Cristina; Fahrentrapp, Johannes

    2015-04-01

    This paper presents an overview of some of the low-cost ground and airborne platforms and technologies now becoming available for data collection in small area vineyards. Low-cost UAV or UAS platforms and cameras are now widely available as the means to collect both vertical and oblique aerial still photography and airborne videography in vineyards. Examples of small aerial platforms include the AR Parrot Drone, the DJI Phantom (1 and 2), and 3D Robotics IRIS+. Both fixed-wing and rotary wings platforms offer numerous advantages for aerial image acquisition including the freedom to obtain high resolution imagery at any time required. Imagery captured can be stored on mobile devices such as an Apple iPad and shared, written directly to a memory stick or card, or saved to the Cloud. The imagery can either be visually interpreted or subjected to semi-automated analysis using digital image processing (DIP) software to extract information about vine status or the vineyard environment. At the ground-level, a radio-controlled 'rugged' model 4x4 vehicle can also be used as a mobile platform to carry a number of sensors (e.g. a Go-Pro camera) around a vineyard, thereby facilitating quick and easy field data collection from both within the vine canopy and rows. For the small vineyard owner/manager with limited financial resources, this technology has a number of distinct advantages to aid in vineyard management practices: it is relatively cheap to purchase; requires a short learning-curve to use and to master; can make use of autonomous ground control units for repetitive coverage enabling reliable monitoring; and information can easily be analysed and integrated within a GIS with minimal expertise. In addition, these platforms make widespread use of familiar and everyday, off-the-shelf technologies such as WiFi, Go-Pro cameras, Cloud computing, and smartphones or tablets as the control interface, all with a large and well established end-user support base. Whilst there are

  7. The Digital Divide

    ERIC Educational Resources Information Center

    Hudson, Hannah Trierweiler

    2011-01-01

    Megan is a 14-year-old from Nebraska who just started ninth grade. She has her own digital camera, cell phone, Nintendo DS, and laptop, and one or more of these devices is usually by her side. Compared to the interactions and exploration she's engaged in at home, Megan finds the technology in her classroom falls a little flat. Most of the…

  8. Study on airborne multispectral imaging fusion detection technology

    NASA Astrophysics Data System (ADS)

    Ding, Na; Gao, Jiaobo; Wang, Jun; Cheng, Juan; Gao, Meng; Gao, Fei; Fan, Zhe; Sun, Kefeng; Wu, Jun; Li, Junna; Gao, Zedong; Cheng, Gang

    2014-11-01

    The airborne multispectral imaging fusion detection technology is proposed in this paper. In this design scheme, the airborne multispectral imaging system consists of the multispectral camera, the image processing unit, and the stabilized platform. The multispectral camera can operate in the spectral region from visible to near infrared waveband (0.4-1.0um), it has four same and independent imaging channels, and sixteen different typical wavelengths to be selected based on the different typical targets and background. The related experiments were tested by the airborne multispectral imaging system. In particularly, the camouflage targets were fused and detected in the different complex environment, such as the land vegetation background, the desert hot background and underwater. In the spectral region from 0.4 um to 1.0um, the three different characteristic wave from sixteen typical spectral are selected and combined according to different backgrounds and targets. The spectral image corresponding to the three characteristic wavelengths is resisted and fused by the image processing technology in real time, and the fusion video with typical target property is outputted. In these fusion images, the contrast of target and background is greatly increased. Experimental results confirm that the airborne multispectral imaging fusion detection technology can acquire multispectral fusion image with high contrast in real time, and has the ability of detecting and identification camouflage objects from complex background to targets underwater.

  9. Miniature TV Camera

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Originally devised to observe Saturn stage separation during Apollo flights, Marshall Space Flight Center's Miniature Television Camera, measuring only 4 x 3 x 1 1/2 inches, quickly made its way to the commercial telecommunications market.

  10. The MKID Camera

    NASA Astrophysics Data System (ADS)

    Maloney, P. R.; Czakon, N. G.; Day, P. K.; Duan, R.; Gao, J.; Glenn, J.; Golwala, S.; Hollister, M.; LeDuc, H. G.; Mazin, B.; Noroozian, O.; Nguyen, H. T.; Sayers, J.; Schlaerth, J.; Vaillancourt, J. E.; Vayonakis, A.; Wilson, P.; Zmuidzinas, J.

    2009-12-01

    The MKID Camera project is a collaborative effort of Caltech, JPL, the University of Colorado, and UC Santa Barbara to develop a large-format, multi-color millimeter and submillimeter-wavelength camera for astronomy using microwave kinetic inductance detectors (MKIDs). These are superconducting, micro-resonators fabricated from thin aluminum and niobium films. We couple the MKIDs to multi-slot antennas and measure the change in surface impedance produced by photon-induced breaking of Cooper pairs. The readout is almost entirely at room temperature and can be highly multiplexed; in principle hundreds or even thousands of resonators could be read out on a single feedline. The camera will have 576 spatial pixels that image simultaneously in four bands at 750, 850, 1100 and 1300 microns. It is scheduled for deployment at the Caltech Submillimeter Observatory in the summer of 2010. We present an overview of the camera design and readout and describe the current status of testing and fabrication.

  11. Advanced CCD camera developments

    SciTech Connect

    Condor, A.

    1994-11-15

    Two charge coupled device (CCD) camera systems are introduced and discussed, describing briefly the hardware involved, and the data obtained in their various applications. The Advanced Development Group Defense Sciences Engineering Division has been actively designing, manufacturing, fielding state-of-the-art CCD camera systems for over a decade. These systems were originally developed for the nuclear test program to record data from underground nuclear tests. Today, new and interesting application for these systems have surfaced and development is continuing in the area of advanced CCD camera systems, with the new CCD camera that will allow experimenters to replace film for x-ray imaging at the JANUS, USP, and NOVA laser facilities.

  12. Gamma ray camera

    SciTech Connect

    Robbins, C.D.; Wang, S.

    1980-09-09

    An anger gamma ray camera is improved by the substitution of a gamma ray sensitive, proximity type image intensifier tube for the scintillator screen in the anger camera, the image intensifier tube having a negatively charged flat scintillator screen and a flat photocathode layer and a grounded, flat output phosphor display screen all of the same dimension (Unity image magnification) and all within a grounded metallic tube envelope and having a metallic, inwardly concaved input window between the scintillator screen and the collimator.

  13. The Airborne Laser

    NASA Astrophysics Data System (ADS)

    Lamberson, Steven E.

    2002-09-01

    The US Air Force Airborne Laser (ABL) is an airborne, megawatt-class laser system with a state-of-the-art atmospheric compensation system to destroy enemy ballistic missiles at long ranges. This system will provide both deterrence and defense against the use of such weapons during conflicts. This paper provides an overview of the ABL weapon system including: the notional operational concept, the development approach and schedule, the overall aircraft configuration, the technologies being incorporated in the ABL, and the risk reduction approach being utilized to ensure program success.

  14. Airborne oceanographic lidar system

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Specifications and preliminary design of an Airborne Oceanographic Lidar (AOL) system, which is to be constructed for installation and used on a NASA Wallops Flight Center (WFC) C-54 research aircraft, are reported. The AOL system is to provide an airborne facility for use by various government agencies to demonstrate the utility and practicality of hardware of this type in the wide area collection of oceanographic data on an operational basis. System measurement and performance requirements are presented, followed by a description of the conceptual system approach and the considerations attendant to its development. System performance calculations are addressed, and the system specifications and preliminary design are presented and discussed.

  15. Monitoring human and vehicle activities using airborne video

    NASA Astrophysics Data System (ADS)

    Cutler, Ross; Shekhar, Chandra S.; Burns, B.; Chellappa, Rama; Bolles, Robert C.; Davis, Larry S.

    2000-05-01

    Ongoing work in Activity Monitoring (AM) for the Airborne Video Surveillance (AVS) project is described. The goal for AM is to recognize activities of interest involving humans and vehicles using airborne video. AM consists of three major components: (1) moving object detection, tracking, and classification; (2) image to site-model registration; (3) activity recognition. Detecting and tracking humans and vehicles form airborne video is a challenging problem due to image noise, low GSD, poor contrast, motion parallax, motion blur, and camera blur, and camera jitter. We use frame-to- frame affine-warping stabilization and temporally integrated intensity differences to detect independent motion. Moving objects are initially tracked using nearest-neighbor correspondence, followed by a greedy method that favors long track lengths and assumes locally constant velocity. Object classification is based on object size, velocity, and periodicity of motion. Site-model registration uses GPS information and camera/airplane orientations to provide an initial geolocation with +/- 100m accuracy at an elevation of 1000m. A semi-automatic procedure is utilized to improve the accuracy to +/- 5m. The activity recognition component uses the geolocated tracked objects and the site-model to detect pre-specified activities, such as people entering a forbidden area and a group of vehicles leaving a staging area.

  16. Basic digital photography in dermatology.

    PubMed

    Kaliyadan, Feroze; Manoj, Jayasree; Venkitakrishnan, S; Dharmaratnam, A D

    2008-01-01

    Digital photography has virtually replaced conventional film photography as far as clinical imaging is concerned. Though most dermatologists are familiar with digital cameras, there is room for improvement in the quality of clinical images. We aim to give an overview of the basics of digital photography in relation to dermatology, which would be useful to a dermatologist in his or her future clinical practice. PMID:19052435

  17. Camera Edge Response

    NASA Astrophysics Data System (ADS)

    Zisk, Stanley H.; Wittels, Norman

    1988-02-01

    Edge location is an important machine vision task. Machine vision systems perform mathematical operations on rectangular arrays of numbers that are intended to faithfully represent the spatial distribution of scene luminance. The numbers are produced by periodic sampling and quantization of the camera's video output. This sequence can cause artifacts to appear in the data with a noise spectrum that is high in power at high spatial frequencies. This is a problem because most edge detection algorithms are preferentially sensitive to the high-frequency content in an image. Solid state cameras can introduce errors because of the spatial periodicity of their sensor elements. This can result in problems when image edges are aligned with camera pixel boundaries: (a) some cameras introduce transients into the video signal while switching between sensor elements; (b) most cameras use analog low-pass filters to minimize sampling artifacts and these introduce video phase delays that shift the locations of edges. The problems compound when the vision system samples asynchronously with the camera's pixel rate. Moire patterns (analogous to beat frequencies) can result. In this paper, we examine and model quantization effects in a machine vision system with particular emphasis on edge detection performance. We also compare our models with experimental measurements.

  18. Spacecraft camera image registration

    NASA Technical Reports Server (NTRS)

    Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Chan, Fred N. T. (Inventor); Gamble, Donald W. (Inventor)

    1987-01-01

    A system for achieving spacecraft camera (1, 2) image registration comprises a portion external to the spacecraft and an image motion compensation system (IMCS) portion onboard the spacecraft. Within the IMCS, a computer (38) calculates an image registration compensation signal (60) which is sent to the scan control loops (84, 88, 94, 98) of the onboard cameras (1, 2). At the location external to the spacecraft, the long-term orbital and attitude perturbations on the spacecraft are modeled. Coefficients (K, A) from this model are periodically sent to the onboard computer (38) by means of a command unit (39). The coefficients (K, A) take into account observations of stars and landmarks made by the spacecraft cameras (1, 2) themselves. The computer (38) takes as inputs the updated coefficients (K, A) plus synchronization information indicating the mirror position (AZ, EL) of each of the spacecraft cameras (1, 2), operating mode, and starting and stopping status of the scan lines generated by these cameras (1, 2), and generates in response thereto the image registration compensation signal (60). The sources of periodic thermal errors on the spacecraft are discussed. The system is checked by calculating measurement residuals, the difference between the landmark and star locations predicted at the external location and the landmark and star locations as measured by the spacecraft cameras (1, 2).

  19. Digital Photography and Its Impact on Instruction.

    ERIC Educational Resources Information Center

    Lantz, Chris

    Today the chemical processing of film is being replaced by a virtual digital darkroom. Digital image storage makes new levels of consistency possible because its nature is less volatile and more mutable than traditional photography. The potential of digital imaging is great, but issues of disk storage, computer speed, camera sensor resolution,…

  20. NASA Airborne Lidar July 1991

    Atmospheric Science Data Center

    2016-05-26

    NASA Airborne Lidar July 1991 Data from the 1991 NASA Langley Airborne Lidar flights following the eruption of Pinatubo in July ... and Osborn [1992a, 1992b]. Project Title:  NASA Airborne Lidar Discipline:  Field Campaigns ...