Science.gov

Sample records for airborne digital camera

  1. A simple method for vignette correction of airborne digital camera data

    SciTech Connect

    Nguyen, A.T.; Stow, D.A.; Hope, A.S.

    1996-11-01

    Airborne digital camera systems have gained popularity in recent years due to their flexibility, high geometric fidelity and spatial resolution, and fast data turn-around time. However, a common problem that plagues these types of framing systems is vignetting which causes falloff in image brightness away from principle nadir point. This paper presents a simple method for vignetting correction by utilizing laboratory images of a uniform illumination source. Multiple lab images are averaged and inverted to create digital correction templates which then are applied to actual airborne data. The vignette correction was effective in removing the systematic falloff in spectral values. We have shown that the vignette correction is a necessary part of the preprocessing of raw digital airborne remote sensing data. The consequences of not correcting for these effects are demonstrated in the context of monitoring of salt marsh habitat. 4 refs.

  2. Integrating an RGB - CIR Digital Camera With an Airborne Laser Swath Mapping System

    NASA Astrophysics Data System (ADS)

    Lee, M.; Carter, W.; Shrestha, R.

    2003-12-01

    The National Science Foundation supported Center for Airborne Laser Mapping (NCALM) utilizes the airborne laser swath mapping (ALSM) system jointly owned by the University of Florida (UF) and Florida International University (FIU). The UF/FIU ALSM system is comprised of an Optech Inc. Model 1233 ALTM unit, with supporting GPS receiver and real-time navigation display, mounted in a twin-inline-engine Cessna 337 aircraft. Shortly after taking delivery of the ALSM system, UF researchers, in collaboration with a commercial partner, added a small format digital camera (Kodak 420) to the system, rigidly mounting it to the ALSM sensor head. Software was developed to use the GPS position and orientation parameters from the IMU unit in the ALSM sensor to rectify and mosaic the digital images. The ALSM height and intensity values were combined pixel by pixel with the RGB digital images, to classify surface materials. Based on our experience with the initial camera, and recommendations received at the NCALM workshop, UF researchers decided to upgrade the system to a Redlake MASD Inc. model MS4100 RGB/CIR camera. The MS4100 contains three CCD arrays, which simultaneously capture full spatial resolution images in red and near IR band bands, and a factor of two lower spatial resolution images in the blue and green bands (the blue and green bands share a single CCD array and the color bands are separated with a Bayer filter). The CCD arrays are rectangular with 1920 x 1080 elements, each element being 7.4 x 7.4 micrometers. With a 28 mm focal length lens, and at a flying height of 550 meters, the effective groundel is approximately 15 x 15 cm. The new digital camera should be particularly useful for studies of vegetation, including agricultural and forestry applications, and for computer automated classification of surface materials. Examples of early results using the improved ALSM-digital imaging capabilities will be presented.

  3. Airborne ballistic camera tracking systems

    NASA Technical Reports Server (NTRS)

    Redish, W. L.

    1976-01-01

    An operational airborne ballistic camera tracking system was tested for operational and data reduction feasibility. The acquisition and data processing requirements of the system are discussed. Suggestions for future improvements are also noted. A description of the data reduction mathematics is outlined. Results from a successful reentry test mission are tabulated. The test mission indicated that airborne ballistic camera tracking systems are feasible.

  4. Digital Pinhole Camera

    ERIC Educational Resources Information Center

    Lancor, Rachael; Lancor, Brian

    2014-01-01

    In this article we describe how the classic pinhole camera demonstration can be adapted for use with digital cameras. Students can easily explore the effects of the size of the pinhole and its distance from the sensor on exposure time, magnification, and image quality. Instructions for constructing a digital pinhole camera and our method for…

  5. CCD video camera and airborne applications

    NASA Astrophysics Data System (ADS)

    Sturz, Richard A.

    2000-11-01

    The human need to see for ones self and to do so remotely, has given rise to video camera applications never before imagined and growing constantly. The instant understanding and verification offered by video lends its applications to every facet of life. Once an entertainment media, video is now ever present in out daily life. The application to the aircraft platform is one aspect of the video camera versatility. Integrating the video camera into the aircraft platform is yet another story. The typical video camera when applied to more standard scene imaging poses less demanding parameters and considerations. This paper explores the video camera as applied to the more complicated airborne environment.

  6. MEMS digital camera

    NASA Astrophysics Data System (ADS)

    Gutierrez, R. C.; Tang, T. K.; Calvet, R.; Fossum, E. R.

    2007-02-01

    MEMS technology uses photolithography and etching of silicon wafers to enable mechanical structures with less than 1 μm tolerance, important for the miniaturization of imaging systems. In this paper, we present the first silicon MEMS digital auto-focus camera for use in cell phones with a focus range of 10 cm to infinity. At the heart of the new silicon MEMS digital camera, a simple and low-cost electromagnetic actuator impels a silicon MEMS motion control stage on which a lens is mounted. The silicon stage ensures precise alignment of the lens with respect to the imager, and enables precision motion of the lens over a range of 300 μm with < 5 μm hysteresis and < 2 μm repeatability. Settling time is < 15 ms for 200 μm step, and < 5ms for 20 μm step enabling AF within 0.36 sec at 30 fps. The precise motion allows COTS optics to maintain MTF > 0.8 at 20 cy/mm up to 80% field over the full range of motion. Accelerated lifetime testing has shown that the alignment and precision of motion is maintained after 8,000 g shocks, thermal cycling from - 40 C to 85 C, and operation over 20 million cycles.

  7. Measuring Distances Using Digital Cameras

    ERIC Educational Resources Information Center

    Kendal, Dave

    2007-01-01

    This paper presents a generic method of calculating accurate horizontal and vertical object distances from digital images taken with any digital camera and lens combination, where the object plane is parallel to the image plane or tilted in the vertical plane. This method was developed for a project investigating the size, density and spatial…

  8. Recent advances in digital camera optics

    NASA Astrophysics Data System (ADS)

    Ishiguro, Keizo

    2012-10-01

    The digital camera market has extremely expanded in the last ten years. The zoom lens for digital camera is especially the key determining factor of the camera body size and image quality. Its technologies have been based on several analog technological progresses including the method of aspherical lens manufacturing and the mechanism of image stabilization. Panasonic is one of the pioneers of both technologies. I will introduce the previous trend in optics of zoom lens as well as original optical technologies of Panasonic digital camera "LUMIX", and in addition optics in 3D camera system. Besides, I would like to suppose the future trend in digital cameras.

  9. Digital Cameras in the K-12 Classroom.

    ERIC Educational Resources Information Center

    Clark, Kenneth; Hosticka, Alice; Bedell, Jacqueline

    This paper discusses the use of digital cameras in K-12 education. Examples are provided of the integration of the digital camera and visual images into: reading and writing; science, social studies, and mathematics; projects; scientific experiments; desktop publishing; visual arts; data analysis; computer literacy; classroom atmosphere; and…

  10. High Speed Digital Camera Technology Review

    NASA Technical Reports Server (NTRS)

    Clements, Sandra D.

    2009-01-01

    A High Speed Digital Camera Technology Review (HSD Review) is being conducted to evaluate the state-of-the-shelf in this rapidly progressing industry. Five HSD cameras supplied by four camera manufacturers participated in a Field Test during the Space Shuttle Discovery STS-128 launch. Each camera was also subjected to Bench Tests in the ASRC Imaging Development Laboratory. Evaluation of the data from the Field and Bench Tests is underway. Representatives from the imaging communities at NASA / KSC and the Optical Systems Group are participating as reviewers. A High Speed Digital Video Camera Draft Specification was updated to address Shuttle engineering imagery requirements based on findings from this HSD Review. This draft specification will serve as the template for a High Speed Digital Video Camera Specification to be developed for the wider OSG imaging community under OSG Task OS-33.

  11. High Resolution Airborne Digital Imagery for Precision Agriculture

    NASA Technical Reports Server (NTRS)

    Herwitz, Stanley R.

    1998-01-01

    The Environmental Research Aircraft and Sensor Technology (ERAST) program is a NASA initiative that seeks to demonstrate the application of cost-effective aircraft and sensor technology to private commercial ventures. In 1997-98, a series of flight-demonstrations and image acquisition efforts were conducted over the Hawaiian Islands using a remotely-piloted solar- powered platform (Pathfinder) and a fixed-wing piloted aircraft (Navajo) equipped with a Kodak DCS450 CIR (color infrared) digital camera. As an ERAST Science Team Member, I defined a set of flight lines over the largest coffee plantation in Hawaii: the Kauai Coffee Company's 4,000 acre Koloa Estate. Past studies have demonstrated the applications of airborne digital imaging to agricultural management. Few studies have examined the usefulness of high resolution airborne multispectral imagery with 10 cm pixel sizes. The Kodak digital camera integrated with ERAST's Airborne Real Time Imaging System (ARTIS) which generated multiband CCD images consisting of 6 x 106 pixel elements. At the designated flight altitude of 1,000 feet over the coffee plantation, pixel size was 10 cm. The study involved the analysis of imagery acquired on 5 March 1998 for the detection of anomalous reflectance values and for the definition of spectral signatures as indicators of tree vigor and treatment effectiveness (e.g., drip irrigation; fertilizer application).

  12. Television camera on RMS surveys insulation on Airborne Support Equipment

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The television camera on the end effector of the Canadian-built Remote Manipulator System (RMS) is seen surveying some of the insulation on the Airborne Support Equipment (ASE). Flight controllers called for the survey following the departure of the Advanced Communications Technology Satellite (ACTS) and its Transfer Orbit Stage (TOS).

  13. An Inexpensive Digital Infrared Camera

    ERIC Educational Resources Information Center

    Mills, Allan

    2012-01-01

    Details are given for the conversion of an inexpensive webcam to a camera specifically sensitive to the near infrared (700-1000 nm). Some experiments and practical applications are suggested and illustrated. (Contains 9 figures.)

  14. Camera! Action! Collaborate with Digital Moviemaking

    ERIC Educational Resources Information Center

    Swan, Kathleen Owings; Hofer, Mark; Levstik, Linda S.

    2007-01-01

    Broadly defined, digital moviemaking integrates a variety of media (images, sound, text, video, narration) to communicate with an audience. There is near-ubiquitous access to the necessary software (MovieMaker and iMovie are bundled free with their respective operating systems) and hardware (computers with Internet access, digital cameras, etc.).…

  15. A stereoscopic lens for digital cinema cameras

    NASA Astrophysics Data System (ADS)

    Lipton, Lenny; Rupkalvis, John

    2015-03-01

    Live-action stereoscopic feature films are, for the most part, produced using a costly post-production process to convert planar cinematography into stereo-pair images and are only occasionally shot stereoscopically using bulky dual-cameras that are adaptations of the Ramsdell rig. The stereoscopic lens design described here might very well encourage more live-action image capture because it uses standard digital cinema cameras and workflow to save time and money.

  16. Digital Camera Project Fosters Communication Skills

    ERIC Educational Resources Information Center

    Fisher, Ashley; Lazaros, Edward J.

    2009-01-01

    This article details the many benefits of educators' use of digital camera technology and provides an activity in which students practice taking portrait shots of classmates, manipulate the resulting images, and add language arts practice by interviewing their subjects to produce a photo-illustrated Word document. This activity gives…

  17. National Guidelines for Digital Camera Systems Certification

    NASA Astrophysics Data System (ADS)

    Yaron, Yaron; Keinan, Eran; Benhamu, Moshe; Regev, Ronen; Zalmanzon, Garry

    2016-06-01

    Digital camera systems are a key component in the production of reliable, geometrically accurate, high-resolution geospatial products. These systems have replaced film imaging in photogrammetric data capturing. Today, we see a proliferation of imaging sensors collecting photographs in different ground resolutions, spectral bands, swath sizes, radiometric characteristics, accuracies and carried on different mobile platforms. In addition, these imaging sensors are combined with navigational tools (such as GPS and IMU), active sensors such as laser scanning and powerful processing tools to obtain high quality geospatial products. The quality (accuracy, completeness, consistency, etc.) of these geospatial products is based on the use of calibrated, high-quality digital camera systems. The new survey regulations of the state of Israel specify the quality requirements for each geospatial product including: maps at different scales and for different purposes, elevation models, orthophotographs, three-dimensional models at different levels of details (LOD) and more. In addition, the regulations require that digital camera systems used for mapping purposes should be certified using a rigorous mapping systems certification and validation process which is specified in the Director General Instructions. The Director General Instructions for digital camera systems certification specify a two-step process as follows: 1. Theoretical analysis of system components that includes: study of the accuracy of each component and an integrative error propagation evaluation, examination of the radiometric and spectral response curves for the imaging sensors, the calibration requirements, and the working procedures. 2. Empirical study of the digital mapping system that examines a typical project (product scale, flight height, number and configuration of ground control points and process). The study examine all the aspects of the final product including; its accuracy, the product pixels size

  18. Digital Earth Watch: Investigating the World with Digital Cameras

    NASA Astrophysics Data System (ADS)

    Gould, A. D.; Schloss, A. L.; Beaudry, J.; Pickle, J.

    2015-12-01

    Every digital camera including the smart phone camera can be a scientific tool. Pictures contain millions of color intensity measurements organized spatially allowing us to measure properties of objects in the images. This presentation will demonstrate how digital pictures can be used for a variety of studies with a special emphasis on using repeat digital photographs to study change-over-time in outdoor settings with a Picture Post. Demonstrations will include using inexpensive color filters to take pictures that enhance features in images such as unhealthy leaves on plants, or clouds in the sky. Software available at no cost from the Digital Earth Watch (DEW) website that lets students explore light, color and pixels, manipulate color in images and make measurements, will be demonstrated. DEW and Picture Post were developed with support from NASA. Please visit our websites: DEW: http://dew.globalsystemsscience.orgPicture Post: http://picturepost.unh.edu

  19. X-ray imaging using digital cameras

    NASA Astrophysics Data System (ADS)

    Winch, Nicola M.; Edgar, Andrew

    2012-03-01

    The possibility of using the combination of a computed radiography (storage phosphor) cassette and a semiprofessional grade digital camera for medical or dental radiography is investigated. We compare the performance of (i) a Canon 5D Mk II single lens reflex camera with f1.4 lens and full-frame CMOS array sensor and (ii) a cooled CCD-based camera with a 1/3 frame sensor and the same lens system. Both systems are tested with 240 x 180 mm cassettes which are based on either powdered europium-doped barium fluoride bromide or needle structure europium-doped cesium bromide. The modulation transfer function for both systems has been determined and falls to a value of 0.2 at around 2 lp/mm, and is limited by light scattering of the emitted light from the storage phosphor rather than the optics or sensor pixelation. The modulation transfer function for the CsBr:Eu2+ plate is bimodal, with a high frequency wing which is attributed to the light-guiding behaviour of the needle structure. The detective quantum efficiency has been determined using a radioisotope source and is comparatively low at 0.017 for the CMOS camera and 0.006 for the CCD camera, attributed to the poor light harvesting by the lens. The primary advantages of the method are portability, robustness, digital imaging and low cost; the limitations are the low detective quantum efficiency and hence signal-to-noise ratio for medical doses, and restricted range of plate sizes. Representative images taken with medical doses are shown and illustrate the potential use for portable basic radiography.

  20. The Sloan Digital Sky Survey Photometric Camera

    SciTech Connect

    Gunn, J.E.; Carr, M.; Rockosi, C.; Sekiguchi, M.; Berry, K.; Elms, B.; de Haas, E.; Ivezic, Z.; Knapp, G.; Lupton, R.; Pauls, G.; Simcoe, R.; Hirsch, R.; Sanford, D.; Wang, S.; York, D.; Harris, F.; Annis, J.; Bartozek, L.; Boroski, W.; Bakken, J.; Haldeman, M.; Kent, S.; Holm, S.; Holmgren, D.; Petravick, D.; Prosapio, A.; Rechenmacher, R.; Doi, M.; Fukugita, M.; Shimasaku, K.; Okada, N.; Hull, C.; Siegmund, W.; Mannery, E.; Blouke, M.; Heidtman, D.; Schneider, D.; Lucinio, R.; and others

    1998-12-01

    We have constructed a large-format mosaic CCD camera for the Sloan Digital Sky Survey. The camera consists of two arrays, a photometric array that uses 30 2048 {times} 2048 SITe/Tektronix CCDs (24 {mu}m pixels) with an effective imaging area of 720 cm{sup 2} and an astrometric array that uses 24 400 {times} 2048 CCDs with the same pixel size, which will allow us to tie bright astrometric standard stars to the objects imaged in the photometric camera. The instrument will be used to carry out photometry essentially simultaneously in five color bands spanning the range accessible to silicon detectors on the ground in the time-delay{endash}and{endash}integrate (TDI) scanning mode. The photometric detectors are arrayed in the focal plane in six columns of five chips each such that two scans cover a filled stripe 2&arcdeg;5 wide. This paper presents engineering and technical details of the camera. {copyright} {ital {copyright} 1998.} {ital The American Astronomical Society}

  1. Digital Camera Control for Faster Inspection

    NASA Technical Reports Server (NTRS)

    Brown, Katharine; Siekierski, James D.; Mangieri, Mark L.; Dekome, Kent; Cobarruvias, John; Piplani, Perry J.; Busa, Joel

    2009-01-01

    Digital Camera Control Software (DCCS) is a computer program for controlling a boom and a boom-mounted camera used to inspect the external surface of a space shuttle in orbit around the Earth. Running in a laptop computer in the space-shuttle crew cabin, DCCS commands integrated displays and controls. By means of a simple one-button command, a crewmember can view low- resolution images to quickly spot problem areas and can then cause a rapid transition to high- resolution images. The crewmember can command that camera settings apply to a specific small area of interest within the field of view of the camera so as to maximize image quality within that area. DCCS also provides critical high-resolution images to a ground screening team, which analyzes the images to assess damage (if any); in so doing, DCCS enables the team to clear initially suspect areas more quickly than would otherwise be possible and further saves time by minimizing the probability of re-imaging of areas already inspected. On the basis of experience with a previous version (2.0) of the software, the present version (3.0) incorporates a number of advanced imaging features that optimize crewmember capability and efficiency.

  2. Optimum color filters for CCD digital cameras.

    PubMed

    Engelhardt, K; Seitz, P

    1993-06-01

    A procedure for the definition of optimum spectral transmission curves for any solid-state (especially silicon-based CCD) color camera is presented. The design of the target curves is based on computer simulation of the camera system and on the use of test colors with known spectral reflectances. Color errors are measured in a uniform color space (CIELUV) and by application of the Commission Internationale de l'Eclairage color difference formula. Dielectric filter stacks were designed by simulated thermal annealing, and a stripe filter pattern was fabricated with transmission properties close to the specifications. Optimization of the color transformation minimizes the residual average color error and an average color error of ~1 just noticeable difference should be feasible. This means that color differences on a side-to-side comparison of original and reproduced color are practically imperceptible. In addition, electrical cross talk within the solid-state imager can be compensated by adapting the color matrixing coefficients. The theoretical findings of this work were employed for the design and fabrication of a high-resolution digital CCD color camera with high calorimetric accuracy. PMID:20829908

  3. Picture Perfect: Using Digital Cameras for Teaching Mathematics.

    ERIC Educational Resources Information Center

    Teahan, John; Sharp, Brian

    2002-01-01

    Discusses positive effects of digital photography on the teaching of mathematics and cost-effectiveness for schools. Discusses appropriate digital camera resolution, storage, printers, and handheld options for classroom use. (KHR)

  4. Quality criterion for digital still camera

    NASA Astrophysics Data System (ADS)

    Bezryadin, Sergey

    2007-02-01

    The main quality requirements for a digital still camera are color capturing accuracy, low noise level, and quantum efficiency. Different consumers assign different priorities to the listed parameters, and camera designers need clearly formulated methods for their evaluation. While there are procedures providing noise level and quantum efficiency estimation, there are no effective means for color capturing accuracy estimation. Introduced in this paper criterion allows to fill this gap. Luther-Ives condition for correct color reproduction system became known in the beginning of the last century. However, since no detector system satisfies Luther-Ives condition, there are always stimuli that are distinctly different for an observer, but which detectors are unable to distinguish. To estimate conformity of a detector set with Luther-Ives condition and calculate a measure of discrepancy, an angle between detector sensor sensitivity and Cohen's Fundamental Color Space may be used. In this paper, the divergence angle is calculated for some typical CCD sensors and a demonstration provided on how this angle might be reduced with a corrective filter. In addition, it is shown that with a specific corrective filter Foveon sensors turn into a detector system with a good Luther-Ives condition compliance.

  5. Camera Ready: Capturing a Digital History of Chester

    ERIC Educational Resources Information Center

    Lehman, Kathy

    2008-01-01

    Armed with digital cameras, voice recorders, and movie cameras, students from Thomas Dale High School in Chester, Virginia, have been exploring neighborhoods, interviewing residents, and collecting memories of their hometown. In this article, the author describes "Digital History of Chester", a project for creating a commemorative DVD. This…

  6. Comparison of Digital Surface Models for Snow Depth Mapping with Uav and Aerial Cameras

    NASA Astrophysics Data System (ADS)

    Boesch, R.; Bühler, Y.; Marty, M.; Ginzler, C.

    2016-06-01

    Photogrammetric workflows for aerial images have improved over the last years in a typically black-box fashion. Most parameters for building dense point cloud are either excessive or not explained and often the progress between software releases is poorly documented. On the other hand, development of better camera sensors and positional accuracy of image acquisition is significant by comparing product specifications. This study shows, that hardware evolutions over the last years have a much stronger impact on height measurements than photogrammetric software releases. Snow height measurements with airborne sensors like the ADS100 and UAV-based DSLR cameras can achieve accuracies close to GSD * 2 in comparison with ground-based GNSS reference measurements. Using a custom notch filter on the UAV camera sensor during image acquisition does not yield better height accuracies. UAV based digital surface models are very robust. Different workflow parameter variations for ADS100 and UAV camera workflows seem to have only random effects.

  7. An airborne multispectral imaging system based on two consumer-grade cameras for agricultural remote sensing

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper describes the design and evaluation of an airborne multispectral imaging system based on two identical consumer-grade cameras for agricultural remote sensing. The cameras are equipped with a full-frame complementary metal oxide semiconductor (CMOS) sensor with 5616 × 3744 pixels. One came...

  8. Methods for identification of images acquired with digital cameras

    NASA Astrophysics Data System (ADS)

    Geradts, Zeno J.; Bijhold, Jurrien; Kieft, Martijn; Kurosawa, Kenji; Kuroki, Kenro; Saitoh, Naoki

    2001-02-01

    From the court we were asked whether it is possible to determine if an image has been made with a specific digital camera. This question has to be answered in child pornography cases, where evidence is needed that a certain picture has been made with a specific camera. We have looked into different methods of examining the cameras to determine if a specific image has been made with a camera: defects in CCDs, file formats that are used, noise introduced by the pixel arrays and watermarking in images used by the camera manufacturer.

  9. A Simple Spectrophotometer Using Common Materials and a Digital Camera

    ERIC Educational Resources Information Center

    Widiatmoko, Eko; Widayani; Budiman, Maman; Abdullah, Mikrajuddin; Khairurrijal

    2011-01-01

    A simple spectrophotometer was designed using cardboard, a DVD, a pocket digital camera, a tripod and a computer. The DVD was used as a diffraction grating and the camera as a light sensor. The spectrophotometer was calibrated using a reference light prior to use. The spectrophotometer was capable of measuring optical wavelengths with a…

  10. Next-generation digital camera integration and software development issues

    NASA Astrophysics Data System (ADS)

    Venkataraman, Shyam; Peters, Ken; Hecht, Richard

    1998-04-01

    This paper investigates the complexities associated with the development of next generation digital cameras due to requirements in connectivity and interoperability. Each successive generation of digital camera improves drastically in cost, performance, resolution, image quality and interoperability features. This is being accomplished by advancements in a number of areas: research, silicon, standards, etc. As the capabilities of these cameras increase, so do the requirements for both hardware and software. Today, there are two single chip camera solutions in the market including the Motorola MPC 823 and LSI DCAM- 101. Real time constraints for a digital camera may be defined by the maximum time allowable between capture of images. Constraints in the design of an embedded digital camera include processor architecture, memory, processing speed and the real-time operating systems. This paper will present the LSI DCAM-101, a single-chip digital camera solution. It will present an overview of the architecture and the challenges in hardware and software for supporting streaming video in such a complex device. Issues presented include the development of the data flow software architecture, testing and integration on this complex silicon device. The strategy for optimizing performance on the architecture will also be presented.

  11. Characterizing Digital Camera Systems: A Prelude to Data Standards

    NASA Technical Reports Server (NTRS)

    Ryan, Robert

    2002-01-01

    This viewgraph presentation profiles: 1) Digital imaging systems; 2) Specifying a digital imagery product; and 3) Characterization of data acquisition systems. Advanced large array digital imaging systems are routinely being used. Digital imagery guidelines are being developed by ASPRS and ISPRS. Guidelines and standards are of little use without standardized characterization methods. Characterization of digital camera systems is important for supporting digital imagery guidelines. Specifications are characterized in the lab and/or the field. Laboratory characterization is critical for optimizing and defining performance. In-flight characterization is necessary for an end-to-end system test.

  12. Choosing the Best Digital Camera for Your Program

    ERIC Educational Resources Information Center

    Mikat, Richard P.; Anderson, Mandi

    2005-01-01

    Many educators in physical education, recreation, dance, and related fields have begun using digital images to enhance their teaching (e.g., Ryan, Marzilla, & Martindale, 2001). Many other educators would like to begin using this technology, but find the task of choosing an appropriate digital camera to be overwhelming. This article is designed to…

  13. Low light performance of digital cameras

    NASA Astrophysics Data System (ADS)

    Hultgren, Bror; Hertel, Dirk

    2009-01-01

    Photospace data previously measured on large image sets have shown that a high percentage of camera phone pictures are taken under low-light conditions. Corresponding image quality measurements linked the lowest quality to these conditions, and subjective analysis of image quality failure modes identified image blur as the most important contributor to image quality degradation. Camera phones without flash have to manage a trade-off when adjusting shutter time to low-light conditions. The shutter time has to be long enough to avoid extreme underexposures, but not short enough that hand-held picture taking is still possible without excessive motion blur. There is still a lack of quantitative data on motion blur. Camera phones often do not record basic operating parameters such as shutter speed in their image metadata, and when recorded, the data are often inaccurate. We introduce a device and process for tracking camera motion and measuring its Point Spread Function (PSF). Vision-based metrics are introduced to assess the impact of camera motion on image quality so that the low-light performance of different cameras can be compared. Statistical distributions of user variability will be discussed.

  14. Acquisition and evaluation of radiography images by digital camera.

    PubMed

    Cone, Stephen W; Carucci, Laura R; Yu, Jinxing; Rafiq, Azhar; Doarn, Charles R; Merrell, Ronald C

    2005-04-01

    To determine applicability of low-cost digital imaging for different radiographic modalities used in consultations from remote areas of the Ecuadorian rainforest with limited resources, both medical and financial. Low-cost digital imaging, consisting of hand-held digital cameras, was used for image capture at a remote location. Diagnostic radiographic images were captured in Ecuador by digital camera and transmitted to a password-protected File Transfer Protocol (FTP) server at VCU Medical Center in Richmond, Virginia, using standard Internet connectivity with standard security. After capture and subsequent transfer of images via low-bandwidth Internet connections, attending radiologists in the United States compared diagnoses to those from Ecuador to evaluate quality of image transfer. Corroborative diagnoses were obtained with the digital camera images for greater than 90% of the plain film and computed tomography studies. Ultrasound (U/S) studies demonstrated only 56% corroboration. Images of radiographs captured utilizing commercially available digital cameras can provide quality sufficient for expert consultation for many plain film studies for remote, underserved areas without access to advanced modalities.

  15. Review of up-to date digital cameras interfaces

    NASA Astrophysics Data System (ADS)

    Linkemann, Joachim

    2013-04-01

    Over the past 15 years, various interfaces on digital industrial cameras have been available on the market. This tutorial will give an overview of interfaces such as LVDS (RS644), Channel Link and Camera Link. In addition, other interfaces such as FireWire, Gigabit Ethernet, and now USB 3.0 have become more popular. Owing to their ease of use, these interfaces cover most of the market. Nevertheless, for certain applications and especially for higher bandwidths, Camera Link and CoaXPress are very useful. This tutorial will give a description of the advantages and disadvantages, comment on bandwidths, and provide recommendations on when to use which interface.

  16. Digital image georeferencing from a multiple camera system by GPS/INS

    NASA Astrophysics Data System (ADS)

    Mostafa, Mohamed M. R.; Schwarz, Klaus-Peter

    In this paper, the development and testing of an airborne fully digital multi-sensor system for digital mapping data acquisition is presented. The system acquires two streams of data, namely, navigation (georeferencing) data and imaging data. The navigation data are obtained by integrating an accurate strapdown inertial navigation system with a differential GPS system (DGPS). The imaging data are acquired by two low-cost digital cameras, configured in such a way so as to reduce their geometric limitations. The two cameras capture strips of overlapping nadir and oblique images. The GPS/INS-derived trajectory contains the full translational and rotational motion of the carrier aircraft. Thus, image exterior orientation information is extracted from the trajectory, during post-processing. This approach eliminates the need for ground control (GCP) when computing 3D positions of objects that appear in the field of view of the system imaging component. Two approaches for calibrating the system are presented, namely, terrestrial calibration and in-flight calibration. Test flights were conducted over the campus of The University of Calgary. Testing the system showed that best ground point positioning accuracy at 1:12,000 average image scale is 0.2 m (RMS) in easting and northing and 0.3 m (RMS) in height. Preliminary results indicate that major applications of such a system in the future are in the field of digital mapping, at scales of 1:5000 and smaller, and in the generation of digital elevation models for engineering applications.

  17. Reflectance and illuminant estimation for digital cameras

    NASA Astrophysics Data System (ADS)

    Dicarlo, Jeffrey Michael

    Several important problems in color imaging can be traced to differences in how cameras and humans sample the spectral properties of light. Color processing within the imaging pipeline, loosely referred to as color correction, transforms the sampled camera responses to a form that matches the human responses. The accuracy of the color correction transformation is limited for two reasons. First, the human visual system and most color acquisition devices critically undersample the spectral information, making the differences in their sampling functions quite significant. Second, the human visual system derives a relatively constant surface color appearance despite variations in the illuminant, complicating color correction with the need to estimate the illuminant. Assuming complete knowledge of the illuminant, we formulate color correction as an input-referred estimation problem. In particular, we analyze how a small number of camera measurements can be used to estimate a complete spectral surface reflectance function. We introduce conventional linear color transformations, and then extend these transformations using forms of local linear regression that we refer to as submanifold estimation methods. These methods are based on the observation that for many data sets the deviations between the signal and the linear estimate is systematic; submanifold methods incorporate knowledge of these systematic deviations to improve upon linear estimation methods. We describe the geometric intuition of these methods and evaluate the submanifold method on printed material data and hyperspectral image data. Next, we discard the assumption of complete knowledge of the illuminant and analyze a technique to estimate the illuminant. Conventional algorithms rely on statistical assumptions about the scene properties (surface reflectance functions and geometry) to estimate the ambient illuminant. We introduce a new illuminant estimation paradigm that uses an active imaging method to

  18. Toward a digital camera to rival the human eye

    NASA Astrophysics Data System (ADS)

    Skorka, Orit; Joseph, Dileepan

    2011-07-01

    All things considered, electronic imaging systems do not rival the human visual system despite notable progress over 40 years since the invention of the CCD. This work presents a method that allows design engineers to evaluate the performance gap between a digital camera and the human eye. The method identifies limiting factors of the electronic systems by benchmarking against the human system. It considers power consumption, visual field, spatial resolution, temporal resolution, and properties related to signal and noise power. A figure of merit is defined as the performance gap of the weakest parameter. Experimental work done with observers and cadavers is reviewed to assess the parameters of the human eye, and assessment techniques are also covered for digital cameras. The method is applied to 24 modern image sensors of various types, where an ideal lens is assumed to complete a digital camera. Results indicate that dynamic range and dark limit are the most limiting factors. The substantial functional gap, from 1.6 to 4.5 orders of magnitude, between the human eye and digital cameras may arise from architectural differences between the human retina, arranged in a multiple-layer structure, and image sensors, mostly fabricated in planar technologies. Functionality of image sensors may be significantly improved by exploiting technologies that allow vertical stacking of active tiers.

  19. Bringing the Digital Camera to the Physics Lab

    ERIC Educational Resources Information Center

    Rossi, M.; Gratton, L. M.; Oss, S.

    2013-01-01

    We discuss how compressed images created by modern digital cameras can lead to even severe problems in the quantitative analysis of experiments based on such images. Difficulties result from the nonlinear treatment of lighting intensity values stored in compressed files. To overcome such troubles, one has to adopt noncompressed, native formats, as…

  20. Using a Digital Video Camera to Study Motion

    ERIC Educational Resources Information Center

    Abisdris, Gil; Phaneuf, Alain

    2007-01-01

    To illustrate how a digital video camera can be used to analyze various types of motion, this simple activity analyzes the motion and measures the acceleration due to gravity of a basketball in free fall. Although many excellent commercially available data loggers and software can accomplish this task, this activity requires almost no financial…

  1. Airborne Digital Sensor System and GPS-aided inertial technology for direct geopositioning in rough terrain

    USGS Publications Warehouse

    Sanchez, Richard D.

    2004-01-01

    High-resolution airborne digital cameras with onboard data collection based on the Global Positioning System (GPS) and inertial navigation systems (INS) technology may offer a real-time means to gather accurate topographic map information by reducing ground control and eliminating aerial triangulation. Past evaluations of this integrated system over relatively flat terrain have proven successful. The author uses Emerge Digital Sensor System (DSS) combined with Applanix Corporation?s Position and Orientation Solutions for Direct Georeferencing to examine the positional mapping accuracy in rough terrain. The positional accuracy documented in this study did not meet large-scale mapping requirements owing to an apparent system mechanical failure. Nonetheless, the findings yield important information on a new approach for mapping in Antarctica and other remote or inaccessible areas of the world.

  2. Demosaicing images from colour cameras for digital image correlation

    NASA Astrophysics Data System (ADS)

    Forsey, A.; Gungor, S.

    2016-11-01

    Digital image correlation is not the intended use for consumer colour cameras, but with care they can be successfully employed in such a role. The main obstacle is the sparsely sampled colour data caused by the use of a colour filter array (CFA) to separate the colour channels. It is shown that the method used to convert consumer camera raw files into a monochrome image suitable for digital image correlation (DIC) can have a significant effect on the DIC output. A number of widely available software packages and two in-house methods are evaluated in terms of their performance when used with DIC. Using an in-plane rotating disc to produce a highly constrained displacement field, it was found that the bicubic spline based in-house demosaicing method outperformed the other methods in terms of accuracy and aliasing suppression.

  3. Observation of Planetary Motion Using a Digital Camera

    ERIC Educational Resources Information Center

    Meyn, Jan-Peter

    2008-01-01

    A digital SLR camera with a standard lens (50 mm focal length, f/1.4) on a fixed tripod is used to obtain photographs of the sky which contain stars up to 8[superscript m] apparent magnitude. The angle of view is large enough to ensure visual identification of the photograph with a large sky region in a stellar map. The resolution is sufficient to…

  4. Measurement of solar extinction in tower plants with digital cameras

    NASA Astrophysics Data System (ADS)

    Ballestrín, J.; Monterreal, R.; Carra, M. E.; Fernandez-Reche, J.; Barbero, J.; Marzo, A.

    2016-05-01

    Atmospheric extinction of solar radiation between the heliostat field and the receiver is accepted as a non-negligible source of energy loss in the increasingly large central receiver plants. However, the reality is that there is currently no reliable measurement method for this quantity and at present these plants are designed, built and operated without knowing this local parameter. Nowadays digital cameras are used in many scientific applications for their ability to convert available light into digital images. Its broad spectral range, high resolution and high signal to noise ratio, make them an interesting device in solar technology. In this work a method for atmospheric extinction measurement based on digital images is presented. The possibility of defining a measurement setup in circumstances similar to those of a tower plant increases the credibility of the method. This procedure is currently being implemented at Plataforma Solar de Almería.

  5. Night sky photometry with amateur-grade digital cameras

    NASA Astrophysics Data System (ADS)

    Mrozek, Tomasz; Gronkiewicz, Dominik; Kolomanski, Sylwester; Steslicki, Marek

    2015-08-01

    Measurements of night sky brightness can give us valuable information on light pollution. The more the measurements we have the better is our knowledge on the spatial distribution of the pollution on local and global scale.High accuracy professional photometry of night sky can be performed with dedicated instruments. The main drawbacks of this method are high price and low mobility. This limits an amount of observers and therefore amount of photometric data that can be collected. In order to overcome the problem of limited amount of data we can involve amateur astronomers in photometry of night sky. However, to achieve this goal we need a method that utilizes equipment which is usually used by amateur astronomers, e.g digital cameras.We propose a method that enables good accuracy photometry of night sky with a use of digital compact or DSLR cameras. In the method reduction of observations and standarization to Johnson UBV system are performed. We tested several cameras and compared results to Sky Quality Meter (SQM) measurements. The overall consistency for results is within 0.2 mag.

  6. Formal methods and digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    This report has been prepared to supplement a forthcoming chapter on formal methods in the FAA Digital Systems Validation Handbook. Its purpose is as follows: to outline the technical basis for formal methods in computer science; to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used on board aircraft; and to suggest factors for consideration when formal methods are offered in support of certification. These latter factors assume the context for software development and assurance described in RTCA document DO-178B, 'Software Considerations in Airborne Systems and Equipment Certification,' Dec. 1992.

  7. Establishing imaging sensor specifications for digital still cameras

    NASA Astrophysics Data System (ADS)

    Kriss, Michael A.

    2007-02-01

    Digital Still Cameras, DSCs, have now displaced conventional still cameras in most markets. The heart of a DSC is thought to be the imaging sensor, be it Full Frame CCD, and Interline CCD, a CMOS sensor or the newer Foveon buried photodiode sensors. There is a strong tendency by consumers to consider only the number of mega-pixels in a camera and not to consider the overall performance of the imaging system, including sharpness, artifact control, noise, color reproduction, exposure latitude and dynamic range. This paper will provide a systematic method to characterize the physical requirements of an imaging sensor and supporting system components based on the desired usage. The analysis is based on two software programs that determine the "sharpness", potential for artifacts, sensor "photographic speed", dynamic range and exposure latitude based on the physical nature of the imaging optics, sensor characteristics (including size of pixels, sensor architecture, noise characteristics, surface states that cause dark current, quantum efficiency, effective MTF, and the intrinsic full well capacity in terms of electrons per square centimeter). Examples will be given for consumer, pro-consumer, and professional camera systems. Where possible, these results will be compared to imaging system currently on the market.

  8. A large distributed digital camera system for accelerator beam diagnostics

    NASA Astrophysics Data System (ADS)

    Catani, L.; Cianchi, A.; Di Pirro, G.; Honkavaara, K.

    2005-07-01

    Optical diagnostics, providing images of accelerated particle beams using radiation emitted by particles impinging a radiator, typically a fluorescent screen, has been extensively used, especially on electron linacs, since the 1970's. Higher intensity beams available in the last decade allow extending the use of beam imaging techniques to perform precise measurements of important beam parameters such as emittance, energy, and energy spread using optical transition radiation (OTR). OTR-based diagnostics systems are extensively used on the superconducting TESLA Test Facility (TTF) linac driving the vacuum ultraviolet free electron laser (VUV-FEL) at the Deutsches Elektronen-Synchrotron facility. Up to 30 optical diagnostic stations have been installed at various positions along the 250-m-long linac, each equipped with a high-performance digital camera. This paper describes the new approach to the design of the hardware and software setups required by the complex topology of such a distributed camera system.

  9. Investigating thin film interference with a digital camera

    NASA Astrophysics Data System (ADS)

    Atkins, Leslie J.; Elliott, Richard C.

    2010-12-01

    Thin film interference is discussed in most introductory physics courses as an intriguing example of wave interference. Although students may understand the interference mechanism that determines the colors of a film, they are likely to have difficulty understanding why soap bubbles and oil slicks have a distinctive set of colors—colors that are strikingly different from those present in the rainbow. This article describes a way to model these colors and a simple method for investigating them using a digital camera and a computer.

  10. Use of a computerized digital camera in podiatric medical practice.

    PubMed

    Stacpoole-Shea, S; Shea, G

    1999-03-01

    Multimedia technology was once rarely found outside the realm of commercial production studios or in elaborate computer games. However, with the addition of only a few simple accessories, recent advances have made this technology readily available to the podiatric medical practitioner on a desktop office computer. The role that the application of multimedia technology using a computerized digital camera can play in a podiatric medical practice--including in such areas as record keeping, outcome measurement, patient education, interdisciplinary communications, and practice-management tools--is discussed.

  11. Social Justice through Literacy: Integrating Digital Video Cameras in Reading Summaries and Responses

    ERIC Educational Resources Information Center

    Liu, Rong; Unger, John A.; Scullion, Vicki A.

    2014-01-01

    Drawing data from an action-oriented research project for integrating digital video cameras into the reading process in pre-college courses, this study proposes using digital video cameras in reading summaries and responses to promote critical thinking and to teach social justice concepts. The digital video research project is founded on…

  12. Digital cameras with designs inspired by the arthropod eye.

    PubMed

    Song, Young Min; Xie, Yizhu; Malyarchuk, Viktor; Xiao, Jianliang; Jung, Inhwa; Choi, Ki-Joong; Liu, Zhuangjian; Park, Hyunsung; Lu, Chaofeng; Kim, Rak-Hwan; Li, Rui; Crozier, Kenneth B; Huang, Yonggang; Rogers, John A

    2013-05-01

    In arthropods, evolution has created a remarkably sophisticated class of imaging systems, with a wide-angle field of view, low aberrations, high acuity to motion and an infinite depth of field. A challenge in building digital cameras with the hemispherical, compound apposition layouts of arthropod eyes is that essential design requirements cannot be met with existing planar sensor technologies or conventional optics. Here we present materials, mechanics and integration schemes that afford scalable pathways to working, arthropod-inspired cameras with nearly full hemispherical shapes (about 160 degrees). Their surfaces are densely populated by imaging elements (artificial ommatidia), which are comparable in number (180) to those of the eyes of fire ants (Solenopsis fugax) and bark beetles (Hylastes nigrinus). The devices combine elastomeric compound optical elements with deformable arrays of thin silicon photodetectors into integrated sheets that can be elastically transformed from the planar geometries in which they are fabricated to hemispherical shapes for integration into apposition cameras. Our imaging results and quantitative ray-tracing-based simulations illustrate key features of operation. These general strategies seem to be applicable to other compound eye devices, such as those inspired by moths and lacewings (refracting superposition eyes), lobster and shrimp (reflecting superposition eyes), and houseflies (neural superposition eyes). PMID:23636401

  13. Off-axis digital holographic camera for quantitative phase microscopy.

    PubMed

    Monemhaghdoust, Zahra; Montfort, Frédéric; Emery, Yves; Depeursinge, Christian; Moser, Christophe

    2014-06-01

    We propose and experimentally demonstrate a digital holographic camera which can be attached to the camera port of a conventional microscope for obtaining digital holograms in a self-reference configuration, under short coherence illumination and in a single shot. A thick holographic grating filters the beam containing the sample information in two dimensions through diffraction. The filtered beam creates the reference arm of the interferometer. The spatial filtering method, based on the high angular selectivity of the thick grating, reduces the alignment sensitivity to angular displacements compared with pinhole based Fourier filtering. The addition of a thin holographic grating alters the coherence plane tilt introduced by the thick grating so as to create high-visibility interference over the entire field of view. The acquired full-field off-axis holograms are processed to retrieve the amplitude and phase information of the sample. The system produces phase images of cheek cells qualitatively similar to phase images extracted with a standard commercial DHM.

  14. Spatial statistical analysis of tree deaths using airborne digital imagery

    NASA Astrophysics Data System (ADS)

    Chang, Ya-Mei; Baddeley, Adrian; Wallace, Jeremy; Canci, Michael

    2013-04-01

    High resolution digital airborne imagery offers unprecedented opportunities for observation and monitoring of vegetation, providing the potential to identify, locate and track individual vegetation objects over time. Analytical tools are required to quantify relevant information. In this paper, locations of trees over a large area of native woodland vegetation were identified using morphological image analysis techniques. Methods of spatial point process statistics were then applied to estimate the spatially-varying tree death risk, and to show that it is significantly non-uniform. [Tree deaths over the area were detected in our previous work (Wallace et al., 2008).] The study area is a major source of ground water for the city of Perth, and the work was motivated by the need to understand and quantify vegetation changes in the context of water extraction and drying climate. The influence of hydrological variables on tree death risk was investigated using spatial statistics (graphical exploratory methods, spatial point pattern modelling and diagnostics).

  15. Payette National Forest aerial survey project using the Kodak digital color infrared camera

    NASA Astrophysics Data System (ADS)

    Greer, Jerry D.

    1997-11-01

    Staff of the Payette National Forest located in central Idaho used the Kodak Digital Infrared Camera to collect digital photographic images over a wide variety of selected areas. The objective of this aerial survey project is to collect airborne digital camera imagery and to evaluate it for potential use in forest assessment and management. The data collected from this remote sensing system is being compared with existing resource information and with personal knowledge of the areas surveyed. Resource specialists are evaluating the imagery to determine if it may be useful for; identifying cultural sites (pre-European settlement tribal villages and camps); recognizing ecosystem landscape pattern; mapping recreation areas; evaluating the South Fork Salmon River road reconstruction project; designing the Elk Summit Road; assessing the impact of sediment on anadramous fish in the South Fork Salmon River; assessing any contribution of sediment to the South Fork from the reconstructed road; determining post-wildfire stress development in conifer timber; in assessing the development of insect populations in areas initially determined to be within low intensity wildfire burn polygons; and to search for Idaho Ground Squirrel habitat. Project sites include approximately 60 linear miles of the South Fork of the Salmon River; a parallel road over about half that distance; 3 archaeological sites; two transects of about 6 miles each for landscape patterns; 3 recreation areas; 5 miles of the Payette River; 4 miles of the Elk Summit Road; a pair of transects 4.5 miles long for stress assessment in timber; a triplet of transects about 3 miles long for the assessment of the identification of species; and an area of about 640 acres to evaluate habitat for the endangered Idaho Ground Squirrel. Preliminary results indicate that the imagery is an economically viable way to collect site specific resource information that is of value in the management of a national forest.

  16. Design of an in-line, digital holographic imaging system for airborne measurement of clouds.

    PubMed

    Spuler, Scott M; Fugal, Jacob

    2011-04-01

    We discuss the design and performance of an airborne (underwing) in-line digital holographic imaging system developed for characterizing atmospheric cloud water droplets and ice particles in situ. The airborne environment constrained the design space to the simple optical layout that in-line non-beam-splitting holography affords. The desired measurement required the largest possible sample volume in which the smallest desired particle size (∼5 μm) could still be resolved, and consequently the magnification requirement was driven by the pixel size of the camera and this particle size. The resulting design was a seven-element, double-telecentric, high-precision optical imaging system used to relay and magnify a hologram onto a CCD surface. The system was designed to preserve performance and high resolution over a wide temperature range. Details of the optical design and construction are given. Experimental results demonstrate that the system is capable of recording holograms that can be reconstructed with resolution of better than 6.5 μm within a 15 cm(3) sample volume.

  17. A Comparative Study of Microscopic Images Captured by a Box Type Digital Camera Versus a Standard Microscopic Photography Camera Unit

    PubMed Central

    Desai, Nandini J.; Gupta, B. D.; Patel, Pratik Narendrabhai

    2014-01-01

    Introduction: Obtaining images of slides viewed by a microscope can be invaluable for both diagnosis and teaching.They can be transferred among technologically-advanced hospitals for further consultation and evaluation. But a standard microscopic photography camera unit (MPCU)(MIPS-Microscopic Image projection System) is costly and not available in resource poor settings. The aim of our endeavour was to find a comparable and cheaper alternative method for photomicrography. Materials and Methods: We used a NIKON Coolpix S6150 camera (box type digital camera) with Olympus CH20i microscope and a fluorescent microscope for the purpose of this study. Results: We got comparable results for capturing images of light microscopy, but the results were not as satisfactory for fluorescent microscopy. Conclusion: A box type digital camera is a comparable, less expensive and convenient alternative to microscopic photography camera unit. PMID:25478350

  18. Verification of Potency of Aerial Digital Oblique Cameras for Aerial Photogrammetry in Japan

    NASA Astrophysics Data System (ADS)

    Nakada, Ryuji; Takigawa, Masanori; Ohga, Tomowo; Fujii, Noritsuna

    2016-06-01

    Digital oblique aerial camera (hereinafter called "oblique cameras") is an assembly of medium format digital cameras capable of shooting digital aerial photographs in five directions i.e. nadir view and oblique views (forward and backward, left and right views) simultaneously and it is used for shooting digital aerial photographs efficiently for generating 3D models in a wide area. For aerial photogrammetry of public survey in Japan, it is required to use large format cameras, like DMC and UltraCam series, to ensure aerial photogrammetric accuracy. Although oblique cameras are intended to generate 3D models, digital aerial photographs in 5 directions taken with them should not be limited to 3D model production but they may also be allowed for digital mapping and photomaps of required public survey accuracy in Japan. In order to verify the potency of using oblique cameras for aerial photogrammetry (simultaneous adjustment, digital mapping and photomaps), (1) a viewer was developed to interpret digital aerial photographs taken with oblique cameras, (2) digital aerial photographs were shot with an oblique camera owned by us, a Penta DigiCAM of IGI mbH, and (3) accuracy of 3D measurements was verified.

  19. Encrypting Digital Camera with Automatic Encryption Key Deletion

    NASA Technical Reports Server (NTRS)

    Oakley, Ernest C. (Inventor)

    2007-01-01

    A digital video camera includes an image sensor capable of producing a frame of video data representing an image viewed by the sensor, an image memory for storing video data such as previously recorded frame data in a video frame location of the image memory, a read circuit for fetching the previously recorded frame data, an encryption circuit having an encryption key input connected to receive the previously recorded frame data from the read circuit as an encryption key, an un-encrypted data input connected to receive the frame of video data from the image sensor and an encrypted data output port, and a write circuit for writing a frame of encrypted video data received from the encrypted data output port of the encryption circuit to the memory and overwriting the video frame location storing the previously recorded frame data.

  20. Thin-filament pyrometry with a digital still camera.

    PubMed

    Maun, Jignesh D; Sunderland, Peter B; Urban, David L

    2007-02-01

    A novel thin-filament pyrometer is presented. It involves a consumer-grade color digital still camera with 6 megapixels and 12 bits per color plane. SiC fibers were used and scanning-electron microscopy found them to be uniform with diameters of 13.9 micro m. Measurements were performed in a methane-air coflowing laminar jet diffusion flame with a luminosity length of 72 mm. Calibration of the pyrometer was accomplished with B-type thermocouples. The pyrometry measurements yielded gas temperatures in the range of 1400-2200 K with an estimated uncertainty of +/-60 K, a relative temperature resolution of +/-0.215 K, a spatial resolution of 42 mum, and a temporal resolution of 0.66 ms. Fiber aging for 10 min had no effect on the results. Soot deposition was less problematic for the pyrometer than for the thermocouple. PMID:17230239

  1. <5cm Ground Resolution DEMs for the Atacama Fault System (Chile), Acquried With the Modular Airborne Camera System (MACS)

    NASA Astrophysics Data System (ADS)

    Zielke, O.; Victor, P.; Oncken, O.; Bucher, T. U.; Lehmann, F.

    2011-12-01

    A primary step towards assessing time and size of future earthquakes is the identification of earthquake recurrence patterns in the existing seismic record. Geologic and geomorphic data are commonly analyzed for this purpose, reasoned by the lack of sufficiently long historical or instrumental seismic data sets. Until recently, those geomorphic data sets encompassed field observation, local total station surveys, and aerial photography. Over the last decade, LiDAR-based high-resolution topographic data sets became an additional powerful mean, contributing distinctly to a better understanding of earthquake rupture characteristics (e.g., single-event along-fault slip distribution, along-fault slip accumulation pattern) and their relation to fault geometric complexities. Typical shot densities of such data sets (e.g., airborne-LiDAR data along the San Andreas Fault) permit generation of digital elevation models (DEM) with <50 cm ground resolution, sufficient for depiction of meter-scale tectonic landforms. Identification of submeter-scale features is however prohibited by DEM resolution limitation. Here, we present a high-resolution topographic and visual data set from the Atacama fault system near Antofagasta, Chile. Data were acquired with Modular Airborne Camera System (MACS) - developed by the DLR (German Aerospace Center) in Berlin, Germany. The photogrammetrically derived DEM and True Ortho Images with <5cm ground resolution permit identification of very small-scale geomorphic features, thus enabling fault zone and earthquake rupture characterization at unprecedented detail. Compared to typical LiDAR-DEM, ground resolution is increased by an order of magnitude while the spatial extend of these data set is essentially the same. Here, we present examples of the <5cm resolution data set (DEM and visual results) and further explore resolution capabilities and potential with regards to the aforementioned tectono-geomorphic questions.

  2. Camera system resolution and its influence on digital image correlation

    DOE PAGES

    Reu, Phillip L.; Sweatt, William; Miller, Timothy; Fleming, Darryn

    2014-09-21

    Digital image correlation (DIC) uses images from a camera and lens system to make quantitative measurements of the shape, displacement, and strain of test objects. This increasingly popular method has had little research on the influence of the imaging system resolution on the DIC results. This paper investigates the entire imaging system and studies how both the camera and lens resolution influence the DIC results as a function of the system Modulation Transfer Function (MTF). It will show that when making spatial resolution decisions (including speckle size) the resolution limiting component should be considered. A consequence of the loss ofmore » spatial resolution is that the DIC uncertainties will be increased. This is demonstrated using both synthetic and experimental images with varying resolution. The loss of image resolution and DIC accuracy can be compensated for by increasing the subset size, or better, by increasing the speckle size. The speckle-size and spatial resolution are now a function of the lens resolution rather than the more typical assumption of the pixel size. The study will demonstrate the tradeoffs associated with limited lens resolution.« less

  3. Camera system resolution and its influence on digital image correlation

    SciTech Connect

    Reu, Phillip L.; Sweatt, William; Miller, Timothy; Fleming, Darryn

    2014-09-21

    Digital image correlation (DIC) uses images from a camera and lens system to make quantitative measurements of the shape, displacement, and strain of test objects. This increasingly popular method has had little research on the influence of the imaging system resolution on the DIC results. This paper investigates the entire imaging system and studies how both the camera and lens resolution influence the DIC results as a function of the system Modulation Transfer Function (MTF). It will show that when making spatial resolution decisions (including speckle size) the resolution limiting component should be considered. A consequence of the loss of spatial resolution is that the DIC uncertainties will be increased. This is demonstrated using both synthetic and experimental images with varying resolution. The loss of image resolution and DIC accuracy can be compensated for by increasing the subset size, or better, by increasing the speckle size. The speckle-size and spatial resolution are now a function of the lens resolution rather than the more typical assumption of the pixel size. The study will demonstrate the tradeoffs associated with limited lens resolution.

  4. Quantifying biodiversity using digital cameras and automated image analysis.

    NASA Astrophysics Data System (ADS)

    Roadknight, C. M.; Rose, R. J.; Barber, M. L.; Price, M. C.; Marshall, I. W.

    2009-04-01

    Monitoring the effects on biodiversity of extensive grazing in complex semi-natural habitats is labour intensive. There are also concerns about the standardization of semi-quantitative data collection. We have chosen to focus initially on automating the most time consuming aspect - the image analysis. The advent of cheaper and more sophisticated digital camera technology has lead to a sudden increase in the number of habitat monitoring images and information that is being collected. We report on the use of automated trail cameras (designed for the game hunting market) to continuously capture images of grazer activity in a variety of habitats at Moor House National Nature Reserve, which is situated in the North of England at an average altitude of over 600m. Rainfall is high, and in most areas the soil consists of deep peat (1m to 3m), populated by a mix of heather, mosses and sedges. The cameras have been continuously in operation over a 6 month period, daylight images are in full colour and night images (IR flash) are black and white. We have developed artificial intelligence based methods to assist in the analysis of the large number of images collected, generating alert states for new or unusual image conditions. This paper describes the data collection techniques, outlines the quantitative and qualitative data collected and proposes online and offline systems that can reduce the manpower overheads and increase focus on important subsets in the collected data. By converting digital image data into statistical composite data it can be handled in a similar way to other biodiversity statistics thus improving the scalability of monitoring experiments. Unsupervised feature detection methods and supervised neural methods were tested and offered solutions to simplifying the process. Accurate (85 to 95%) categorization of faunal content can be obtained, requiring human intervention for only those images containing rare animals or unusual (undecidable) conditions, and

  5. Use of a Digital Camera To Document Student Observations in a Microbiology Laboratory Class.

    ERIC Educational Resources Information Center

    Mills, David A.; Kelley, Kevin; Jones, Michael

    2001-01-01

    Points out the lack of microscopic images of wine-related microbes. Uses a digital camera during a wine microbiology laboratory to capture student-generated microscope images. Discusses the advantages of using a digital camera in a teaching lab. (YDS)

  6. Range camera self-calibration based on integrated bundle adjustment via joint setup with a 2D digital camera.

    PubMed

    Shahbazi, Mozhdeh; Homayouni, Saeid; Saadatseresht, Mohammad; Sattari, Mehran

    2011-01-01

    Time-of-flight cameras, based on photonic mixer device (PMD) technology, are capable of measuring distances to objects at high frame rates, however, the measured ranges and the intensity data contain systematic errors that need to be corrected. In this paper, a new integrated range camera self-calibration method via joint setup with a digital (RGB) camera is presented. This method can simultaneously estimate the systematic range error parameters as well as the interior and external orientation parameters of the camera. The calibration approach is based on photogrammetric bundle adjustment of observation equations originating from collinearity condition and a range errors model. Addition of a digital camera to the calibration process overcomes the limitations of small field of view and low pixel resolution of the range camera. The tests are performed on a dataset captured by a PMD[vision]-O3 camera from a multi-resolution test field of high contrast targets. An average improvement of 83% in RMS of range error and 72% in RMS of coordinate residual, over that achieved with basic calibration, was realized in an independent accuracy assessment. Our proposed calibration method also achieved 25% and 36% improvement on RMS of range error and coordinate residual, respectively, over that obtained by integrated calibration of the single PMD camera. PMID:22164102

  7. Range camera self-calibration based on integrated bundle adjustment via joint setup with a 2D digital camera.

    PubMed

    Shahbazi, Mozhdeh; Homayouni, Saeid; Saadatseresht, Mohammad; Sattari, Mehran

    2011-01-01

    Time-of-flight cameras, based on photonic mixer device (PMD) technology, are capable of measuring distances to objects at high frame rates, however, the measured ranges and the intensity data contain systematic errors that need to be corrected. In this paper, a new integrated range camera self-calibration method via joint setup with a digital (RGB) camera is presented. This method can simultaneously estimate the systematic range error parameters as well as the interior and external orientation parameters of the camera. The calibration approach is based on photogrammetric bundle adjustment of observation equations originating from collinearity condition and a range errors model. Addition of a digital camera to the calibration process overcomes the limitations of small field of view and low pixel resolution of the range camera. The tests are performed on a dataset captured by a PMD[vision]-O3 camera from a multi-resolution test field of high contrast targets. An average improvement of 83% in RMS of range error and 72% in RMS of coordinate residual, over that achieved with basic calibration, was realized in an independent accuracy assessment. Our proposed calibration method also achieved 25% and 36% improvement on RMS of range error and coordinate residual, respectively, over that obtained by integrated calibration of the single PMD camera.

  8. Range Camera Self-Calibration Based on Integrated Bundle Adjustment via Joint Setup with a 2D Digital Camera

    PubMed Central

    Shahbazi, Mozhdeh; Homayouni, Saeid; Saadatseresht, Mohammad; Sattari, Mehran

    2011-01-01

    Time-of-flight cameras, based on Photonic Mixer Device (PMD) technology, are capable of measuring distances to objects at high frame rates, however, the measured ranges and the intensity data contain systematic errors that need to be corrected. In this paper, a new integrated range camera self-calibration method via joint setup with a digital (RGB) camera is presented. This method can simultaneously estimate the systematic range error parameters as well as the interior and external orientation parameters of the camera. The calibration approach is based on photogrammetric bundle adjustment of observation equations originating from collinearity condition and a range errors model. Addition of a digital camera to the calibration process overcomes the limitations of small field of view and low pixel resolution of the range camera. The tests are performed on a dataset captured by a PMD[vision]-O3 camera from a multi-resolution test field of high contrast targets. An average improvement of 83% in RMS of range error and 72% in RMS of coordinate residual, over that achieved with basic calibration, was realized in an independent accuracy assessment. Our proposed calibration method also achieved 25% and 36% improvement on RMS of range error and coordinate residual, respectively, over that obtained by integrated calibration of the single PMD camera. PMID:22164102

  9. Quantification of gully volume using very high resolution DSM generated through 3D reconstruction from airborne and field digital imagery

    NASA Astrophysics Data System (ADS)

    Castillo, Carlos; Zarco-Tejada, Pablo; Laredo, Mario; Gómez, Jose Alfonso

    2013-04-01

    Major advances have been made recently in automatic 3D photo-reconstruction techniques using uncalibrated and non-metric cameras (James and Robson, 2012). However, its application on soil conservation studies and landscape feature identification is currently at the outset. The aim of this work is to compare the performance of a remote sensing technique using a digital camera mounted on an airborne platform, with 3D photo-reconstruction, a method already validated for gully erosion assessment purposes (Castillo et al., 2012). A field survey was conducted in November 2012 in a 250 m-long gully located in field crops on a Vertisol in Cordoba (Spain). The airborne campaign was conducted with a 4000x3000 digital camera installed onboard an aircraft flying at 300 m above ground level to acquire 6 cm resolution imagery. A total of 990 images were acquired over the area ensuring a large overlap in the across- and along-track direction of the aircraft. An ortho-mosaic and the digital surface model (DSM) were obtained through automatic aerial triangulation and camera calibration methods. For the field-level photo-reconstruction technique, the gully was divided in several reaches to allow appropriate reconstruction (about 150 pictures taken per reach) and, finally, the resulting point clouds were merged into a unique mesh. A centimetric-accuracy GPS provided a benchmark dataset for gully perimeter and distinguishable reference points in order to allow the assessment of measurement errors of the airborne technique and the georeferenciation of the photo-reconstruction 3D model. The uncertainty on the gully limits definition was explicitly addressed by comparison of several criteria obtained by 3D models (slope and second derivative) with the outer perimeter obtained by the GPS operator identifying visually the change in slope at the top of the gully walls. In this study we discussed the magnitude of planimetric and altimetric errors and the differences observed between the

  10. Airborne Camera System for Real-Time Applications - Support of a National Civil Protection Exercise

    NASA Astrophysics Data System (ADS)

    Gstaiger, V.; Romer, H.; Rosenbaum, D.; Henkel, F.

    2015-04-01

    In the VABENE++ project of the German Aerospace Center (DLR), powerful tools are being developed to aid public authorities and organizations with security responsibilities as well as traffic authorities when dealing with disasters and large public events. One focus lies on the acquisition of high resolution aerial imagery, its fully automatic processing, analysis and near real-time provision to decision makers in emergency situations. For this purpose a camera system was developed to be operated from a helicopter with light-weight processing units and microwave link for fast data transfer. In order to meet end-users' requirements DLR works close together with the German Federal Office of Civil Protection and Disaster Assistance (BBK) within this project. One task of BBK is to establish, maintain and train the German Medical Task Force (MTF), which gets deployed nationwide in case of large-scale disasters. In October 2014, several units of the MTF were deployed for the first time in the framework of a national civil protection exercise in Brandenburg. The VABENE++ team joined the exercise and provided near real-time aerial imagery, videos and derived traffic information to support the direction of the MTF and to identify needs for further improvements and developments. In this contribution the authors introduce the new airborne camera system together with its near real-time processing components and share experiences gained during the national civil protection exercise.

  11. Digital camera with apparatus for authentication of images produced from an image file

    NASA Technical Reports Server (NTRS)

    Friedman, Gary L. (Inventor)

    1993-01-01

    A digital camera equipped with a processor for authentication of images produced from an image file taken by the digital camera is provided. The digital camera processor has embedded therein a private key unique to it, and the camera housing has a public key that is so uniquely based upon the private key that digital data encrypted with the private key by the processor may be decrypted using the public key. The digital camera processor comprises means for calculating a hash of the image file using a predetermined algorithm, and second means for encrypting the image hash with the private key, thereby producing a digital signature. The image file and the digital signature are stored in suitable recording means so they will be available together. Apparatus for authenticating at any time the image file as being free of any alteration uses the public key for decrypting the digital signature, thereby deriving a secure image hash identical to the image hash produced by the digital camera and used to produce the digital signature. The apparatus calculates from the image file an image hash using the same algorithm as before. By comparing this last image hash with the secure image hash, authenticity of the image file is determined if they match, since even one bit change in the image hash will cause the image hash to be totally different from the secure hash.

  12. Digital Camera with Apparatus for Authentication of Images Produced from an Image File

    NASA Technical Reports Server (NTRS)

    Friedman, Gary L. (Inventor)

    1996-01-01

    A digital camera equipped with a processor for authentication of images produced from an image file taken by the digital camera is provided. The digital camera processor has embedded therein a private key unique to it, and the camera housing has a public key that is so uniquely related to the private key that digital data encrypted with the private key may be decrypted using the public key. The digital camera processor comprises means for calculating a hash of the image file using a predetermined algorithm, and second means for encrypting the image hash with the private key, thereby producing a digital signature. The image file and the digital signature are stored in suitable recording means so they will be available together. Apparatus for authenticating the image file as being free of any alteration uses the public key for decrypting the digital signature, thereby deriving a secure image hash identical to the image hash produced by the digital camera and used to produce the digital signature. The authenticating apparatus calculates from the image file an image hash using the same algorithm as before. By comparing this last image hash with the secure image hash, authenticity of the image file is determined if they match. Other techniques to address time-honored methods of deception, such as attaching false captions or inducing forced perspectives, are included.

  13. Accuracy assessment of airborne photogrammetrically derived high-resolution digital elevation models in a high mountain environment

    NASA Astrophysics Data System (ADS)

    Müller, Johann; Gärtner-Roer, Isabelle; Thee, Patrick; Ginzler, Christian

    2014-12-01

    High-resolution digital elevation models (DEMs) generated by airborne remote sensing are frequently used to analyze landform structures (monotemporal) and geomorphological processes (multitemporal) in remote areas or areas of extreme terrain. In order to assess and quantify such structures and processes it is necessary to know the absolute accuracy of the available DEMs. This study assesses the absolute vertical accuracy of DEMs generated by the High Resolution Stereo Camera-Airborne (HRSC-A), the Leica Airborne Digital Sensors 40/80 (ADS40 and ADS80) and the analogue camera system RC30. The study area is located in the Turtmann valley, Valais, Switzerland, a glacially and periglacially formed hanging valley stretching from 2400 m to 3300 m a.s.l. The photogrammetrically derived DEMs are evaluated against geodetic field measurements and an airborne laser scan (ALS). Traditional and robust global and local accuracy measurements are used to describe the vertical quality of the DEMs, which show a non Gaussian distribution of errors. The results show that all four sensor systems produce DEMs with similar accuracy despite their different setups and generations. The ADS40 and ADS80 (both with a ground sampling distance of 0.50 m) generate the most accurate DEMs in complex high mountain areas with a RMSE of 0.8 m and NMAD of 0.6 m They also show the highest accuracy relating to flying height (0.14‰). The pushbroom scanning system HRSC-A produces a RMSE of 1.03 m and a NMAD of 0.83 m (0.21‰ accuracy of the flying height and 10 times the ground sampling distance). The analogue camera system RC30 produces DEMs with a vertical accuracy of 1.30 m RMSE and 0.83 m NMAD (0.17‰ accuracy of the flying height and two times the ground sampling distance). It is also shown that the performance of the DEMs strongly depends on the inclination of the terrain. The RMSE of areas up to an inclination <40° is better than 1 m. In more inclined areas the error and outlier occurrence

  14. Aerosol retrieval from twilight photographs taken by a digital camera

    NASA Astrophysics Data System (ADS)

    Saito, M.; Iwabuchi, H.

    2014-12-01

    Twilight sky, one of the most beautiful sights seen in our daily life, varies day by day, because atmospheric components such as ozone and aerosols also varies day by day. Recent studies have revealed the effects of tropospheric aerosols on twilight sky. In this study, we develop a new algorithm for aerosol retrievals from twilight photographs taken by a digital single reflex-lens camera in solar zenith angle of 90-96˚ with interval of 1˚. A radiative transfer model taking spherical-shell atmosphere, multiple scattering and refraction into account is used as a forward model, and the optimal estimation is used as an inversion calculation to infer the aerosol optical and radiative properties. The sensitivity tests show that tropospheric (stratospheric) aerosol optical thickness is responsible to the distribution of twilight sky color and brightness near the horizon (in viewing angles of 10˚ to 20˚) and aerosol size distribution is responsible to the angular distribution of brightness near the solar direction. The AOTs are inferred with small uncertainties and agree very well with that from the Skyradiometer. In this conference, several case studies using the algorithm will be shown.

  15. Using Commercial Digital Cameras and Structure-for-Motion Software to Map Snow Cover Depth from Small Aircraft

    NASA Astrophysics Data System (ADS)

    Sturm, M.; Nolan, M.; Larsen, C. F.

    2014-12-01

    A long-standing goal in snow hydrology has been to map snow cover in detail, either mapping snow depth or snow water equivalent (SWE) with sub-meter resolution. Airborne LiDAR and air photogrammetry have been used successfully for this purpose, but both require significant investments in equipment and substantial processing effort. Here we detail a relatively inexpensive and simple airborne photogrammetric technique that can be used to measure snow depth. The main airborne hardware consists of a consumer-grade digital camera attached to a survey-quality, dual-frequency GPS. Photogrammetric processing is done using commercially available Structure from Motion (SfM) software that does not require ground control points. Digital elevation models (DEMs) are made from snow-free acquisitions in the summer and snow-covered acquisitions in winter, and the maps are then differenced to arrive at snow thickness. We tested the accuracy and precision of snow depths measured using this system through 1) a comparison with airborne scanning LiDAR, 2) a comparison of results from two independent and slightly different photogrameteric systems, and 3) comparison to extensive on-the-ground measured snow depths. Vertical accuracy and precision are on the order of +/-30 cm and +/- 8 cm, respectively. The accuracy can be made to approach that of the precision if suitable snow-free ground control points exists and are used to co-register summer to winter DEM maps. Final snow depth accuracy from our series of tests was on the order of ±15 cm. This photogrammetric method substantially lowers the economic and expertise barriers to entry for mapping snow.

  16. Side oblique real-time orthophotography with the 9Kx9K digital framing camera

    NASA Astrophysics Data System (ADS)

    Gorin, Brian A.

    2003-08-01

    BAE SYSTEMS has reported on a new framing camera incorporating an ultra high resolution CCD detector array comprised of 9,216 x 9,216 pixels fabricated on one silicon wafer. The detector array features a 1:2 frame-per-second readout capable of stereo imagery with Nyquist resolution of 57 lp/mm from high velocity, low altitude (V/H) airborne platforms. Flight tests demonstrated the capability of the focal plane electronics for differential image motion compensation (IMC) with Nyquist performance utilizing a focal plane shutter (FPS) to enable both nadir and significant side and forward oblique imaging angles. The impact of FPS for differential image motion compensation is evaluated with the exterior orientation calibration parameters, which include the existing shutter velocity and flight dynamics from sample mapping applications. System requirements for GPS/INS are included with the effect of vertical error and side oblique angle impact of the digital elevation map (DEM) required to create the orthophoto. Results from the differentiated "collinearity equations" which relate the image coordinates to elements of interior and exterior orientation are combined with the DEM impact to provide useful guidelines for side oblique applications. The application of real-time orthophotography is described with the implications for system requirements for side oblique orthophoto capability.

  17. Real-time object tracking for moving target auto-focus in digital camera

    NASA Astrophysics Data System (ADS)

    Guan, Haike; Niinami, Norikatsu; Liu, Tong

    2015-02-01

    Focusing at a moving object accurately is difficult and important to take photo of the target successfully in a digital camera. Because the object often moves randomly and changes its shape frequently, position and distance of the target should be estimated at real-time so as to focus at the objet precisely. We propose a new method of real-time object tracking to do auto-focus for moving target in digital camera. Video stream in the camera is used for the moving target tracking. Particle filter is used to deal with problem of the target object's random movement and shape change. Color and edge features are used as measurement of the object's states. Parallel processing algorithm is developed to realize real-time particle filter object tracking easily in hardware environment of the digital camera. Movement prediction algorithm is also proposed to remove focus error caused by difference between tracking result and target object's real position when the photo is taken. Simulation and experiment results in digital camera demonstrate effectiveness of the proposed method. We embedded real-time object tracking algorithm in the digital camera. Position and distance of the moving target is obtained accurately by object tracking from the video stream. SIMD processor is applied to enforce parallel real-time processing. Processing time less than 60ms for each frame is obtained in the digital camera with its CPU of only 162MHz.

  18. Issues in implementing services for a wireless web-enabled digital camera

    NASA Astrophysics Data System (ADS)

    Venkataraman, Shyam; Sampat, Nitin; Fisher, Yoram; Canosa, John; Noel, Nicholas

    2001-05-01

    The competition in the exploding digital photography market has caused vendors to explore new ways to increase their return on investment. A common view among industry analysts is that increasingly it will be services provided by these cameras, and not the cameras themselves, that will provide the revenue stream. These services will be coupled to e- Appliance based Communities. In addition, the rapidly increasing need to upload images to the Internet for photo- finishing services as well as the need to download software upgrades to the camera is driving many camera OEMs to evaluate the benefits of using the wireless web to extend their enterprise systems. Currently, creating a viable e- appliance such as a digital camera coupled with a wireless web service requires more than just a competency in product development. This paper will evaluate the system implications in the deployment of recurring revenue services and enterprise connectivity of a wireless, web-enabled digital camera. These include, among other things, an architectural design approach for services such as device management, synchronization, billing, connectivity, security, etc. Such an evaluation will assist, we hope, anyone designing or connecting a digital camera to the enterprise systems.

  19. Works starts on building world's largest digital camera

    NASA Astrophysics Data System (ADS)

    Kruesi, Liz

    2015-10-01

    The $473m Large Synoptic Survey Telescope (LSST) has moved one step closer to completion after the US Department of Energy (DOE) approved the start of construction for the telescope's $168m 3.2-gigapixel camera.

  20. Investigation of a consumer-grade digital stereo camera

    NASA Astrophysics Data System (ADS)

    Menna, Fabio; Nocerino, Erica; Remondino, Fabio; Shortis, Mark

    2013-04-01

    The paper presents a metric investigation of the Fuji FinePix Real 3D W1 stereo photo-camera. The stereo-camera uses a synchronized Twin Lens-CCD System to acquire simultaneously two images using two Fujinon 3x optical zoom lenses arranged in an aluminum die-cast frame integrated in a very compact body. The nominal baseline is 77 mm and the resolution of the each CCD is 10 megapixels. Given the short baseline and the presence of two optical paths, the investigation aims to evaluate the accuracy of the 3D data that can be produced and the stability of the camera. From a photogrammetric point of view, the interest in this camera is its capability to acquire synchronized image pairs that contain important 3D metric information for many close-range applications (human body parts measurement, rapid prototyping, surveying of archeological artifacts, etc.). Calibration values - for the left and right cameras - at different focal lengths, derived with an in-house software application, are reported together with accuracy analyses. The object coordinates obtained from the bundle adjustment computation for each focal length were compared to reference coordinates of a test range by means of a similarity transformation. Additionally, the article reports on the investigation of the asymmetrical relative orientation between the left and right camera.

  1. Fast measurement of temporal noise of digital camera's photosensors

    NASA Astrophysics Data System (ADS)

    Cheremkhin, Pavel A.; Evtikhiev, Nikolay N.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Starikov, Rostislav S.; Starikov, Sergey N.

    2015-10-01

    Currently photo- and videocameras are widespread parts of both scientific experimental setups and consumer applications. They are used in optics, radiophysics, astrophotography, chemistry, and other various fields of science and technology such as control systems and video-surveillance monitoring. One of the main information limitations of photoand videocameras are noises of photosensor pixels. Camera's photosensor noise can be divided into random and pattern components. Temporal noise includes random noise component while spatial noise includes pattern noise component. Spatial part usually several times lower in magnitude than temporal. At first approximation spatial noises might be neglected. Earlier we proposed modification of the automatic segmentation of non-uniform targets (ASNT) method for measurement of temporal noise of photo- and videocameras. Only two frames are sufficient for noise measurement with the modified method. In result, proposed ASNT modification should allow fast and accurate measurement of temporal noise. In this paper, we estimated light and dark temporal noises of four cameras of different types using the modified ASNT method with only several frames. These cameras are: consumer photocamera Canon EOS 400D (CMOS, 10.1 MP, 12 bit ADC), scientific camera MegaPlus II ES11000 (CCD, 10.7 MP, 12 bit ADC), industrial camera PixeLink PLB781F (CMOS, 6.6 MP, 10 bit ADC) and video-surveillance camera Watec LCL-902C (CCD, 0.47 MP, external 8 bit ADC). Experimental dependencies of temporal noise on signal value are in good agreement with fitted curves based on a Poisson distribution excluding areas near saturation. We measured elapsed time for processing of shots used for temporal noise estimation. The results demonstrate the possibility of fast obtaining of dependency of camera full temporal noise on signal value with the proposed ASNT modification.

  2. Two Methods for Self Calibration of Digital Camera

    NASA Astrophysics Data System (ADS)

    Sampath, A.; Moe, D.; Christopherson, J.

    2012-07-01

    Photogrammetric mapping using Commercial of the Shelf (COTS) cameras is becoming more popular. Their popularity is augmented by the increasing use of Unmanned Aerial Vehicles (UAV) as a platform for mapping. The mapping precision of these methods can be increased by using a calibrated camera. The USGS/EROS has developed an inexpensive, easy to use method, particularly for calibrating short focal length cameras. The method builds on a self-calibration procedure developed for the USGS EROS Data Center by Pictometry (and augmented by Dr. C.S Fraser), that uses a series of coded targets. These coded targets form different patterns that are imaged from nine different locations with differing camera orientations. A free network solution using collinearity equations is used to determine the calibration parameters. For the smaller focal length COTS cameras, the USGS has developed a procedure that uses a small prototype box that contains these coded targets. The design of the box is discussed, along with best practices for calibration procedure. Results of calibration parameters obtained using the box are compared with the parameters obtained using more established standard procedures.

  3. USGS QA Plan: Certification of digital airborne mapping products (1)

    USGS Publications Warehouse

    Christopherson, J.

    2007-01-01

    To facilitate acceptance of new digital technologies in aerial imaging and mapping, the US Geological Survey (USGS) and its partners have launched a Quality Assurance (QA) Plan for Digital Aerial Imagery. This should provide a foundation for the quality of digital aerial imagery and products. It introduces broader considerations regarding processes employed by aerial flyers in collecting, processing and delivering data, and provides training and information for US producers and users alike.

  4. DigiCam: fully digital compact camera for SST-1M telescope

    NASA Astrophysics Data System (ADS)

    Aguilar, J. A.; Bilnik, W.; Bogacz, L.; Bulik, T.; Christov, A.; della Volpe, D.; Dyrda, M.; Frankowski, A.; Grudzinska, M.; Grygorczuk, J.; Heller, M.; Idźkowski, B.; Janiak, M.; Jamrozy, M.; Karczewski, M.; Kasperek, J.; Lyard, E.; Marszałek, A.; Michałowski, J.; Moderski, R.; Montaruli, T.; Neronov, A.; Nicolau-Kukliński, J.; Niemiec, J.; Ostrowski, M.; Paśko, P.; Płatos, Ł.; Prandini, E.; Pruchniewicz, R.; Rafalski, J.; Rajda, P. J.; Rameez, M.; Rataj, M.; Rupiński, M.; Rutkowski, K.; Seweryn, K.; Sidz, M.; Stawarz, Ł.; Stodulska, M.; Stodulski, M.; Tokarz, M.; Toscano, S.; Troyano Pujadas, I.; Walter, R.; Wawer, P.; Wawrzaszek, R.; Wiśniewski, L.; Zietara, K.; Ziółkowski, P.; Żychowski, P.

    2014-08-01

    The single mirror Small Size Telescopes (SST-1M), being built by a sub-consortium of Polish and Swiss Institutions of the CTA Consortium, will be equipped with a fully digital camera with a compact photodetector plane based on silicon photomultipliers. The internal trigger signal transmission overhead will be kept at low level by introducing a high level of integration. It will be achieved by massively deploying state-of-the-art multi-gigabit transceivers, beginning from the ADC flash converters, through the internal data and trigger signals transmission over backplanes and cables, to the camera's server 10Gb/s Ethernet links. Such approach will allow fitting the size and weight of the camera exactly to the SST-1M needs, still retaining the flexibility of a fully digital design. Such solution has low power consumption, high reliability and long lifetime. The concept of the camera will be described, along with some construction details and performance results.

  5. Photogrammetry of a 5m Inflatable Space Antenna With Consumer Digital Cameras

    NASA Technical Reports Server (NTRS)

    Pappa, Richard S.; Giersch, Louis R.; Quagliaroli, Jessica M.

    2000-01-01

    This paper discusses photogrammetric measurements of a 5m-diameter inflatable space antenna using four Kodak DC290 (2.1 megapixel) digital cameras. The study had two objectives: 1) Determine the photogrammetric measurement precision obtained using multiple consumer-grade digital cameras and 2) Gain experience with new commercial photogrammetry software packages, specifically PhotoModeler Pro from Eos Systems, Inc. The paper covers the eight steps required using this hardware/software combination. The baseline data set contained four images of the structure taken from various viewing directions. Each image came from a separate camera. This approach simulated the situation of using multiple time-synchronized cameras, which will be required in future tests of vibrating or deploying ultra-lightweight space structures. With four images, the average measurement precision for more than 500 points on the antenna surface was less than 0.020 inches in-plane and approximately 0.050 inches out-of-plane.

  6. Perspective Intensity Images for Co-Registration of Terrestrial Laser Scanner and Digital Camera

    NASA Astrophysics Data System (ADS)

    Liang, Yubin; Qiu, Yan; Cui, Tiejun

    2016-06-01

    Co-registration of terrestrial laser scanner and digital camera has been an important topic of research, since reconstruction of visually appealing and measurable models of the scanned objects can be achieved by using both point clouds and digital images. This paper presents an approach for co-registration of terrestrial laser scanner and digital camera. A perspective intensity image of the point cloud is firstly generated by using the collinearity equation. Then corner points are extracted from the generated perspective intensity image and the camera image. The fundamental matrix F is then estimated using several interactively selected tie points and used to obtain more matches with RANSAC. The 3D coordinates of all the matched tie points are directly obtained or estimated using the least squares method. The robustness and effectiveness of the presented methodology is demonstrated by the experimental results. Methods presented in this work may also be used for automatic registration of terrestrial laser scanning point clouds.

  7. A Multispectral Image Creating Method for a New Airborne Four-Camera System with Different Bandpass Filters

    PubMed Central

    Li, Hanlun; Zhang, Aiwu; Hu, Shaoxing

    2015-01-01

    This paper describes an airborne high resolution four-camera multispectral system which mainly consists of four identical monochrome cameras equipped with four interchangeable bandpass filters. For this multispectral system, an automatic multispectral data composing method was proposed. The homography registration model was chosen, and the scale-invariant feature transform (SIFT) and random sample consensus (RANSAC) were used to generate matching points. For the difficult registration problem between visible band images and near-infrared band images in cases lacking manmade objects, we presented an effective method based on the structural characteristics of the system. Experiments show that our method can acquire high quality multispectral images and the band-to-band alignment error of the composed multiple spectral images is less than 2.5 pixels. PMID:26205264

  8. 2010 A Digital Odyssey: Exploring Document Camera Technology and Computer Self-Efficacy in a Digital Era

    ERIC Educational Resources Information Center

    Hoge, Robert Joaquin

    2010-01-01

    Within the sphere of education, navigating throughout a digital world has become a matter of necessity for the developing professional, as with the advent of Document Camera Technology (DCT). This study explores the pedagogical implications of implementing DCT; to see if there is a relationship between teachers' comfort with DCT and to the…

  9. Applications of the Lambert W function to analyze digital camera sensors

    NASA Astrophysics Data System (ADS)

    Villegas, Daniel

    2014-05-01

    The Lambert W function is applied via Maple to analyze the operation of the modern digital camera sensors. The Lambert W function had been applied previously to understand the functioning of diodes and solar cells. The parallelism between the physics of solar cells and digital camera sensors will be exploited. Digital camera sensors use p-n photodiodes and such photodiodes can be studied using the Lambert W function. At general, the bulk transformation of light into photocurrent is described by an equivalent circuit which determines a dynamical equation to be solved using the Lambert W function. Specifically, in a camera senor, the precise measurement of light intensity by filtering through color filters is able to create a measurable photocurrent that is proportional to image point intensity; and such photocurrent is given in terms of the Lambert W function. It is claimed that the drift between neighboring photocells at long wavelengths affects the ability to resolve an image and such drift can be represented effectively using the Lambert W function. Also is conjectured that the recombination of charge carries in the digital sensors is connected to the notion of "noise" in photography and such "noise" could be described by certain combinations of Lambert W functions. Finally, it is suggested that the notion of bias, and varying the width of the depletion zone, has a relationship to the ISO "sped· of the camera sensor; and such relationship could be described using Lambert W functions.

  10. Digital synchroballistic schlieren camera for high-speed photography of bullets and rocket sleds

    NASA Astrophysics Data System (ADS)

    Buckner, Benjamin D.; L'Esperance, Drew

    2013-08-01

    A high-speed digital streak camera designed for simultaneous high-resolution color photography and focusing schlieren imaging is described. The camera uses a computer-controlled galvanometer scanner to achieve synchroballistic imaging through a narrow slit. Full color 20 megapixel images of a rocket sled moving at 480 m/s and of projectiles fired at around 400 m/s were captured, with high-resolution schlieren imaging in the latter cases, using conventional photographic flash illumination. The streak camera can achieve a line rate for streak imaging of up to 2.4 million lines/s.

  11. Estimation of spectral distribution of sky radiance using a commercial digital camera.

    PubMed

    Saito, Masanori; Iwabuchi, Hironobu; Murata, Isao

    2016-01-10

    Methods for estimating spectral distribution of sky radiance from images captured by a digital camera and for accurately estimating spectral responses of the camera are proposed. Spectral distribution of sky radiance is represented as a polynomial of the wavelength, with coefficients obtained from digital RGB counts by linear transformation. The spectral distribution of radiance as measured is consistent with that obtained by spectrometer and radiative transfer simulation for wavelengths of 430-680 nm, with standard deviation below 1%. Preliminary applications suggest this method is useful for detecting clouds and studying the relation between irradiance at the ground and cloud distribution.

  12. Accurate measurement of spatial noise portraits of photosensors of digital cameras

    NASA Astrophysics Data System (ADS)

    Cheremkhin, P. A.; Evtikhiev, N. N.; Krasnov, V. V.; Kulakov, M. N.; Starikov, R. S.

    2016-08-01

    Method of measurement of accurate portraits of light and dark spatial noise of photosensors is described. The method consists of four steps: creation of spatially homogeneous illumination; shooting light and dark frames; digital processing and filtering. Unlike standard technique, this method uses iterative creation of spatially homogeneous illumination by display, compensation of photosensor dark spatial noise portrait and improved procedure of elimination of dark temporal noise. Portraits of light and dark spatial noise of photosensors of a scientific digital camera were found. Characteristics of the measured portraits were compared with values of photo response and dark signal non-uniformities of camera's photosensor.

  13. Estimation of spectral distribution of sky radiance using a commercial digital camera.

    PubMed

    Saito, Masanori; Iwabuchi, Hironobu; Murata, Isao

    2016-01-10

    Methods for estimating spectral distribution of sky radiance from images captured by a digital camera and for accurately estimating spectral responses of the camera are proposed. Spectral distribution of sky radiance is represented as a polynomial of the wavelength, with coefficients obtained from digital RGB counts by linear transformation. The spectral distribution of radiance as measured is consistent with that obtained by spectrometer and radiative transfer simulation for wavelengths of 430-680 nm, with standard deviation below 1%. Preliminary applications suggest this method is useful for detecting clouds and studying the relation between irradiance at the ground and cloud distribution. PMID:26835780

  14. In-plane displacement and strain measurements using a camera phone and digital image correlation

    NASA Astrophysics Data System (ADS)

    Yu, Liping; Pan, Bing

    2014-05-01

    In-plane displacement and strain measurements of planar objects by processing the digital images captured by a camera phone using digital image correlation (DIC) are performed in this paper. As a convenient communication tool for everyday use, the principal advantages of a camera phone are its low cost, easy accessibility, and compactness. However, when used as a two-dimensional DIC system for mechanical metrology, the assumed imaging model of a camera phone may be slightly altered during the measurement process due to camera misalignment, imperfect loading, sample deformation, and temperature variations of the camera phone, which can produce appreciable errors in the measured displacements. In order to obtain accurate DIC measurements using a camera phone, the virtual displacements caused by these issues are first identified using an unstrained compensating specimen and then corrected by means of a parametric model. The proposed technique is first verified using in-plane translation and out-of-plane translation tests. Then, it is validated through a determination of the tensile strains and elastic properties of an aluminum specimen. Results of the present study show that accurate DIC measurements can be conducted using a common camera phone provided that an adequate correction is employed.

  15. Lights, Camera, Reflection! Digital Movies: A Tool for Reflective Learning

    ERIC Educational Resources Information Center

    Genereux, Annie Prud'homme; Thompson, William A.

    2008-01-01

    At the end of a biology course entitled Ecology, Evolution, and Genetics, students were asked to consider how their learning experience had changed their perception of either ecology or genetics. Students were asked to express their thoughts in the form of a "digital story" using readily available software to create movies for the purpose of…

  16. The trustworthy digital camera: Restoring credibility to the photographic image

    NASA Astrophysics Data System (ADS)

    Friedman, Gary L.

    1994-02-01

    The increasing sophistication of computers has made digital manipulation of photographic images, as well as other digitally-recorded artifacts such as audio and video, incredibly easy to perform and increasingly difficult to detect. Today, every picture appearing in newspapers and magazines has been digitally altered to some degree, with the severity varying from the trivial (cleaning up 'noise' and removing distracting backgrounds) to the point of deception (articles of clothing removed, heads attached to other people's bodies, and the complete rearrangement of city skylines). As the power, flexibility, and ubiquity of image-altering computers continues to increase, the well-known adage that 'the photography doesn't lie' will continue to become an anachronism. A solution to this problem comes from a concept called digital signatures, which incorporates modern cryptographic techniques to authenticate electronic mail messages. 'Authenticate' in this case means one can be sure that the message has not been altered, and that the sender's identity has not been forged. The technique can serve not only to authenticate images, but also to help the photographer retain and enforce copyright protection when the concept of 'electronic original' is no longer meaningful.

  17. The trustworthy digital camera: Restoring credibility to the photographic image

    NASA Technical Reports Server (NTRS)

    Friedman, Gary L.

    1994-01-01

    The increasing sophistication of computers has made digital manipulation of photographic images, as well as other digitally-recorded artifacts such as audio and video, incredibly easy to perform and increasingly difficult to detect. Today, every picture appearing in newspapers and magazines has been digitally altered to some degree, with the severity varying from the trivial (cleaning up 'noise' and removing distracting backgrounds) to the point of deception (articles of clothing removed, heads attached to other people's bodies, and the complete rearrangement of city skylines). As the power, flexibility, and ubiquity of image-altering computers continues to increase, the well-known adage that 'the photography doesn't lie' will continue to become an anachronism. A solution to this problem comes from a concept called digital signatures, which incorporates modern cryptographic techniques to authenticate electronic mail messages. 'Authenticate' in this case means one can be sure that the message has not been altered, and that the sender's identity has not been forged. The technique can serve not only to authenticate images, but also to help the photographer retain and enforce copyright protection when the concept of 'electronic original' is no longer meaningful.

  18. Airborne multispectral identification of individual cotton plants using consumer-grade cameras

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Although multispectral remote sensing using consumer-grade cameras has successfully identified fields of small cotton plants, improvements to detection sensitivity are needed to identify individual or small clusters of plants. The imaging sensor of consumer-grade cameras are based on a Bayer patter...

  19. Digital data from the Great Sand Dunes airborne gravity gradient survey, south-central Colorado

    USGS Publications Warehouse

    Drenth, B.J.; Abraham, J.D.; Grauch, V.J.S.; Labson, V.F.; Hodges, G.

    2013-01-01

    This report contains digital data and supporting explanatory files describing data types, data formats, and survey procedures for a high-resolution airborne gravity gradient (AGG) survey at Great Sand Dunes National Park, Alamosa and Saguache Counties, south-central Colorado. In the San Luis Valley, the Great Sand Dunes survey covers a large part of Great Sand Dunes National Park and Preserve. The data described were collected from a high-resolution AGG survey flown in February 2012, by Fugro Airborne Surveys Corp., on contract to the U.S. Geological Survey. Scientific objectives of the AGG survey are to investigate the subsurface structural framework that may influence groundwater hydrology and seismic hazards, and to investigate AGG methods and resolution using different flight specifications. Funding was provided by an airborne geophysics training program of the U.S. Department of Defense's Task Force for Business & Stability Operations.

  20. A powerful ethernet interface module for digital camera control

    NASA Astrophysics Data System (ADS)

    Amato, Stephen M.; Geary, John C.

    2012-09-01

    We have found a commercially-available ethernet interface module with sufficient on-board resources to largely handle all timing generation tasks required by digital imaging systems found in astronomy. In addition to providing a high-bandwidth ethernet interface to the controller, it can largely replace the need for special-purpose timing circuitry. Examples for use with both CCD and CMOS imagers are provided.

  1. Film cameras or digital sensors? The challenge ahead for aerial imaging

    USGS Publications Warehouse

    Light, D.L.

    1996-01-01

    Cartographic aerial cameras continue to play the key role in producing quality products for the aerial photography business, and specifically for the National Aerial Photography Program (NAPP). One NAPP photograph taken with cameras capable of 39 lp/mm system resolution can contain the equivalent of 432 million pixels at 11 ??m spot size, and the cost is less than $75 per photograph to scan and output the pixels on a magnetic storage medium. On the digital side, solid state charge coupled device linear and area arrays can yield quality resolution (7 to 12 ??m detector size) and a broader dynamic range. If linear arrays are to compete with film cameras, they will require precise attitude and positioning of the aircraft so that the lines of pixels can be unscrambled and put into a suitable homogeneous scene that is acceptable to an interpreter. Area arrays need to be much larger than currently available to image scenes competitive in size with film cameras. Analysis of the relative advantages and disadvantages of the two systems show that the analog approach is more economical at present. However, as arrays become larger, attitude sensors become more refined, global positioning system coordinate readouts become commonplace, and storage capacity becomes more affordable, the digital camera may emerge as the imaging system for the future. Several technical challenges must be overcome if digital sensors are to advance to where they can support mapping, charting, and geographic information system applications.

  2. 75 FR 7519 - In the Matter of Certain Digital Cameras; Notice of Commission Determination Not To Review an...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-19

    ... ] on February 27, 2009 and March 11, 2009. 74 FR 12377-78 (Mar. 24, 2009). The complaint, as... COMMISSION In the Matter of Certain Digital Cameras; Notice of Commission Determination Not To Review an... importation, or the sale within the United States after importation of certain digital cameras by reason...

  3. Can Commercial Digital Cameras Be Used as Multispectral Sensors? A Crop Monitoring Test

    PubMed Central

    Lebourgeois, Valentine; Bégué, Agnès; Labbé, Sylvain; Mallavan, Benjamin; Prévot, Laurent; Roux, Bruno

    2008-01-01

    The use of consumer digital cameras or webcams to characterize and monitor different features has become prevalent in various domains, especially in environmental applications. Despite some promising results, such digital camera systems generally suffer from signal aberrations due to the on-board image processing systems and thus offer limited quantitative data acquisition capability. The objective of this study was to test a series of radiometric corrections having the potential to reduce radiometric distortions linked to camera optics and environmental conditions, and to quantify the effects of these corrections on our ability to monitor crop variables. In 2007, we conducted a five-month experiment on sugarcane trial plots using original RGB and modified RGB (Red-Edge and NIR) cameras fitted onto a light aircraft. The camera settings were kept unchanged throughout the acquisition period and the images were recorded in JPEG and RAW formats. These images were corrected to eliminate the vignetting effect, and normalized between acquisition dates. Our results suggest that 1) the use of unprocessed image data did not improve the results of image analyses; 2) vignetting had a significant effect, especially for the modified camera, and 3) normalized vegetation indices calculated with vignetting-corrected images were sufficient to correct for scene illumination conditions. These results are discussed in the light of the experimental protocol and recommendations are made for the use of these versatile systems for quantitative remote sensing of terrestrial surfaces.

  4. 77 FR 43858 - Certain Mobile Telephones and Wireless Communication Devices Featuring Digital Cameras, and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-26

    ... COMMISSION Certain Mobile Telephones and Wireless Communication Devices Featuring Digital Cameras, and... 4, 2010. 75 FR 8112. The complaint alleged violations of section 337 of the Tariff Act of 1930 in... States after importation of certain mobile telephones and wireless communication devices...

  5. Estimating the infrared radiation wavelength emitted by a remote control device using a digital camera

    NASA Astrophysics Data System (ADS)

    Catelli, Francisco; Giovannini, Odilon; Dall Agnol Bolzan, Vicente

    2011-03-01

    The interference fringes produced by a diffraction grating illuminated with radiation from a TV remote control and a red laser beam are, simultaneously, captured by a digital camera. Based on an image with two interference patterns, an estimate of the infrared radiation wavelength emitted by a TV remote control is made.

  6. Developing Mental Imagery Using a Digital Camera: A Study of Adult Vocational Training

    ERIC Educational Resources Information Center

    Ryba, Ken; Selby, Linda; Brown, Roy

    2004-01-01

    This study was undertaken to explore the use of a digital camera for mental imagery training of a vocational task with two young adult men with Down syndrome. The results indicate that these particular men benefited from the use of a collaborative training process that involved mental imagery for learning a series of photocopying operations. An…

  7. Digital Video Cameras for Brainstorming and Outlining: The Process and Potential

    ERIC Educational Resources Information Center

    Unger, John A.; Scullion, Vicki A.

    2013-01-01

    This "Voices from the Field" paper presents methods and participant-exemplar data for integrating digital video cameras into the writing process across postsecondary literacy contexts. The methods and participant data are part of an ongoing action-based research project systematically designed to bring research and theory into practice…

  8. Multipoint laser Doppler vibrometry using holographic optical elements and a CMOS digital camera.

    PubMed

    Connelly, Michael J; Szecówka, Przemyslaw M; Jallapuram, Raghavendra; Martin, Suzanne; Toal, Vincent; Whelan, Maurice P

    2008-02-15

    A laser Doppler vibrometer (LDV) is described in which holographic optical elements are used to provide the interferometer reference and object illumination beams. A complementary metal-oxide semiconductor camera, incorporating a digital signal processor, is used to carry out real-time signal processing of the interferometer output to allow multipoint LDV to be implemented.

  9. On the Complexity of Digital Video Cameras in/as Research: Perspectives and Agencements

    ERIC Educational Resources Information Center

    Bangou, Francis

    2014-01-01

    The goal of this article is to consider the potential for digital video cameras to produce as part of a research agencement. Our reflection will be guided by the current literature on the use of video recordings in research, as well as by the rhizoanalysis of two vignettes. The first of these vignettes is associated with a short video clip shot by…

  10. Estimating the Infrared Radiation Wavelength Emitted by a Remote Control Device Using a Digital Camera

    ERIC Educational Resources Information Center

    Catelli, Francisco; Giovannini, Odilon; Bolzan, Vicente Dall Agnol

    2011-01-01

    The interference fringes produced by a diffraction grating illuminated with radiation from a TV remote control and a red laser beam are, simultaneously, captured by a digital camera. Based on an image with two interference patterns, an estimate of the infrared radiation wavelength emitted by a TV remote control is made. (Contains 4 figures.)

  11. Color calibration of a CMOS digital camera for mobile imaging

    NASA Astrophysics Data System (ADS)

    Eliasson, Henrik

    2010-01-01

    As white balance algorithms employed in mobile phone cameras become increasingly sophisticated by using, e.g., elaborate white-point estimation methods, a proper color calibration is necessary. Without such a calibration, the estimation of the light source for a given situation may go wrong, giving rise to large color errors. At the same time, the demands for efficiency in the production environment require the calibration to be as simple as possible. Thus it is important to find the correct balance between image quality and production efficiency requirements. The purpose of this work is to investigate camera color variations using a simple model where the sensor and IR filter are specified in detail. As input to the model, spectral data of the 24-color Macbeth Colorchecker was used. This data was combined with the spectral irradiance of mainly three different light sources: CIE A, D65 and F11. The sensor variations were determined from a very large population from which 6 corner samples were picked out for further analysis. Furthermore, a set of 100 IR filters were picked out and measured. The resulting images generated by the model were then analyzed in the CIELAB space and color errors were calculated using the ΔE94 metric. The results of the analysis show that the maximum deviations from the typical values are small enough to suggest that a white balance calibration is sufficient. Furthermore, it is also demonstrated that the color temperature dependence is small enough to justify the use of only one light source in a production environment.

  12. Development of an XYZ Digital Camera with Embedded Color Calibration System for Accurate Color Acquisition

    NASA Astrophysics Data System (ADS)

    Kretkowski, Maciej; Jablonski, Ryszard; Shimodaira, Yoshifumi

    Acquisition of accurate colors is important in the modern era of widespread exchange of electronic multimedia. The variety of device-dependent color spaces causes troubles with accurate color reproduction. In this paper we present the outlines of accomplished digital camera system with device-independent output formed from tristimulus XYZ values. The outstanding accuracy and fidelity of acquired color is achieved in our system by employing an embedded color calibration system based on emissive device generating reference calibration colors with user-defined spectral distribution and chromaticity coordinates. The system was tested by calibrating the camera using 24 reference colors spectrally reproduced from 24 color patches of the Macbeth Chart. The average color difference (CIEDE2000) has been found to be ΔE =0.83, which is an outstanding result compared to commercially available digital cameras.

  13. Digital image processing for the rectification of television camera distortions.

    NASA Technical Reports Server (NTRS)

    Rindfleisch, T. C.

    1971-01-01

    All television systems introduce distortions into the imagery they record which influence the results of quantitative photometric and geometric measurements. Digital computer techniques provide a powerful approach to the calibration and rectification of these systematic effects. Nonlinear as well as linear problems can be attacked with flexibility and precision. Methods which have been developed and applied for the removal of structured system noises and the correction of photometric, geometric, and resolution distortions in vidicon systems are briefly described. Examples are given of results derived primarily from the Mariner Mars 1969 television experiment.

  14. Metric Potential of a 3D Measurement System Based on Digital Compact Cameras

    PubMed Central

    Sanz-Ablanedo, Enoc; Rodríguez-Pérez, José Ramón; Arias-Sánchez, Pedro; Armesto, Julia

    2009-01-01

    This paper presents an optical measuring system based on low cost, high resolution digital cameras. Once the cameras are synchronised, the portable and adjustable system can be used to observe living beings, bodies in motion, or deformations of very different sizes. Each of the cameras has been modelled individually and studied with regard to the photogrammetric potential of the system. We have investigated the photogrammetric precision obtained from the crossing of rays, the repeatability of results, and the accuracy of the coordinates obtained. Systematic and random errors are identified in validity assessment of the definition of the precision of the system from crossing of rays or from marking residuals in images. The results have clearly demonstrated the capability of a low-cost multiple-camera system to measure with sub-millimetre precision. PMID:22408520

  15. Comparison of mosaicking techniques for airborne images from consumer-grade cameras

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Images captured from airborne imaging systems have the advantages of relatively low cost, high spatial resolution, and real/near-real-time availability. Multiple images taken from one or more flight lines could be used to generate a high-resolution mosaic image, which could be useful for diverse rem...

  16. Submersible digital holographic cameras and their application to marine science

    NASA Astrophysics Data System (ADS)

    Watson, John

    2011-09-01

    Digital holography has been growing in importance for application to environmental studies in the oceans and lakes of the world. With an imaging resolution using ``classical'' photoholography of a few micro-meters and recording volumes up to a cubic meter, several ``holocameras'' were developed and deployed for underwater imaging of plankton and other marine particles. For in-water deployment, however, the weight and size of these instruments restricted their use on advanced observation platforms such as remotely operated vehicles, and limited operational depth to a few hundred meters. Advances made in digital recording on electronic sensors, coupled with numerical reconstruction, led to the development of smaller, rugged holocameras. This freed holography from many of its constraints and allowed rapid capture and storage of images and holographic video recording of moving objects. Although holography is not the only optical method applicable underwater, its ability to record full-field, high-resolution, distortion free images in situ from which particle dimensions, distribution and dynamics can be extracted is hard to match. The current state-of-the-art in underwater holography is discussed, with an outline of some submersible holocameras. We describe one such system, eHoloCam, in more depth and present results from its deployment in the North Sea.

  17. Comparison of Kodak Professional Digital Camera System images to conventional film, still video, and freeze-frame images

    NASA Astrophysics Data System (ADS)

    Kent, Richard A.; McGlone, John T.; Zoltowski, Norbert W.

    1991-06-01

    Electronic cameras provide near real time image evaluation with the benefits of digital storage methods for rapid transmission or computer processing and enhancement of images. But how does the image quality of their images compare to that of conventional film? A standard Nikon F-3TM 35 mm SLR camera was transformed into an electro-optical camera by replacing the film back with Kodak's KAF-1400V (or KAF-1300L) megapixel CCD array detector back and a processing accessory. Images taken with these Kodak electronic cameras were compared to those using conventional films and to several still video cameras. Quantitative and qualitative methods were used to compare images from these camera systems. Images captured on conventional video analog systems provide a maximum of 450 - 500 TV lines of resolution depending upon the camera resolution, storage method, and viewing system resolution. The Kodak Professional Digital Camera SystemTM exceeded this resolution and more closely approached that of film.

  18. Fundamentals of in situ digital camera methodology for water quality monitoring of coast and ocean.

    PubMed

    Goddijn-Murphy, Lonneke; Dailloux, Damien; White, Martin; Bowers, Dave

    2009-01-01

    Conventional digital cameras, the Nikon Coolpix885(®) and the SeaLife ECOshot(®), were used as in situ optical instruments for water quality monitoring. Measured response spectra showed that these digital cameras are basically three-band radiometers. The response values in the red, green and blue bands, quantified by RGB values of digital images of the water surface, were comparable to measurements of irradiance levels at red, green and cyan/blue wavelengths of water leaving light. Different systems were deployed to capture upwelling light from below the surface, while eliminating direct surface reflection. Relationships between RGB ratios of water surface images, and water quality parameters were found to be consistent with previous measurements using more traditional narrow-band radiometers. This current paper focuses on the method that was used to acquire digital images, derive RGB values and relate measurements to water quality parameters. Field measurements were obtained in Galway Bay, Ireland, and in the Southern Rockall Trough in the North Atlantic, where both yellow substance and chlorophyll concentrations were successfully assessed using the digital camera method.

  19. Effect of camera temperature variations on stereo-digital image correlation measurements.

    PubMed

    Pan, Bing; Shi, Wentao; Lubineau, Gilles

    2015-12-01

    In laboratory and especially non-laboratory stereo-digital image correlation (stereo-DIC) applications, the extrinsic and intrinsic parameters of the cameras used in the system may change slightly due to the camera warm-up effect and possible variations in ambient temperature. Because these camera parameters are generally calibrated once prior to measurements and considered to be unaltered during the whole measurement period, the changes in these parameters unavoidably induce displacement/strain errors. In this study, the effect of temperature variations on stereo-DIC measurements is investigated experimentally. To quantify the errors associated with camera or ambient temperature changes, surface displacements and strains of a stationary optical quartz glass plate with near-zero thermal expansion were continuously measured using a regular stereo-DIC system. The results confirm that (1) temperature variations in the cameras and ambient environment have a considerable influence on the displacements and strains measured by stereo-DIC due to the slightly altered extrinsic and intrinsic camera parameters; and (2) the corresponding displacement and strain errors correlate with temperature changes. For the specific stereo-DIC configuration used in this work, the temperature-induced strain errors were estimated to be approximately 30-50 με/°C. To minimize the adverse effect of camera temperature variations on stereo-DIC measurements, two simple but effective solutions are suggested.

  20. Arthropod eye-inspired digital camera with unique imaging characteristics

    NASA Astrophysics Data System (ADS)

    Xiao, Jianliang; Song, Young Min; Xie, Yizhu; Malyarchuk, Viktor; Jung, Inhwa; Choi, Ki-Joong; Liu, Zhuangjian; Park, Hyunsung; Lu, Chaofeng; Kim, Rak-Hwan; Li, Rui; Crozier, Kenneth B.; Huang, Yonggang; Rogers, John A.

    2014-06-01

    In nature, arthropods have a remarkably sophisticated class of imaging systems, with a hemispherical geometry, a wideangle field of view, low aberrations, high acuity to motion and an infinite depth of field. There are great interests in building systems with similar geometries and properties due to numerous potential applications. However, the established semiconductor sensor technologies and optics are essentially planar, which experience great challenges in building such systems with hemispherical, compound apposition layouts. With the recent advancement of stretchable optoelectronics, we have successfully developed strategies to build a fully functional artificial apposition compound eye camera by combining optics, materials and mechanics principles. The strategies start with fabricating stretchable arrays of thin silicon photodetectors and elastomeric optical elements in planar geometries, which are then precisely aligned and integrated, and elastically transformed to hemispherical shapes. This imaging device demonstrates nearly full hemispherical shape (about 160 degrees), with densely packed artificial ommatidia. The number of ommatidia (180) is comparable to those of the eyes of fire ants and bark beetles. We have illustrated key features of operation of compound eyes through experimental imaging results and quantitative ray-tracing-based simulations. The general strategies shown in this development could be applicable to other compound eye devices, such as those inspired by moths and lacewings (refracting superposition eyes), lobster and shrimp (reflecting superposition eyes), and houseflies (neural superposition eyes).

  1. Optical design of high resolution and large format CCD airborne remote sensing camera on unmanned aerial vehicle

    NASA Astrophysics Data System (ADS)

    Qian, Yixian; Cheng, Xiaowei; Shao, Jie

    2010-11-01

    Unmanned aerial vehicle remote sensing (UAVRS) is lower in cost, flexible on task arrangement and automatic and intelligent in application, it has been used widely for mapping, surveillance, reconnaissance and city planning. Airborne remote sensing missions require sensors with both high resolution and large fields of view, large format CCD digital airborne imaging systems are now a reality. A refractive system was designed to meet the requirements with the help of code V software, It has a focal length of 150mm, F number of 5.6, waveband of 0.45~0.7um, and field of view reaches 20°. It is shown that the value of modulation transfer function is higher than 0.5 at 55lp/mm, distortion is less than 0.1%, image quality reaches the diffraction limit. The system with large format CCD and wide field can satisfy the demand of the wide ground overlay area and high resolution. The optical system with simpler structure, smaller size and lighter weight, can be used in airborne remote sensing.

  2. Airborne digital holographic system for cloud particle measurements.

    PubMed

    Fugal, Jacob P; Shaw, Raymond A; Saw, Ewe Wei; Sergeyev, Aleksandr V

    2004-11-10

    An in-line holographic system for in situ detection of atmospheric cloud particles [Holographic Detector for Clouds (HOLODEC)] has been developed and flown on the National Center for Atmospheric Research C-130 research aircraft. Clear holograms are obtained in daylight conditions at typical aircraft speeds of 100 m s(-1). The instrument is fully digital and is interfaced to a control and data-acquisition system in the aircraft via optical fiber. It is operable at temperatures of less than -30 degrees C and at typical cloud humidities. Preliminary data from the experiment show its utility for studies of the three-dimensional spatial distribution of cloud particles and ice crystal shapes.

  3. Self-calibration of digital aerial camera using combined orthogonal models

    NASA Astrophysics Data System (ADS)

    Babapour, Hadi; Mokhtarzade, Mehdi; Valadan Zoej, Mohamad Javad

    2016-07-01

    The emergence of new digital aerial cameras and the diverse design and technology used in this type of cameras require in-situ calibration. Self-calibration methods, e.g. the Fourier model, are primarily used; however, additional parameters employed in such methods have not yet met the expectations to desirably model the complex multiple distortions existing in the digital aerial cameras. The present study proposes the Chebyshev-Fourier (CHF) and Jacobi-Fourier (JF) combined orthogonal models. The models are evaluated for the multiple distortions using both simulated and real data, the latter being derived from an UltraCam digital camera. The results indicate that the JF model is superior to the other methods where, e.g., in the UltraCam scenario, it improves the planimetric and vertical accuracy over the Fourier model by 18% and 22%, respectively. Furthermore, a 30% and 16% of reduction in external and internal correlation is obtained via this approach which is very promising.

  4. Illuminant spectrum estimation using a digital color camera and a color chart

    NASA Astrophysics Data System (ADS)

    Shi, Junsheng; Yu, Hongfei; Huang, Xiaoqiao; Chen, Zaiqing; Tai, Yonghang

    2014-10-01

    Illumination estimation is the main step in color constancy processing, also an important prerequisite for digital color image reproduction and many computer vision applications. In this paper, a method for estimating illuminant spectrum is investigated using a digital color camera and a color chart under the situation when the spectral reflectance of the chart is known. The method is based on measuring CIEXYZ of the chart using the camera. The first step of the method is to gain camera's color correction matrix and gamma values by taking a photo of the chart under a standard illuminant. The second step is to take a photo of the chart under an estimated illuminant, and the camera's inherent RGB values are converted to the standard sRGB values and further converted to CIEXYZ of the chart. Based on measured CIEXYZ and known spectral reflectance of the chart, the spectral power distribution (SPD) of the illuminant is estimated using the Wiener estimation and smoothing estimation. To evaluate the performance of the method quantitatively, the goodnessfitting coefficient (GFC) was used to measure the spectral match and the CIELAB color difference metric was used to evaluate the color match between color patches under the estimated and actual SPDs. The simulated experiment was carried to estimate CIE standard illuminant D50 and C using X-rite ColorChecker 24-color chart, the actual experiment was carried to estimate daylight and illuminant A using two consumergrade cameras and the chart, and the experiment results verified feasible of the investigated method.

  5. On-chip digital noise reduction for integrated CMOS Cameras

    NASA Astrophysics Data System (ADS)

    Rullmann, Markus; Schluessler, Jens-Uwe; Schueffny, Rene

    2003-06-01

    We propose an on-line noise reduction system especially designed for noisy CMOS image sensors. Image sequences from CMOS sensors in general are corrupted by two types of noise, temporal noise and fixed pattern noise (FPN). It is shown how the FPN component can be estimated from a sequence. We studied the theoretical performance of two different approaches called direct and indirect FPN estimation. We show that indirect estimation gives superior performance, both theoretically and by simulations. The FPN estimates can be used to improve the image quality by compensating it. We assess the quality of the estimates by the achievable SNR gains. Using those results a dedicated filtering scheme has been designed to accomplish both temporal noise reduction and FPN correction by applying a single noise filter. It allows signal gains of up to 12dB and provides a high visual quality of the results. We further analyzed and optimized the memory size and bandwidth requirements of our scheme and conclude that it is possible to implement it in hardware. The required memory size is 288kByte and the memory access rate is 70MHz. Our algorithm allows the integration of noisy CMOS sensors with digital noise reduction and other circuitry on a system-on-chip solution.

  6. Temporal monitoring of groundcover change using digital cameras

    NASA Astrophysics Data System (ADS)

    Zerger, A.; Gobbett, D.; Crossman, C.; Valencia, P.; Wark, T.; Davies, M.; Handcock, R. N.; Stol, J.

    2012-10-01

    This paper describes the development and testing of an automated method for detecting change in groundcover vegetation in response to kangaroo grazing using visible wavelength digital photography. The research is seen as a precursor to the future deployment of autonomous vegetation monitoring systems (environmental sensor networks). The study was conducted over six months with imagery captured every 90 min and post-processed using supervised image processing techniques. Synchronous manual assessments of groundcover change were also conducted to evaluate the effectiveness of the automated procedures. Results show that for particular cover classes such as Live Vegetation and Bare Ground, there is excellent temporal concordance between automated and manual methods. However, litter classes were difficult to consistently differentiate. A limitation of the method is the inability to effectively deal with change in the vertical profile of groundcover. This indicates that the three dimensional structure related to species composition and plant traits play an important role in driving future experimental designs. The paper concludes by providing lessons for conducting future groundcover monitoring experiments.

  7. Feasibility of an airborne TV camera as a size spectrometer for cloud droplets in daylight.

    PubMed

    Roscoe, H K; Lachlan-Cope, T A; Roscoe, J

    1999-01-20

    Photographs of clouds taken with a camera with a large aperture ratio must have a short depth of focus to resolve small droplets. Hence the sampling volume is small, which limits the number of droplets and gives rise to a large statistical error on the number counted. However, useful signals can be obtained with a small aperture ratio, which allows for a sample volume large enough for counting cloud droplets at aircraft speeds with useful spatial resolution. The signal is sufficient to discriminate against noise from a sunlit cloud as background, provided the bandwidth of the light source and camera are restricted, and against readout noise. Hence, in principle, an instrument to sample the size distribution of cloud droplets from aircraft in daylight can be constructed from a simple TV camera and an array of laser diodes, without any components or screens external to the aircraft window.

  8. Combining laser scan and photogrammetry for 3D object modeling using a single digital camera

    NASA Astrophysics Data System (ADS)

    Xiong, Hanwei; Zhang, Hong; Zhang, Xiangwei

    2009-07-01

    In the fields of industrial design, artistic design and heritage conservation, physical objects are usually digitalized by reverse engineering through some 3D scanning methods. Laser scan and photogrammetry are two main methods to be used. For laser scan, a video camera and a laser source are necessary, and for photogrammetry, a digital still camera with high resolution pixels is indispensable. In some 3D modeling tasks, two methods are often integrated to get satisfactory results. Although many research works have been done on how to combine the results of the two methods, no work has been reported to design an integrated device at low cost. In this paper, a new 3D scan system combining laser scan and photogrammetry using a single consumer digital camera is proposed. Nowadays there are many consumer digital cameras, such as Canon EOS 5D Mark II, they usually have features of more than 10M pixels still photo recording and full 1080p HD movie recording, so a integrated scan system can be designed using such a camera. A square plate glued with coded marks is used to place the 3d objects, and two straight wood rulers also glued with coded marks can be laid on the plate freely. In the photogrammetry module, the coded marks on the plate make up a world coordinate and can be used as control network to calibrate the camera, and the planes of two rulers can also be determined. The feature points of the object and the rough volume representation from the silhouettes can be obtained in this module. In the laser scan module, a hand-held line laser is used to scan the object, and the two straight rulers are used as reference planes to determine the position of the laser. The laser scan results in dense points cloud which can be aligned together automatically through calibrated camera parameters. The final complete digital model is obtained through a new a patchwise energy functional method by fusion of the feature points, rough volume and the dense points cloud. The design

  9. CMOS image sensor noise reduction method for image signal processor in digital cameras and camera phones

    NASA Astrophysics Data System (ADS)

    Yoo, Youngjin; Lee, SeongDeok; Choe, Wonhee; Kim, Chang-Yong

    2007-02-01

    Digital images captured from CMOS image sensors suffer Gaussian noise and impulsive noise. To efficiently reduce the noise in Image Signal Processor (ISP), we analyze noise feature for imaging pipeline of ISP where noise reduction algorithm is performed. The Gaussian noise reduction and impulsive noise reduction method are proposed for proper ISP implementation in Bayer domain. The proposed method takes advantage of the analyzed noise feature to calculate noise reduction filter coefficients. Thus, noise is adaptively reduced according to the scene environment. Since noise is amplified and characteristic of noise varies while the image sensor signal undergoes several image processing steps, it is better to remove noise in earlier stage on imaging pipeline of ISP. Thus, noise reduction is carried out in Bayer domain on imaging pipeline of ISP. The method is tested on imaging pipeline of ISP and images captured from Samsung 2M CMOS image sensor test module. The experimental results show that the proposed method removes noise while effectively preserves edges.

  10. Quantitative Evaluation of Surface Color of Tomato Fruits Cultivated in Remote Farm Using Digital Camera Images

    NASA Astrophysics Data System (ADS)

    Hashimoto, Atsushi; Suehara, Ken-Ichiro; Kameoka, Takaharu

    To measure the quantitative surface color information of agricultural products with the ambient information during cultivation, a color calibration method for digital camera images and a remote monitoring system of color imaging using the Web were developed. Single-lens reflex and web digital cameras were used for the image acquisitions. The tomato images through the post-ripening process were taken by the digital camera in both the standard image acquisition system and in the field conditions from the morning to evening. Several kinds of images were acquired with the standard RGB color chart set up just behind the tomato fruit on a black matte, and a color calibration was carried out. The influence of the sunlight could be experimentally eliminated, and the calibrated color information consistently agreed with the standard ones acquired in the system through the post-ripening process. Furthermore, the surface color change of the tomato on the tree in a greenhouse was remotely monitored during maturation using the digital cameras equipped with the Field Server. The acquired digital color images were sent from the Farm Station to the BIFE Laboratory of Mie University via VPN. The time behavior of the tomato surface color change during the maturing process could be measured using the color parameter calculated based on the obtained and calibrated color images along with the ambient atmospheric record. This study is a very important step in developing the surface color analysis for both the simple and rapid evaluation of the crop vigor in the field and to construct an ambient and networked remote monitoring system for food security, precision agriculture, and agricultural research.

  11. Visibility through the gaseous smoke in airborne remote sensing using a DSLR camera

    NASA Astrophysics Data System (ADS)

    Chabok, Mirahmad; Millington, Andrew; Hacker, Jorg M.; McGrath, Andrew J.

    2016-08-01

    Visibility and clarity of remotely sensed images acquired by consumer grade DSLR cameras, mounted on an unmanned aerial vehicle or a manned aircraft, are critical factors in obtaining accurate and detailed information from any area of interest. The presence of substantial haze, fog or gaseous smoke particles; caused, for example, by an active bushfire at the time of data capture, will dramatically reduce image visibility and quality. Although most modern hyperspectral imaging sensors are capable of capturing a large number of narrow range bands of the shortwave and thermal infrared spectral range, which have the potential to penetrate smoke and haze, the resulting images do not contain sufficient spatial detail to enable locating important objects or assist search and rescue or similar applications which require high resolution information. We introduce a new method for penetrating gaseous smoke without compromising spatial resolution using a single modified DSLR camera in conjunction with image processing techniques which effectively improves the visibility of objects in the captured images. This is achieved by modifying a DSLR camera and adding a custom optical filter to enable it to capture wavelengths from 480-1200nm (R, G and Near Infrared) instead of the standard RGB bands (400-700nm). With this modified camera mounted on an aircraft, images were acquired over an area polluted by gaseous smoke from an active bushfire. Processed data using our proposed method shows significant visibility improvements compared with other existing solutions.

  12. Implementation of Waveform Digitization In A Small Footprint, Airborne Lidar Topographic Mapping System

    NASA Astrophysics Data System (ADS)

    Gutierrez, R.; Crawford, M. M.; Liadsky, J.

    2004-12-01

    Accurate mapping is critical for applications ranging from geodesy, geomorphology, and forestry to urban planning and natural hazards monitoring. While airborne lidar (Light Detection and Ranging) has had a revolutionary impact on three-dimensional imaging of the earth's surface, there is great potential for developing new capability by replacing the laser range and backscatter intensity information recorded by conventional lidar systems with full waveform digitization. The University of Texas at Austin (UT) owns and operates an Optech ALTM 1225, a small footprint lidar system. In response to an initiative from UT, Optech has developed a module which samples the analog waveform of a laser pulse and converts these samples into digital measurements. The waveform digitizer specifications include a 1-nanosecond sampling interval, 440 samples per return laser waveform (approximately 65 meters of vertical extent), and waveform digitization at the 25kHz laser pulse repetition rate. The digitizer also records the initial T0 pulse that starts the timing cycle. The digitizer unit is an independent module supported by a Pentium-4 computer, two hard drives, and a high-speed data recording system. The digitizer is integrated into the ALTM system so that both full waveform and the conventional first and last returns are recorded for each transmitted laser pulse. This unique capability allows for conventional lidar data to be directly compared to the full waveform. We present examples of full waveform lidar mapping over different environments and discuss future applications.

  13. Application of phase matching autofocus in airborne long-range oblique photography camera

    NASA Astrophysics Data System (ADS)

    Petrushevsky, Vladimir; Guberman, Asaf

    2014-06-01

    The Condor2 long-range oblique photography (LOROP) camera is mounted in an aerodynamically shaped pod carried by a fast jet aircraft. Large aperture, dual-band (EO/MWIR) camera is equipped with TDI focal plane arrays and provides high-resolution imagery of extended areas at long stand-off ranges, at day and night. Front Ritchey-Chretien optics is made of highly stable materials. However, the camera temperature varies considerably in flight conditions. Moreover, a composite-material structure of the reflective objective undergoes gradual dehumidification in dry nitrogen atmosphere inside the pod, causing some small decrease of the structure length. The temperature and humidity effects change a distance between the mirrors by just a few microns. The distance change is small but nevertheless it alters the camera's infinity focus setpoint significantly, especially in the EO band. To realize the optics' resolution potential, the optimal focus shall be constantly maintained. In-flight best focus calibration and temperature-based open-loop focus control give mostly satisfactory performance. To get even better focusing precision, a closed-loop phase-matching autofocus method was developed for the camera. The method makes use of an existing beamsharer prism FPA arrangement where aperture partition exists inherently in an area of overlap between the adjacent detectors. The defocus is proportional to an image phase shift in the area of overlap. Low-pass filtering of raw defocus estimate reduces random errors related to variable scene content. Closed-loop control converges robustly to precise focus position. The algorithm uses the temperature- and range-based focus prediction as an initial guess for the closed-loop phase-matching control. The autofocus algorithm achieves excellent results and works robustly in various conditions of scene illumination and contrast.

  14. Recording of essential ballistic data with a new generation of digital ballistic range camera

    NASA Astrophysics Data System (ADS)

    Haddleton, Graham P.; Honour, Jo

    2007-01-01

    Scientists and Engineers still require to record essential parameters during the design and testing of new (or refined) munitions. This essential data, such as velocities, spin, pitch and yaw angles, sabot discards, impact angles, target penetrations, behind target effects and post impact delays, need to be recorded during dynamic, high velocity, and dangerous firings. Traditionally these parameters have been recorded on high-speed film cameras. With the demise of film as a recording media a new generation of electronic digital recording cameras has come to be accepted method of allowing these parameters to be recorded and analysed. Their obvious advantage over film is their instant access to records and their ability for almost instant analysis of records. This paper will detail results obtained using a new specially designed Ballistic Range Camera manufactured by Specialised Imaging Ltd.

  15. A digital underwater video camera system for aquatic research in regulated rivers

    USGS Publications Warehouse

    Martin, Benjamin M.; Irwin, Elise R.

    2010-01-01

    We designed a digital underwater video camera system to monitor nesting centrarchid behavior in the Tallapoosa River, Alabama, 20 km below a peaking hydropower dam with a highly variable flow regime. Major components of the system included a digital video recorder, multiple underwater cameras, and specially fabricated substrate stakes. The innovative design of the substrate stakes allowed us to effectively observe nesting redbreast sunfish Lepomis auritus in a highly regulated river. Substrate stakes, which were constructed for the specific substratum complex (i.e., sand, gravel, and cobble) identified at our study site, were able to withstand a discharge level of approximately 300 m3/s and allowed us to simultaneously record 10 active nests before and during water releases from the dam. We believe our technique will be valuable for other researchers that work in regulated rivers to quantify behavior of aquatic fauna in response to a discharge disturbance.

  16. Measuring the Orbital Period of the Moon Using a Digital Camera

    ERIC Educational Resources Information Center

    Hughes, Stephen W.

    2006-01-01

    A method of measuring the orbital velocity of the Moon around the Earth using a digital camera is described. Separate images of the Moon and stars taken 24 hours apart were loaded into Microsoft PowerPoint and the centre of the Moon marked on each image. Four stars common to both images were connected together to form a "home-made" constellation.…

  17. Use of a new high-speed digital data acquisition system in airborne ice-sounding

    USGS Publications Warehouse

    Wright, David L.; Bradley, Jerry A.; Hodge, Steven M.

    1989-01-01

    A high-speed digital data acquisition and signal averaging system for borehole, surface, and airborne radio-frequency geophysical measurements was designed and built by the US Geological Survey. The system permits signal averaging at rates high enough to achieve significant signal-to-noise enhancement in profiling, even in airborne applications. The first field use of the system took place in Greenland in 1987 for recording data on a 150 by 150-km grid centered on the summit of the Greenland ice sheet. About 6000-line km were flown and recorded using the new system. The data can be used to aid in siting a proposed scientific corehole through the ice sheet.

  18. Comparison of mosaicking techniques for airborne images from consumer-grade cameras

    NASA Astrophysics Data System (ADS)

    Song, Huaibo; Yang, Chenghai; Zhang, Jian; Hoffmann, Wesley Clint; He, Dongjian; Thomasson, J. Alex

    2016-01-01

    Images captured from airborne imaging systems can be mosaicked for diverse remote sensing applications. The objective of this study was to identify appropriate mosaicking techniques and software to generate mosaicked images for use by aerial applicators and other users. Three software packages-Photoshop CC, Autostitch, and Pix4Dmapper-were selected for mosaicking airborne images acquired from a large cropping area. Ground control points were collected for georeferencing the mosaicked images and for evaluating the accuracy of eight mosaicking techniques. Analysis and accuracy assessment showed that Pix4Dmapper can be the first choice if georeferenced imagery with high accuracy is required. The spherical method in Photoshop CC can be an alternative for cost considerations, and Autostitch can be used to quickly mosaic images with reduced spatial resolution. The results also showed that the accuracy of image mosaicking techniques could be greatly affected by the size of the imaging area or the number of the images and that the accuracy would be higher for a small area than for a large area. The results from this study will provide useful information for the selection of image mosaicking software and techniques for aerial applicators and other users.

  19. Long-Term Tracking of a Specific Vehicle Using Airborne Optical Camera Systems

    NASA Astrophysics Data System (ADS)

    Kurz, F.; Rosenbaum, D.; Runge, H.; Cerra, D.; Mattyus, G.; Reinartz, P.

    2016-06-01

    In this paper we present two low cost, airborne sensor systems capable of long-term vehicle tracking. Based on the properties of the sensors, a method for automatic real-time, long-term tracking of individual vehicles is presented. This combines the detection and tracking of the vehicle in low frame rate image sequences and applies the lagged Cell Transmission Model (CTM) to handle longer tracking outages occurring in complex traffic situations, e.g. tunnels. The CTM model uses the traffic conditions in the proximities of the target vehicle and estimates its motion to predict the position where it reappears. The method is validated on an airborne image sequence acquired from a helicopter. Several reference vehicles are tracked within a range of 500m in a complex urban traffic situation. An artificial tracking outage of 240m is simulated, which is handled by the CTM. For this, all the vehicles in the close proximity are automatically detected and tracked to estimate the basic density-flow relations of the CTM model. Finally, the real and simulated trajectories of the reference vehicles in the outage are compared showing good correspondence also in congested traffic situations.

  20. Extension of the possibilities of a commercial digital camera in detecting spatial intensity distribution of laser radiation

    SciTech Connect

    Konnik, M V; Manykin, Eduard A; Starikov, S N

    2010-06-23

    Performance capabilities of commercial digital cameras are demonstrated by the example of a Canon EOS 400D camera in measuring and detecting spatial distributions of laser radiation intensity. It is shown that software extraction of linear data expands the linear dynamic range of the camera by a factor greater than 10, up to 58 dB. Basic measurement characteristics of the camera are obtained in the regime of linear data extraction: the radiometric function, deviation from linearity, dynamic range, temporal and spatial noises (both dark and those depending on the signal value). The parameters obtained correspond to those of technical measuring cameras. (measurement of laser radiation parameters)

  1. Digital camera and smartphone as detectors in paper-based chemiluminometric genotyping of single nucleotide polymorphisms.

    PubMed

    Spyrou, Elena M; Kalogianni, Despina P; Tragoulias, Sotirios S; Ioannou, Penelope C; Christopoulos, Theodore K

    2016-10-01

    Chemi(bio)luminometric assays have contributed greatly to various areas of nucleic acid analysis due to their simplicity and detectability. In this work, we present the development of chemiluminometric genotyping methods in which (a) detection is performed by using either a conventional digital camera (at ambient temperature) or a smartphone and (b) a lateral flow assay configuration is employed for even higher simplicity and suitability for point of care or field testing. The genotyping of the C677T single nucleotide polymorphism (SNP) of methylenetetrahydropholate reductase (MTHFR) gene is chosen as a model. The interrogated DNA sequence is amplified by polymerase chain reaction (PCR) followed by a primer extension reaction. The reaction products are captured through hybridization on the sensing areas (spots) of the strip. Streptavidin-horseradish peroxidase conjugate is used as a reporter along with a chemiluminogenic substrate. Detection of the emerging chemiluminescence from the sensing areas of the strip is achieved by digital camera or smartphone. For this purpose, we constructed a 3D-printed smartphone attachment that houses inexpensive lenses and converts the smartphone into a portable chemiluminescence imager. The device enables spatial discrimination of the two alleles of a SNP in a single shot by imaging of the strip, thus avoiding the need of dual labeling. The method was applied successfully to genotyping of real clinical samples. Graphical abstract Paper-based genotyping assays using digital camera and smartphone as detectors.

  2. Monitoring of phenological control on ecosystem fluxes using digital cameras and eddy covariance data

    NASA Astrophysics Data System (ADS)

    Toomey, M. P.; Friedl, M. A.; Hufkens, K.; Sonnentag, O.; Milliman, T. E.; Frolking, S.; Richardson, A. D.

    2012-12-01

    Digital repeat photography is an emerging platform for monitoring land surface phenology. Despite the great potential of digital repeat photography to yield insights into phenological cycles, relatively few studies have compared digital repeat photography to in situ measures of ecosystem fluxes. We used 60 site years of concurrent camera and eddy covariance data at 13 sites, representing five distinct ecosystem types - temperate deciduous forest, temperate coniferous forest, boreal forest, grasslands and crops - to measure and model phenological controls on carbon and water exchange with the atmosphere. Camera-derived relative greenness was strongly correlated with estimated gross primary productivity among the five ecosystem types and was moderately correlated with water fluxes. Camera-derived canopy development was also compared with phenological phase as predicted by a generalized, bioclimatic phenology model and Moderate Resolution Imaging Spectrometer (MODIS) imagery to assess the potential for cross-biome phenological monitoring. This study demonstrates the potential of webcam networks such as Phenocam (phenocam.unh.edu) to conduct long-term, continental monitoring and modeling of ecosystem response to climate change.

  3. Simulation of film media in motion picture production using a digital still camera

    NASA Astrophysics Data System (ADS)

    Bakke, Arne M.; Hardeberg, Jon Y.; Paul, Steffen

    2009-01-01

    The introduction of digital intermediate workflow in movie production has made visualization of the final image on the film set increasingly important. Images that have been color corrected on the set can also serve as a basis for color grading in the laboratory. In this paper we suggest and evaluate an approach that has been used to simulate the appearance of different film stocks. The GretagMacbeth Digital ColorChecker was captured using both a Canon EOS 20D camera as well as an analog camera. The film was scanned using an Arri film scanner. The images of the color chart were then used to perform a colorimetric characterization of these devices using models based on polynomial regression. By using the reverse model of the digital camera and the forward model of the analog film chain, the output of the film scanner was simulated. We also constructed a direct transformation using regression on the RGB values of the two devices. A different color chart was then used as a test set to evaluate the accuracy of the transformations, where the indirect model was found to provide the required performance for our purpose without compromising the flexibility of having an independent profile for each device.

  4. Airborne imaging for heritage documentation using the Fotokite tethered flying camera

    NASA Astrophysics Data System (ADS)

    Verhoeven, Geert; Lupashin, Sergei; Briese, Christian; Doneus, Michael

    2014-05-01

    Since the beginning of aerial photography, researchers used all kinds of devices (from pigeons, kites, poles, and balloons to rockets) to take still cameras aloft and remotely gather aerial imagery. To date, many of these unmanned devices are still used for what has been referred to as Low-Altitude Aerial Photography or LAAP. In addition to these more traditional camera platforms, radio-controlled (multi-)copter platforms have recently added a new aspect to LAAP. Although model airplanes have been around for several decades, the decreasing cost, increasing functionality and stability of ready-to-fly multi-copter systems has proliferated their use among non-hobbyists. As such, they became a very popular tool for aerial imaging. The overwhelming amount of currently available brands and types (heli-, dual-, tri-, quad-, hexa-, octo-, dodeca-, deca-hexa and deca-octocopters), together with the wide variety of navigation options (e.g. altitude and position hold, waypoint flight) and camera mounts indicate that these platforms are here to stay for some time. Given the multitude of still camera types and the image quality they are currently capable of, endless combinations of low- and high-cost LAAP solutions are available. In addition, LAAP allows for the exploitation of new imaging techniques, as it is often only a matter of lifting the appropriate device (e.g. video cameras, thermal frame imagers, hyperspectral line sensors). Archaeologists were among the first to adopt this technology, as it provided them with a means to easily acquire essential data from a unique point of view, whether for simple illustration purposes of standing historic structures or to compute three-dimensional (3D) models and orthophotographs from excavation areas. However, even very cheap multi-copters models require certain skills to pilot them safely. Additionally, malfunction or overconfidence might lift these devices to altitudes where they can interfere with manned aircrafts. As such, the

  5. Use of a Digital Camera to Monitor the Growth and Nitrogen Status of Cotton

    PubMed Central

    Jia, Biao; He, Haibing; Ma, Fuyu; Diao, Ming; Jiang, Guiying; Zheng, Zhong; Cui, Jin; Fan, Hua

    2014-01-01

    The main objective of this study was to develop a nondestructive method for monitoring cotton growth and N status using a digital camera. Digital images were taken of the cotton canopies between emergence and full bloom. The green and red values were extracted from the digital images and then used to calculate canopy cover. The values of canopy cover were closely correlated with the normalized difference vegetation index and the ratio vegetation index and were measured using a GreenSeeker handheld sensor. Models were calibrated to describe the relationship between canopy cover and three growth properties of the cotton crop (i.e., aboveground total N content, LAI, and aboveground biomass). There were close, exponential relationships between canopy cover and three growth properties. And the relationships for estimating cotton aboveground total N content were most precise, the coefficients of determination (R2) value was 0.978, and the root mean square error (RMSE) value was 1.479 g m−2. Moreover, the models were validated in three fields of high-yield cotton. The result indicated that the best relationship between canopy cover and aboveground total N content had an R2 value of 0.926 and an RMSE value of 1.631 g m−2. In conclusion, as a near-ground remote assessment tool, digital cameras have good potential for monitoring cotton growth and N status. PMID:24723817

  6. Greenness indices from digital cameras predict the timing and seasonal dynamics of canopy-scale photosynthesis.

    PubMed

    Toomey, Michael; Friedl, Mark A; Frolking, Steve; Hufkens, Koen; Klosterman, Stephen; Sonnentag, Oliver; Baldocchi, Dennis D; Bernacchi, Carl J; Biraud, Sebastien C; Bohrer, Gil; Brzostek, Edward; Burns, Sean P; Coursolle, Carole; Hollinger, David Y; Margolis, Hank A; Mccaughey, Harry; Monson, Russell K; Munger, J William; Pallardy, Stephen; Phillips, Richard P; Torn, Margaret S; Wharton, Sonia; Zeri, Marcelo; And, Andrew D; Richardson, Andrew D

    2015-01-01

    The proliferation of digital cameras co-located with eddy covariance instrumentation provides new opportunities to better understand the relationship between canopy phenology and the seasonality of canopy photosynthesis. In this paper we analyze the abilities and limitations of canopy color metrics measured by digital repeat photography to track seasonal canopy development and photosynthesis, determine phenological transition dates, and estimate intra-annual and interannual variability in canopy photosynthesis. We used 59 site-years of camera imagery and net ecosystem exchange measurements from 17 towers spanning three plant functional types (deciduous broadleaf forest, evergreen needleleaf forest, and grassland/crops) to derive color indices and estimate gross primary productivity (GPP). GPP was strongly correlated with greenness derived from camera imagery in all three plant functional types. Specifically, the beginning of the photosynthetic period in deciduous broadleaf forest and grassland/crops and the end of the photosynthetic period in grassland/crops were both correlated with changes in greenness; changes in redness were correlated with the end of the photosynthetic period in deciduous broadleaf forest. However, it was not possible to accurately identify the beginning or ending of the photosynthetic period using camera greenness in evergreen needleleaf forest. At deciduous broadleaf sites, anomalies in integrated greenness and total GPP were significantly correlated up to 60 days after the mean onset date for the start of spring. More generally, results from this work demonstrate that digital repeat photography can be used to quantify both the duration of the photosynthetically active period as well as total GPP in deciduous broadleaf forest and grassland/crops, but that new and different approaches are required before comparable results can be achieved in evergreen needleleaf forest. PMID:26255360

  7. High-speed radiometric imaging with a gated, intensified, digitally controlled camera

    NASA Astrophysics Data System (ADS)

    Ross, Charles C.; Sturz, Richard A.

    1997-05-01

    The development of an advanced instrument for real-time radiometric imaging of high-speed events is described. The Intensified Digitally-Controlled Gated (IDG) camera is a microprocessor-controlled instrument based on an intensified CCD that is specifically designed to provide radiometric optical data. The IDG supports a variety of camera- synchronous and camera-asynchronous imaging tasks in both passive imaging and active laser range-gated applications. It features both automatic and manual modes of operation, digital precision and repeatability, and ease of use. The IDG produces radiometric imagery by digitally controlling the instrument's optical gain and exposure duration, and by encoding and annotating the parameters necessary for radiometric analysis onto the resultant video signal. Additional inputs, such as date, time, GPS, IRIG-B timing, and other data can also be encoded and annotated. The IDG optical sensitivity can be readily calibrated, with calibration data tables stored in the camera's nonvolatile flash memory. The microprocessor then uses this data to provide a linear, calibrated output. The IDG possesses both synchronous and asynchronous imaging modes in order to allow internal or external control of exposure, timing, and direct interface to external equipment such as event triggers and frame grabbers. Support for laser range-gating is implemented by providing precise asynchronous CCD operation and nanosecond resolution of the intensifier photocathode gate duration and timing. Innovative methods used to control the CCD for asynchronous image capture, as well as other sensor and system considerations relevant to high-speed imaging are discussed in this paper.

  8. Greenness indices from digital cameras predict the timing and seasonal dynamics of canopy-scale photosynthesis.

    PubMed

    Toomey, Michael; Friedl, Mark A; Frolking, Steve; Hufkens, Koen; Klosterman, Stephen; Sonnentag, Oliver; Baldocchi, Dennis D; Bernacchi, Carl J; Biraud, Sebastien C; Bohrer, Gil; Brzostek, Edward; Burns, Sean P; Coursolle, Carole; Hollinger, David Y; Margolis, Hank A; Mccaughey, Harry; Monson, Russell K; Munger, J William; Pallardy, Stephen; Phillips, Richard P; Torn, Margaret S; Wharton, Sonia; Zeri, Marcelo; And, Andrew D; Richardson, Andrew D

    2015-01-01

    The proliferation of digital cameras co-located with eddy covariance instrumentation provides new opportunities to better understand the relationship between canopy phenology and the seasonality of canopy photosynthesis. In this paper we analyze the abilities and limitations of canopy color metrics measured by digital repeat photography to track seasonal canopy development and photosynthesis, determine phenological transition dates, and estimate intra-annual and interannual variability in canopy photosynthesis. We used 59 site-years of camera imagery and net ecosystem exchange measurements from 17 towers spanning three plant functional types (deciduous broadleaf forest, evergreen needleleaf forest, and grassland/crops) to derive color indices and estimate gross primary productivity (GPP). GPP was strongly correlated with greenness derived from camera imagery in all three plant functional types. Specifically, the beginning of the photosynthetic period in deciduous broadleaf forest and grassland/crops and the end of the photosynthetic period in grassland/crops were both correlated with changes in greenness; changes in redness were correlated with the end of the photosynthetic period in deciduous broadleaf forest. However, it was not possible to accurately identify the beginning or ending of the photosynthetic period using camera greenness in evergreen needleleaf forest. At deciduous broadleaf sites, anomalies in integrated greenness and total GPP were significantly correlated up to 60 days after the mean onset date for the start of spring. More generally, results from this work demonstrate that digital repeat photography can be used to quantify both the duration of the photosynthetically active period as well as total GPP in deciduous broadleaf forest and grassland/crops, but that new and different approaches are required before comparable results can be achieved in evergreen needleleaf forest.

  9. Digital Elevation Model from Non-Metric Camera in Uas Compared with LIDAR Technology

    NASA Astrophysics Data System (ADS)

    Dayamit, O. M.; Pedro, M. F.; Ernesto, R. R.; Fernando, B. L.

    2015-08-01

    Digital Elevation Model (DEM) data as a representation of surface topography is highly demanded for use in spatial analysis and modelling. Aimed to that issue many methods of acquisition data and process it are developed, from traditional surveying until modern technology like LIDAR. On the other hands, in a past four year the development of Unamend Aerial System (UAS) aimed to Geomatic bring us the possibility to acquire data about surface by non-metric digital camera on board in a short time with good quality for some analysis. Data collectors have attracted tremendous attention on UAS due to possibility of the determination of volume changes over time, monitoring of the breakwaters, hydrological modelling including flood simulation, drainage networks, among others whose support in DEM for proper analysis. The DEM quality is considered as a combination of DEM accuracy and DEM suitability so; this paper is aimed to analyse the quality of the DEM from non-metric digital camera on UAS compared with a DEM from LIDAR corresponding to same geographic space covering 4 km2 in Artemisa province, Cuba. This area is in a frame of urban planning whose need to know the topographic characteristics in order to analyse hydrology behaviour and decide the best place for make roads, building and so on. Base on LIDAR technology is still more accurate method, it offer us a pattern for test DEM from non-metric digital camera on UAS, whose are much more flexible and bring a solution for many applications whose needs DEM of detail.

  10. Full-field dynamic deformation and strain measurements using high-speed digital cameras

    NASA Astrophysics Data System (ADS)

    Schmidt, Timothy E.; Tyson, John; Galanulis, Konstantin; Revilock, Duane M.; Melis, Matthew E.

    2005-03-01

    Digital cameras are rapidly supplanting film, even for very high speed and ultra high-speed applications. The benefits of these cameras, particularly CMOS versions, are well appreciated. This paper describes how a pair of synchronized digital high-speed cameras can provide full-field dynamic deformation, shape and strain information, through a process known as 3D image correlation photogrammetry. The data is equivalent to thousands of non-contact x-y-z extensometers and strain rosettes, as well as instant non-contact CMM shape measurement. A typical data acquisition rate is 27,000 frames per second, with displacement accuracy on the order of 25-50 microns, and strain accuracy of 250-500 microstrain. High-speed 3D image correlation is being used extensively at the NASA Glenn Ballistic Impact Research Lab, in support of Return to Flight activities. This leading edge work is playing an important role in validating and iterating LS-DYNA models of foam impact on reinforced carbon-carbon, including orbiter wing panel tests. The technique has also been applied to air blast effect studies and Kevlar ballistic impact testing. In these cases, full-field and time history analysis revealed the complexity of the dynamic buckling, including multiple lobes of out-of-plane and in-plane displacements, strain maxima shifts, and damping over time.

  11. Teaching with Technology: Step Back and Hand over the Cameras! Using Digital Cameras to Facilitate Mathematics Learning with Young Children in K-2 Classrooms

    ERIC Educational Resources Information Center

    Northcote, Maria

    2011-01-01

    Digital cameras are now commonplace in many classrooms and in the lives of many children in early childhood centres and primary schools. They are regularly used by adults and teachers for "saving special moments and documenting experiences." The use of previously expensive photographic and recording equipment has often remained in the domain of…

  12. Improvements in remote cardiopulmonary measurement using a five band digital camera.

    PubMed

    McDuff, Daniel; Gontarek, Sarah; Picard, Rosalind W

    2014-10-01

    Remote measurement of the blood volume pulse via photoplethysmography (PPG) using digital cameras and ambient light has great potential for healthcare and affective computing. However, traditional RGB cameras have limited frequency resolution. We present results of PPG measurements from a novel five band camera and show that alternate frequency bands, in particular an orange band, allowed physiological measurements much more highly correlated with an FDA approved contact PPG sensor. In a study with participants (n = 10) at rest and under stress, correlations of over 0.92 (p 0.01) were obtained for heart rate, breathing rate, and heart rate variability measurements. In addition, the remotely measured heart rate variability spectrograms closely matched those from the contact approach. The best results were obtained using a combination of cyan, green, and orange (CGO) bands; incorporating red and blue channel observations did not improve performance. In short, RGB is not optimal for this problem: CGO is better. Incorporating alternative color channel sensors should not increase the cost of such cameras dramatically.

  13. Quantification of atmospheric visibility with dual digital cameras during daytime and nighttime

    NASA Astrophysics Data System (ADS)

    Du, K.; Wang, K.; Shi, P.; Wang, Y.

    2013-08-01

    A digital optical method "DOM-Vis" was developed to measure atmospheric visibility. In this method, two digital pictures were taken of the same target at two different distances along the same straight line. The pictures were analyzed to determine the optical contrasts between the target and its sky background and, subsequently, visibility is calculated. A light transfer scheme for DOM-Vis was delineated, based upon which algorithms were developed for both daytime and nighttime scenarios. A series of field tests were carried out under different weather and meteorological conditions to study the impacts of such operational parameters as exposure, optical zoom, distance between the two camera locations, and distance of the target. This method was validated by comparing the DOM-Vis results with those measured using a co-located Vaisala® visibility meter. The visibility under which this study was carried out ranged from 1 to 20 km. This digital-photography-based method possesses a number of advantages compared with traditional methods. Pre-calibration of the detector with a visibility meter is not required. In addition, the application of DOM-Vis is independent of several factors like the exact distance of the target and several camera setting parameters. These features make DOM-Vis more adaptive under a variety of field conditions.

  14. Quantification of atmospheric visibility with dual digital cameras during daytime and nighttime

    NASA Astrophysics Data System (ADS)

    Du, K.; Wang, K.; Shi, P.; Wang, Y.

    2013-01-01

    A digital optical method "DOM-Vis" was developed to measure atmospheric visibility. In this method, two digital pictures were taken of the same target at two different distances along the same straight line. The pictures were analyzed to determine the optical contrasts between the target and its sky background, and subsequently, visibility is calculated. A light transfer scheme for DOM-Vis was delineated, based upon which, algorithms were developed for both daytime and nighttime scenarios. A series of field tests were carried out under different weather and meteorological conditions to study the impacts of such operational parameters as exposure, optical zoom, distance between the two camera locations, and distance of the target. This method was validated by comparing the DOM-Vis results with those measured using a co-located Vaisala® visibility meter. The visibility under which this study was carried out ranged from to 1 km to 20 km. This digital photography based method possesses a number of advantages compared with traditional methods. Pre-calibration of the detector with a visibility meter is not required. In addition, the application of DOM-Vis is independent of several factors like the exact distance of the target and several camera setting parameters. These features make DOM-Vis more adaptive under a variety of field conditions.

  15. Development of High Speed Digital Camera: EXILIM EX-F1

    NASA Astrophysics Data System (ADS)

    Nojima, Osamu

    The EX-F1 is a high speed digital camera featuring a revolutionary improvement in burst shooting speed that is expected to create entirely new markets. This model incorporates a high speed CMOS sensor and a high speed LSI processor. With this model, CASIO has achieved an ultra-high speed 60 frames per second (fps) burst rate for still images, together with 1,200 fps high speed movie that captures movements which cannot even be seen by human eyes. Moreover, this model can record movies at full High-Definition. After launching it into the market, it was able to get a lot of high appraisals as an innovation camera. We will introduce the concept, features and technologies about the EX-F1.

  16. Technique for improving the quality of images from digital cameras using ink-jet printers and smoothed RGB transfer curves

    NASA Astrophysics Data System (ADS)

    Sampat, Nitin; Grim, John F.; O'Hara, James E.

    1998-04-01

    The digital camera market is growing at an explosive rate. At the same time, the quality of photographs printed on ink- jet printers continues to improve. Most of the consumer cameras are designed with the monitor as the target output device and ont the printer. When a user is printing his images from a camera, he/she needs to optimize the camera and printer combination in order to maximize image quality. We describe the details of one such method for improving image quality using a AGFA digital camera and an ink jet printer combination. Using Adobe PhotoShop, we generated optimum red, green and blue transfer curves that match the scene content to the printers output capabilities. Application of these curves to the original digital image resulted in a print with more shadow detail, no loss of highlight detail, a smoother tone scale, and more saturated colors. The image also exhibited an improved tonal scale and visually more pleasing images than those captured and printed without any 'correction'. While we report the results for one camera-printer combination we tested this technique on numbers digital cameras and printer combinations and in each case produced a better looking image. We also discuss the problems we encountered in implementing this technique.

  17. Retrieval of water quality algorithms from airborne HySpex camera for oxbow lakes in north-eastern Poland

    NASA Astrophysics Data System (ADS)

    Slapinska, Malgorzata; Berezowski, Tomasz; Frąk, Magdalena; Chormański, Jarosław

    2016-04-01

    The aim of this study was to retrieve empirical formulas for water quality of oxbow lakes in Lower Biebrza Basin (river located in NE Poland) using HySpex airborne imaging spectrometer. Biebrza River is one of the biggest wetland in Europe. It is characterised by low contamination level and small human influence. Because of those characteristics Biebrza River can be treated as a reference area for other floodplains and fen ecosystem in Europe. Oxbow lakes are important part of Lower Biebrza Basin due to their retention and habitat function. Hyperspectral remote sensing data were acquired by the HySpex sensor (which covers the range of 400-2500 nm) on 01-02.08.2015 with the ground measurements campaign conducted 03-04.08.2015. The ground measurements consisted of two parts. First part included spectral reflectance sampling with spectroradiometer ASD FieldSpec 3, which covered the wavelength range of 350-2500 nm at 1 nm intervals. In situ data were collected both for water and for specific objects within the area. Second part of the campaign included water parameters such as Secchi disc depth (SDD), electric conductivity (EC), pH, temperature and phytoplankton. Measured reflectance enabled empirical line atmospheric correction which was conducted for the HySpex data. Our results indicated that proper atmospheric correction was very important for further data analysis. The empirical formulas for our water parameters were retrieved based on reflecatance data. This study confirmed applicability of HySpex camera to retrieve water quality.

  18. Assessing the application of an airborne intensified multispectral video camera to measure chlorophyll a in three Florida estuaries

    SciTech Connect

    Dierberg, F.E.; Zaitzeff, J.

    1997-08-01

    After absolute and spectral calibration, an airborne intensified, multispectral video camera was field tested for water quality assessments over three Florida estuaries (Tampa Bay, Indian River Lagoon, and the St. Lucie River Estuary). Univariate regression analysis of upwelling spectral energy vs. ground-truthed uncorrected chlorophyll a (Chl a) for each estuary yielded lower coefficients of determination (R{sup 2}) with increasing concentrations of Gelbstoff within an estuary. More predictive relationships were established by adding true color as a second independent variable in a bivariate linear regression model. These regressions successfully explained most of the variation in upwelling light energy (R{sup 2}=0.94, 0.82 and 0.74 for the Tampa Bay, Indian River Lagoon, and St. Lucie estuaries, respectively). Ratioed wavelength bands within the 625-710 nm range produced the highest correlations with ground-truthed uncorrected Chl a, and were similar to those reported as being the most predictive for Chl a in Tennessee reservoirs. However, the ratioed wavebands producing the best predictive algorithms for Chl a differed among the three estuaries due to the effects of varying concentrations of Gelbstoff on upwelling spectral signatures, which precluded combining the data into a common data set for analysis.

  19. A Novel Multi-Digital Camera System Based on Tilt-Shift Photography Technology

    PubMed Central

    Sun, Tao; Fang, Jun-yong; Zhao, Dong; Liu, Xue; Tong, Qing-xi

    2015-01-01

    Multi-digital camera systems (MDCS) are constantly being improved to meet the increasing requirement of high-resolution spatial data. This study identifies the insufficiencies of traditional MDCSs and proposes a new category MDCS based on tilt-shift photography to improve ability of the MDCS to acquire high-accuracy spatial data. A prototype system, including two or four tilt-shift cameras (TSC, camera model: Nikon D90), is developed to validate the feasibility and correctness of proposed MDCS. Similar to the cameras of traditional MDCSs, calibration is also essential for TSC of new MDCS. The study constructs indoor control fields and proposes appropriate calibration methods for TSC, including digital distortion model (DDM) approach and two-step calibrated strategy. The characteristics of TSC are analyzed in detail via a calibration experiment; for example, the edge distortion of TSC. Finally, the ability of the new MDCS to acquire high-accuracy spatial data is verified through flight experiments. The results of flight experiments illustrate that geo-position accuracy of prototype system achieves 0.3 m at a flight height of 800 m, and spatial resolution of 0.15 m. In addition, results of the comparison between the traditional (MADC II) and proposed MDCS demonstrate that the latter (0.3 m) provides spatial data with higher accuracy than the former (only 0.6 m) under the same conditions. We also take the attitude that using higher accuracy TSC in the new MDCS should further improve the accuracy of the photogrammetry senior product. PMID:25835187

  20. Assessment of digital camera-derived vegetation indices in quantitative monitoring of seasonal rice growth

    NASA Astrophysics Data System (ADS)

    Sakamoto, Toshihiro; Shibayama, Michio; Kimura, Akihiko; Takada, Eiji

    2011-11-01

    A commercially available digital camera can be used in a low-cost automatic observation system for monitoring crop growth change in open-air fields. We developed a prototype Crop Phenology Recording System (CPRS) for monitoring rice growth, but the ready-made waterproof cases that we used produced shadows on the images. After modifying the waterproof cases, we repeated the fixed-point camera observations to clarify questions regarding digital camera-derived vegetation indices (VIs), namely, the visible atmospherically resistant index (VARI) based on daytime normal color images (RGB image) and the nighttime relative brightness index (NRBI NIR) based on nighttime near infrared (NIR) images. We also took frequent measurements of agronomic data such as plant length, leaf area index (LAI), and aboveground dry matter weight to gain a detailed understanding of the temporal relationship between the VIs and the biophysical parameters of rice. In addition, we conducted another nighttime outdoor experiment to establish the link between NRBI NIR and camera-to-object distance. The study produced the following findings. (1) The customized waterproof cases succeeded in preventing large shadows from being cast, especially on nighttime images, and it was confirmed that the brightness of the nighttime NIR images had spatial heterogeneity when a point light source (flashlight) was used, in contrast to the daytime RGB images. (2) The additional experiment using a forklift showed that both the ISO sensitivity and the calibrated digital number of the NIR (cDN NIR) had significant effects on the sensitivity of NRBI NIR to the camera-to-object distance. (3) Detailed measurements of a reproductive stem were collected to investigate the connection between the morphological feature change caused by the panicle sagging process and the downtrend in NRBI NIR during the reproductive stages. However, these agronomic data were not completely in accord with NRBI NIR in terms of the temporal pattern

  1. Cataract screening by minimally trained remote observer with non-mydriatic digital fundus camera

    NASA Astrophysics Data System (ADS)

    Choi, Ann; Hjelmstad, David; Taibl, Jessica N.; Sayegh, Samir I.

    2013-03-01

    We propose a method that allows an inexperienced observer, through the examination of the digital fundus image of a retina on a computer screen, to simply determine the presence of a cataract and the necessity to refer the patient for further evaluation. To do so, fundus photos obtained with a non-mydriatic camera were presented to an inexperienced observer that was briefly instructed on fundus imaging, nature of cataracts and their probable effect on the image of the retina and the use of a computer program presenting fundus image pairs. Preliminary results of pair testing indicate the method is very effective.

  2. Determination of the diffusion coefficient between corn syrup and distilled water using a digital camera

    NASA Astrophysics Data System (ADS)

    Ray, E.; Bunton, P.; Pojman, J. A.

    2007-10-01

    A simple technique for determining the diffusion coefficient between two miscible liquids is presented based on observing concentration-dependent ultraviolet-excited fluorescence using a digital camera. The ultraviolet-excited visible fluorescence of corn syrup is proportional to the concentration of the syrup. The variation of fluorescence with distance from the transition zone between the fluids is fit by the Fick's law solution to the diffusion equation. By monitoring the concentration at successive times, the diffusion coefficient can be determined in otherwise transparent materials. The technique is quantitative and makes measurement of diffusion accessible in the advanced undergraduate physics laboratory.

  3. Erosion research with a digital camera: the structure from motion method used in gully monitoring - field experiments from southern Morocco

    NASA Astrophysics Data System (ADS)

    Kaiser, Andreas; Rock, Gilles; Neugirg, Fabian; Müller, Christoph; Ries, Johannes

    2014-05-01

    From a geoscientific view arid or semiarid landscapes are often associated with soil degrading erosion processes and thus active geomorphology. In this regard gully incision represents one of the most important influences on surface dynamics. Established approaches to monitor and quantify soil loss require costly and labor-intensive measuring methods: terrestrial or airborne LiDAR scans to create digital elevation models and unmanned airborne vehicles for image acquisition provide adequate tools for geomorphological surveying. Despite their ever advancing abilities, they are finite with their applicability in detailed recordings of complex surfaces. Especially undercuttings and plunge pools in the headcut area of gully systems are invisible or cause shadowing effects. The presented work aims to apply and advance an adequate tool to avoid the above mentioned obstacles and weaknesses of the established methods. The emerging structure from motion-based high resolution 3D-visualisation not only proved to be useful in gully erosion. Moreover, it provides a solid ground for additional applications in geosciences such as surface roughness measurements, quantification of gravitational mass movements or capturing stream connectivity. During field campaigns in semiarid southern Morocco a commercial DSLR camera was used, to produce images that served as input data for software based point cloud and mesh generation. Thus, complex land surfaces could be reconstructed entirely in high resolution by photographing the object from different perspectives. In different scales the resulting 3D-mesh represents a holistic reconstruction of the actual shape complexity with its limits set only by computing capacity. Analysis and visualization of time series of different erosion-related events illustrate the additional benefit of the method. It opens new perspectives on process understanding that can be exploited by open source and commercial software. Results depicted a soil loss of 5

  4. High-resolution image digitizing through 12x3-bit RGB-filtered CCD camera

    NASA Astrophysics Data System (ADS)

    Cheng, Andrew Y. S.; Pau, Michael C. Y.

    1996-09-01

    A high resolution computer-controlled CCD image capturing system is developed by using a 12 bits 1024 by 1024 pixels CCD camera and motorized RGB filters to grasp an image with color depth up to 36 bits. The filters distinguish the major components of color and collect them separately while the CCD camera maintains the spatial resolution and detector filling factor. The color separation can be done optically rather than electronically. The operation is simply by placing the capturing objects like color photos, slides and even x-ray transparencies under the camera system, the necessary parameters such as integration time, mixing level and light intensity are automatically adjusted by an on-line expert system. This greatly reduces the restrictions of the capturing species. This unique approach can save considerable time for adjusting the quality of image, give much more flexibility of manipulating captured object even if it is a 3D object with minimal setup fixers. In addition, cross sectional dimension of a 3D capturing object can be analyzed by adapting a fiber optic ring light source. It is particularly useful in non-contact metrology of a 3D structure. The digitized information can be stored in an easily transferable format. Users can also perform a special LUT mapping automatically or manually. Applications of the system include medical images archiving, printing quality control, 3D machine vision, and etc.

  5. Application Of A 1024X1024 Pixel Digital Image Store, With Pulsed Progressive Readout Camera, For Gastro-Intestinal Radiology

    NASA Astrophysics Data System (ADS)

    Edmonds, E. W.; Rowlands, J. A.; Hynes, D. M.; Toth, B. D.; Porter, A. J.

    1986-06-01

    We discuss the applicability of intensified x-ray television systems for general digital radiography and the requirements necessary for physician acceptance. Television systems for videofluorography when limited to conventional fluoroscopic exposure rates (25uR/s to x-ray intensifier), with particular application to the gastro-intestinal system, all suffer from three problems which tend to degrade the image: (a) lack of resolution, (b) noise, and (c) patient movement. The system to be described in this paper addresses each of these problems. Resolution is that provided by the use of a 1024 x 1024 pixel frame store combined with a 1024 line video camera and a 10"/6" x-ray image intensifier. Problems of noise and sensitivity to patient movement are overcome by using a short but intense burst of radiation to produce the latent image, which is then read off the video camera in a progressive fashion and placed in the digital store. Hard copy is produced from a high resolution multiformat camera, or a high resolution digital laser camera. It is intended that this PPR system will replace the 100mm spot film camera in present use, and will provide information in digital form for further processing and eventual digital archiving.

  6. Cloud Height Estimation with a Single Digital Camera and Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Carretas, Filipe; Janeiro, Fernando M.

    2014-05-01

    Clouds influence the local weather, the global climate and are an important parameter in the weather prediction models. Clouds are also an essential component of airplane safety when visual flight rules (VFR) are enforced, such as in most small aerodromes where it is not economically viable to install instruments for assisted flying. Therefore it is important to develop low cost and robust systems that can be easily deployed in the field, enabling large scale acquisition of cloud parameters. Recently, the authors developed a low-cost system for the measurement of cloud base height using stereo-vision and digital photography. However, due to the stereo nature of the system, some challenges were presented. In particular, the relative camera orientation requires calibration and the two cameras need to be synchronized so that the photos from both cameras are acquired simultaneously. In this work we present a new system that estimates the cloud height between 1000 and 5000 meters. This prototype is composed by one digital camera controlled by a Raspberry Pi and is installed at Centro de Geofísica de Évora (CGE) in Évora, Portugal. The camera is periodically triggered to acquire images of the overhead sky and the photos are downloaded to the Raspberry Pi which forwards them to a central computer that processes the images and estimates the cloud height in real time. To estimate the cloud height using just one image requires a computer model that is able to learn from previous experiences and execute pattern recognition. The model proposed in this work is an Artificial Neural Network (ANN) that was previously trained with cloud features at different heights. The chosen Artificial Neural Network is a three-layer network, with six parameters in the input layer, 12 neurons in the hidden intermediate layer, and an output layer with only one output. The six input parameters are the average intensity values and the intensity standard deviation of each RGB channel. The output

  7. Molecular Shocks Associated with Massive Young Stars: CO Line Images with a New Far-Infrared Spectroscopic Camera on the Kuiper Airborne Observatory

    NASA Technical Reports Server (NTRS)

    Watson, Dan M.

    1997-01-01

    Under the terms of our contract with NASA Ames Research Center, the University of Rochester (UR) offers the following final technical report on grant NAG 2-958, Molecular shocks associated with massive young stars: CO line images with a new far-infrared spectroscopic camera, given for implementation of the UR Far-Infrared Spectroscopic Camera (FISC) on the Kuiper Airborne Observatory (KAO), and use of this camera for observations of star-formation regions 1. Two KAO flights in FY 1995, the final year of KAO operations, were awarded to this program, conditional upon a technical readiness confirmation which was given in January 1995. The funding period covered in this report is 1 October 1994 - 30 September 1996. The project was supported with $30,000, and no funds remained at the conclusion of the project.

  8. Statistical correction of lidar-derived digital elevation models with multispectral airborne imagery in tidal marshes

    USGS Publications Warehouse

    Buffington, Kevin J.; Dugger, Bruce D.; Thorne, Karen M.; Takekawa, John

    2016-01-01

    Airborne light detection and ranging (lidar) is a valuable tool for collecting large amounts of elevation data across large areas; however, the limited ability to penetrate dense vegetation with lidar hinders its usefulness for measuring tidal marsh platforms. Methods to correct lidar elevation data are available, but a reliable method that requires limited field work and maintains spatial resolution is lacking. We present a novel method, the Lidar Elevation Adjustment with NDVI (LEAN), to correct lidar digital elevation models (DEMs) with vegetation indices from readily available multispectral airborne imagery (NAIP) and RTK-GPS surveys. Using 17 study sites along the Pacific coast of the U.S., we achieved an average root mean squared error (RMSE) of 0.072 m, with a 40–75% improvement in accuracy from the lidar bare earth DEM. Results from our method compared favorably with results from three other methods (minimum-bin gridding, mean error correction, and vegetation correction factors), and a power analysis applying our extensive RTK-GPS dataset showed that on average 118 points were necessary to calibrate a site-specific correction model for tidal marshes along the Pacific coast. By using available imagery and with minimal field surveys, we showed that lidar-derived DEMs can be adjusted for greater accuracy while maintaining high (1 m) resolution.

  9. First Results from an Airborne Ka-band SAR Using SweepSAR and Digital Beamforming

    NASA Technical Reports Server (NTRS)

    Sadowy, Gregory; Ghaemi, Hirad; Hensley, Scott

    2012-01-01

    NASA/JPL has developed SweepSAR technique that breaks typical Synthetic Aperture Radar (SAR) trade space using time-dependent multi-beam DBF on receive. Developing SweepSAR implementation using array-fed reflector for proposed DESDynI Earth Radar Mission concept. Performed first-of-a-kind airborne demonstration of the SweepSAR concept at Ka-band (35.6 GHz). Validated calibration and antenna pattern data sufficient for beam forming in elevation. (1) Provides validation evidence that the proposed Deformation Ecosystem Structure Dynamics of Ice (DESDynI) SAR architecture is sound. (2) Functions well even with large variations in receiver gain / phase. Future plans include using prototype DESDynI SAR digital flight hardware to do the beam forming in real-time onboard the aircraft.

  10. Portable retinal imaging for eye disease screening using a consumer-grade digital camera

    NASA Astrophysics Data System (ADS)

    Barriga, Simon; Larichev, Andrey; Zamora, Gilberto; Soliz, Peter

    2012-03-01

    The development of affordable means to image the retina is an important step toward the implementation of eye disease screening programs. In this paper we present the i-RxCam, a low-cost, hand-held, retinal camera for widespread applications such as tele-retinal screening for eye diseases like diabetic retinopathy (DR), glaucoma, and age-related ocular diseases. Existing portable retinal imagers do not meet the requirements of a low-cost camera with sufficient technical capabilities (field of view, image quality, portability, battery power, and ease-of-use) to be distributed widely to low volume clinics, such as the offices of single primary care physicians serving rural communities. The i-RxCam uses a Nikon D3100 digital camera body. The camera has a CMOS sensor with 14.8 million pixels. We use a 50mm focal lens that gives a retinal field of view of 45 degrees. The internal autofocus can compensate for about 2D (diopters) of focusing error. The light source is an LED produced by Philips with a linear emitting area that is transformed using a light pipe to the optimal shape at the eye pupil, an annulus. To eliminate corneal reflex we use a polarization technique in which the light passes through a nano-wire polarizer plate. This is a novel type of polarizer featuring high polarization separation (contrast ratio of more than 1000) and very large acceptance angle (>45 degrees). The i-RxCam approach will yield a significantly more economical retinal imaging device that would allow mass screening of the at-risk population.

  11. The feasibility of photo-based 3D modeling for the structures by using a common digital camera

    NASA Astrophysics Data System (ADS)

    Li, Ping; Zhang, Jin-quan; Li, Wan-heng; Lv, Jian-ming; Wang, Xin-zheng

    2011-12-01

    This article explored the method of photo-based 3D modeling for the arc bridge structures by ordinary digital camera. Firstly, a series of processes had been studied by using ordinary digital camera that included the camera calibration, data acquisition, data management, and 3D orientation, setting scale and textures, etc., then the 3D model from photos can be built. The model can be measured, edited and close to the real structures. Take an interior masonry arch bridge as an example, build 3D model through the processes above by using camera HP CB350. The 3D model can be integrated with the loading conditions and material properties, to provide the detailed data for analyzing the structure. This paper has accumulated the experience in data acquisition and modeling methods. The methods can be applied to other structural analysis, and other conditions of 3D modeling with fast and economic advantages.

  12. Estimating the spatial position of marine mammals based on digital camera recordings.

    PubMed

    Hoekendijk, Jeroen P A; de Vries, Jurre; van der Bolt, Krissy; Greinert, Jens; Brasseur, Sophie; Camphuysen, Kees C J; Aarts, Geert

    2015-02-01

    Estimating the spatial position of organisms is essential to quantify interactions between the organism and the characteristics of its surroundings, for example, predator-prey interactions, habitat selection, and social associations. Because marine mammals spend most of their time under water and may appear at the surface only briefly, determining their exact geographic location can be challenging. Here, we developed a photogrammetric method to accurately estimate the spatial position of marine mammals or birds at the sea surface. Digital recordings containing landscape features with known geographic coordinates can be used to estimate the distance and bearing of each sighting relative to the observation point. The method can correct for frame rotation, estimates pixel size based on the reference points, and can be applied to scenarios with and without a visible horizon. A set of R functions was written to process the images and obtain accurate geographic coordinates for each sighting. The method is applied to estimate the spatiotemporal fine-scale distribution of harbour porpoises in a tidal inlet. Video recordings of harbour porpoises were made from land, using a standard digital single-lens reflex (DSLR) camera, positioned at a height of 9.59 m above mean sea level. Porpoises were detected up to a distance of ∽3136 m (mean 596 m), with a mean location error of 12 m. The method presented here allows for multiple detections of different individuals within a single video frame and for tracking movements of individuals based on repeated sightings. In comparison with traditional methods, this method only requires a digital camera to provide accurate location estimates. It especially has great potential in regions with ample data on local (a)biotic conditions, to help resolve functional mechanisms underlying habitat selection and other behaviors in marine mammals in coastal areas. PMID:25691982

  13. Estimating the spatial position of marine mammals based on digital camera recordings

    PubMed Central

    Hoekendijk, Jeroen P A; de Vries, Jurre; van der Bolt, Krissy; Greinert, Jens; Brasseur, Sophie; Camphuysen, Kees C J; Aarts, Geert

    2015-01-01

    Estimating the spatial position of organisms is essential to quantify interactions between the organism and the characteristics of its surroundings, for example, predator–prey interactions, habitat selection, and social associations. Because marine mammals spend most of their time under water and may appear at the surface only briefly, determining their exact geographic location can be challenging. Here, we developed a photogrammetric method to accurately estimate the spatial position of marine mammals or birds at the sea surface. Digital recordings containing landscape features with known geographic coordinates can be used to estimate the distance and bearing of each sighting relative to the observation point. The method can correct for frame rotation, estimates pixel size based on the reference points, and can be applied to scenarios with and without a visible horizon. A set of R functions was written to process the images and obtain accurate geographic coordinates for each sighting. The method is applied to estimate the spatiotemporal fine-scale distribution of harbour porpoises in a tidal inlet. Video recordings of harbour porpoises were made from land, using a standard digital single-lens reflex (DSLR) camera, positioned at a height of 9.59 m above mean sea level. Porpoises were detected up to a distance of ∽3136 m (mean 596 m), with a mean location error of 12 m. The method presented here allows for multiple detections of different individuals within a single video frame and for tracking movements of individuals based on repeated sightings. In comparison with traditional methods, this method only requires a digital camera to provide accurate location estimates. It especially has great potential in regions with ample data on local (a)biotic conditions, to help resolve functional mechanisms underlying habitat selection and other behaviors in marine mammals in coastal areas. PMID:25691982

  14. Realization of the FPGA based TDI algorithm in digital domain for CMOS cameras

    NASA Astrophysics Data System (ADS)

    Tao, Shuping; Jin, Guang; Zhang, Xuyan; Qu, Hongsong

    2012-10-01

    In order to make the CMOS image sensors suitable for space high resolution imaging applications, a new method realizing TDI in digital domain by FPGA is proposed in this paper, which improves the imaging mode for area array CMOS sensors. The TDI algorithm accumulates the corresponding pixels of adjoining frames in digital domain, so the gray values increase by M times, where M is for the integration number, and the image's quality in signal-to-noise ratio can be improved. In addition, the TDI optimization algorithm is discussed. Firstly, the signal storage is optimized by 2 slices of external RAM, where memory depth expanding and the table tennis operation mechanism are used. Secondly, the FIFO operation mechanism reduces the reading and writing operation on memory by M×(M-1) times, It saves so much signal transfer time as is proportional to the square of integration number M2, that the frame frequency is able to increase greatly. At last, the CMOS camera based on TDI in digital domain is developed, and the algorithm is validated by experiments on it.

  15. Combining multi-spectral proximal sensors and digital cameras for monitoring grazed tropical pastures

    NASA Astrophysics Data System (ADS)

    Handcock, R. N.; Gobbett, D. L.; González, L. A.; Bishop-Hurley, G. J.; McGavin, S. L.

    2015-11-01

    Timely and accurate monitoring of pasture biomass and ground-cover is necessary in livestock production systems to ensure productive and sustainable management of forage for livestock. Interest in the use of proximal sensors for monitoring pasture status in grazing systems has increased, since such sensors can return data in near real-time, and have the potential to be deployed on large properties where remote sensing may not be suitable due to issues such as spatial scale or cloud cover. However, there are unresolved challenges in developing calibrations to convert raw sensor data to quantitative biophysical values, such as pasture biomass or vegetation ground-cover, to allow meaningful interpretation of sensor data by livestock producers. We assessed the use of multiple proximal sensors for monitoring tropical pastures with a pilot deployment of sensors at two sites on Lansdown Research Station near Townsville, Australia. Each site was monitored by a Skye SKR-four-band multi-spectral sensor (every 1 min), a digital camera (every 30 min), and a soil moisture sensor (every 1 min), each operated over 18 months. Raw data from each sensor were processed to calculate a number of multispectral vegetation indices. Visual observations of pasture characteristics, including above-ground standing biomass and ground cover, were made every 2 weeks. A methodology was developed to manage the sensor deployment and the quality control of the data collected. The data capture from the digital cameras was more reliable than the multi-spectral sensors, which had up to 63 % of data discarded after data cleaning and quality control. We found a strong relationship between sensor and pasture measurements during the wet season period of maximum pasture growth (January to April), especially when data from the multi-spectral sensors were combined with weather data. RatioNS34 (a simple band ratio between the near infrared (NIR) and lower shortwave infrared (SWIR) bands) and rainfall since 1

  16. Colorimetric characterization of digital cameras with unrestricted capture settings applicable for different illumination circumstances

    NASA Astrophysics Data System (ADS)

    Fang, Jingyu; Xu, Haisong; Wang, Zhehong; Wu, Xiaomin

    2016-05-01

    With colorimetric characterization, digital cameras can be used as image-based tristimulus colorimeters for color communication. In order to overcome the restriction of fixed capture settings adopted in the conventional colorimetric characterization procedures, a novel method was proposed considering capture settings. The method calculating colorimetric value of the measured image contains five main steps, including conversion from RGB values to equivalent ones of training settings through factors based on imaging system model so as to build the bridge between different settings, scaling factors involved in preparation steps for transformation mapping to avoid errors resulted from nonlinearity of polynomial mapping for different ranges of illumination levels. The experiment results indicate that the prediction error of the proposed method, which was measured by CIELAB color difference formula, reaches less than 2 CIELAB units under different illumination levels and different correlated color temperatures. This prediction accuracy for different capture settings remains the same level as the conventional method for particular lighting condition.

  17. Color segmentation as an aid to white balancing for digital still cameras

    NASA Astrophysics Data System (ADS)

    Cooper, Ted J.

    2000-12-01

    Digital Still Cameras employ automatic white balance techniques to adjust sensor amplifier gains so that white imaged objects appear white. A color cast detection algorithm is presented that uses histogram and segmentation techniques to select near-neutral objects in the image. Once identified and classified, these objects permit determination of the scene illuminant and implicitly the respective amplifier gains. Under certain circumstances, a scene may contain no near-neutral objects. By using the segmentation operations on non-neutral image objects, memory colors, from skin, sky, and foliage objects, may be identified. If identified, these memory colors provide enough chromatic information to predict the scene illuminant. By combining the approaches from near-neutral objects with those of memory color objects, a reasonable automatic white balance over a wide range of scenes is possible.

  18. Noctilucent clouds: modern ground-based photographic observations by a digital camera network.

    PubMed

    Dubietis, Audrius; Dalin, Peter; Balčiūnas, Ričardas; Černis, Kazimieras; Pertsev, Nikolay; Sukhodoev, Vladimir; Perminov, Vladimir; Zalcik, Mark; Zadorozhny, Alexander; Connors, Martin; Schofield, Ian; McEwan, Tom; McEachran, Iain; Frandsen, Soeren; Hansen, Ole; Andersen, Holger; Grønne, Jesper; Melnikov, Dmitry; Manevich, Alexander; Romejko, Vitaly

    2011-10-01

    Noctilucent, or "night-shining," clouds (NLCs) are a spectacular optical nighttime phenomenon that is very often neglected in the context of atmospheric optics. This paper gives a brief overview of current understanding of NLCs by providing a simple physical picture of their formation, relevant observational characteristics, and scientific challenges of NLC research. Modern ground-based photographic NLC observations, carried out in the framework of automated digital camera networks around the globe, are outlined. In particular, the obtained results refer to studies of single quasi-stationary waves in the NLC field. These waves exhibit specific propagation properties--high localization, robustness, and long lifetime--that are the essential requisites of solitary waves.

  19. Measuring the kinetic parameters of saltating sand grains using a high-speed digital camera

    NASA Astrophysics Data System (ADS)

    Zhang, Yang; Wang, Yuan; Jia, Pan

    2014-06-01

    A high-speed digital camera is used to record the saltation of three sand samples (diameter range: 300-500, 200-300 and 100-125 μm). This is followed by an overlapping particle tracking algorithm to reconstruct the saltating trajectory and the differential scheme to abstract the kinetic parameters of saltating grains. The velocity results confirm the propagating feature of saltation in maintaining near-face aeolian sand transport. Moreover, the acceleration of saltating sand grains was obtained directly from the reconstructed trajectory, and the results reveal that the climbing stage of the saltating trajectory represents an critical process of energy transfer while the sand grains travel through air.

  20. Estimating information from image colors: an application to digital cameras and natural scenes.

    PubMed

    Marín-Franch, Iván; Foster, David H

    2013-01-01

    The colors present in an image of a scene provide information about its constituent elements. But the amount of information depends on the imaging conditions and on how information is calculated. This work had two aims. The first was to derive explicitly estimators of the information available and the information retrieved from the color values at each point in images of a scene under different illuminations. The second was to apply these estimators to simulations of images obtained with five sets of sensors used in digital cameras and with the cone photoreceptors of the human eye. Estimates were obtained for 50 hyperspectral images of natural scenes under daylight illuminants with correlated color temperatures 4,000, 6,500, and 25,000 K. Depending on the sensor set, the mean estimated information available across images with the largest illumination difference varied from 15.5 to 18.0 bits and the mean estimated information retrieved after optimal linear processing varied from 13.2 to 15.5 bits (each about 85 percent of the corresponding information available). With the best sensor set, 390 percent more points could be identified per scene than with the worst. Capturing scene information from image colors depends crucially on the choice of camera sensors.

  1. Estimating information from image colors: an application to digital cameras and natural scenes.

    PubMed

    Marín-Franch, Iván; Foster, David H

    2013-01-01

    The colors present in an image of a scene provide information about its constituent elements. But the amount of information depends on the imaging conditions and on how information is calculated. This work had two aims. The first was to derive explicitly estimators of the information available and the information retrieved from the color values at each point in images of a scene under different illuminations. The second was to apply these estimators to simulations of images obtained with five sets of sensors used in digital cameras and with the cone photoreceptors of the human eye. Estimates were obtained for 50 hyperspectral images of natural scenes under daylight illuminants with correlated color temperatures 4,000, 6,500, and 25,000 K. Depending on the sensor set, the mean estimated information available across images with the largest illumination difference varied from 15.5 to 18.0 bits and the mean estimated information retrieved after optimal linear processing varied from 13.2 to 15.5 bits (each about 85 percent of the corresponding information available). With the best sensor set, 390 percent more points could be identified per scene than with the worst. Capturing scene information from image colors depends crucially on the choice of camera sensors. PMID:22450817

  2. A Cryogenic, Insulating Suspension System for the High Resolution Airborne Wideband Camera (HAWC)and Submillemeter And Far Infrared Experiment (SAFIRE) Adiabatic Demagnetization Refrigerators (ADRs)

    NASA Technical Reports Server (NTRS)

    Voellmer, George M.; Jackson, Michael L.; Shirron, Peter J.; Tuttle, James G.

    2002-01-01

    The High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter And Far Infrared Experiment (SAFIRE) will use identical Adiabatic Demagnetization Refrigerators (ADR) to cool their detectors to 200mK and 100mK, respectively. In order to minimize thermal loads on the salt pill, a Kevlar suspension system is used to hold it in place. An innovative, kinematic suspension system is presented. The suspension system is unique in that it consists of two parts that can be assembled and tensioned offline, and later bolted onto the salt pill.

  3. A simple method for evaluating image quality of screen-film system using a high-performance digital camera

    NASA Astrophysics Data System (ADS)

    Fujita, Naotoshi; Yamazaki, Asumi; Ichikawa, Katsuhiro; Kodera, Yoshie

    2009-02-01

    Screen-film systems are used in mammography even now. Therefore, it is important to measure their physical properties such as modulation transfer function (MTF) or noise power spectrum (NPS). The MTF and NPS of screen-film systems are mostly measured by using a microdensitometer. However, since microdensitometers are not commonly used in general hospitals, it is difficult to carry out these measurements regularly. In the past, Ichikawa et al. have measured and evaluated the physical properties of medical liquid crystal displays by using a high-performance digital camera. By this method, the physical properties of screen-film systems can be measured easily without using a microdensitometer. Therefore, we have proposed a simple method for measuring the MTF and NPS of screen-film systems by using a high-performance digital camera. The proposed method is based on the edge method (for evaluating MTF) and the one-dimensional fast Fourier transform (FFT) method (for evaluating NPS), respectively. As a result, the MTF and NPS evaluated by using the high-performance digital camera approximately corresponded with those evaluated by using a microdensitometer. It is possible to substitute the calculation of MTF and NPS by using a high-performance digital camera for that by using a microdensitometer. Further, this method also simplifies the evaluation of the physical properties of screen-film systems.

  4. New Stereo Vision Digital Camera System for Simultaneous Measurement of Cloud Base Height and Atmospheric Visibility

    NASA Astrophysics Data System (ADS)

    Janeiro, F. M.; Carretas, F.; Palma, N.; Ramos, P. M.; Wagner, F.

    2013-12-01

    Clouds play an important role in many aspects of everyday life. They affect both the local weather as well as the global climate and are an important parameter on climate change studies. Cloud parameters are also important for weather prediction models which make use of actual measurements. It is thus important to have low-cost instrumentation that can be deployed in the field to measure those parameters. This kind of instruments should also be automated and robust since they may be deployed in remote places and be subject to adverse weather conditions. Although clouds are very important in environmental systems, they are also an essential component of airplane safety when visual flight rules (VFR) are enforced, such as in most small aerodromes where it is not economically viable to install instruments for assisted flying. Under VFR there are strict limits on the height of the cloud base, cloud cover and atmospheric visibility that ensure the safety of the pilots and planes. Although there are instruments, available in the market, to measure those parameters, their relatively high cost makes them unavailable in many local aerodromes. In this work we present a new prototype which has been recently developed and deployed in a local aerodrome as proof of concept. It is composed by two digital cameras that capture photographs of the sky and allow the measurement of the cloud height from the parallax effect. The new developments consist on having a new geometry which allows the simultaneous measurement of cloud base height, wind speed at cloud base height and atmospheric visibility, which was not previously possible with only two cameras. The new orientation of the cameras comes at the cost of a more complex geometry to measure the cloud base height. The atmospheric visibility is calculated from the Lambert-Beer law after the measurement of the contrast between a set of dark objects and the background sky. The prototype includes the latest hardware developments that

  5. Analysis of airborne LiDAR as a basis for digital soil mapping in Alpine areas

    NASA Astrophysics Data System (ADS)

    Kringer, K.; Tusch, M.; Geitner, C.; Meißl, G.; Rutzinger, M.

    2009-04-01

    Especially in mountainous regions like the Alps the formation of soil is highly influenced by relief characteristics. Among all factors included in Jenny's (1941) model for soil development, relief is the one most commonly used in approaches to create digital soil maps and to derive soil properties from secondary data sources (McBratney et al. 2003). Elevation data, first order (slope, aspect) and second order derivates (plan, profile and cross-sectional curvature) as well as complex morphometric parameters (various landform classifications, e.g., Wood 1996) and compound indices (e.g., topographic wetness indices, vertical distance to drainage network, insolation) can be calculated from digital elevation models (DEM). However, while being an important source of information for digital soil mapping on small map scales, "conventional" DEMs are of limited use for the design of large scale conceptual soil maps for small areas due to rather coarse raster resolutions with cell sizes ranging from 20 to 100 meters. Slight variations in elevation and small landform features might not be discernible even though they might have a significant effect to soil formation, e.g., regarding the influence of groundwater in alluvial soils or the extent of alluvial fans. Nowadays, Airborne LiDAR (Light Detection And Ranging) provides highly accurate data for the elaboration of high-resolution digital terrain models (DTM) even in forested areas. In the project LASBO (Laserscanning in der Bodenkartierung) the applicability of digital terrain models derived from LiDAR for the identification of soil-relevant geomorphometric parameter is investigated. Various algorithms which were initially designed for coarser raster data are applied on high-resolution DTMs. Test areas for LASBO are located in the region of Bruneck (Italy) and near the municipality of Kramsach in the Inn Valley (Austria). The freely available DTM for Bruneck has a raster resolution of 2.5 meters while in Kramsach a DTM with

  6. Characterizing arid region alluvial fan surface roughness with airborne laser swath mapping digital topographic data

    NASA Astrophysics Data System (ADS)

    Frankel, Kurt L.; Dolan, James F.

    2007-06-01

    Range-front alluvial fan deposition in arid environments is episodic and results in multiple fan surfaces and ages. These distinct landforms are often defined by descriptions of their surface morphology, desert varnish accumulation, clast rubification, desert pavement formation, soil development, and stratigraphy. Although quantifying surface roughness differences between alluvial fan units has proven to be difficult in the past, high-resolution airborne laser swath mapping (ALSM) digital topographic data are now providing researchers with an opportunity to study topography in unprecedented detail. Here we use ALSM data to calculate surface roughness on two alluvial fans in northern Death Valley, California. We define surface roughness as the standard deviation of slope in a 5-m by 5-m moving window. Comparison of surface roughness values between mapped fan surfaces shows that each unit is statistically unique at the 99% confidence level. Furthermore, there is an obvious smoothing trend from the presently active channel to a deposit with cosmogenic 10Be and 36Cl surface exposure ages of ˜70 ka. Beyond 70 ka, alluvial landforms become progressively rougher with age. These data suggest that alluvial fans in arid regions smooth out with time until a threshold is crossed where roughness increases at greater wavelength with age as a result of surface runoff and headward tributary incision into the oldest surfaces.

  7. Evaluation of an airborne remote sensing platform consisting of two consumer-grade cameras for crop identification

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Remote sensing systems based on consumer-grade cameras have been increasingly used in scientific research and remote sensing applications because of their low cost and ease of use. However, the performance of consumer-grade cameras for practical applications have not been well documented in related ...

  8. New long-zoom lens for 4K super 35mm digital cameras

    NASA Astrophysics Data System (ADS)

    Thorpe, Laurence J.; Usui, Fumiaki; Kamata, Ryuhei

    2015-05-01

    The world of television production is beginning to adopt 4K Super 35 mm (S35) image capture for a widening range of program genres that seek both the unique imaging properties of that large image format and the protection of their program assets in a world anticipating future 4K services. Documentary and natural history production in particular are transitioning to this form of production. The nature of their shooting demands long zoom lenses. In their traditional world of 2/3-inch digital HDTV cameras they have a broad choice in portable lenses - with zoom ranges as high as 40:1. In the world of Super 35mm the longest zoom lens is limited to 12:1 offering a telephoto of 400mm. Canon was requested to consider a significantly longer focal range lens while severely curtailing its size and weight. Extensive computer simulation explored countless combinations of optical and optomechanical systems in a quest to ensure that all operational requests and full 4K performance could be met. The final lens design is anticipated to have applications beyond entertainment production, including a variety of security systems.

  9. A new lunar digital elevation model from the Lunar Orbiter Laser Altimeter and SELENE Terrain Camera

    NASA Astrophysics Data System (ADS)

    Barker, M. K.; Mazarico, E.; Neumann, G. A.; Zuber, M. T.; Haruyama, J.; Smith, D. E.

    2016-07-01

    We present an improved lunar digital elevation model (DEM) covering latitudes within ±60°, at a horizontal resolution of 512 pixels per degree (∼60 m at the equator) and a typical vertical accuracy ∼3 to 4 m. This DEM is constructed from ∼ 4.5 ×109 geodetically-accurate topographic heights from the Lunar Orbiter Laser Altimeter (LOLA) onboard the Lunar Reconnaissance Orbiter, to which we co-registered 43,200 stereo-derived DEMs (each 1° × 1°) from the SELENE Terrain Camera (TC) (∼1010 pixels total). After co-registration, approximately 90% of the TC DEMs show root-mean-square vertical residuals with the LOLA data of <5 m compared to ∼ 50% prior to co-registration. We use the co-registered TC data to estimate and correct orbital and pointing geolocation errors from the LOLA altimetric profiles (typically amounting to <10 m horizontally and <1 m vertically). By combining both co-registered datasets, we obtain a near-global DEM with high geodetic accuracy, and without the need for surface interpolation. We evaluate the resulting LOLA + TC merged DEM (designated as "SLDEM2015") with particular attention to quantifying seams and crossover errors.

  10. Examination of the semi-automatic calculation technique of vegetation cover rate by digital camera images.

    NASA Astrophysics Data System (ADS)

    Takemine, S.; Rikimaru, A.; Takahashi, K.

    The rice is one of the staple foods in the world High quality rice production requires periodically collecting rice growth data to control the growth of rice The height of plant the number of stem the color of leaf is well known parameters to indicate rice growth Rice growth diagnosis method based on these parameters is used operationally in Japan although collecting these parameters by field survey needs a lot of labor and time Recently a laborsaving method for rice growth diagnosis is proposed which is based on vegetation cover rate of rice Vegetation cover rate of rice is calculated based on discriminating rice plant areas in a digital camera image which is photographed in nadir direction Discrimination of rice plant areas in the image was done by the automatic binarization processing However in the case of vegetation cover rate calculation method depending on the automatic binarization process there is a possibility to decrease vegetation cover rate against growth of rice In this paper a calculation method of vegetation cover rate was proposed which based on the automatic binarization process and referred to the growth hysteresis information For several images obtained by field survey during rice growing season vegetation cover rate was calculated by the conventional automatic binarization processing and the proposed method respectively And vegetation cover rate of both methods was compared with reference value obtained by visual interpretation As a result of comparison the accuracy of discriminating rice plant areas was increased by the proposed

  11. Hard color-shrinkage for color-image processing of a digital color camera

    NASA Astrophysics Data System (ADS)

    Saito, Takahiro; Ueda, Yasutaka; Fujii, Nobuhiro; Komatsu, Takashi

    2010-01-01

    The classic shrinkage works well for monochrome-image denoising. To utilize inter-channel color correlations, a noisy image undergoes the color-transformation from the RGB to the luminance-and-chrominance color space, and the luminance and the chrominance components are separately denoised. However, this approach cannot cope with signaldependent noise of a digital color camera. To utilize the noise's signal-dependencies, previously we have proposed the soft color-shrinkage where the inter-channel color correlations are directly utilized in the RGB color space. The soft color-shrinkage works well; but involves a large amount of computations. To alleviate the drawback, taking up the l0-l2 optimization problem whose solution yields the hard shrinkage, we introduce the l0 norms of color differences and the l0 norms of color sums into the model, and derive hard color-shrinkage as its solution. For each triplet of three primary colors, the hard color-shrinkage has 24 feasible solutions, and from among them selects the optimal feasible solution giving the minimal energy. We propose a method to control its shrinkage parameters spatially-adaptively according to both the local image statistics and the noise's signal-dependencies, and apply the spatially-adaptive hard color-shrinkage to removal of signal-dependent noise in a shift-invariant wavelet transform domain. The hard color-shrinkage performs mostly better than the soft color-shrinkage, from objective and subjective viewpoints.

  12. Analysis of chemiluminescence measurements by grey-scale ICCD and colour digital cameras

    NASA Astrophysics Data System (ADS)

    Migliorini, F.; Maffi, S.; De Iuliis, S.; Zizak, G.

    2014-05-01

    Spectral, grey-scale and colour chemiluminescence measurements of C2* and CH* radicals' emission are carried out on the flame front of a methane-air premixed flame at different equivalence ratios. To this purpose, properly spatially resolved optical equipment has been implemented in order to reduce the background emission from other burned gas regions. The grey-scale (ICCD + interference filters) and RGB colour (commercial digital camera) approaches have been compared in order to find a correspondence between the C2* and the green component, as well as the CH* and the blue component of the emission intensities. The C2*/CH* chemiluminescence ratio has been investigated at different equivalence ratios and a good correlation has been obtained, showing the possibility of sensing the equivalence ratio in practical systems. The grey-scale and colour chemiluminescence analysis has then been applied to a meso-scale not premixed swirl combustor fuelled with a methane-air mixture and operating at 0.3 MPa. 2D results are presented and discussed in this work.

  13. Product quality-based eco-efficiency applied to digital cameras.

    PubMed

    Park, Pil-Ju; Tahara, Kiyotaka; Inaba, Atsushi

    2007-04-01

    When calculating eco-efficiency, there are considerable confusion and controversy about what the product value is and how it should be quantified. We have proposed here a quantification method for eco-efficiency that derives the ratio of the multiplication value of the product quality and the life span of a product to its whole environmental impact based on Life Cycle Assessment (LCA). In this study, product quality was used as the product value and quantified by the following three steps: (1) normalization based on a value function, (2) determination of the subjective weighting factors of the attributes, and (3) calculation of product quality of the chosen products. The applicability of the proposed method to an actual product was evaluated using digital cameras. The results show that the eco-efficiency values of products equipped with rechargeable batteries were higher than those products that use alkaline batteries, because of higher quality values and lower environmental impacts. The sensitivity analysis shows that the proposed method was superior to the existing methods, because it enables to identify the quality level of the chosen products by considering all products that have the same functions in the market and because, when adding a new product, the calculated quality values in the proposed method do not have to be changed.

  14. Integral estimation of number of resolvable signal levels of digital cameras

    NASA Astrophysics Data System (ADS)

    Cheremkhin, P. A.; Evtikhiev, N. N.; Krasnov, V. V.; Kurbatova, E. A.; Starikov, R. S.; Starikov, S. N.

    2016-08-01

    Number of signal levels of modern photo- and videocameras equals thousands and tens of thousands. However because of temporal and spatial camera pixels noises and linear dynamic range limitation, number of resolvable signal levels is significantly lower. Earlier iterative method of estimation of number of resolvable signal levels of cameras was proposed. In this paper integral method of estimation of number of resolvable signal levels of cameras is proposed and applied to consumer camera.

  15. Performance analysis of digital cameras versus chromatic white light (CWL) sensors for the localization of latent fingerprints in crime scenes

    NASA Astrophysics Data System (ADS)

    Jankow, Mathias; Hildebrandt, Mario; Sturm, Jennifer; Kiltz, Stefan; Vielhauer, Claus

    2012-06-01

    In future applications of contactless acquisition techniques for latent fingerprints the automatic localization of potential fingerprint traces in crime scenes is required. Our goal is to study the application of a camera-based approach1 comparing with the performance of chromatic white light (CWL) techniques2 for the latent fingerprint localization in coarse and the resulting acquisition using detailed scans. Furthermore, we briefly evaluate the suitability of the camera-based acquisition for the detection of malicious fingerprint traces using an extended camera setup in comparison to Kiltz et al.3 Our experimental setup includes a Canon EOS 550D4 digital single-lens reflex (DSLR) camera and a FRT MicroProf2005 surface measurement device with CWL6002 sensor. We apply at least two fingerprints to each surface in our test set with 8 different either smooth, textured and structured surfaces to evaluate the detection performance of the two localization techniques using different pre-processing and feature extraction techniques. Printed fingerprint patterns as reproducible but potentially malicious traces3 are additionally acquired and analyzed on foil and compact discs. Our results indicate positive tendency towards a fast localization using the camera-based technique. All fingerprints that are located using the CWL sensor are found using the camera. However,the disadvantage of the camera-based technique is that the size of the region of interest for the detailed scan for each potential latent fingerprint is usually slightly larger compared to the CWL-based localization. Furthermore, this technique does not acquire 3D data and the resulting images are distorted due to the necessary angle between the camera and the surface. When applying the camera-based approach, it is required to optimize the feature extraction and classification. Furthermore, the required acquisition time for each potential fingerprint needs to be estimated to determine the time-savings of the

  16. Airborne nanoparticle characterization with a digital ion trap-reflectron time of flight mass spectrometer

    NASA Astrophysics Data System (ADS)

    Wang, Shenyi; Johnston, Murray V.

    2006-12-01

    A digital ion trap-reflectron time of flight mass spectrometer is described for airborne nanoparticle characterization. Charged particles sampled into this nanoaerosol mass spectrometer (NAMS) are captured in the ion trap and ablated with a high fluence laser pulse to reach the "complete ionization limit". Atomic ions produced from the trapped particle(s) are mass analyzed by time of flight, and the elemental composition is determined from the relative signal intensities in the mass spectrum. The particle size range captured in the ion trap is selected by the frequency applied to the ring electrode. Size selection is based on the mass normalized particle diameter, defined as the diameter of a spherical particle with unit density that has the same mass as the particle being analyzed. For the current instrument configuration, ring electrode frequencies between 5 and 140 kHz allow selective trapping of particles with a mass normalized diameter between 7 and 25 nm with a geometric standard deviation of about 1.1. The particle detection efficiency, defined as the fraction of charged particles entering the mass spectrometer that are subsequently captured and analyzed, is between l x l0-4 and 3 x l0-4 over this size range. The effective particle density can be determined from simultaneous measurement of the mobility and mass normalized diameters. Test nanoparticles composed of sucrose, polyethylene glycol, polypropylene glycol, sodium chloride, ammonium sulfate and copper(II) chloride are investigated. In most cases, the measured elemental compositions match the expected elemental compositions within +/-5% or less and the measured compositions do not change with particle size. The one exception is copper chloride, which does not yield a well-developed plasma when it is irradiated by the laser pulse.

  17. Digital holographic PTV for complicated flow in a water by two cameras and refractive index-matching method

    NASA Astrophysics Data System (ADS)

    Kuniyasu, Masataka; Aoyagi, Yusuke; Unno, Noriyuki; Satake, Shin-ichi; Yuki, Kazuhisa; Seki, Yohji

    2016-06-01

    A basic heat transfer promoter such as packed beds of spheres is one of the technologies of the promotion of heat transfer using the turbulent mixture. We carried out 3-D visualization of digital holographic PTV to understand the complicated flow in a sphere-packed pipe (SPP) using a refractive index-matching method with a water used as a working fluid, the spheres was made of MEXFLON, whose refractive index is the same as that of a water. To visualize the detail flow structure around the spheres in water, we performed three-dimensional simultaneous measurements of velocity field in a water flow in the SPP are performed by our proposed holography technique with two cameras. The velocity field by two cameras could obtain finer flow structures than that by one camera.

  18. Estimates of the error caused by atmospheric turbulence in determining object's motion speed using a digital camera

    NASA Astrophysics Data System (ADS)

    Valley, M. T.; Dudorov, V. V.; Kolosov, V. V.; Filimonov, G. A.

    2006-11-01

    The paper considers the error caused by atmospheric turbulence, in determining the motion speed of an object by using its successive images recorded on a matrix of a digital camera. Numerical modeling of the image of a moving object in successive time moments is performed. Fluctuation variance of the image mass centre affecting the measurement error is calculated. Error dependences on the distance to the object and path slope angle are obtained for different turbulence models. Considered are the situations, when the angular displacement of the object between two immediate shots of the digital camera is greater than the isoplanatism angle as well as the situations when the angular displacement is smaller than this angle.

  19. Comparative spectral analysis between the functionality of the human eye and of the optical part of a digital camera

    NASA Astrophysics Data System (ADS)

    Toadere, Florin

    2015-02-01

    A software that comparatively analysis the spectral functionality of the optical part of the human eye and of the optical image acquisition system of the digital camera, is presented. Comparisons are done using demonstrative images which present the spectral color transformations of an image that is considered the test object. To perform the simulations are presented the spectral models and are computed their effects on the colors of the spectral image, during the propagation of the D48 sun light through the eye and the optics of the digital camera. The simulations are made using a spectral image processing algorithm which converts the spectral image into XYZ color space, CIE CAM02 color appearance model and then into RGB color space.

  20. Two Years of Digital Terrain Model Production Using the Lunar Reconnaissance Orbiter Narrow Angle Camera

    NASA Astrophysics Data System (ADS)

    Burns, K.; Robinson, M. S.; Speyerer, E.; LROC Science Team

    2011-12-01

    One of the primary objectives of the Lunar Reconnaissance Orbiter Camera (LROC) is to gather stereo observations with the Narrow Angle Camera (NAC). These stereo observations are used to generate digital terrain models (DTMs). The NAC has a pixel scale of 0.5 to 2.0 meters but was not designed for stereo observations and thus requires the spacecraft to roll off-nadir to acquire these images. Slews interfere with the data collection of the other instruments, so opportunities are currently limited to four per day. Arizona State University has produced DTMs from 95 stereo pairs for 11 Constellation Project (CxP) sites (Aristarchus, Copernicus crater, Gruithuisen domes, Hortensius domes, Ina D-caldera, Lichtenberg crater, Mare Ingenii, Marius hills, Reiner Gamma, South Pole-Aitkin Rim, Sulpicius Gallus) as well as 30 other regions of scientific interest (including: Bhabha crater, highest and lowest elevation points, Highland Ponds, Kugler Anuchin, Linne Crater, Planck Crater, Slipher crater, Sears Crater, Mandel'shtam Crater, Virtanen Graben, Compton/Belkovich, Rumker Domes, King Crater, Luna 16/20/23/24 landing sites, Ranger 6 landing site, Wiener F Crater, Apollo 11/14/15/17, fresh craters, impact melt flows, Larmor Q crater, Mare Tranquillitatis pit, Hansteen Alpha, Moore F Crater, and Lassell Massif). To generate DTMs, the USGS ISIS software and SOCET SET° from BAE Systems are used. To increase the absolute accuracy of the DTMs, data obtained from the Lunar Orbiter Laser Altimeter (LOLA) is used to coregister the NAC images and define the geodetic reference frame. NAC DTMs have been used in examination of several sites, e.g. Compton-Belkovich, Marius Hills and Ina D-caldera [1-3]. LROC will continue to acquire high-resolution stereo images throughout the science phase of the mission and any extended mission opportunities, thus providing a vital dataset for scientific research as well as future human and robotic exploration. [1] B.L. Jolliff (2011) Nature

  1. Digital X-ray camera for quality evaluation three-dimensional topographic reconstruction of single crystals of biological macromolecules

    NASA Technical Reports Server (NTRS)

    Borgstahl, Gloria (Inventor); Lovelace, Jeff (Inventor); Snell, Edward Holmes (Inventor); Bellamy, Henry (Inventor)

    2008-01-01

    The present invention provides a digital topography imaging system for determining the crystalline structure of a biological macromolecule, wherein the system employs a charge coupled device (CCD) camera with antiblooming circuitry to directly convert x-ray signals to electrical signals without the use of phosphor and measures reflection profiles from the x-ray emitting source after x-rays are passed through a sample. Methods for using said system are also provided.

  2. A New Towed Digital DeepSea Camera and Multi-Rock Coring System: The WHOI TowCam

    NASA Astrophysics Data System (ADS)

    Billings, A.; Fornari, D.

    2002-12-01

    This year, a team of engineers at the Woods Hole Oceanographic Institution (WHOI) developed and successfully tested a new, digital deep-sea camera system as part of a NSF equipment development grant. The system has been used during two expeditions, one to the Galapagos Rift, and the most recent one to the Hess Deep. To date it has acquired nearly 20,000 digital seafloor images. The new WHOI Towed Digital Camera and Multi-Rock Coring System (TowCam) is an internally recording digital deep sea camera system that also permits acquisition of volcanic glass samples using up to six rock cores in conjunction with CTD water properties data. The TowCam is towed on a standard UNOLS coaxial CTD sea cable, thereby permitting real-time acquisition of digital depth and altitude data that can be used to help quantify objects in the digital images. The use of the conducting sea cable and CTD system also permits triggering of six rock core units on the sled so that discrete samples of volcanic glass can be collected during a lowering. By operating either at night in between Alvin dives, or during other seagoing programs, photographic information of the seafloor can be recorded for near real-time analysis and for planning subsequent Alvin dives or other sampling and surveying programs. The new WHOI TowCam is a self-recording, deep-sea towed camera system rated to 6000m. It is capable of remotely taking 1000 high-resolution color digital photographs on each lowering at intervals of 10-60 sec, while being towed 5-7m above the bottom at speeds of up to 1/2 knot. The digital camera (DigiSeaCam) was developed by DeepSea Power and Light of San Diego, CA and uses a 3.3 Megapixel Nikon995. The onboard CTD (SeaBird25) permits real-time display and recording of digital depth, altitude and other standard CTD sensors (e.g. conductivity, temperature, turbidity), and provides connectivity to the pylon that permits triggering of the rock corers. A strobe monitor connected to a spare serial port in

  3. Per-Pixel Coded Exposure for High-Speed and High-Resolution Imaging Using a Digital Micromirror Device Camera.

    PubMed

    Feng, Wei; Zhang, Fumin; Qu, Xinghua; Zheng, Shiwei

    2016-03-04

    High-speed photography is an important tool for studying rapid physical phenomena. However, low-frame-rate CCD (charge coupled device) or CMOS (complementary metal oxide semiconductor) camera cannot effectively capture the rapid phenomena with high-speed and high-resolution. In this paper, we incorporate the hardware restrictions of existing image sensors, design the sampling functions, and implement a hardware prototype with a digital micromirror device (DMD) camera in which spatial and temporal information can be flexibly modulated. Combined with the optical model of DMD camera, we theoretically analyze the per-pixel coded exposure and propose a three-element median quicksort method to increase the temporal resolution of the imaging system. Theoretically, this approach can rapidly increase the temporal resolution several, or even hundreds, of times without increasing bandwidth requirements of the camera. We demonstrate the effectiveness of our method via extensive examples and achieve 100 fps (frames per second) gain in temporal resolution by using a 25 fps camera.

  4. Per-Pixel Coded Exposure for High-Speed and High-Resolution Imaging Using a Digital Micromirror Device Camera

    PubMed Central

    Feng, Wei; Zhang, Fumin; Qu, Xinghua; Zheng, Shiwei

    2016-01-01

    High-speed photography is an important tool for studying rapid physical phenomena. However, low-frame-rate CCD (charge coupled device) or CMOS (complementary metal oxide semiconductor) camera cannot effectively capture the rapid phenomena with high-speed and high-resolution. In this paper, we incorporate the hardware restrictions of existing image sensors, design the sampling functions, and implement a hardware prototype with a digital micromirror device (DMD) camera in which spatial and temporal information can be flexibly modulated. Combined with the optical model of DMD camera, we theoretically analyze the per-pixel coded exposure and propose a three-element median quicksort method to increase the temporal resolution of the imaging system. Theoretically, this approach can rapidly increase the temporal resolution several, or even hundreds, of times without increasing bandwidth requirements of the camera. We demonstrate the effectiveness of our method via extensive examples and achieve 100 fps (frames per second) gain in temporal resolution by using a 25 fps camera. PMID:26959023

  5. Fault dislocation modeled structure of lobate scarps from Lunar Reconnaissance Orbiter Camera digital terrain models

    NASA Astrophysics Data System (ADS)

    Williams, N. R.; Watters, T. R.; Pritchard, M. E.; Banks, M. E.; Bell, J. F.

    2013-02-01

    Before the launch of the Lunar Reconnaissance Orbiter, known characteristics of lobate scarps on the Moon were limited to studies of only a few dozen scarps revealed in Apollo-era photographs within ~20° of the equator. The Lunar Reconnaissance Orbiter Camera now provides meter-scale images of more than 100 lobate scarps, as well as stereo-derived topography of about a dozen scarps. High-resolution digital terrain models (DTMs) provide unprecedented insight into scarp morphology and dimensions. Here, we analyze images and DTMs of the Slipher, Racah X-1, Mandel'shtam-1, Feoktistov, Simpelius-1, and Oppenheimer F lobate scarps. Parameters in fault dislocation models are iteratively varied to provide best fits to DTM topographic profiles to test previous interpretations that the observed landforms are the result of shallow, low-angle thrust faults. Results suggest that these faults occur from the surface down to depths of hundreds of meters, have dip angles of 35-40°, and have typical maximum slips of tens of meters. These lunar scarp models are comparable to modeled geometries of lobate scarps on Mercury, Mars, and asteroid 433 Eros, but are shallower and ~10° steeper than geometries determined in studies with limited Apollo-era data. Frictional and rock mass strength criteria constrain the state of global differential stress between 3.5 and 18.6 MPa at the modeled maximum depths of faulting. Our results are consistent with thermal history models that predict relatively small compressional stresses that likely arise from cooling of a magma ocean.

  6. Snow process monitoring in mountain forest environments with a digital camera network

    NASA Astrophysics Data System (ADS)

    Dong, Chunyu; Menzel, Lucas

    2016-04-01

    Snow processes are important components of the hydrologic cycle in mountainous areas and at high latitudes. Sparse observations in remote regions, in combination with complex topography, local climate specifics and the impact of heterogeneous vegetation cover complicate a detailed investigation of snow related processes. In this study, a camera network is applied to monitor the complex snow processes with high temporal resolution in montane forest environments (800-1200 m a.s.l.) in southwestern Germany. A typical feature of this region is the high temporal variability of weather conditions, with frequent snow accumulation and ablation processes and recurrent snow interception on conifers. We developed a semi-automatic procedure to interpret snow depths from the digital images, which shows high consistency with manual readings and station-based measurements. To extract the snow canopy interception dynamics from the pictures, six binary classification methods are compared. MaxEntropy classifier shows obviously better performance than the others in various illumination conditions, and it is thus selected to execute the snow interception quantification. The snow accumulation and ablation processes on the ground as well as the snow loading and unloading in forest canopies are investigated based on the snow parameters derived from the time-lapse photography monitoring. Besides, the influences of meteorological conditions, forest cover and elevation on snow processes are considered. Further, our investigations serve to improve the snow and interception modules of a hydrological model. We found that time-lapse photography proves to be an effective and low-cost approach to collect useful snow-related information which supports our understanding of snow processes and the further development of hydrological models. We will present selected results from our investigations over two consecutive winters.

  7. Single-camera stereo-digital image correlation with a four-mirror adapter: optimized design and validation

    NASA Astrophysics Data System (ADS)

    Yu, Liping; Pan, Bing

    2016-12-01

    A low-cost, easy-to-implement but practical single-camera stereo-digital image correlation (DIC) system using a four-mirror adapter is established for accurate shape and three-dimensional (3D) deformation measurements. The mirrors assisted pseudo-stereo imaging system can convert a single camera into two virtual cameras, which view a specimen from different angles and record the surface images of the test object onto two halves of the camera sensor. To enable deformation measurement in non-laboratory conditions or extreme high temperature environments, an active imaging optical design, combining an actively illuminated monochromatic source with a coupled band-pass optical filter, is compactly integrated to the pseudo-stereo DIC system. The optical design, basic principles and implementation procedures of the established system for 3D profile and deformation measurements are described in detail. The effectiveness and accuracy of the established system are verified by measuring the profile of a regular cylinder surface and displacements of a translated planar plate. As an application example, the established system is used to determine the tensile strains and Poisson's ratio of a composite solid propellant specimen during stress relaxation test. Since the established single-camera stereo-DIC system only needs a single camera and presents strong robustness against variations in ambient light or the thermal radiation of a hot object, it demonstrates great potential in determining transient deformation in non-laboratory or high-temperature environments with the aid of a single high-speed camera.

  8. A pilot project combining multispectral proximal sensors and digital cameras for monitoring tropical pastures

    NASA Astrophysics Data System (ADS)

    Handcock, Rebecca N.; Gobbett, D. L.; González, Luciano A.; Bishop-Hurley, Greg J.; McGavin, Sharon L.

    2016-08-01

    Timely and accurate monitoring of pasture biomass and ground cover is necessary in livestock production systems to ensure productive and sustainable management. Interest in the use of proximal sensors for monitoring pasture status in grazing systems has increased, since data can be returned in near real time. Proximal sensors have the potential for deployment on large properties where remote sensing may not be suitable due to issues such as spatial scale or cloud cover. There are unresolved challenges in gathering reliable sensor data and in calibrating raw sensor data to values such as pasture biomass or vegetation ground cover, which allow meaningful interpretation of sensor data by livestock producers. Our goal was to assess whether a combination of proximal sensors could be reliably deployed to monitor tropical pasture status in an operational beef production system, as a precursor to designing a full sensor deployment. We use this pilot project to (1) illustrate practical issues around sensor deployment, (2) develop the methods necessary for the quality control of the sensor data, and (3) assess the strength of the relationships between vegetation indices derived from the proximal sensors and field observations across the wet and dry seasons. Proximal sensors were deployed at two sites in a tropical pasture on a beef production property near Townsville, Australia. Each site was monitored by a Skye SKR-four-band multispectral sensor (every 1 min), a digital camera (every 30 min), and a soil moisture sensor (every 1 min), each of which were operated over 18 months. Raw data from each sensor was processed to calculate multispectral vegetation indices. The data capture from the digital cameras was more reliable than the multispectral sensors, which had up to 67 % of data discarded after data cleaning and quality control for technical issues related to the sensor design, as well as environmental issues such as water incursion and insect infestations. We recommend

  9. Design and Fabrication of Two-Dimensional Semiconducting Bolometer Arrays for the High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC-II)

    NASA Technical Reports Server (NTRS)

    Voellmer, George M.; Allen, Christine A.; Amato, Michael J.; Babu, Sachidananda R.; Bartels, Arlin E.; Benford, Dominic J.; Derro, Rebecca J.; Dowell, C. Darren; Harper, D. Al; Jhabvala, Murzy D.

    2002-01-01

    The High resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC II) will use almost identical versions of an ion-implanted silicon bolometer array developed at the National Aeronautics and Space Administration's Goddard Space Flight Center (GSFC). The GSFC 'Pop-up' Detectors (PUD's) use a unique folding technique to enable a 12 x 32-element close-packed array of bolometers with a filling factor greater than 95 percent. A kinematic Kevlar(trademark) suspension system isolates the 200 mK bolometers from the helium bath temperature, and GSFC - developed silicon bridge chips make electrical connection to the bolometers, while maintaining thermal isolation. The JFET preamps operate at 120 K. Providing good thermal heat sinking for these, and keeping their conduction and radiation from reaching the nearby bolometers, is one of the principal design challenges encountered. Another interesting challenge is the preparation of the silicon bolometers. They are manufactured in 32-element, planar rows using Micro Electro Mechanical Systems (MEMS) semiconductor etching techniques, and then cut and folded onto a ceramic bar. Optical alignment using specialized jigs ensures their uniformity and correct placement. The rows are then stacked to create the 12 x 32-element array. Engineering results from the first light run of SHARC II at the Caltech Submillimeter Observatory (CSO) are presented.

  10. Design and Fabrication of Two-Dimensional Semiconducting Bolometer Arrays for the High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC-II)

    NASA Technical Reports Server (NTRS)

    Voellmer, George M.; Allen, Christine A.; Amato, Michael J.; Babu, Sachidananda R.; Bartels, Arlin E.; Benford, Dominic J.; Derro, Rebecca J.; Dowell, C. Darren; Harper, D. Al; Jhabvala, Murzy D.; Simpson, A. D. (Technical Monitor)

    2002-01-01

    The High resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC 11) will use almost identical versions of an ion-implanted silicon bolometer array developed at the National Aeronautics and Space Administration's Goddard Space Flight Center (GSFC). The GSFC "Pop-Up" Detectors (PUD's) use a unique folding technique to enable a 12 x 32-element close-packed array of bolometers with a filling factor greater than 95 percent. A kinematic Kevlar(Registered Trademark) suspension system isolates the 200 mK bolometers from the helium bath temperature, and GSFC - developed silicon bridge chips make electrical connection to the bolometers, while maintaining thermal isolation. The JFET preamps operate at 120 K. Providing good thermal heat sinking for these, and keeping their conduction and radiation from reaching the nearby bolometers, is one of the principal design challenges encountered. Another interesting challenge is the preparation of the silicon bolometers. They are manufactured in 32-element, planar rows using Micro Electro Mechanical Systems (MEMS) semiconductor etching techniques, and then cut and folded onto a ceramic bar. Optical alignment using specialized jigs ensures their uniformity and correct placement. The rows are then stacked to create the 12 x 32-element array. Engineering results from the first light run of SHARC II at the CalTech Submillimeter Observatory (CSO) are presented.

  11. Comparison of - and Mutual Informaton Based Calibration of Terrestrial Laser Scanner and Digital Camera for Deformation Monitoring

    NASA Astrophysics Data System (ADS)

    Omidalizarandi, M.; Neumann, I.

    2015-12-01

    In the current state-of-the-art, geodetic deformation analysis of natural and artificial objects (e.g. dams, bridges,...) is an ongoing research in both static and kinematic mode and has received considerable interest by researchers and geodetic engineers. In this work, due to increasing the accuracy of geodetic deformation analysis, a terrestrial laser scanner (TLS; here the Zoller+Fröhlich IMAGER 5006) and a high resolution digital camera (Nikon D750) are integrated to complementarily benefit from each other. In order to optimally combine the acquired data of the hybrid sensor system, a highly accurate estimation of the extrinsic calibration parameters between TLS and digital camera is a vital preliminary step. Thus, the calibration of the aforementioned hybrid sensor system can be separated into three single calibrations: calibration of the camera, calibration of the TLS and extrinsic calibration between TLS and digital camera. In this research, we focus on highly accurate estimating extrinsic parameters between fused sensors and target- and targetless (mutual information) based methods are applied. In target-based calibration, different types of observations (image coordinates, TLS measurements and laser tracker measurements for validation) are utilized and variance component estimation is applied to optimally assign adequate weights to the observations. Space resection bundle adjustment based on the collinearity equations is solved using Gauss-Markov and Gauss-Helmert model. Statistical tests are performed to discard outliers and large residuals in the adjustment procedure. At the end, the two aforementioned approaches are compared and advantages and disadvantages of them are investigated and numerical results are presented and discussed.

  12. Co-Registration Airborne LIDAR Point Cloud Data and Synchronous Digital Image Registration Based on Combined Adjustment

    NASA Astrophysics Data System (ADS)

    Yang, Z. H.; Zhang, Y. S.; Zheng, T.; Lai, W. B.; Zou, Z. R.; Zou, B.

    2016-06-01

    Aim at the problem of co-registration airborne laser point cloud data with the synchronous digital image, this paper proposed a registration method based on combined adjustment. By integrating tie point, point cloud data with elevation constraint pseudo observations, using the principle of least-squares adjustment to solve the corrections of exterior orientation elements of each image, high-precision registration results can be obtained. In order to ensure the reliability of the tie point, and the effectiveness of pseudo observations, this paper proposed a point cloud data constrain SIFT matching and optimizing method, can ensure that the tie points are located on flat terrain area. Experiments with the airborne laser point cloud data and its synchronous digital image, there are about 43 pixels error in image space using the original POS data. If only considering the bore-sight of POS system, there are still 1.3 pixels error in image space. The proposed method regards the corrections of the exterior orientation elements of each image as unknowns and the errors are reduced to 0.15 pixels.

  13. Small Field of View Scintimammography Gamma Camera Integrated to a Stereotactic Core Biopsy Digital X-ray System

    SciTech Connect

    Andrew Weisenberger; Fernando Barbosa; T. D. Green; R. Hoefer; Cynthia Keppel; Brian Kross; Stanislaw Majewski; Vladimir Popov; Randolph Wojcik

    2002-10-01

    A small field of view gamma camera has been developed for integration with a commercial stereotactic core biopsy system. The goal is to develop and implement a dual-modality imaging system utilizing scintimammography and digital radiography to evaluate the reliability of scintimammography in predicting the malignancy of suspected breast lesions from conventional X-ray mammography. The scintimammography gamma camera is a custom-built mini gamma camera with an active area of 5.3 cm /spl times/ 5.3 cm and is based on a 2 /spl times/ 2 array of Hamamatsu R7600-C8 position-sensitive photomultiplier tubes. The spatial resolution of the gamma camera at the collimator surface is < 4 mm full-width at half-maximum and a sensitivity of /spl sim/ 4000 Hz/mCi. The system is also capable of acquiring dynamic scintimammographic data to allow for dynamic uptake studies. Sample images of preliminary clinical results are presented to demonstrate the performance of the system.

  14. Active hyperspectral imaging using a quantum cascade laser (QCL) array and digital-pixel focal plane array (DFPA) camera.

    PubMed

    Goyal, Anish; Myers, Travis; Wang, Christine A; Kelly, Michael; Tyrrell, Brian; Gokden, B; Sanchez, Antonio; Turner, George; Capasso, Federico

    2014-06-16

    We demonstrate active hyperspectral imaging using a quantum-cascade laser (QCL) array as the illumination source and a digital-pixel focal-plane-array (DFPA) camera as the receiver. The multi-wavelength QCL array used in this work comprises 15 individually addressable QCLs in which the beams from all lasers are spatially overlapped using wavelength beam combining (WBC). The DFPA camera was configured to integrate the laser light reflected from the sample and to perform on-chip subtraction of the passive thermal background. A 27-frame hyperspectral image was acquired of a liquid contaminant on a diffuse gold surface at a range of 5 meters. The measured spectral reflectance closely matches the calculated reflectance. Furthermore, the high-speed capabilities of the system were demonstrated by capturing differential reflectance images of sand and KClO3 particles that were moving at speeds of up to 10 m/s.

  15. Evaluation of a novel laparoscopic camera for characterization of renal ischemia in a porcine model using digital light processing (DLP) hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Olweny, Ephrem O.; Tan, Yung K.; Faddegon, Stephen; Jackson, Neil; Wehner, Eleanor F.; Best, Sara L.; Park, Samuel K.; Thapa, Abhas; Cadeddu, Jeffrey A.; Zuzak, Karel J.

    2012-03-01

    Digital light processing hyperspectral imaging (DLP® HSI) was adapted for use during laparoscopic surgery by coupling a conventional laparoscopic light guide with a DLP-based Agile Light source (OL 490, Optronic Laboratories, Orlando, FL), incorporating a 0° laparoscope, and a customized digital CCD camera (DVC, Austin, TX). The system was used to characterize renal ischemia in a porcine model.

  16. Increasing signal-to-noise ratio of reconstructed digital holograms by using light spatial noise portrait of camera's photosensor

    NASA Astrophysics Data System (ADS)

    Cheremkhin, Pavel A.; Evtikhiev, Nikolay N.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Starikov, Sergey N.

    2015-01-01

    Digital holography is technique which includes recording of interference pattern with digital photosensor, processing of obtained holographic data and reconstruction of object wavefront. Increase of signal-to-noise ratio (SNR) of reconstructed digital holograms is especially important in such fields as image encryption, pattern recognition, static and dynamic display of 3D scenes, and etc. In this paper compensation of photosensor light spatial noise portrait (LSNP) for increase of SNR of reconstructed digital holograms is proposed. To verify the proposed method, numerical experiments with computer generated Fresnel holograms with resolution equal to 512×512 elements were performed. Simulation of shots registration with digital camera Canon EOS 400D was performed. It is shown that solo use of the averaging over frames method allows to increase SNR only up to 4 times, and further increase of SNR is limited by spatial noise. Application of the LSNP compensation method in conjunction with the averaging over frames method allows for 10 times SNR increase. This value was obtained for LSNP measured with 20 % error. In case of using more accurate LSNP, SNR can be increased up to 20 times.

  17. Digital image measurement of specimen deformation based on CCD cameras and Image J software: an application to human pelvic biomechanics

    NASA Astrophysics Data System (ADS)

    Jia, Yongwei; Cheng, Liming; Yu, Guangrong; Lou, Yongjian; Yu, Yan; Chen, Bo; Ding, Zuquan

    2008-03-01

    A method of digital image measurement of specimen deformation based on CCD cameras and Image J software was developed. This method was used to measure the biomechanics behavior of human pelvis. Six cadaveric specimens from the third lumbar vertebra to the proximal 1/3 part of femur were tested. The specimens without any structural abnormalities were dissected of all soft tissue, sparing the hip joint capsules and the ligaments of the pelvic ring and floor. Markers with black dot on white background were affixed to the key regions of the pelvis. Axial loading from the proximal lumbar was applied by MTS in the gradient of 0N to 500N, which simulated the double feet standing stance. The anterior and lateral images of the specimen were obtained through two CCD cameras. Based on Image J software, digital image processing software, which can be freely downloaded from the National Institutes of Health, digital 8-bit images were processed. The procedure includes the recognition of digital marker, image invert, sub-pixel reconstruction, image segmentation, center of mass algorithm based on weighted average of pixel gray values. Vertical displacements of S1 (the first sacral vertebrae) in front view and micro-angular rotation of sacroiliac joint in lateral view were calculated according to the marker movement. The results of digital image measurement showed as following: marker image correlation before and after deformation was excellent. The average correlation coefficient was about 0.983. According to the 768 × 576 pixels image (pixel size 0.68mm × 0.68mm), the precision of the displacement detected in our experiment was about 0.018 pixels and the comparatively error could achieve 1.11\\perthou. The average vertical displacement of S1 of the pelvis was 0.8356+/-0.2830mm under vertical load of 500 Newtons and the average micro-angular rotation of sacroiliac joint in lateral view was 0.584+/-0.221°. The load-displacement curves obtained from our optical measure system

  18. First Results from an Airborne Ka-Band SAR Using SweepSAR and Digital Beamforming

    NASA Technical Reports Server (NTRS)

    Sadowy, Gregory A.; Ghaemi, Hirad; Hensley, Scott C.

    2012-01-01

    SweepSAR is a wide-swath synthetic aperture radar technique that is being studied for application on the future Earth science radar missions. This paper describes the design of an airborne radar demonstration that simulates an 11-m L-band (1.2-1.3 GHz) reflector geometry at Ka-band (35.6 GHz) using a 40-cm reflector. The Ka-band SweepSAR Demonstration system was flown on the NASA DC-8 airborne laboratory and used to study engineering performance trades and array calibration for SweepSAR configurations. We present an instrument and experiment overview, instrument calibration and first results.

  19. Reading Out Single-Molecule Digital RNA and DNA Isothermal Amplification in Nanoliter Volumes with Unmodified Camera Phones.

    PubMed

    Rodriguez-Manzano, Jesus; Karymov, Mikhail A; Begolo, Stefano; Selck, David A; Zhukov, Dmitriy V; Jue, Erik; Ismagilov, Rustem F

    2016-03-22

    Digital single-molecule technologies are expanding diagnostic capabilities, enabling the ultrasensitive quantification of targets, such as viral load in HIV and hepatitis C infections, by directly counting single molecules. Replacing fluorescent readout with a robust visual readout that can be captured by any unmodified cell phone camera will facilitate the global distribution of diagnostic tests, including in limited-resource settings where the need is greatest. This paper describes a methodology for developing a visual readout system for digital single-molecule amplification of RNA and DNA by (i) selecting colorimetric amplification-indicator dyes that are compatible with the spectral sensitivity of standard mobile phones, and (ii) identifying an optimal ratiometric image-process for a selected dye to achieve a readout that is robust to lighting conditions and camera hardware and provides unambiguous quantitative results, even for colorblind users. We also include an analysis of the limitations of this methodology, and provide a microfluidic approach that can be applied to expand dynamic range and improve reaction performance, allowing ultrasensitive, quantitative measurements at volumes as low as 5 nL. We validate this methodology using SlipChip-based digital single-molecule isothermal amplification with λDNA as a model and hepatitis C viral RNA as a clinically relevant target. The innovative combination of isothermal amplification chemistry in the presence of a judiciously chosen indicator dye and ratiometric image processing with SlipChip technology allowed the sequence-specific visual readout of single nucleic acid molecules in nanoliter volumes with an unmodified cell phone camera. When paired with devices that integrate sample preparation and nucleic acid amplification, this hardware-agnostic approach will increase the affordability and the distribution of quantitative diagnostic and environmental tests.

  20. Reading Out Single-Molecule Digital RNA and DNA Isothermal Amplification in Nanoliter Volumes with Unmodified Camera Phones.

    PubMed

    Rodriguez-Manzano, Jesus; Karymov, Mikhail A; Begolo, Stefano; Selck, David A; Zhukov, Dmitriy V; Jue, Erik; Ismagilov, Rustem F

    2016-03-22

    Digital single-molecule technologies are expanding diagnostic capabilities, enabling the ultrasensitive quantification of targets, such as viral load in HIV and hepatitis C infections, by directly counting single molecules. Replacing fluorescent readout with a robust visual readout that can be captured by any unmodified cell phone camera will facilitate the global distribution of diagnostic tests, including in limited-resource settings where the need is greatest. This paper describes a methodology for developing a visual readout system for digital single-molecule amplification of RNA and DNA by (i) selecting colorimetric amplification-indicator dyes that are compatible with the spectral sensitivity of standard mobile phones, and (ii) identifying an optimal ratiometric image-process for a selected dye to achieve a readout that is robust to lighting conditions and camera hardware and provides unambiguous quantitative results, even for colorblind users. We also include an analysis of the limitations of this methodology, and provide a microfluidic approach that can be applied to expand dynamic range and improve reaction performance, allowing ultrasensitive, quantitative measurements at volumes as low as 5 nL. We validate this methodology using SlipChip-based digital single-molecule isothermal amplification with λDNA as a model and hepatitis C viral RNA as a clinically relevant target. The innovative combination of isothermal amplification chemistry in the presence of a judiciously chosen indicator dye and ratiometric image processing with SlipChip technology allowed the sequence-specific visual readout of single nucleic acid molecules in nanoliter volumes with an unmodified cell phone camera. When paired with devices that integrate sample preparation and nucleic acid amplification, this hardware-agnostic approach will increase the affordability and the distribution of quantitative diagnostic and environmental tests. PMID:26900709

  1. New measuring concepts using integrated online analysis of color and monochrome digital high-speed camera sequences

    NASA Astrophysics Data System (ADS)

    Renz, Harald

    1997-05-01

    High speed sequences allow a subjective assessment of very fast processes and serve as an important basis for the quantitative analysis of movements. Computer systems help to acquire, handle, display and store digital image sequences as well as to perform measurement tasks automatically. High speed cameras have been used since several years for safety tests, material testing or production optimization. To get the very high speed of 1000 or more images per second, three have been used mainly 16 mm film cameras, which could provide an excellent image resolution and the required time resolution. But up to now, most results have been only judged by viewing. For some special applications like safety tests using crash or high-g sled tests in the automobile industry there have been used image analyzing techniques to measure also the characteristic of moving objects inside images. High speed films, shot during the short impact, allow judgement of the dynamic scene. Additionally they serve as an important basis for the quantitative analysis of the very fast movements. Thus exact values of the velocity and acceleration, the dummies or vehicles are exposed to, can be derived. For analysis of the sequences the positions of signalized points--mostly markers, which are fixed by the test engineers before a test--have to be measured frame by frame. The trajectories show the temporal sequence of the test objects and are the base for calibrated diagrams of distance, velocity and acceleration. Today there are replaced more and more 16 mm film cameras by electronic high speed cameras. The development of high-speed recording systems is very far advanced and the prices of these systems are more and more comparable to those of traditional film cameras. Also the resolution has been increased very greatly. The new cameras are `crashproof' and can be used for similar tasks as the 16 mm film cameras at similar sizes. High speed video cameras now offer an easy setup and direct access to

  2. An image compression algorithm for a high-resolution digital still camera

    NASA Technical Reports Server (NTRS)

    Nerheim, Rosalee

    1989-01-01

    The Electronic Still Camera (ESC) project will provide for the capture and transmission of high-quality images without the use of film. The image quality will be superior to video and will approach the quality of 35mm film. The camera, which will have the same general shape and handling as a 35mm camera, will be able to send images to earth in near real-time. Images will be stored in computer memory (RAM) in removable cartridges readable by a computer. To save storage space, the image will be compressed and reconstructed at the time of viewing. Both lossless and loss-y image compression algorithms are studied, described, and compared.

  3. Digital photogrammetric analysis of the IMP camera images: Mapping the Mars Pathfinder landing site in three dimensions

    USGS Publications Warehouse

    Kirk, R.L.; Howington-Kraus, E.; Hare, T.; Dorrer, E.; Cook, D.; Becker, K.; Thompson, K.; Redding, B.; Blue, J.; Galuszka, D.; Lee, E.M.; Gaddis, L.R.; Johnson, J. R.; Soderblom, L.A.; Ward, A.W.; Smith, P.H.; Britt, D.T.

    1999-01-01

    This paper describes our photogrammetric analysis of the Imager for Mars Pathfinder data, part of a broader program of mapping the Mars Pathfinder landing site in support of geoscience investigations. This analysis, carried out primarily with a commercial digital photogrammetric system, supported by our in-house Integrated Software for Imagers and Spectrometers (ISIS), consists of three steps: (1) geometric control: simultaneous solution for refined estimates of camera positions and pointing plus three-dimensional (3-D) coordinates of ???103 features sitewide, based on the measured image coordinates of those features; (2) topographic modeling: identification of ???3 ?? 105 closely spaced points in the images and calculation (based on camera parameters from step 1) of their 3-D coordinates, yielding digital terrain models (DTMs); and (3) geometric manipulation of the data: combination of the DTMs from different stereo pairs into a sitewide model, and reprojection of image data to remove parallax between the different spectral filters in the two cameras and to provide an undistorted planimetric view of the site. These processes are described in detail and example products are shown. Plans for combining the photogrammetrically derived topographic data with spectrophotometry are also described. These include photometric modeling using surface orientations from the DTM to study surface microtextures and improve the accuracy of spectral measurements, and photoclinometry to refine the DTM to single-pixel resolution where photometric properties are sufficiently uniform. Finally, the inclusion of rover images in a joint photogrammetric analysis with IMP images is described. This challenging task will provide coverage of areas hidden to the IMP, but accurate ranging of distant features can be achieved only if the lander is also visible in the rover image used. Copyright 1999 by the American Geophysical Union.

  4. Low-cost camera modifications and methodologies for very-high-resolution digital images

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Aerial color and color-infrared photography are usually acquired at high altitude so the ground resolution of the photographs is < 1 m. Moreover, current color-infrared cameras and manned aircraft flight time are expensive, so the objective is the development of alternative methods for obtaining ve...

  5. Greenness indices from digital cameras predict the timing and seasonal dynamics of canopy-scale photosynthesis

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The proliferation of tower-mounted cameras co-located with eddy covariance instrumentation provides a novel opportunity to better understand the relationship between canopy phenology and the seasonality of canopy photosynthesis. In this paper, we describe the abilities and limitations of webcams to ...

  6. Levee crest elevation profiles derived from airborne lidar-based high resolution digital elevation models in south Louisiana

    USGS Publications Warehouse

    Palaseanu-Lovejoy, Monica; Thatcher, Cindy A.; Barras, John A.

    2014-01-01

    This study explores the feasibility of using airborne lidar surveys to derive high-resolution digital elevation models (DEMs) and develop an automated procedure to extract levee longitudinal elevation profiles for both federal levees in Atchafalaya Basin and local levees in Lafourche Parish. Generally, the use of traditional manual surveying methods to map levees is a costly and time consuming process that typically produces cross-levee profiles every few hundred meters, at best. The purpose of our paper is to describe and test methods for extracting levee crest elevations in an efficient, comprehensive manner using high resolution lidar generated DEMs. In addition, the vertical uncertainty in the elevation data and its effect on the resultant estimate of levee crest heights is addressed in an assessment of whether the federal levees in our study meet the USACE minimum height design criteria.

  7. Applying emerging digital video interface standards to airborne avionics sensor and digital map integrations: benefits outweigh the initial costs

    NASA Astrophysics Data System (ADS)

    Kuehl, C. Stephen

    1996-06-01

    Video signal system performance can be compromised in a military aircraft cockpit management system (CMS) with the tailoring of vintage Electronics Industries Association (EIA) RS170 and RS343A video interface standards. Video analog interfaces degrade when induced system noise is present. Further signal degradation has been traditionally associated with signal data conversions between avionics sensor outputs and the cockpit display system. If the CMS engineering process is not carefully applied during the avionics video and computing architecture development, extensive and costly redesign will occur when visual sensor technology upgrades are incorporated. Close monitoring and technical involvement in video standards groups provides the knowledge-base necessary for avionic systems engineering organizations to architect adaptable and extendible cockpit management systems. With the Federal Communications Commission (FCC) in the process of adopting the Digital HDTV Grand Alliance System standard proposed by the Advanced Television Systems Committee (ATSC), the entertainment and telecommunications industries are adopting and supporting the emergence of new serial/parallel digital video interfaces and data compression standards that will drastically alter present NTSC-M video processing architectures. The re-engineering of the U.S. Broadcasting system must initially preserve the electronic equipment wiring networks within broadcast facilities to make the transition to HDTV affordable. International committee activities in technical forums like ITU-R (former CCIR), ANSI/SMPTE, IEEE, and ISO/IEC are establishing global consensus on video signal parameterizations that support a smooth transition from existing analog based broadcasting facilities to fully digital computerized systems. An opportunity exists for implementing these new video interface standards over existing video coax/triax cabling in military aircraft cockpit management systems. Reductions in signal

  8. A Digital Readout System For The CSO Microwave Kinetic Inductance Camera

    NASA Astrophysics Data System (ADS)

    Max-Moerbeck, Walter; Mazin, B. A.; Zmuidzinas, J.

    2007-12-01

    Submillimeter galaxies are important to the understanding of galaxy formation and evolution. Determination of the spectral energy distribution in the millimeter and submillimeter regimes allows important and powerful diagnostics. Our group is developing a camera for the Caltech Submillimeter Observatory (CSO) using Microwave Kinetic Inductance Detectors (MKIDs). MKIDs are superconducting devices whose impedance changes with the absorption of photons. The camera will have 600 spatial pixels and 4 bands at 750 μm, 850 μm, 1.1 mm and 1.3 mm. For each spatial pixel of the camera the radiation is coupled to the MKIDs using phased-array antennas. This signal is split into 4 different bands using filters and detected using the superconductor as part of a MKID's resonant circuit. The detection process consists of measurement of the changes in the transmission through the resonator when it is illuminated. By designing resonant circuits to have different resonant frequencies and high transmission out resonance, MKIDs can be frequency-domain multiplexed. This allows the simultaneous readout of many detectors through a single coaxial cable. The readout system makes use of microwave IQ modulation and is based on commercial electronics components operating at room temperature. The basic readout has been demonstrated on the CSO. We are working on the implementation of an improved design to be tested on a prototype system with 6x6 pixels and 4 colors next April on the CSO.

  9. Single-camera microscopic stereo digital image correlation using a diffraction grating.

    PubMed

    Pan, Bing; Wang, Qiong

    2013-10-21

    A simple, cost-effective but practical microscopic 3D-DIC method using a single camera and a transmission diffraction grating is proposed for surface profile and deformation measurement of small-scale objects. By illuminating a test sample with quasi-monochromatic source, the transmission diffraction grating placed in front of the camera can produce two laterally spaced first-order diffraction views of the sample surface into the two halves of the camera target. The single image comprising negative and positive first-order diffraction views can be used to reconstruct the profile of the test sample, while the two single images acquired before and after deformation can be employed to determine the 3D displacements and strains of the sample surface. The basic principles and implementation procedures of the proposed technique for microscopic 3D profile and deformation measurement are described in detail. The effectiveness and accuracy of the presented microscopic 3D-DIC method is verified by measuring the profile and 3D displacements of a regular cylinder surface.

  10. Digital multi-focusing from a single photograph taken with an uncalibrated conventional camera.

    PubMed

    Cao, Yang; Fang, Shuai; Wang, Zengfu

    2013-09-01

    The demand to restore all-in-focus images from defocused images and produce photographs focused at different depths is emerging in more and more cases, such as low-end hand-held cameras and surveillance cameras. In this paper, we manage to solve this challenging multi-focusing problem with a single image taken with an uncalibrated conventional camera. Different from all existing multi-focusing approaches, our method does not need to include a deconvolution process, which is quite time-consuming and will cause ringing artifacts in the focused region and low depth-of-field. This paper proposes a novel systematic approach to realize multi-focusing from a single photograph. First of all, with the optical explanation for the local smooth assumption, we present a new point-to-point defocus model. Next, the blur map of the input image, which reflects the amount of defocus blur at each pixel in the image, is estimated by two steps. 1) With the sharp edge prior, a rough blur map is obtained by estimating the blur amount at the edge regions. 2) The guided image filter is applied to propagate the blur value from the edge regions to the whole image by which a refined blur map is obtained. Thus far, we can restore the all-in-focus photograph from a defocused input. To further produce photographs focused at different depths, the depth map from the blur map must be derived. To eliminate the ambiguity over the focal plane, user interaction is introduced and a binary graph cut algorithm is used. So we introduce user interaction and use a binary graph cut algorithm to eliminate the ambiguity over the focal plane. Coupled with the camera parameters, this approach produces images focused at different depths. The performance of this new multi-focusing algorithm is evaluated both objectively and subjectively by various test images. Both results demonstrate that this algorithm produces high quality depth maps and multi-focusing results, outperforming the previous approaches. PMID

  11. Homogeneous nucleation rates from the piston-expansion tube using a digital camera

    NASA Astrophysics Data System (ADS)

    Peters, Franz; Graßmann, Arne

    2000-08-01

    Homogeneous nucleation rates of n-pentanol in nitrogen are obtained from a piston-expansion tube (pex-tube) involving the nucleation pulse method which generates a limited number of nuclei that grow into droplets. The detection of the droplets is achieved by a new counting method developed on the basis of a CCD camera in combination with a laser light sheet. Nucleation rates between 104 and 109cm-3 s-1 are covered for the nucleation temperature 260 K. The rates are plotted as isotherms versus supersaturation. An influence of the initial expansion temperature on the nucleation rate is identified. Literature data from other expansion experiments agree with our finding.

  12. Field test comparison of an autocorrelation technique for determining grain size using a digital 'beachball' camera versus traditional methods

    USGS Publications Warehouse

    Barnard, P.L.; Rubin, D.M.; Harney, J.; Mustain, N.

    2007-01-01

    This extensive field test of an autocorrelation technique for determining grain size from digital images was conducted using a digital bed-sediment camera, or 'beachball' camera. Using 205 sediment samples and >1200 images from a variety of beaches on the west coast of the US, grain size ranging from sand to granules was measured from field samples using both the autocorrelation technique developed by Rubin [Rubin, D.M., 2004. A simple autocorrelation algorithm for determining grain size from digital images of sediment. Journal of Sedimentary Research, 74(1): 160-165.] and traditional methods (i.e. settling tube analysis, sieving, and point counts). To test the accuracy of the digital-image grain size algorithm, we compared results with manual point counts of an extensive image data set in the Santa Barbara littoral cell. Grain sizes calculated using the autocorrelation algorithm were highly correlated with the point counts of the same images (r2 = 0.93; n = 79) and had an error of only 1%. Comparisons of calculated grain sizes and grain sizes measured from grab samples demonstrated that the autocorrelation technique works well on high-energy dissipative beaches with well-sorted sediment such as in the Pacific Northwest (r2 ??? 0.92; n = 115). On less dissipative, more poorly sorted beaches such as Ocean Beach in San Francisco, results were not as good (r2 ??? 0.70; n = 67; within 3% accuracy). Because the algorithm works well compared with point counts of the same image, the poorer correlation with grab samples must be a result of actual spatial and vertical variability of sediment in the field; closer agreement between grain size in the images and grain size of grab samples can be achieved by increasing the sampling volume of the images (taking more images, distributed over a volume comparable to that of a grab sample). In all field tests the autocorrelation method was able to predict the mean and median grain size with ???96% accuracy, which is more than

  13. Online rate control in digital cameras for near-constant distortion based on minimum/maximum criterion

    NASA Astrophysics Data System (ADS)

    Lee, Sang-Yong; Ortega, Antonio

    2000-04-01

    We address the problem of online rate control in digital cameras, where the goal is to achieve near-constant distortion for each image. Digital cameras usually have a pre-determined number of images that can be stored for the given memory size and require limited time delay and constant quality for each image. Due to time delay restrictions, each image should be stored before the next image is received. Therefore, we need to define an online rate control that is based on the amount of memory used by previously stored images, the current image, and the estimated rate of future images. In this paper, we propose an algorithm for online rate control, in which an adaptive reference, a 'buffer-like' constraint, and a minimax criterion (as a distortion metric to achieve near-constant quality) are used. The adaptive reference is used to estimate future images and the 'buffer-like' constraint is required to keep enough memory for future images. We show that using our algorithm to select online bit allocation for each image in a randomly given set of images provides near constant quality. Also, we show that our result is near optimal when a minimax criterion is used, i.e., it achieves a performance close to that obtained by applying an off-line rate control that assumes exact knowledge of the images. Suboptimal behavior is only observed in situations where the distribution of images is not truly random (e.g., if most of the 'complex' images are captured at the end of the sequence.) Finally, we propose a T- step delay rate control algorithm and using the result of 1- step delay rate control algorithm, we show that this algorithm removes the suboptimal behavior.

  14. A digital architecture for striping noise compensation in push-broom hyperspectral cameras

    NASA Astrophysics Data System (ADS)

    Valenzuela, Wladimir E.; Figueroa, Miguel; Pezoa, Jorge E.; Meza, Pablo

    2015-09-01

    We present a striping noise compensation architecture for hyperspectral push-broom cameras, implemented on a Field-Programmable Gate Array (FPGA). The circuit is fast, compact, low power, and is capable of eliminating the striping noise in-line during the image acquisition process. The architecture implements a multi dimensional neural network (MDNN) algorithm for striping noise compensation previously reported by our group. The algorithm relies on the assumption that the amount of light impinging at the neighboring photo-detectors is approximately the same in the spatial and spectral dimensions. Under this assumption, two striping noise parameters are estimated using spatial and spectral information from the raw data. We implemented the circuit on a Xilinx ZYNQ XC7Z2010 FPGA and tested it with images obtained from a NIR N17E push-broom camera, with a frame rate of 25fps and a band-pixel rate of 1.888 MHz. The setup consists of a loop of 320 samples of 320 spatial lines and 236 spectral bands between 900 and 1700 nanometers, in laboratory condition, captured with a rigid push-broom controller. The noise compensation core can run at more than 100 MHZ and consumes less than 30mW of dynamic power, using less than 10% of the logic resources available on the chip. It also uses one of two ARM processors available on the FPGA for data acquisition and communication purposes.

  15. Definition and trade-off study of reconfigurable airborne digital computer system organizations

    NASA Technical Reports Server (NTRS)

    Conn, R. B.

    1974-01-01

    A highly-reliable, fault-tolerant reconfigurable computer system for aircraft applications was developed. The development and application reliability and fault-tolerance assessment techniques are described. Particular emphasis is placed on the needs of an all-digital, fly-by-wire control system appropriate for a passenger-carrying airplane.

  16. Study on key techniques for camera-based hydrological record image digitization

    NASA Astrophysics Data System (ADS)

    Li, Shijin; Zhan, Di; Hu, Jinlong; Gao, Xiangtao; Bo, Ping

    2015-10-01

    With the development of information technology, the digitization of scientific or engineering drawings has received more and more attention. In hydrology, meteorology, medicine and mining industry, the grid drawing sheet is commonly used to record the observations from sensors. However, these paper drawings may be destroyed and contaminated due to improper preservation or overuse. Further, it will be a heavy workload and prone to error if these data are manually transcripted into the computer. Hence, in order to digitize these drawings, establishing the corresponding data base will ensure the integrity of data and provide invaluable information for further research. This paper presents an automatic system for hydrological record image digitization, which consists of three key techniques, i.e., image segmentation, intersection point localization and distortion rectification. First, a novel approach to the binarization of the curves and grids in the water level sheet image has been proposed, which is based on the fusion of gradient and color information adaptively. Second, a fast search strategy for cross point location is invented and point-by-point processing is thus avoided, with the help of grid distribution information. And finally, we put forward a local rectification method through analyzing the central portions of the image and utilizing the domain knowledge of hydrology. The processing speed is accelerated, while the accuracy is still satisfying. Experiments on several real water level records show that our proposed techniques are effective and capable of recovering the hydrological observations accurately.

  17. High precision digital control LED spot light source used to calibrate camera

    NASA Astrophysics Data System (ADS)

    Du, Boyu; Xu, Xiping; Liu, Yang

    2015-04-01

    This paper introduces a method of using LED point light source as the camera calibration light. According to the characteristics of the LED point light source, the constant current source is used to provide the necessary current and the illuminometer is used to measure the luminance of the LED point light source. The constant current source is controlled by ARM MCU and exchange data with the host computer though the mode of serial communications. The PC is used as the host computer, it adjust the current according to the luminance of the LED point light source until the luminance achieve the anticipated value. By experimental analysis, we found that the LED point light source can achieve the desired requirements as the calibration light source, and the accuracy is quite better that achieve the desired effect and it can adaptive control the luminance of LED well. The system is convenient and flexible, and its performance is stable and reliable.

  18. Noncontact imaging of plethysmographic pulsation and spontaneous low-frequency oscillation in skin perfusion with a digital red-green-blue camera

    NASA Astrophysics Data System (ADS)

    Nishidate, Izumi; Hoshi, Akira; Aoki, Yuta; Nakano, Kazuya; Niizeki, Kyuichi; Aizu, Yoshihisa

    2016-03-01

    A non-contact imaging method with a digital RGB camera is proposed to evaluate plethysmogram and spontaneous lowfrequency oscillation. In vivo experiments with human skin during mental stress induced by the Stroop color-word test demonstrated the feasibility of the method to evaluate the activities of autonomic nervous systems.

  19. Formal methods and their role in digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1995-01-01

    This report is based on one prepared as a chapter for the FAA Digital Systems Validation Handbook (a guide to assist FAA certification specialists with advanced technology issues). Its purpose is to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used in critical applications; and to suggest factors for consideration when formal methods are offered in support of certification. The presentation concentrates on the rationale for formal methods and on their contribution to assurance for critical applications within a context such as that provided by DO-178B (the guidelines for software used on board civil aircraft); it is intended as an introduction for those to whom these topics are new.

  20. Digital Intermediate Frequency Receiver Module For Use In Airborne Sar Applications

    DOEpatents

    Tise, Bertice L.; Dubbert, Dale F.

    2005-03-08

    A digital IF receiver (DRX) module directly compatible with advanced radar systems such as synthetic aperture radar (SAR) systems. The DRX can combine a 1 G-Sample/sec 8-bit ADC with high-speed digital signal processor, such as high gate-count FPGA technology or ASICs to realize a wideband IF receiver. DSP operations implemented in the DRX can include quadrature demodulation and multi-rate, variable-bandwidth IF filtering. Pulse-to-pulse (Doppler domain) filtering can also be implemented in the form of a presummer (accumulator) and an azimuth prefilter. An out of band noise source can be employed to provide a dither signal to the ADC, and later be removed by digital signal processing. Both the range and Doppler domain filtering operations can be implemented using a unique pane architecture which allows on-the-fly selection of the filter decimation factor, and hence, the filter bandwidth. The DRX module can include a standard VME-64 interface for control, status, and programming. An interface can provide phase history data to the real-time image formation processors. A third front-panel data port (FPDP) interface can send wide bandwidth, raw phase histories to a real-time phase history recorder for ground processing.

  1. Validity and reliability of a dietary assessment method: the application of a digital camera with a mobile phone card attachment.

    PubMed

    Wang, Da-Hong; Kogashiwa, Michiko; Ohta, Sachiko; Kira, Shohei

    2002-12-01

    This study was aimed at evaluation of the validity and reliability of an alternative dietary measurement method that assists epidemiologic studies. We validated a handheld personal digital assistant with camera and mobile phone card, called Wellnavi, in which a 1-d weighed diet record was employed as a reference method. Twenty college students majoring in food and nutrition participated in this study. They were asked to keep a diet record and to take digital photos of all these recorded food at the same time, then send them to the dietitians by the mobile phone card. In the reliability study, other twenty students from the same college were asked to take digital photos of the same meal during a day by two same instruments under the same circumstances and to send these photos to the different dietitians electronically. With respect to validity, median nutrient intakes estimated by the Wellnavi method and the diet record method are comparable. Correlation coefficients between the median nutrient intakes estimated from these two methods ranged from 0.46 for monounsaturated fatty acid to 0.93 for vitamin B12 and copper (median r = 0.77). With respect to reliability, our data show a good agreement between two Wellnavi instruments for most of the nutrients. Correlation coefficients between the nutrient intakes estimated from 2 instruments ranged from 0.55 for vitamin B1 and water-insoluble dietary fiber to 0.92 for vitamin B12 (median r = 0.78). In conclusion, the results indicate this dietary assessment instrument can usefully measure individual dietary intakes for a variety of nutrients in an epidemiologic study. PMID:12775117

  2. Measuring stress intensity factors with a camera: Integrated digital image correlation (I-DIC)

    NASA Astrophysics Data System (ADS)

    Hild, François; Roux, Stéphane

    2006-01-01

    A novel 'integrated' approach coupling image correlation and elastic displacement field identification provides a powerful and accurate tool to evaluate mode I and II stress intensity factors. This technique is applied to silicon carbide subjected to a sandwiched three-point bend test, using digital pictures obtained in optical microscopy where the pixel physical scale is about 2 μm. A crack whose maximum opening is 500 nm can be detected and its geometry identified. The toughness is determined well within a 10%uncertainty. To cite this article: F. Hild, S. Roux, C. R. Mecanique 334 (2006).

  3. The Sensitivity of a Volcanic Flow Model to Digital Elevation Models From Diverse Sources: Digitized Map Contours and Airborne Interferometric Radar

    NASA Astrophysics Data System (ADS)

    Stevens, N. F.; Manville, V.; Heron, D. W.

    2001-12-01

    A growing trend in the field of volcanic hazard assessment is the use of computer models of a variety of flows to predict potential areas of devastation. The accuracy of these computer models depends on two factors, the nature and veracity of the flow model itself, and the accuracy of the topographic data set over which it is run. All digital elevation models (DEMs) contain innate errors. The nature of these depends on the accuracy of the original measurements of the terrain, and on the method used to build the DEM. We investigate the effect that these errors have on the performance of a simple volcanic flow model designed to delineate areas at risk from lahar inundation. The volcanic flow model was run over two DEMs of southern Ruapehu volcano derived from (1) digitized 1:50,000 topographic maps, and (2) airborne C-band synthetic aperture radar interferometry obtained using the NASA AIRSAR system. On steep slopes (exceeding 4 degrees), drainage channels are more likely to be incised deeply, and flow paths predicted by the model are generally in agreement for both DEMs despite the differing nature of the source data. Over shallow slopes (approx. 4 degrees and less), where channels are less deep and are more likely to meander, problems were encountered with flow path prediction in both DEMs due to interpolation errors and forestry. The predicted lateral and longitudinal extent of deposit inundation was also sensitive to the type of DEM used, most likely in response to the differing degrees of surface texture preserved in the DEMs. A technique to refine contour-derived DEMs and reduce the error in predicted flow paths was tested to improve the reliability of the modeled flow path predictions. The suitability of forthcoming topographic measurements acquired by a single-pass space-borne instrument, the NASA Shuttle Radar Topography Mission (SRTM) are also tested.

  4. Coherent digital demodulation of single-camera N-projections for 3D-object shape measurement: co-phased profilometry.

    PubMed

    Servin, M; Garnica, G; Estrada, J C; Quiroga, A

    2013-10-21

    Fringe projection profilometry is a well-known technique to digitize 3-dimensional (3D) objects and it is widely used in robotic vision and industrial inspection. Probably the single most important problem in single-camera, single-projection profilometry are the shadows and specular reflections generated by the 3D object under analysis. Here a single-camera along with N-fringe-projections is (digital) coherent demodulated in a single-step, solving the shadows and specular reflections problem. Co-phased profilometry coherently phase-demodulates a whole set of N-fringe-pattern perspectives in a single demodulation and unwrapping process. The mathematical theory behind digital co-phasing N-fringe-patterns is mathematically similar to co-phasing a segmented N-mirror telescope.

  5. Referenced dual pressure- and temperature-sensitive paint for digital color camera read out.

    PubMed

    Fischer, Lorenz H; Karakus, Cüneyt; Meier, Robert J; Risch, Nikolaus; Wolfbeis, Otto S; Holder, Elisabeth; Schäferling, Michael

    2012-12-01

    The first fluorescent material for the referenced simultaneous RGB (red green blue) imaging of barometric pressure (oxygen partial pressure) and temperature is presented. This sensitive coating consists of two platinum(II) complexes as indicators and a reference dye, each of which is incorporated in appropriate polymer nanoparticles. These particles are dispersed in a polyurethane hydrogel and spread onto a solid support. The emission of the (oxygen) pressure indicator, PtTFPP, matches the red channel of a RGB color camera, whilst the emission of the temperature indicator [Pt(II) (Br-thq)(acac)] matches the green channel. The reference dye, 9,10-diphenylanthracene, emits in the blue channel. In contrast to other dual-sensitive materials, this new coating allows for the simultaneous imaging of both indicator signals, as well as the reference signal, in one RGB color picture without having to separate the signals with additional optical filters. All of these dyes are excitable with a 405 nm light-emitting diode (LED). With this new composite material, barometric pressure can be determined with a resolution of 22 mbar; the temperature can be determined with a resolution of 4.3 °C.

  6. Single camera absolute motion based digital elevation mapping for a next generation planetary lander

    NASA Astrophysics Data System (ADS)

    Feetham, Luke M.; Aouf, Nabil; Bourdarias, Clement; Voirin, Thomas

    2014-05-01

    Robotic planetary surface exploration missions are becoming much more ambitious in their science goals as they attempt to answer the bigger questions relating to the possibility of life elsewhere in our solar system. Answering these questions will require scientifically rich landing sites. Such sites are unlikely to be located in relatively flat regions that are free from hazards, therefore there is a growing need for next generation entry descent and landing systems to possess highly sophisticated navigation capabilities coupled with active hazard avoidance that can enable a pin-point landing. As a first step towards achieving these goals, a multi-source, multi-rate data fusion algorithm is presented that combines single camera recursive feature-based structure from motion (SfM) estimates with measurements from an inertial measurement unit in order to overcome the scale ambiguity problem by directly estimating the unknown scale factor. This paper focuses on accurate estimation of absolute motion parameters, as well as the estimation of sparse landing site structure to provide a starting point for hazard detection. We assume no prior knowledge of the landing site terrain structure or of the landing craft motion in order to fully assess the capabilities of the proposed algorithm to allow a pin-point landing on distant solar system bodies where accurate knowledge of the desired landing site may be limited. We present results using representative synthetic images of deliberately challenging landing scenarios, which demonstrates that the proposed method has great potential.

  7. Design of a fault tolerant airborne digital computer. Volume 1: Architecture

    NASA Technical Reports Server (NTRS)

    Wensley, J. H.; Levitt, K. N.; Green, M. W.; Goldberg, J.; Neumann, P. G.

    1973-01-01

    This volume is concerned with the architecture of a fault tolerant digital computer for an advanced commercial aircraft. All of the computations of the aircraft, including those presently carried out by analogue techniques, are to be carried out in this digital computer. Among the important qualities of the computer are the following: (1) The capacity is to be matched to the aircraft environment. (2) The reliability is to be selectively matched to the criticality and deadline requirements of each of the computations. (3) The system is to be readily expandable. contractible, and (4) The design is to appropriate to post 1975 technology. Three candidate architectures are discussed and assessed in terms of the above qualities. Of the three candidates, a newly conceived architecture, Software Implemented Fault Tolerance (SIFT), provides the best match to the above qualities. In addition SIFT is particularly simple and believable. The other candidates, Bus Checker System (BUCS), also newly conceived in this project, and the Hopkins multiprocessor are potentially more efficient than SIFT in the use of redundancy, but otherwise are not as attractive.

  8. Airborne digital-image data for monitoring the Colorado River corridor below Glen Canyon Dam, Arizona, 2009 - Image-mosaic production and comparison with 2002 and 2005 image mosaics

    USGS Publications Warehouse

    Davis, Philip A.

    2012-01-01

    Airborne digital-image data were collected for the Arizona part of the Colorado River ecosystem below Glen Canyon Dam in 2009. These four-band image data are similar in wavelength band (blue, green, red, and near infrared) and spatial resolution (20 centimeters) to image collections of the river corridor in 2002 and 2005. These periodic image collections are used by the Grand Canyon Monitoring and Research Center (GCMRC) of the U.S. Geological Survey to monitor the effects of Glen Canyon Dam operations on the downstream ecosystem. The 2009 collection used the latest model of the Leica ADS40 airborne digital sensor (the SH52), which uses a single optic for all four bands and collects and stores band radiance in 12-bits, unlike the image sensors that GCMRC used in 2002 and 2005. This study examined the performance of the SH52 sensor, on the basis of the collected image data, and determined that the SH52 sensor provided superior data relative to the previously employed sensors (that is, an early ADS40 model and Zeiss Imaging's Digital Mapping Camera) in terms of band-image registration, dynamic range, saturation, linearity to ground reflectance, and noise level. The 2009 image data were provided as orthorectified segments of each flightline to constrain the size of the image files; each river segment was covered by 5 to 6 overlapping, linear flightlines. Most flightline images for each river segment had some surface-smear defects and some river segments had cloud shadows, but these two conditions did not generally coincide in the majority of the overlapping flightlines for a particular river segment. Therefore, the final image mosaic for the 450-kilometer (km)-long river corridor required careful selection and editing of numerous flightline segments (a total of 513 segments, each 3.2 km long) to minimize surface defects and cloud shadows. The final image mosaic has a total of only 3 km of surface defects. The final image mosaic for the western end of the corridor has

  9. Design of a fault tolerant airborne digital computer. Volume 2: Computational requirements and technology

    NASA Technical Reports Server (NTRS)

    Ratner, R. S.; Shapiro, E. B.; Zeidler, H. M.; Wahlstrom, S. E.; Clark, C. B.; Goldberg, J.

    1973-01-01

    This final report summarizes the work on the design of a fault tolerant digital computer for aircraft. Volume 2 is composed of two parts. Part 1 is concerned with the computational requirements associated with an advanced commercial aircraft. Part 2 reviews the technology that will be available for the implementation of the computer in the 1975-1985 period. With regard to the computation task 26 computations have been categorized according to computational load, memory requirements, criticality, permitted down-time, and the need to save data in order to effect a roll-back. The technology part stresses the impact of large scale integration (LSI) on the realization of logic and memory. Also considered was module interconnection possibilities so as to minimize fault propagation.

  10. Noninvasive imaging of human skin hemodynamics using a digital red-green-blue camera

    NASA Astrophysics Data System (ADS)

    Nishidate, Izumi; Tanaka, Noriyuki; Kawase, Tatsuya; Maeda, Takaaki; Yuasa, Tomonori; Aizu, Yoshihisa; Yuasa, Tetsuya; Niizeki, Kyuichi

    2011-08-01

    In order to visualize human skin hemodynamics, we investigated a method that is specifically developed for the visualization of concentrations of oxygenated blood, deoxygenated blood, and melanin in skin tissue from digital RGB color images. Images of total blood concentration and oxygen saturation can also be reconstructed from the results of oxygenated and deoxygenated blood. Experiments using tissue-like agar gel phantoms demonstrated the ability of the developed method to quantitatively visualize the transition from an oxygenated blood to a deoxygenated blood in dermis. In vivo imaging of the chromophore concentrations and tissue oxygen saturation in the skin of the human hand are performed for 14 subjects during upper limb occlusion at 50 and 250 mm Hg. The response of the total blood concentration in the skin acquired by this method and forearm volume changes obtained from the conventional strain-gauge plethysmograph were comparable during the upper arm occlusion at pressures of both 50 and 250 mm Hg. The results presented in the present paper indicate the possibility of visualizing the hemodynamics of subsurface skin tissue.

  11. Beyond photography: Evaluation of the consumer digital camera to identify strabismus and anisometropia by analyzing the Bruckner's reflex

    PubMed Central

    Bani, Sadat A. O.; Amitava, Abadan K.; Sharma, Richa; Danish, Alam

    2013-01-01

    Amblyopia screening is often either costly or laborious. We evaluated the Canon Powershot TX1 (CPTX1) digital camera as an efficient screener for amblyogenic risk factors (ARF). We included 138 subjects: 84-amblyopes and 54-normal. With the red-eye-reduction feature off, we obtained Bruckner reflex photographs of different sized crescents which suggested anisometropia, while asymmetrical brightness indicated strabismus; symmetry implied normalcy. Eight sets of randomly arranged 138 photographs were made. After training, 8 personnel, marked each as normal or abnormal. Of the 84 amblyopes, 42 were strabismus alone (SA), 36 had anisometropia alone (AA) while six were mixed amblyopes (MA). Overall mean sensitivity for amblyopes was 0.86 (95% CI: 0.83-0.89) and specificity 0.85 (95% CI: 0.77-0.93). Sub-group analyses on SA, AA and MA returned sensitivities of 0.86, 0.89 and 0.69, while specificities were 0.85 for all three. Overall Cohen's Kappa was 0.66 (95% CI: 0.62-0.71). The CPTX1 appears to be a feasible option to screen for ARF, although results need to be validated on appropriate age groups. PMID:24212318

  12. Solar-Powered Airplane with Cameras and WLAN

    NASA Technical Reports Server (NTRS)

    Higgins, Robert G.; Dunagan, Steve E.; Sullivan, Don; Slye, Robert; Brass, James; Leung, Joe G.; Gallmeyer, Bruce; Aoyagi, Michio; Wei, Mei Y.; Herwitz, Stanley R.; Johnson, Lee; Arvesen, John C.

    2004-01-01

    An experimental airborne remote sensing system includes a remotely controlled, lightweight, solar-powered airplane (see figure) that carries two digital-output electronic cameras and communicates with a nearby ground control and monitoring station via a wireless local-area network (WLAN). The speed of the airplane -- typically <50 km/h -- is low enough to enable loitering over farm fields, disaster scenes, or other areas of interest to collect high-resolution digital imagery that could be delivered to end users (e.g., farm managers or disaster-relief coordinators) in nearly real time.

  13. Using digital time-lapse cameras to monitor species-specific understorey and overstorey phenology in support of wildlife habitat assessment.

    PubMed

    Bater, Christopher W; Coops, Nicholas C; Wulder, Michael A; Hilker, Thomas; Nielsen, Scott E; McDermid, Greg; Stenhouse, Gordon B

    2011-09-01

    Critical to habitat management is the understanding of not only the location of animal food resources, but also the timing of their availability. Grizzly bear (Ursus arctos) diets, for example, shift seasonally as different vegetation species enter key phenological phases. In this paper, we describe the use of a network of seven ground-based digital camera systems to monitor understorey and overstorey vegetation within species-specific regions of interest. Established across an elevation gradient in western Alberta, Canada, the cameras collected true-colour (RGB) images daily from 13 April 2009 to 27 October 2009. Fourth-order polynomials were fit to an RGB-derived index, which was then compared to field-based observations of phenological phases. Using linear regression to statistically relate the camera and field data, results indicated that 61% (r (2) = 0.61, df = 1, F = 14.3, p = 0.0043) of the variance observed in the field phenological phase data is captured by the cameras for the start of the growing season and 72% (r (2) = 0.72, df = 1, F = 23.09, p = 0.0009) of the variance in length of growing season. Based on the linear regression models, the mean absolute differences in residuals between predicted and observed start of growing season and length of growing season were 4 and 6 days, respectively. This work extends upon previous research by demonstrating that specific understorey and overstorey species can be targeted for phenological monitoring in a forested environment, using readily available digital camera technology and RGB-based vegetation indices. PMID:21082343

  14. How to optimize radiological images captured from digital cameras, using the Adobe Photoshop 6.0 program.

    PubMed

    Chalazonitis, A N; Koumarianos, D; Tzovara, J; Chronopoulos, P

    2003-06-01

    Over the past decade, the technology that permits images to be digitized and the reduction in the cost of digital equipment allows quick digital transfer of any conventional radiological film. Images then can be transferred to a personal computer, and several software programs are available that can manipulate their digital appearance. In this article, the fundamentals of digital imaging are discussed, as well as the wide variety of optional adjustments that the Adobe Photoshop 6.0 (Adobe Systems, San Jose, CA) program can offer to present radiological images with satisfactory digital imaging quality.

  15. A comparative study of reflectance spectral indices and digital camera imagery to quantify in-vivo foliar chlorophyll concentration in common New England forest species

    NASA Astrophysics Data System (ADS)

    Gagnon, Michael T.

    Quantifying foliar chlorophyll content is an important procedure in ecosystem studies. Established extraction techniques for quantifying chlorophyll concentration were compared to hyperspectral reflectance indices collected with a GER2600 spectroradiometer, and digital image indices derived from digital camera imagery. Findings show REIP:(R 2=0.52) is a strong indicator reflectance changes associated with plant stress, but RE3/RE2:(R2=0.72), FD715/FD705:(R2=0.77) and CRI red-edge(d) :(R2=0.73) predicted differences in chlorophyll concentration across a range of species more accurately. Many spectral indices predict chlorophyll concentrations more accurately than the REIP, but fail to document the blue-shift associated with foliar stress. Camera imagery results show gray card normalized percent red (R-GC)/( R+GC):(R2=0.63) and percent green (G-GC)/(G+GC):(R 2=0.68) to be strong predictors of chlorophyll concentration across multiple species. For individual species (%Red-%Blue)/(%Red+%Blue) or ( RvB:%R) is a reliable camera index that tracks phenological changes in chlorophyll accurately. Pearson's r across the 2008 growing season for black oak (N=40) was (R2=-0.95), and sugar maple (N=33) was (R2=-0.64).

  16. Product Accuracy Effect of Oblique and Vertical Non-Metric Digital Camera Utilization in Uav-Photogrammetry to Determine Fault Plane

    NASA Astrophysics Data System (ADS)

    Amrullah, C.; Suwardhi, D.; Meilano, I.

    2016-06-01

    This study aims to see the effect of non-metric oblique and vertical camera combination along with the configuration of the ground control points to improve the precision and accuracy in UAV-Photogrammetry project. The field observation method is used for data acquisition with aerial photographs and ground control points. All data are processed by digital photogrammetric process with some scenarios in camera combination and ground control point configuration. The model indicates that the value of precision and accuracy increases with the combination of oblique and vertical camera at all control point configuration. The best products of the UAV-Photogrammetry model are produced in the form of Digital Elevation Model (DEM) compared to the LiDAR DEM. Furthermore, DEM from UAV-Photogrammetry and LiDAR are used to define the fault plane by using cross-section on the model and interpretation to determine the point at the extreme height of terrain changes. The result of the defined fault planes indicate that two models do not show any significant difference.

  17. Land cover/use classification of Cairns, Queensland, Australia: A remote sensing study involving the conjunctive use of the airborne imaging spectrometer, the large format camera and the thematic mapper simulator

    NASA Technical Reports Server (NTRS)

    Heric, Matthew; Cox, William; Gordon, Daniel K.

    1987-01-01

    In an attempt to improve the land cover/use classification accuracy obtainable from remotely sensed multispectral imagery, Airborne Imaging Spectrometer-1 (AIS-1) images were analyzed in conjunction with Thematic Mapper Simulator (NS001) Large Format Camera color infrared photography and black and white aerial photography. Specific portions of the combined data set were registered and used for classification. Following this procedure, the resulting derived data was tested using an overall accuracy assessment method. Precise photogrammetric 2D-3D-2D geometric modeling techniques is not the basis for this study. Instead, the discussion exposes resultant spectral findings from the image-to-image registrations. Problems associated with the AIS-1 TMS integration are considered, and useful applications of the imagery combination are presented. More advanced methodologies for imagery integration are needed if multisystem data sets are to be utilized fully. Nevertheless, research, described herein, provides a formulation for future Earth Observation Station related multisensor studies.

  18. Ground-based detection of nighttime clouds above Manila Observatory (14.64°N, 121.07°E) using a digital camera.

    PubMed

    Gacal, Glenn Franco B; Antioquia, Carlo; Lagrosas, Nofel

    2016-08-01

    Ground-based cloud detection at nighttime is achieved by using cameras, lidars, and ceilometers. Despite these numerous instruments gathering cloud data, there is still an acknowledged scarcity of information on quantified local cloud cover, especially at nighttime. In this study, a digital camera is used to continuously collect images near the sky zenith at nighttime in an urban environment. An algorithm is developed to analyze pixel values of images of nighttime clouds. A minimum threshold pixel value of 17 is assigned to determine cloud occurrence. The algorithm uses temporal averaging to estimate the cloud fraction based on the results within the limited field of view. The analysis of the data from the months of January, February, and March 2015 shows that cloud occurrence is low during the months with relatively lower minimum temperature (January and February), while cloud occurrence during the warmer month (March) increases. PMID:27505386

  19. Ground-based detection of nighttime clouds above Manila Observatory (14.64°N, 121.07°E) using a digital camera.

    PubMed

    Gacal, Glenn Franco B; Antioquia, Carlo; Lagrosas, Nofel

    2016-08-01

    Ground-based cloud detection at nighttime is achieved by using cameras, lidars, and ceilometers. Despite these numerous instruments gathering cloud data, there is still an acknowledged scarcity of information on quantified local cloud cover, especially at nighttime. In this study, a digital camera is used to continuously collect images near the sky zenith at nighttime in an urban environment. An algorithm is developed to analyze pixel values of images of nighttime clouds. A minimum threshold pixel value of 17 is assigned to determine cloud occurrence. The algorithm uses temporal averaging to estimate the cloud fraction based on the results within the limited field of view. The analysis of the data from the months of January, February, and March 2015 shows that cloud occurrence is low during the months with relatively lower minimum temperature (January and February), while cloud occurrence during the warmer month (March) increases.

  20. Using a slit lamp-mounted digital high-speed camera for dynamic observation of phakic lenses during eye movements: a pilot study

    PubMed Central

    Leitritz, Martin Alexander; Ziemssen, Focke; Bartz-Schmidt, Karl Ulrich; Voykov, Bogomil

    2014-01-01

    Purpose To evaluate a digital high-speed camera combined with digital morphometry software for dynamic measurements of phakic intraocular lens movements to observe kinetic influences, particularly in fast direction changes and at lateral end points. Materials and methods A high-speed camera taking 300 frames per second observed movements of eight iris-claw intraocular lenses and two angle-supported intraocular lenses. Standardized saccades were performed by the patients to trigger mass inertia with lens position changes. Freeze images with maximum deviation were used for digital software-based morphometry analysis with ImageJ. Results Two eyes from each of five patients (median age 32 years, range 28–45 years) without findings other than refractive errors were included. The high-speed images showed sufficient usability for further morphometric processing. In the primary eye position, the median decentrations downward and in a lateral direction were −0.32 mm (range −0.69 to 0.024) and 0.175 mm (range −0.37 to 0.45), respectively. Despite the small sample size of asymptomatic patients, we found a considerable amount of lens dislocation. The median distance amplitude during eye movements was 0.158 mm (range 0.02–0.84). There was a slight positive correlation (r=0.39, P<0.001) between the grade of deviation in the primary position and the distance increase triggered by movements. Conclusion With the use of a slit lamp-mounted high-speed camera system and morphometry software, observation and objective measurements of iris-claw intraocular lenses and angle-supported intraocular lenses movements seem to be possible. Slight decentration in the primary position might be an indicator of increased lens mobility during kinetic stress during eye movements. Long-term assessment by high-speed analysis with higher case numbers has to clarify the relationship between progressing motility and endothelial cell damage. PMID:25071365

  1. A 3.3-to-25V all-digital charge pump based system with temperature and load compensation for avalanche photodiode cameras with fixed sensitivity

    NASA Astrophysics Data System (ADS)

    Mandai, S.; Charbon, E.

    2013-03-01

    This paper presents a digitally controlled charge pump (DCP) to supply high voltages, while ensuring temperature and load current independence of excess bias in cameras based on avalanche photodiodes. This is achieved through a single-photon avalanche diode (SPAD) based monitoring mechanism that continuously reconfigures the DCP using a feedback loop to compensate breakdown voltage variations by temperature and load current in real time. The sensitivity of the SPADs, or photon detection probability (PDP), is maintained to within 1.9% when the temperature shifts from 28°C to 65°C and the load current changes from 0 μA to 100 μA.

  2. Measurement of Young’s modulus and Poisson’s ratio of metals by means of ESPI using a digital camera

    NASA Astrophysics Data System (ADS)

    Francisco, J. B. Pascual; Michtchenko, A.; Barragán Pérez, O.; Susarrey Huerta, O.

    2016-09-01

    In this paper, mechanical experiments with a low-cost interferometry set-up are presented. The set-up is suitable for an undergraduate laboratory where optical equipment is absent. The arrangement consists of two planes of illumination, allowing the measurement of the two perpendicular in-plane displacement directions. An axial load was applied on three different metals, and the longitudinal and transversal displacements were measured sequentially. A digital camera was used to acquire the images of the different states of load of the illuminated area. A personal computer was used to perform the digital subtraction of the images to obtain the fringe correlations, which are needed to calculate the displacements. Finally, Young’s modulus and Poisson’s ratio of the metals were calculated using the displacement data.

  3. A geometric comparison of video camera-captured raster data to vector-parented raster data generated by the X-Y digitizing table

    NASA Technical Reports Server (NTRS)

    Swalm, C.; Pelletier, R.; Rickman, D.; Gilmore, K.

    1989-01-01

    The relative accuracy of a georeferenced raster data set captured by the Megavision 1024XM system using the Videk Megaplus CCD cameras is compared to a georeferenced raster data set generated from vector lines manually digitized through the ELAS software package on a Summagraphics X-Y digitizer table. The study also investigates the amount of time necessary to fully complete the rasterization of the two data sets, evaluating individual areas such as time necessary to generate raw data, time necessary to edit raw data, time necessary to georeference raw data, and accuracy of georeferencing against a norm. Preliminary results exhibit a high level of agreement between areas of the vector-parented data and areas of the captured file data where sufficient control points were chosen. Maps of 1:20,000 scale were digitized into raster files of 5 meter resolution per pixel and overall error in RMS was estimated at less than eight meters. Such approaches offer time and labor-saving advantages as well as increasing the efficiency of project scheduling and enabling the digitization of new types of data.

  4. The Laser Vegetation Imaging Sensor (LVIS): A Medium-Altitude, Digitization-Only, Airborne Laser Altimeter for Mapping Vegetation and Topography

    NASA Technical Reports Server (NTRS)

    Blair, J. Bryan; Rabine, David L.; Hofton, Michelle A.

    1999-01-01

    The Laser Vegetation Imaging Sensor (LVIS) is an airborne, scanning laser altimeter designed and developed at NASA's Goddard Space Flight Center. LVIS operates at altitudes up to 10 km above ground, and is capable of producing a data swath up to 1000 m wide nominally with 25 m wide footprints. The entire time history of the outgoing and return pulses is digitized, allowing unambiguous determination of range and return pulse structure. Combined with aircraft position and attitude knowledge, this instrument produces topographic maps with decimeter accuracy and vertical height and structure measurements of vegetation. The laser transmitter is a diode-pumped Nd:YAG oscillator producing 1064 nm, 10 nsec, 5 mJ pulses at repetition rates up to 500 Hz. LVIS has recently demonstrated its ability to determine topography (including sub-canopy) and vegetation height and structure on flight missions to various forested regions in the U.S. and Central America. The LVIS system is the airborne simulator for the Vegetation Canopy Lidar (VCL) mission (a NASA Earth remote sensing satellite due for launch in 2000), providing simulated data sets and a platform for instrument proof-of-concept studies. The topography maps and return waveforms produced by LVIS provide Earth scientists with a unique data set allowing studies of topography, hydrology, and vegetation with unmatched accuracy and coverage.

  5. Getting the Picture: Using the Digital Camera as a Tool to Support Reflective Practice and Responsive Care

    ERIC Educational Resources Information Center

    Luckenbill, Julia

    2012-01-01

    Many early childhood educators use cameras to share the charming things that children do and the artwork they make. Programs often bind these photographs into portfolios and give them to children and their families as mementos at the end of the year. In the author's classrooms, they use photography on a daily basis to document children's…

  6. Anger Camera Firmware

    2010-11-19

    The firmware is responsible for the operation of Anger Camera Electronics, calculation of position, time of flight and digital communications. It provides a first stage analysis of 48 signals from 48 analog signals that have been converted to digital values using A/D convertors.

  7. Polarization encoded color camera.

    PubMed

    Schonbrun, Ethan; Möller, Guðfríður; Di Caprio, Giuseppe

    2014-03-15

    Digital cameras would be colorblind if they did not have pixelated color filters integrated into their image sensors. Integration of conventional fixed filters, however, comes at the expense of an inability to modify the camera's spectral properties. Instead, we demonstrate a micropolarizer-based camera that can reconfigure its spectral response. Color is encoded into a linear polarization state by a chiral dispersive element and then read out in a single exposure. The polarization encoded color camera is capable of capturing three-color images at wavelengths spanning the visible to the near infrared. PMID:24690806

  8. Estimation of the Spectral Sensitivity Functions of Un-Modified and Modified Commercial Off-The Digital Cameras to Enable Their Use as a Multispectral Imaging System for Uavs

    NASA Astrophysics Data System (ADS)

    Berra, E.; Gibson-Poole, S.; MacArthur, A.; Gaulton, R.; Hamilton, A.

    2015-08-01

    Commercial off-the-shelf (COTS) digital cameras on-board unmanned aerial vehicles (UAVs) have the potential to be used as multispectral imaging systems; however, their spectral sensitivity is usually unknown and needs to be either measured or estimated. This paper details a step by step methodology for identifying the spectral sensitivity of modified (to be response to near infra-red wavelengths) and un-modified COTS digital cameras, showing the results of its application for three different models of camera. Six digital still cameras, which are being used as imaging systems on-board different UAVs, were selected to have their spectral sensitivities measured by a monochromator. Each camera was exposed to monochromatic light ranging from 370 nm to 1100 nm in 10 nm steps, with images of each step recorded in RAW format. The RAW images were converted linearly into TIFF images using DCRaw, an open-source program, before being batch processed through ImageJ (also open-source), which calculated the mean and standard deviation values from each of the red-green-blue (RGB) channels over a fixed central region within each image. These mean values were then related to the relative spectral radiance from the monochromator and its integrating sphere, in order to obtain the relative spectral response (RSR) for each of the cameras colour channels. It was found that different un-modified camera models present very different RSR in some channels, and one of the modified cameras showed a response that was unexpected. This highlights the need to determine the RSR of a camera before using it for any quantitative studies.

  9. Integration of airborne Thematic Mapper Simulator (TMS) data and digitized aerial photography via an ISH transformation. [Intensity Saturation Hue

    NASA Technical Reports Server (NTRS)

    Ambrosia, Vincent G.; Myers, Jeffrey S.; Ekstrand, Robert E.; Fitzgerald, Michael T.

    1991-01-01

    A simple method for enhancing the spatial and spectral resolution of disparate data sets is presented. Two data sets, digitized aerial photography at a nominal spatial resolution 3,7 meters and TMS digital data at 24.6 meters, were coregistered through a bilinear interpolation to solve the problem of blocky pixel groups resulting from rectification expansion. The two data sets were then subjected to intensity-saturation-hue (ISH) transformations in order to 'blend' the high-spatial-resolution (3.7 m) digitized RC-10 photography with the high spectral (12-bands) and lower spatial (24.6 m) resolution TMS digital data. The resultant merged products make it possible to perform large-scale mapping, ease photointerpretation, and can be derived for any of the 12 available TMS spectral bands.

  10. An aerial composite imaging method with multiple upright cameras based on axis-shift theory

    NASA Astrophysics Data System (ADS)

    Fang, Junyong; Liu, Xue; Xue, Yongqi; Tong, Qingxi

    2010-11-01

    Several composite camera systems were made for wide coverage by using 3 or 4 oblique cameras. A virtual projecting center and image was used for geometrical correction and mosaic with different projecting angles and different spatial resolutions caused by oblique cameras. An imaging method based axis-shift theory is proposed to acquire wide coverage images by several upright cameras. Four upright camera lenses have the same wide angle of view. The optic axis of lens is not on the center of CCD, and each CCD in each camera covers only one part of the whole focus plane. Oblique deformation caused by oblique camera would be avoided by this axis-shift imaging method. The principle and parameters are given and discussed. A prototype camera system is constructed by common DLSR (digital single lens reflex) cameras. The angle of view could exceed 80 degrees along the flight direction when the focal length is 24mm, and the ratio of base line to height could exceed 0.7 when longitudinal overlap is 60%. Some original and mosaic images captured by this prototype system in some ground and airborne experiments are given at last. Experimental results of image test show that the upright imaging method can effectively avoid the oblique deformation and meet the geometrical precision of image mosaic.

  11. Target-Tracking Camera for a Metrology System

    NASA Technical Reports Server (NTRS)

    Liebe, Carl; Bartman, Randall; Chapsky, Jacob; Abramovici, Alexander; Brown, David

    2009-01-01

    An analog electronic camera that is part of a metrology system measures the varying direction to a light-emitting diode that serves as a bright point target. In the original application for which the camera was developed, the metrological system is used to determine the varying relative positions of radiating elements of an airborne synthetic aperture-radar (SAR) antenna as the airplane flexes during flight; precise knowledge of the relative positions as a function of time is needed for processing SAR readings. It has been common metrology system practice to measure the varying direction to a bright target by use of an electronic camera of the charge-coupled-device or active-pixel-sensor type. A major disadvantage of this practice arises from the necessity of reading out and digitizing the outputs from a large number of pixels and processing the resulting digital values in a computer to determine the centroid of a target: Because of the time taken by the readout, digitization, and computation, the update rate is limited to tens of hertz. In contrast, the analog nature of the present camera makes it possible to achieve an update rate of hundreds of hertz, and no computer is needed to determine the centroid. The camera is based on a position-sensitive detector (PSD), which is a rectangular photodiode with output contacts at opposite ends. PSDs are usually used in triangulation for measuring small distances. PSDs are manufactured in both one- and two-dimensional versions. Because it is very difficult to calibrate two-dimensional PSDs accurately, the focal-plane sensors used in this camera are two orthogonally mounted one-dimensional PSDs.

  12. Hierarchical object-based classification of ultra-high-resolution digital mapping camera (DMC) imagery for rangeland mapping and assessment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Ultra high resolution digital aerial photography has great potential to complement or replace ground measurements of vegetation cover for rangeland monitoring and assessment. We investigated object-based image analysis (OBIA) techniques for classifying vegetation in southwestern U.S. arid rangelands...

  13. Low altitude remote-sensing method to monitor marine and beach litter of various colors using a balloon equipped with a digital camera.

    PubMed

    Kako, Shin'ichiro; Isobe, Atsuhiko; Magome, Shinya

    2012-06-01

    This study aims to establish a low-altitude remote sensing system for surveying litter on a beach or the ocean using a remote-controlled digital camera suspended from a balloon filled with helium gas. The resultant images are processed to identify the litter using projective transformation method and color difference in the CIELUV color space. Low-altitude remote sensing experimental observations were conducted on two locations in Japan. Although the sizes of the litter and the areas covered are distorted in the original photographs taken at various angles and heights, the proposed image process system is capable of identifying object positions with a high degree of accuracy (1-3 m). Furthermore, the color difference approach in the CIELUV color space used in this study is well capable of extracting pixels of litter objects of various colors allowing us to estimate the number of objects from the photographs.

  14. A microcontroller-based system for automated and continuous sky glow measurements with the use of digital single-lens reflex cameras

    NASA Astrophysics Data System (ADS)

    Solano Lamphar, Hétor Antonio; Kundracik, Frantisek

    2014-02-01

    In recent years, the scientific community has shown an increased interest in sky glow research. This has revealed an increased need for automated technology that enables continuous evaluation of sky glow. As a result, a reliable low-cost platform has been developed and constructed for automating sky glow measurement. The core of the system is embedded software and hardware managed by a microcontroller with ARM architecture. A monolithic photodiode transimpedance amplifier is used to allow linear light measurement. Data from the diode are collected and used to arrange the exposure time of every image captured by the digital single-lens reflex camera. This proposal supports experimenters by providing a low-cost system to analyse sky glow variations overnight without a human interface.

  15. In vivo multispectral imaging of the absorption and scattering properties of exposed brain using a digital red-green-blue camera

    NASA Astrophysics Data System (ADS)

    Yoshida, Keiichiro; Ishizuka, Tomohiro; Mizushima, Chiharu; Nishidate, Izumi; Kawauchi, Satoko; Sato, Shunichi; Sato, Manabu

    2015-04-01

    To evaluate multi-spectral images of the absorption and scattering properties in the cerebral cortex of rat brain, we investigated spectral reflectance images estimated by the Wiener estimation method using a digital red-green-blue camera. A Monte Carlo simulation-based multiple regression analysis for the corresponding spectral absorbance images at nine wavelengths (500, 520, 540, 560, 570, 580, 600, 730, and 760 nm) was then used to specify the absorption and scattering parameters. The spectral images of absorption and reduced scattering coefficients were reconstructed from the absorption and scattering parameters. We performed in vivo experiments on exposed rat brain to confirm the feasibility of this method. The estimated images of the absorption coefficients were dominated by hemoglobin spectra. The estimated images of the reduced scattering coefficients had a broad scattering spectrum, exhibiting a larger magnitude at shorter wavelengths, corresponding to the typical spectrum of brain tissue published in the literature.

  16. Assessment of the Spatial Co-registration of Multitemporal Imagery from Large Format Digital Cameras in the Context of Detailed Change Detection

    PubMed Central

    Coulter, Lloyd L.; Stow, Douglas A.

    2008-01-01

    Large format digital camera (LFDC) systems are becoming more broadly available and regularly collect image data over large areas. Spectral and radiometric attributes of imagery from LFDC systems make this type of image data appropriate for semi-automated change detection. However, achieving accurate spatial co-registration between multitemporal image sets is necessary for semi-automated change detection. This study investigates the accuracy of co-registration between multitemporal image sets acquired using the Leica Geosystems ADS40, Intergraph Z/I Imaging® DMC, and Vexcel UltraCam-D sensors in areas of gentle, moderate, and extreme terrain relief. Custom image sets were collected and orthorectified by imagery vendors, with guidance from the authors. Results indicate that imagery acquired by vendors operating LFDC systems may be co- registered with pixel or sub-pixel level accuracy, even for environments with high terrain relief. Specific image acquisition and processing procedures facilitating this level of co- registration are discussed.

  17. Image Sensors Enhance Camera Technologies

    NASA Technical Reports Server (NTRS)

    2010-01-01

    In the 1990s, a Jet Propulsion Laboratory team led by Eric Fossum researched ways of improving complementary metal-oxide semiconductor (CMOS) image sensors in order to miniaturize cameras on spacecraft while maintaining scientific image quality. Fossum s team founded a company to commercialize the resulting CMOS active pixel sensor. Now called the Aptina Imaging Corporation, based in San Jose, California, the company has shipped over 1 billion sensors for use in applications such as digital cameras, camera phones, Web cameras, and automotive cameras. Today, one of every three cell phone cameras on the planet feature Aptina s sensor technology.

  18. Increasing Electrochemiluminescence Intensity of a Wireless Electrode Array Chip by Thousands of Times Using a Diode for Sensitive Visual Detection by a Digital Camera.

    PubMed

    Qi, Liming; Xia, Yong; Qi, Wenjing; Gao, Wenyue; Wu, Fengxia; Xu, Guobao

    2016-01-19

    Both a wireless electrochemiluminescence (ECL) electrode microarray chip and the dramatic increase in ECL by embedding a diode in an electromagnetic receiver coil have been first reported. The newly designed device consists of a chip and a transmitter. The chip has an electromagnetic receiver coil, a mini-diode, and a gold electrode array. The mini-diode can rectify alternating current into direct current and thus enhance ECL intensities by 18 thousand times, enabling a sensitive visual detection using common cameras or smart phones as low cost detectors. The detection limit of hydrogen peroxide using a digital camera is comparable to that using photomultiplier tube (PMT)-based detectors. Coupled with a PMT-based detector, the device can detect luminol with higher sensitivity with linear ranges from 10 nM to 1 mM. Because of the advantages including high sensitivity, high throughput, low cost, high portability, and simplicity, it is promising in point of care testing, drug screening, and high throughput analysis. PMID:26669809

  19. ProgRes 3000: a digital color camera with a 2-D array CCD sensor and programmable resolution up to 2994 x 2320 picture elements

    NASA Astrophysics Data System (ADS)

    Lenz, Reimar K.; Lenz, Udo

    1990-11-01

    A newly developed imaging principle two dimensional microscanning with Piezo-controlled Aperture Displacement (PAD) allows for high image resolutions. The advantages of line scanners (high resolution) are combined with those of CCD area sensors (high light sensitivity geometrical accuracy and stability easy focussing illumination control and selection of field of view by means of TV real-time imaging). A custom designed sensor optimized for small sensor element apertures and color fidelity eliminates the need for color filter revolvers or mechanical shutters and guarantees good color convergence. By altering the computer controlled microscan patterns spatial and temporal resolution become interchangeable their product being a constant. The highest temporal resolution is TV real-time (50 fields/sec) the highest spatial resolution is 2994 x 2320 picture elements (Pels) for each of the three color channels (28 MBytes of raw image data in 8 see). Thus for the first time it becomes possible to take 35mm slide quality still color images of natural 3D scenes by purely electronic means. Nearly " square" Pels as well as hexagonal sampling schemes are possible. Excellent geometrical accuracy and low noise is guaranteed by sensor element (Sel) synchronous analog to digital conversion within the camera head. The cameras principle of operation and the procedure to calibrate the two-dimensional piezo-mechanical motion with an accuracy of better than O. 2. tm RMSE in image space is explained. The remaining positioning inaccuracy may be further

  20. Camera-based measurement for transverse vibrations of moving catenaries in mine hoists using digital image processing techniques

    NASA Astrophysics Data System (ADS)

    Yao, Jiannan; Xiao, Xingming; Liu, Yao

    2016-03-01

    This paper proposes a novel, non-contact, sensing method to measure the transverse vibrations of hoisting catenaries in mine hoists. Hoisting catenaries are typically moving cables and it is not feasible to use traditional methods to measure their transverse vibrations. In order to obtain the transverse displacements of an arbitrary point in a moving catenary, by superposing a mask image having the predefined reference line perpendicular to the hoisting catenaries on each frame of the processed image sequence, the dynamic intersecting points with a grey value of 0 in the image sequence could be identified. Subsequently, by traversing the coordinates of the pixel with a grey value of 0 and calculating the distance between the identified dynamic points from the reference, the transverse displacements of the selected arbitrary point in the hoisting catenary can be obtained. Furthermore, based on a theoretical model, the reasonability and applicability of the proposed camera-based method were confirmed. Additionally, a laboratory experiment was also carried out, which then validated the accuracy of the proposed method. The research results indicate that the proposed camera-based method is suitable for the measurement of the transverse vibrations of moving cables.

  1. Quantitative single-particle digital autoradiography with α-particle emitters for targeted radionuclide therapy using the iQID camera

    SciTech Connect

    Miller, Brian W.; Frost, Sofia H. L.; Frayo, Shani L.; Kenoyer, Aimee L.; Santos, Erlinda; Jones, Jon C.; Orozco, Johnnie J.; Green, Damian J.; Press, Oliver W.; Pagel, John M.; Sandmaier, Brenda M.; Hamlin, Donald K.; Wilbur, D. Scott; Fisher, Darrell R.

    2015-07-15

    Purpose: Alpha-emitting radionuclides exhibit a potential advantage for cancer treatments because they release large amounts of ionizing energy over a few cell diameters (50–80 μm), causing localized, irreparable double-strand DNA breaks that lead to cell death. Radioimmunotherapy (RIT) approaches using monoclonal antibodies labeled with α emitters may thus inactivate targeted cells with minimal radiation damage to surrounding tissues. Tools are needed to visualize and quantify the radioactivity distribution and absorbed doses to targeted and nontargeted cells for accurate dosimetry of all treatment regimens utilizing α particles, including RIT and others (e.g., Ra-223), especially for organs and tumors with heterogeneous radionuclide distributions. The aim of this study was to evaluate and characterize a novel single-particle digital autoradiography imager, the ionizing-radiation quantum imaging detector (iQID) camera, for use in α-RIT experiments. Methods: The iQID camera is a scintillator-based radiation detection system that images and identifies charged-particle and gamma-ray/x-ray emissions spatially and temporally on an event-by-event basis. It employs CCD-CMOS cameras and high-performance computing hardware for real-time imaging and activity quantification of tissue sections, approaching cellular resolutions. In this work, the authors evaluated its characteristics for α-particle imaging, including measurements of intrinsic detector spatial resolutions and background count rates at various detector configurations and quantification of activity distributions. The technique was assessed for quantitative imaging of astatine-211 ({sup 211}At) activity distributions in cryosections of murine and canine tissue samples. Results: The highest spatial resolution was measured at ∼20 μm full width at half maximum and the α-particle background was measured at a rate as low as (2.6 ± 0.5) × 10{sup −4} cpm/cm{sup 2} (40 mm diameter detector area

  2. Quantitative single-particle digital autoradiography with α-particle emitters for targeted radionuclide therapy using the iQID camera

    PubMed Central

    Miller, Brian W.; Frost, Sofia H. L.; Frayo, Shani L.; Kenoyer, Aimee L.; Santos, Erlinda; Jones, Jon C.; Green, Damian J.; Hamlin, Donald K.; Wilbur, D. Scott; Orozco, Johnnie J.; Press, Oliver W.; Pagel, John M.; Sandmaier, Brenda M.

    2015-01-01

    Purpose: Alpha-emitting radionuclides exhibit a potential advantage for cancer treatments because they release large amounts of ionizing energy over a few cell diameters (50–80 μm), causing localized, irreparable double-strand DNA breaks that lead to cell death. Radioimmunotherapy (RIT) approaches using monoclonal antibodies labeled with α emitters may thus inactivate targeted cells with minimal radiation damage to surrounding tissues. Tools are needed to visualize and quantify the radioactivity distribution and absorbed doses to targeted and nontargeted cells for accurate dosimetry of all treatment regimens utilizing α particles, including RIT and others (e.g., Ra-223), especially for organs and tumors with heterogeneous radionuclide distributions. The aim of this study was to evaluate and characterize a novel single-particle digital autoradiography imager, the ionizing-radiation quantum imaging detector (iQID) camera, for use in α-RIT experiments. Methods: The iQID camera is a scintillator-based radiation detection system that images and identifies charged-particle and gamma-ray/x-ray emissions spatially and temporally on an event-by-event basis. It employs CCD-CMOS cameras and high-performance computing hardware for real-time imaging and activity quantification of tissue sections, approaching cellular resolutions. In this work, the authors evaluated its characteristics for α-particle imaging, including measurements of intrinsic detector spatial resolutions and background count rates at various detector configurations and quantification of activity distributions. The technique was assessed for quantitative imaging of astatine-211 (211At) activity distributions in cryosections of murine and canine tissue samples. Results: The highest spatial resolution was measured at ∼20 μm full width at half maximum and the α-particle background was measured at a rate as low as (2.6 ± 0.5) × 10−4 cpm/cm2 (40 mm diameter detector area). Simultaneous imaging of

  3. Accuracy Potential and Applications of MIDAS Aerial Oblique Camera System

    NASA Astrophysics Data System (ADS)

    Madani, M.

    2012-07-01

    Airborne oblique cameras such as Fairchild T-3A were initially used for military reconnaissance in 30s. A modern professional digital oblique camera such as MIDAS (Multi-camera Integrated Digital Acquisition System) is used to generate lifelike three dimensional to the users for visualizations, GIS applications, architectural modeling, city modeling, games, simulators, etc. Oblique imagery provide the best vantage for accessing and reviewing changes to the local government tax base, property valuation assessment, buying & selling of residential/commercial for better decisions in a more timely manner. Oblique imagery is also used for infrastructure monitoring making sure safe operations of transportation, utilities, and facilities. Sanborn Mapping Company acquired one MIDAS from TrackAir in 2011. This system consists of four tilted (45 degrees) cameras and one vertical camera connected to a dedicated data acquisition computer system. The 5 digital cameras are based on the Canon EOS 1DS Mark3 with Zeiss lenses. The CCD size is 5,616 by 3,744 (21 MPixels) with the pixel size of 6.4 microns. Multiple flights using different camera configurations (nadir/oblique (28 mm/50 mm) and (50 mm/50 mm)) were flown over downtown Colorado Springs, Colorado. Boresight fights for 28 mm nadir camera were flown at 600 m and 1,200 m and for 50 mm nadir camera at 750 m and 1500 m. Cameras were calibrated by using a 3D cage and multiple convergent images utilizing Australis model. In this paper, the MIDAS system is described, a number of real data sets collected during the aforementioned flights are presented together with their associated flight configurations, data processing workflow, system calibration and quality control workflows are highlighted and the achievable accuracy is presented in some detail. This study revealed that the expected accuracy of about 1 to 1.5 GSD (Ground Sample Distance) for planimetry and about 2 to 2.5 GSD for vertical can be achieved. Remaining systematic

  4. Oblique Multi-Camera Systems - Orientation and Dense Matching Issues

    NASA Astrophysics Data System (ADS)

    Rupnik, E.; Nex, F.; Remondino, F.

    2014-03-01

    The use of oblique imagery has become a standard for many civil and mapping applications, thanks to the development of airborne digital multi-camera systems, as proposed by many companies (Blomoblique, IGI, Leica, Midas, Pictometry, Vexcel/Microsoft, VisionMap, etc.). The indisputable virtue of oblique photography lies in its simplicity of interpretation and understanding for inexperienced users allowing their use of oblique images in very different applications, such as building detection and reconstruction, building structural damage classification, road land updating and administration services, etc. The paper reports an overview of the actual oblique commercial systems and presents a workflow for the automated orientation and dense matching of large image blocks. Perspectives, potentialities, pitfalls and suggestions for achieving satisfactory results are given. Tests performed on two datasets acquired with two multi-camera systems over urban areas are also reported.

  5. Design of a modular digital computer system DRL 4 and 5. [design of airborne/spaceborne computer system

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Design and development efforts for a spaceborne modular computer system are reported. An initial baseline description is followed by an interface design that includes definition of the overall system response to all classes of failure. Final versions for the register level designs for all module types were completed. Packaging, support and control executive software, including memory utilization estimates and design verification plan, were formalized to insure a soundly integrated design of the digital computer system.

  6. Dry imaging cameras

    PubMed Central

    Indrajit, IK; Alam, Aftab; Sahni, Hirdesh; Bhatia, Mukul; Sahu, Samaresh

    2011-01-01

    Dry imaging cameras are important hard copy devices in radiology. Using dry imaging camera, multiformat images of digital modalities in radiology are created from a sealed unit of unexposed films. The functioning of a modern dry camera, involves a blend of concurrent processes, in areas of diverse sciences like computers, mechanics, thermal, optics, electricity and radiography. Broadly, hard copy devices are classified as laser and non laser based technology. When compared with the working knowledge and technical awareness of different modalities in radiology, the understanding of a dry imaging camera is often superficial and neglected. To fill this void, this article outlines the key features of a modern dry camera and its important issues that impact radiology workflow. PMID:21799589

  7. Single chip camera active pixel sensor

    NASA Technical Reports Server (NTRS)

    Shaw, Timothy (Inventor); Pain, Bedabrata (Inventor); Olson, Brita (Inventor); Nixon, Robert H. (Inventor); Fossum, Eric R. (Inventor); Panicacci, Roger A. (Inventor); Mansoorian, Barmak (Inventor)

    2003-01-01

    A totally digital single chip camera includes communications to operate most of its structure in serial communication mode. The digital single chip camera include a D/A converter for converting an input digital word into an analog reference signal. The chip includes all of the necessary circuitry for operating the chip using a single pin.

  8. Validating NASA's Airborne Multikilohertz Microlaser Altimeter (Microaltimeter) by Direct Comparison of Data Taken Over Ocean City, Maryland Against an Existing Digital Elevation Model

    NASA Technical Reports Server (NTRS)

    Abel, Peter

    2003-01-01

    NASA's Airborne Multikilohertz Microlaser Altimeter (Microaltimeter) is a scanning, photon-counting laser altimeter, which uses a low energy (less than 10 microJuoles), high repetition rate (approximately 10 kHz) laser, transmitting at 532 nm. A 14 cm diameter telescope images the ground return onto a segmented anode photomultiplier, which provides up to 16 range returns for each fire. Multiple engineering flights were made during 2001 and 2002 over the Maryland and Virginia coastal area, all during daylight hours. Post-processing of the data to geolocate the laser footprint and determine the terrain height requires post- detection Poisson filtering techniques to extract the actual ground returns from the noise. Validation of the instrument's ability to produce accurate terrain heights will be accomplished by direct comparison of data taken over Ocean City, Maryland with a Digital Elevation Model (DEM) of the region produced at Ohio State University (OSU) from other laser altimeter and photographic sources. The techniques employed to produce terrain heights from the Microaltimeter ranges will be shown, along with some preliminary comparisons with the OSU DEM.

  9. Development and Application of a new DACOM Airborne Trace Gas Instrument based on Room-Temperature Laser and Detector Technology and all-Digital Control and Data Processin

    NASA Astrophysics Data System (ADS)

    Diskin, G. S.; Sachse, G. W.; DiGangi, J. P.; Pusede, S. E.; Slate, T. A.; Rana, M.

    2014-12-01

    The DACOM (Differential Absorption Carbon monOxide Measurements) instrument has been used for airborne measurements of carbon monoxide, methane, and nitrous oxide for nearly four decades. Over the years, the instrument has undergone a nearly continuous series of modifications, taking advantage of improvements in available technology and the benefits of experience, but always utilizing cryogenically cooled lasers and detectors. More recently, though, the availability of room-temperature, higher-power single-mode lasers at the mid-infrared wavelengths used by DACOM has made it possible to replace both the cryogenic lasers and detectors with thermoelectrically cooled versions. And the relative stability of these lasers has allowed us to incorporate an all-digital wavelength stabilization technique developed previously for the Diode Laser Hygrometer (DLH) instrument. The new DACOM flew first in the summer 2013 SEAC4RS campaign, measuring CO from the DC-8 aircraft, and more recently measuring all three gases from the NASA P-3B aircraft in support of the summer 2014 DISCOVER-AQ campaign. We will present relevant aspects of the new instrument design and operation as well as selected data from recent campaigns illustrating instrument performance and some preliminary science.

  10. New approach to color calibration of high fidelity color digital camera by using unique wide gamut color generator based on LED diodes

    NASA Astrophysics Data System (ADS)

    Kretkowski, M.; Shimodaira, Y.; Jabłoński, R.

    2008-11-01

    Development of a high accuracy color reproduction system requires certain instrumentation and reference for color calibration. Our research led to development of a high fidelity color digital camera with implemented filters that realize the color matching functions. The output signal returns XYZ values which provide absolute description of color. In order to produce XYZ output a mathematical conversion must be applied to CCD output values introducing a conversion matrix. The conversion matrix coefficients are calculated by using a color reference with known XYZ values and corresponding output signals from the CCD sensor under each filter acquisition from a certain amount of color samples. The most important feature of the camera is its ability to acquire colors from the complete theoretically visible color gamut due to implemented filters. However market available color references such as various color checkers are enclosed within HDTV gamut, which is insufficient for calibration in the whole operating color range. This led to development of a unique color reference based on LED diodes called the LED Color Generator (LED CG). It is capable of displaying colors in a wide color gamut estimated by chromaticity coordinates of 12 primary colors. The total amount of colors possible to produce is 25512. The biggest advantage is a possibility of displaying colors with desired spectral distribution (with certain approximations) due to multiple primary colors it consists. The average color difference obtained for test colors was found to be ▵E~0.78 for calibration with LED CG. The result is much better and repetitive in comparison with the Macbeth ColorCheckerTM which typically gives ▵E~1.2 and in the best case ▵E~0.83 with specially developed techniques.

  11. Quantitative Single-Particle Digital Autoradiography with α-Particle Emitters for Targeted Radionuclide Therapy using the iQID Camera

    SciTech Connect

    Miller, Brian W.; Frost, Sophia; Frayo, Shani; Kenoyer, Aimee L.; Santos, E. B.; Jones, Jon C.; Green, Damian J.; Hamlin, Donald K.; Wilbur, D. Scott; Fisher, Darrell R.; Orozco, Johnnie J.; Press, Oliver W.; Pagel, John M.; Sandmaier, B. M.

    2015-07-01

    Abstract Alpha emitting radionuclides exhibit a potential advantage for cancer treatments because they release large amounts of ionizing energy over a few cell diameters (50–80 μm) causing localized, irreparable double-strand DNA breaks that lead to cell death. Radioimmunotherapy (RIT) approaches using monoclonal antibodies labeled with alpha emitters may inactivate targeted cells with minimal radiation damage to surrounding tissues. For accurate dosimetry in alpha-RIT, tools are needed to visualize and quantify the radioactivity distribution and absorbed dose to targeted and non-targeted cells, especially for organs and tumors with heterogeneous radionuclide distributions. The aim of this study was to evaluate and characterize a novel single-particle digital autoradiography imager, iQID (ionizing-radiation Quantum Imaging Detector), for use in alpha-RIT experiments. Methods: The iQID camera is a scintillator-based radiation detection technology that images and identifies charged-particle and gamma-ray/X-ray emissions spatially and temporally on an event-by-event basis. It employs recent advances in CCD/CMOS cameras and computing hardware for real-time imaging and activity quantification of tissue sections, approaching cellular resolutions. In this work, we evaluated this system’s characteristics for alpha particle imaging including measurements of spatial resolution and background count rates at various detector configurations and quantification of activity distributions. The technique was assessed for quantitative imaging of astatine-211 (211At) activity distributions in cryosections of murine and canine tissue samples. Results: The highest spatial resolution was measured at ~20 μm full width at half maximum (FWHM) and the alpha particle background was measured at a rate of (2.6 ± 0.5) × 10–4 cpm/cm2 (40 mm diameter detector area). Simultaneous imaging of multiple tissue sections was performed using a large-area iQID configuration (ø 11.5 cm

  12. High-resolution digital elevation model of lower Cowlitz and Toutle Rivers, adjacent to Mount St. Helens, Washington, based on an airborne lidar survey of October 2007

    USGS Publications Warehouse

    Mosbrucker, Adam

    2015-01-01

    The lateral blast, debris avalanche, and lahars of the May 18th, 1980, eruption of Mount St. Helens, Washington, dramatically altered the surrounding landscape. Lava domes were extruded during the subsequent eruptive periods of 1980–1986 and 2004–2008. More than three decades after the emplacement of the 1980 debris avalanche, high sediment production persists in the Toutle River basin, which drains the northern and western flanks of the volcano. Because this sediment increases the risk of flooding to downstream communities on the Toutle and lower Cowlitz Rivers, the U.S. Army Corps of Engineers (USACE), under the direction of Congress to maintain an authorized level of flood protection, continues to monitor and mitigate excess sediment in North and South Fork Toutle River basins to help reduce this risk and to prevent sediment from clogging the shipping channel of the Columbia River. From October 22–27, 2007, Watershed Sciences, Inc., under contract to USACE, collected high-precision airborne lidar (light detection and ranging) data that cover 273 square kilometers (105 square miles) of lower Cowlitz and Toutle River tributaries from the Columbia River at Kelso, Washington, to upper North Fork Toutle River (below the volcano's edifice), including lower South Fork Toutle River. These data provide a digital dataset of the ground surface, including beneath forest cover. Such remotely sensed data can be used to develop sediment budgets and models of sediment erosion, transport, and deposition. The U.S. Geological Survey (USGS) used these lidar data to develop digital elevation models (DEMs) of the study area. DEMs are fundamental to monitoring natural hazards and studying volcanic landforms, fluvial and glacial geomorphology, and surface geology. Watershed Sciences, Inc., provided files in the LASer (LAS) format containing laser returns that had been filtered, classified, and georeferenced. The USGS produced a hydro-flattened DEM from ground-classified points at

  13. High-resolution digital elevation model of Mount St. Helens crater and upper North Fork Toutle River basin, Washington, based on an airborne lidar survey of September 2009

    USGS Publications Warehouse

    Mosbrucker, Adam

    2014-01-01

    The lateral blast, debris avalanche, and lahars of the May 18th, 1980, eruption of Mount St. Helens, Washington, dramatically altered the surrounding landscape. Lava domes were extruded during the subsequent eruptive periods of 1980–1986 and 2004–2008. More than three decades after the emplacement of the 1980 debris avalanche, high sediment production persists in the North Fork Toutle River basin, which drains the northern flank of the volcano. Because this sediment increases the risk of flooding to downstream communities on the Toutle and Cowlitz Rivers, the U.S. Army Corps of Engineers (USACE), under the direction of Congress to maintain an authorized level of flood protection, built a sediment retention structure on the North Fork Toutle River in 1989 to help reduce this risk and to prevent sediment from clogging the shipping channel of the Columbia River. From September 16–20, 2009, Watershed Sciences, Inc., under contract to USACE, collected high-precision airborne lidar (light detection and ranging) data that cover 214 square kilometers (83 square miles) of Mount St. Helens and the upper North Fork Toutle River basin from the sediment retention structure to the volcano's crater. These data provide a digital dataset of the ground surface, including beneath forest cover. Such remotely sensed data can be used to develop sediment budgets and models of sediment erosion, transport, and deposition. The U.S. Geological Survey (USGS) used these lidar data to develop digital elevation models (DEMs) of the study area. DEMs are fundamental to monitoring natural hazards and studying volcanic landforms, fluvial and glacial geomorphology, and surface geology. Watershed Sciences, Inc., provided files in the LASer (LAS) format containing laser returns that had been filtered, classified, and georeferenced. The USGS produced a hydro-flattened DEM from ground-classified points at Castle, Coldwater, and Spirit Lakes. Final results averaged about five laser last

  14. A Low Noise, Microprocessor-Controlled, Internally Digitizing Rotating-Vane Electric Field Mill for Airborne Platforms

    NASA Technical Reports Server (NTRS)

    Bateman, M. G.; Stewart, M. F.; Blakeslee, R. J.; Podgorny, s. J.; Christian, H. J.; Mach, D. M.; Bailey, J. C.; Daskar, D.

    2006-01-01

    This paper reports on a new generation of aircraft-based rotating-vane style electric field mills designed and built at NASA's Marshall Spaceflight Center. The mills have individual microprocessors that digitize the electric field signal at the mill and respond to commands from the data system computer. The mills are very sensitive (1 V/m per bit), have a wide dynamic range (115 dB), and are very low noise (+/-1 LSB). Mounted on an aircraft, these mills can measure fields from +/-1 V/m to +/-500 kV/m. Once-per-second commanding from the data collection computer to each mill allows for precise timing and synchronization. The mills can also be commanded to execute a self-calibration in flight, which is done periodically to monitor the status and health of each mill.

  15. Use of a digital camera onboard an unmanned aerial vehicle to monitor spring phenology at individual tree level

    NASA Astrophysics Data System (ADS)

    Berra, Elias; Gaulton, Rachel; Barr, Stuart

    2016-04-01

    The monitoring of forest phenology, in a cost-effective manner, at a fine spatial scale and over relatively large areas remains a significant challenge. To address this issue, unmanned aerial vehicles (UAVs) appear as a potential new option for forest phenology monitoring. The aim of this study is to assess the potential of imagery acquired from a UAV to track seasonal changes in leaf canopy at individual tree level. UAV flights, deploying consumer-grade standard and near-infrared modified cameras, were carried out over a deciduous woodland during the spring season of 2015, from which a temporal series of calibrated and georeferenced 5 cm spatial resolution orthophotos was generated. Initial results from a subset of trees are presented in this paper. Four trees with different observed Start of Season (SOS) dates were selected to monitor UAV-derived Green Chromatic Coordinate (GCC), as a measure of canopy greenness. Mean GCC values were extracted from within the four individual tree crowns and were plotted against the day of year (DOY) when the data were acquired. The temporal GCC trajectory of each tree was associated with the visual observations of leaf canopy phenology (SOS) and also with the development of understory vegetation. The chronological order when sudden increases of GCC values occurred matched with the chronological order of observed SOS: the first sudden increase in GCC was detected in the tree which first reached SOS; 18.5 days later (on average) the last sudden increase of GCC was detected in the tree which last reached SOS (18 days later than the first one). Trees with later observed SOS presented GCC values increasing slowly over time, which were associated with development of understory vegetation. Ongoing work is dealing with: 1) testing different indices; 2) radiometric calibration (retrieving of spectral reflectance); 3) expanding the analysis to more tree individuals, more tree species and over larger forest areas, and; 4) deriving

  16. Airborne system for testing multispectral reconnaissance technologies

    NASA Astrophysics Data System (ADS)

    Schmitt, Dirk-Roger; Doergeloh, Heinrich; Keil, Heiko; Wetjen, Wilfried

    1999-07-01

    There is an increasing demand for future airborne reconnaissance systems to obtain aerial images for tactical or peacekeeping operations. Especially Unmanned Aerial Vehicles (UAVs) equipped with multispectral sensor system and with real time jam resistant data transmission capabilities are of high interest. An airborne experimental platform has been developed as testbed to investigate different concepts of reconnaissance systems before their application in UAVs. It is based on a Dornier DO 228 aircraft, which is used as flying platform. Great care has been taken to achieve the possibility to test different kinds of multispectral sensors. Hence basically it is capable to be equipped with an IR sensor head, high resolution aerial cameras of the whole optical spectrum and radar systems. The onboard equipment further includes system for digital image processing, compression, coding, and storage. The data are RF transmitted to the ground station using technologies with high jam resistance. The images, after merging with enhanced vision components, are delivered to the observer who has an uplink data channel available to control flight and imaging parameters.

  17. Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    Holland, S. Douglas (Inventor)

    1992-01-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  18. Introduction to an airborne remote sensing system equipped onboard the Chinese marine surveillance plane

    NASA Astrophysics Data System (ADS)

    Gong, Fang; Wang, Difeng; Pan, Delu; Hao, Zengzhou

    2008-10-01

    The airborne remote sensing system onboard the Chinese Marine Surveillance Plane have three scanners including marine airborne multi-spectrum scanner(MAMS), airborne hyper spectral system(AISA+) and optical-electric platform(MOP) currently. MAMS is developed by Shanghai Institute of Technology and Physics CAS with 11 bands from ultraviolet to infrared and mainly used for inversion of oceanic main factors and pollution information, like chlorophyll, sea surface temperature, red tide, etc. The AISA+ made by Finnish Specim company is a push broom system, consist of a high spectrum scanner head, a miniature GPS/INS sensor and data collecting PC. It is a kind of aviation imaging spectrometer and has the ability of ground target imaging and measuring target spectrum characteristic. The MOP mainly supports for object watching, recording and track. It mainly includes 3 equipments: digital CCD with Sony-DXC390, CANON EOS film camera and digital camera Sony F717. This paper mainly introduces these three remote sensing instruments as well as the ground processing information system, involving the system's hardware and software design, related algorithm research, etc.

  19. Camera Optics.

    ERIC Educational Resources Information Center

    Ruiz, Michael J.

    1982-01-01

    The camera presents an excellent way to illustrate principles of geometrical optics. Basic camera optics of the single-lens reflex camera are discussed, including interchangeable lenses and accessories available to most owners. Several experiments are described and results compared with theoretical predictions or manufacturer specifications.…

  20. Assessing modern ground survey methods and airborne laser scanning for digital terrain modelling: A case study from the Lake District, England

    NASA Astrophysics Data System (ADS)

    Gallay, Michal; Lloyd, Christopher D.; McKinley, Jennifer; Barry, Lorraine

    2013-02-01

    This paper compares the applicability of three ground survey methods for modelling terrain: one man electronic tachymetry (TPS), real time kinematic GPS (GPS), and terrestrial laser scanning (TLS). Vertical accuracy of digital terrain models (DTMs) derived from GPS, TLS and airborne laser scanning (ALS) data is assessed. Point elevations acquired by the four methods represent two sections of a mountainous area in Cumbria, England. They were chosen so that the presence of non-terrain features is constrained to the smallest amount. The vertical accuracy of the DTMs was addressed by subtracting each DTM from TPS point elevations. The error was assessed using exploratory measures including statistics, histograms, and normal probability plots. The results showed that the internal measurement accuracy of TPS, GPS, and TLS was below a centimetre. TPS and GPS can be considered equally applicable alternatives for sampling the terrain in areas accessible on foot. The highest DTM vertical accuracy was achieved with GPS data, both on sloped terrain (RMSE 0.16 m) and flat terrain (RMSE 0.02 m). TLS surveying was the most efficient overall but veracity of terrain representation was subject to dense vegetation cover. Therefore, the DTM accuracy was the lowest for the sloped area with dense bracken (RMSE 0.52 m) although it was the second highest on the flat unobscured terrain (RMSE 0.07 m). ALS data represented the sloped terrain more realistically (RMSE 0.23 m) than the TLS. However, due to a systematic bias identified on the flat terrain the DTM accuracy was the lowest (RMSE 0.29 m) which was above the level stated by the data provider. Error distribution models were more closely approximated by normal distribution defined using median and normalized median absolute deviation which supports the use of the robust measures in DEM error modelling and its propagation.

  1. Development of an airborne laser bathymeter

    NASA Technical Reports Server (NTRS)

    Kim, H., H.; Cervenka, P. O.; Lankford, C. B.

    1975-01-01

    An airborne laser depth sounding system was built and taken through a complete series of field tests. Two green laser sources were tried: a pulsed neon laser at 540 nm and a frequency-doubled Nd:YAG transmitter at 532 nm. To obtain a depth resolution of better than 20 cm, the pulses had a duration of 5 to 7 nanoseconds and could be fired up to at rates of 50 pulses per second. In the receiver, the signal was detected by a photomultiplier tube connected to a 28 cm diameter Cassegrainian telescope that was aimed vertically downward. Oscilloscopic traces of the signal reflected from the sea surface and the ocean floor could either be recorded by a movie camera on 35 mm film or digitized into 500 discrete channels of information and stored on magnetic tape, from which depth information could be extracted. An aerial color movie camera recorded the geographic footprint while a boat crew of oceanographers measured depth and other relevant water parameters. About two hundred hours of flight time on the NASA C-54 airplane in the area of Chincoteague, Virginia, the Chesapeake Bay, and in Key West, Florida, have yielded information on the actual operating conditions of such a system and helped to optimize the design. One can predict the maximum depth attainable in a mission by measuring the effective attenuation coefficient in flight. This quantity is four times smaller than the usual narrow beam attenuation coefficient. Several square miles of a varied underwater landscape were also mapped.

  2. New portable system for dental plaque measurement using a digital single-lens reflex camera and image analysis: Study of reliability and validation

    PubMed Central

    Rosa, Guillermo Martin; Elizondo, Maria Lidia

    2015-01-01

    Background: The quantification of the dental plaque (DP) by indices has limitations: They depend on the subjective operator's evaluation and are measured in an ordinal scale. The purpose of this study was to develop and evaluate a method to measure DP in a proportional scale. Materials and Methods: A portable photographic positioning device (PPPD) was designed and added to a photographic digital single-lens reflex camera. Seventeen subjects participated in this study, after DP disclosure with the erythrosine, their incisors, and a calibration scale ware photographed by two operators in duplicate, re-positioning the PPPD among each acquisition. A third operator registered the Quigley-Hein modified by Turesky DP index (Q-H/TPI). After tooth brushing, the same operators repeated the photographs and the Q-H/TPI. The image analysis system (IAS) technique allowed the measurement in mm2 of the vestibular total tooth area and the area with DP. Results: The reliability was determined with the intra-class correlation coefficient that was 0.9936 (P < 0.05) for the intra-operator repeatability and 0.9931 (P < 0.05) for inter-operator reproducibility. The validity was assessed using the Spearman's correlation coefficient that indicated a strong positive correlation with the Q-H/TPI rs = 0.84 (P < 0.01). The sensitivity of the IAS was evaluated with two sample sizes, only the IAS was able to detect significant differences (P < 0.05) with the sample of smaller size (n = 8). Conclusions: Image analysis system showed to be a reliable and valid method to measure the quantity of DP in a proportional scale, allowing a more powerful statistical analysis, thus facilitating trials with a smaller sample size. PMID:26229267

  3. Multispectral imaging of absorption and scattering properties of in vivo exposed rat brain using a digital red-green-blue camera

    NASA Astrophysics Data System (ADS)

    Yoshida, Keiichiro; Nishidate, Izumi; Ishizuka, Tomohiro; Kawauchi, Satoko; Sato, Shunichi; Sato, Manabu

    2015-05-01

    In order to estimate multispectral images of the absorption and scattering properties in the cerebral cortex of in vivo rat brain, we investigated spectral reflectance images estimated by the Wiener estimation method using a digital RGB camera. A Monte Carlo simulation-based multiple regression analysis for the corresponding spectral absorbance images at nine wavelengths (500, 520, 540, 560, 570, 580, 600, 730, and 760 nm) was then used to specify the absorption and scattering parameters of brain tissue. In this analysis, the concentrations of oxygenated hemoglobin and that of deoxygenated hemoglobin were estimated as the absorption parameters, whereas the coefficient a and the exponent b of the reduced scattering coefficient spectrum approximated by a power law function were estimated as the scattering parameters. The spectra of absorption and reduced scattering coefficients were reconstructed from the absorption and scattering parameters, and the spectral images of absorption and reduced scattering coefficients were then estimated. In order to confirm the feasibility of this method, we performed in vivo experiments on exposed rat brain. The estimated images of the absorption coefficients were dominated by the spectral characteristics of hemoglobin. The estimated spectral images of the reduced scattering coefficients had a broad scattering spectrum, exhibiting a larger magnitude at shorter wavelengths, corresponding to the typical spectrum of brain tissue published in the literature. The changes in the estimated absorption and scattering parameters during normoxia, hyperoxia, and anoxia indicate the potential applicability of the method by which to evaluate the pathophysiological conditions of in vivo brain due to the loss of tissue viability.

  4. Time-resolved imaging of prompt-gamma rays for proton range verification using a knife-edge slit camera based on digital photon counters.

    PubMed

    Cambraia Lopes, Patricia; Clementel, Enrico; Crespo, Paulo; Henrotin, Sebastien; Huizenga, Jan; Janssens, Guillaume; Parodi, Katia; Prieels, Damien; Roellinghoff, Frauke; Smeets, Julien; Stichelbaut, Frederic; Schaart, Dennis R

    2015-08-01

    Proton range monitoring may facilitate online adaptive proton therapy and improve treatment outcomes. Imaging of proton-induced prompt gamma (PG) rays using a knife-edge slit collimator is currently under investigation as a potential tool for real-time proton range monitoring. A major challenge in collimated PG imaging is the suppression of neutron-induced background counts. In this work, we present an initial performance test of two knife-edge slit camera prototypes based on arrays of digital photon counters (DPCs). PG profiles emitted from a PMMA target upon irradiation with a 160 MeV proton pencil beams (about 6.5 × 10(9) protons delivered in total) were measured using detector modules equipped with four DPC arrays coupled to BGO or LYSO : Ce crystal matrices. The knife-edge slit collimator and detector module were placed at 15 cm and 30 cm from the beam axis, respectively, in all cases. The use of LYSO : Ce enabled time-of-flight (TOF) rejection of background events, by synchronizing the DPC readout electronics with the 106 MHz radiofrequency signal of the cyclotron. The signal-to-background (S/B) ratio of 1.6 obtained with a 1.5 ns TOF window and a 3 MeV-7 MeV energy window was about 3 times higher than that obtained with the same detector module without TOF discrimination and 2 times higher than the S/B ratio obtained with the BGO module. Even 1 mm shifts of the Bragg peak position translated into clear and consistent shifts of the PG profile if TOF discrimination was applied, for a total number of protons as low as about 6.5 × 10(8) and a detector surface of 6.6 cm × 6.6 cm.

  5. Airborne Transparencies.

    ERIC Educational Resources Information Center

    Horne, Lois Thommason

    1984-01-01

    Starting from a science project on flight, art students discussed and investigated various means of moving in space. Then they made acetate illustrations which could be used as transparencies. The projection phenomenon made the illustrations look airborne. (CS)

  6. Time-resolved imaging of prompt-gamma rays for proton range verification using a knife-edge slit camera based on digital photon counters

    NASA Astrophysics Data System (ADS)

    Cambraia Lopes, Patricia; Clementel, Enrico; Crespo, Paulo; Henrotin, Sebastien; Huizenga, Jan; Janssens, Guillaume; Parodi, Katia; Prieels, Damien; Roellinghoff, Frauke; Smeets, Julien; Stichelbaut, Frederic; Schaart, Dennis R.

    2015-08-01

    Proton range monitoring may facilitate online adaptive proton therapy and improve treatment outcomes. Imaging of proton-induced prompt gamma (PG) rays using a knife-edge slit collimator is currently under investigation as a potential tool for real-time proton range monitoring. A major challenge in collimated PG imaging is the suppression of neutron-induced background counts. In this work, we present an initial performance test of two knife-edge slit camera prototypes based on arrays of digital photon counters (DPCs). PG profiles emitted from a PMMA target upon irradiation with a 160 MeV proton pencil beams (about 6.5   ×   109 protons delivered in total) were measured using detector modules equipped with four DPC arrays coupled to BGO or LYSO : Ce crystal matrices. The knife-edge slit collimator and detector module were placed at 15 cm and 30 cm from the beam axis, respectively, in all cases. The use of LYSO : Ce enabled time-of-flight (TOF) rejection of background events, by synchronizing the DPC readout electronics with the 106 MHz radiofrequency signal of the cyclotron. The signal-to-background (S/B) ratio of 1.6 obtained with a 1.5 ns TOF window and a 3 MeV-7 MeV energy window was about 3 times higher than that obtained with the same detector module without TOF discrimination and 2 times higher than the S/B ratio obtained with the BGO module. Even 1 mm shifts of the Bragg peak position translated into clear and consistent shifts of the PG profile if TOF discrimination was applied, for a total number of protons as low as about 6.5   ×   108 and a detector surface of 6.6 cm  ×  6.6 cm.

  7. Expected accuracy of tilt measurements on a novel hexapod-based digital zenith camera system: a Monte-Carlo simulation study

    NASA Astrophysics Data System (ADS)

    Hirt, Christian; Papp, Gábor; Pál, András; Benedek, Judit; Szũcs, Eszter

    2014-08-01

    Digital zenith camera systems (DZCS) are dedicated astronomical-geodetic measurement systems for the observation of the direction of the plumb line. A DZCS key component is a pair of tilt meters for the determination of the instrumental tilt with respect to the plumb line. Highest accuracy (i.e., 0.1 arc-seconds or better) is achieved in practice through observation with precision tilt meters in opposite faces (180° instrumental rotation), and application of rigorous tilt reduction models. A novel concept proposes the development of a hexapod (Stewart platform)-based DZCS. However, hexapod-based total rotations are limited to about 30°-60° in azimuth (equivalent to ±15° to ±30° yaw rotation), which raises the question of the impact of the rotation angle between the two faces on the accuracy of the tilt measurement. The goal of the present study is the investigation of the expected accuracy of tilt measurements to be carried out on future hexapod-based DZCS, with special focus placed on the role of the limited rotation angle. A Monte-Carlo simulation study is carried out in order to derive accuracy estimates for the tilt determination as a function of several input parameters, and the results are validated against analytical error propagation. As the main result of the study, limitation of the instrumental rotation to 60° (30°) deteriorates the tilt accuracy by a factor of about 2 (4) compared to a 180° rotation between the faces. Nonetheless, a tilt accuracy at the 0.1 arc-second level is expected when the rotation is at least 45°, and 0.05 arc-second (about 0.25 microradian) accurate tilt meters are deployed. As such, a hexapod-based DZCS can be expected to allow sufficiently accurate determination of the instrumental tilt. This provides supporting evidence for the feasibility of such a novel instrumentation. The outcomes of our study are not only relevant to the field of DZCS, but also to all other types of instruments where the instrumental tilt

  8. Airborne Imagery Collections Barrow 2013

    DOE Data Explorer

    Cherry, Jessica; Crowder, Kerri

    2015-07-20

    The data here are orthomosaics, digital surface models (DSMs), and individual frames captured during low altitude airborne flights in 2013 at the Barrow Environmental Observatory. The orthomosaics, thermal IR mosaics, and DSMs were generated from the individual frames using Structure from Motion techniques.

  9. Range-Gated LADAR Coherent Imaging Using Parametric Up-Conversion of IR and NIR Light for Imaging with a Visible-Range Fast-Shuttered Intensified Digital CCD Camera

    SciTech Connect

    YATES,GEORGE J.; MCDONALD,THOMAS E. JR.; BLISS,DAVID E.; CAMERON,STEWART M.; ZUTAVERN,FRED J.

    2000-12-20

    Research is presented on infrared (IR) and near infrared (NIR) sensitive sensor technologies for use in a high speed shuttered/intensified digital video camera system for range-gated imaging at ''eye-safe'' wavelengths in the region of 1.5 microns. The study is based upon nonlinear crystals used for second harmonic generation (SHG) in optical parametric oscillators (OPOS) for conversion of NIR and IR laser light to visible range light for detection with generic S-20 photocathodes. The intensifiers are ''stripline'' geometry 18-mm diameter microchannel plate intensifiers (MCPIIS), designed by Los Alamos National Laboratory and manufactured by Philips Photonics. The MCPIIS are designed for fast optical shattering with exposures in the 100-200 ps range, and are coupled to a fast readout CCD camera. Conversion efficiency and resolution for the wavelength conversion process are reported. Experimental set-ups for the wavelength shifting and the optical configurations for producing and transporting laser reflectance images are discussed.

  10. CAOS-CMOS camera.

    PubMed

    Riza, Nabeel A; La Torre, Juan Pablo; Amin, M Junaid

    2016-06-13

    Proposed and experimentally demonstrated is the CAOS-CMOS camera design that combines the coded access optical sensor (CAOS) imager platform with the CMOS multi-pixel optical sensor. The unique CAOS-CMOS camera engages the classic CMOS sensor light staring mode with the time-frequency-space agile pixel CAOS imager mode within one programmable optical unit to realize a high dynamic range imager for extreme light contrast conditions. The experimentally demonstrated CAOS-CMOS camera is built using a digital micromirror device, a silicon point-photo-detector with a variable gain amplifier, and a silicon CMOS sensor with a maximum rated 51.3 dB dynamic range. White light imaging of three different brightness simultaneously viewed targets, that is not possible by the CMOS sensor, is achieved by the CAOS-CMOS camera demonstrating an 82.06 dB dynamic range. Applications for the camera include industrial machine vision, welding, laser analysis, automotive, night vision, surveillance and multispectral military systems.

  11. CAOS-CMOS camera.

    PubMed

    Riza, Nabeel A; La Torre, Juan Pablo; Amin, M Junaid

    2016-06-13

    Proposed and experimentally demonstrated is the CAOS-CMOS camera design that combines the coded access optical sensor (CAOS) imager platform with the CMOS multi-pixel optical sensor. The unique CAOS-CMOS camera engages the classic CMOS sensor light staring mode with the time-frequency-space agile pixel CAOS imager mode within one programmable optical unit to realize a high dynamic range imager for extreme light contrast conditions. The experimentally demonstrated CAOS-CMOS camera is built using a digital micromirror device, a silicon point-photo-detector with a variable gain amplifier, and a silicon CMOS sensor with a maximum rated 51.3 dB dynamic range. White light imaging of three different brightness simultaneously viewed targets, that is not possible by the CMOS sensor, is achieved by the CAOS-CMOS camera demonstrating an 82.06 dB dynamic range. Applications for the camera include industrial machine vision, welding, laser analysis, automotive, night vision, surveillance and multispectral military systems. PMID:27410361

  12. Fourth Airborne Geoscience Workshop

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The focus of the workshop was on how the airborne community can assist in achieving the goals of the Global Change Research Program. The many activities that employ airborne platforms and sensors were discussed: platforms and instrument development; airborne oceanography; lidar research; SAR measurements; Doppler radar; laser measurements; cloud physics; airborne experiments; airborne microwave measurements; and airborne data collection.

  13. Space Camera

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Nikon's F3 35mm camera was specially modified for use by Space Shuttle astronauts. The modification work produced a spinoff lubricant. Because lubricants in space have a tendency to migrate within the camera, Nikon conducted extensive development to produce nonmigratory lubricants; variations of these lubricants are used in the commercial F3, giving it better performance than conventional lubricants. Another spinoff is the coreless motor which allows the F3 to shoot 140 rolls of film on one set of batteries.

  14. The future of consumer cameras

    NASA Astrophysics Data System (ADS)

    Battiato, Sebastiano; Moltisanti, Marco

    2015-03-01

    In the last two decades multimedia, and in particular imaging devices (camcorders, tablets, mobile phones, etc.) have been dramatically diffused. Moreover the increasing of their computational performances, combined with an higher storage capability, allows them to process large amount of data. In this paper an overview of the current trends of consumer cameras market and technology will be given, providing also some details about the recent past (from Digital Still Camera up today) and forthcoming key issues.

  15. Characterization of the Series 1000 Camera System

    SciTech Connect

    Kimbrough, J; Moody, J; Bell, P; Landen, O

    2004-04-07

    The National Ignition Facility requires a compact network addressable scientific grade CCD camera for use in diagnostics ranging from streak cameras to gated x-ray imaging cameras. Due to the limited space inside the diagnostic, an analog and digital input/output option in the camera controller permits control of both the camera and the diagnostic by a single Ethernet link. The system consists of a Spectral Instruments Series 1000 camera, a PC104+ controller, and power supply. The 4k by 4k CCD camera has a dynamic range of 70 dB with less than 14 electron read noise at a 1MHz readout rate. The PC104+ controller includes 16 analog inputs, 4 analog outputs and 16 digital input/output lines for interfacing to diagnostic instrumentation. A description of the system and performance characterization is reported.

  16. The Geospectral Camera: a Compact and Geometrically Precise Hyperspectral and High Spatial Resolution Imager

    NASA Astrophysics Data System (ADS)

    Delauré, B.; Michiels, B.; Biesemans, J.; Livens, S.; Van Achteren, T.

    2013-04-01

    Small unmanned aerial vehicles are increasingly being employed for environmental monitoring at local scale, which drives the demand for compact and lightweight spectral imagers. This paper describes the geospectral camera, which is a novel compact imager concept. The camera is built around an innovative detector which has two sensor elements on a single chip and therefore offers the functionality of two cameras within the volume of a single one. The two sensor elements allow the camera to derive both spectral information as well as geometric information (high spatial resolution imagery and a digital surface model) of the scene of interest. A first geospectral camera prototype has been developed. It uses a linear variable optical filter which is installed in front of one of the two sensors of the MEDUSA CMOS imager chip. A accompanying software approach has been developed which exploits the simultaneous information of the two sensors in order to extract an accurate spectral image product. This method has been functionally demonstrated by applying it on image data acquired during an airborne acquisition.

  17. Airborne Imagery

    NASA Technical Reports Server (NTRS)

    1983-01-01

    ATM (Airborne Thematic Mapper) was developed for NSTL (National Space Technology Companies) by Daedalus Company. It offers expanded capabilities for timely, accurate and cost effective identification of areas with prospecting potential. A related system is TIMS, Thermal Infrared Multispectral Scanner. Originating from Landsat 4, it is also used for agricultural studies, etc.

  18. Infrared Camera

    NASA Technical Reports Server (NTRS)

    1997-01-01

    A sensitive infrared camera that observes the blazing plumes from the Space Shuttle or expendable rocket lift-offs is capable of scanning for fires, monitoring the environment and providing medical imaging. The hand-held camera uses highly sensitive arrays in infrared photodetectors known as quantum well infrared photo detectors (QWIPS). QWIPS were developed by the Jet Propulsion Laboratory's Center for Space Microelectronics Technology in partnership with Amber, a Raytheon company. In October 1996, QWIP detectors pointed out hot spots of the destructive fires speeding through Malibu, California. Night vision, early warning systems, navigation, flight control systems, weather monitoring, security and surveillance are among the duties for which the camera is suited. Medical applications are also expected.

  19. Using Digital Imaging in Classroom and Outdoor Activities.

    ERIC Educational Resources Information Center

    Thomasson, Joseph R.

    2002-01-01

    Explains how to use digital cameras and related basic equipment during indoor and outdoor activities. Uses digital imaging in general botany class to identify unknown fungus samples. Explains how to select a digital camera and other necessary equipment. (YDS)

  20. Estimation of the Atmospheric Refraction Effect in Airborne Images Using Radiosonde Data

    NASA Astrophysics Data System (ADS)

    Beisl, U.; Tempelmann, U.

    2016-06-01

    The influence of the atmospheric refraction on the geometric accuracy of airborne photogrammetric images was already considered in the days of analogue photography. The effect is a function of the varying refractive index on the path from the ground to the image sensor. Therefore the effect depends on the height over ground, the view zenith angle and the atmospheric constituents. It is leading to a gradual increase of the scale towards the borders of the image, i.e. a magnification takes place. Textbooks list a shift of several pixels at the borders of standard wide angle images. As it was the necessity of that time when images could only be acquired at good weather conditions, the effect was calculated using standard atmospheres for good atmospheric conditions, leading to simple empirical formulas. Often the pixel shift caused by refraction was approximated as linear with height and compensated by an adjustment of the focal length. With the advent of sensitive digital cameras, the image dynamics allows for capturing images at adverse weather conditions. So the influence of the atmospheric profiles on the geometric accuracy of the images has to be investigated and the validity of the standard correction formulas has to be checked. This paper compares the results from the standard formulas by Saastamoinen with the results calculated from a broad selection of atmospheres obtained from radiosonde profile data. The geometric deviation is calculated by numerical integration of the refractive index as a function of the height using the refractive index formula by Ciddor. It turns out that the effect of different atmospheric profiles (including inversion situations) is generally small compared to the overall effect except at low camera heights. But there the absolute deviation is small. Since the necessary atmospheric profile data are often not readily available for airborne images a formula proposed by Saastamoinen is verified that uses only camera height, the pressure

  1. CCD Camera

    DOEpatents

    Roth, R.R.

    1983-08-02

    A CCD camera capable of observing a moving object which has varying intensities of radiation emanating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other. 7 figs.

  2. CCD Camera

    DOEpatents

    Roth, Roger R.

    1983-01-01

    A CCD camera capable of observing a moving object which has varying intensities of radiation eminating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other.

  3. Nikon Camera

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Nikon FM compact has simplification feature derived from cameras designed for easy, yet accurate use in a weightless environment. Innovation is a plastic-cushioned advance lever which advances the film and simultaneously switches on a built in light meter. With a turn of the lens aperture ring, a glowing signal in viewfinder confirms correct exposure.

  4. SITHON: An Airborne Fire Detection System Compliant with Operational Tactical Requirements

    PubMed Central

    Kontoes, Charalabos; Keramitsoglou, Iphigenia; Sifakis, Nicolaos; Konstantinidis, Pavlos

    2009-01-01

    In response to the urging need of fire managers for timely information on fire location and extent, the SITHON system was developed. SITHON is a fully digital thermal imaging system, integrating INS/GPS and a digital camera, designed to provide timely positioned and projected thermal images and video data streams rapidly integrated in the GIS operated by Crisis Control Centres. This article presents in detail the hardware and software components of SITHON, and demonstrates the first encouraging results of test flights over the Sithonia Peninsula in Northern Greece. It is envisaged that the SITHON system will be soon operated onboard various airborne platforms including fire brigade airplanes and helicopters as well as on UAV platforms owned and operated by the Greek Air Forces. PMID:22399963

  5. SITHON: An Airborne Fire Detection System Compliant with Operational Tactical Requirements.

    PubMed

    Kontoes, Charalabos; Keramitsoglou, Iphigenia; Sifakis, Nicolaos; Konstantinidis, Pavlos

    2009-01-01

    In response to the urging need of fire managers for timely information on fire location and extent, the SITHON system was developed. SITHON is a fully digital thermal imaging system, integrating INS/GPS and a digital camera, designed to provide timely positioned and projected thermal images and video data streams rapidly integrated in the GIS operated by Crisis Control Centres. This article presents in detail the hardware and software components of SITHON, and demonstrates the first encouraging results of test flights over the Sithonia Peninsula in Northern Greece. It is envisaged that the SITHON system will be soon operated onboard various airborne platforms including fire brigade airplanes and helicopters as well as on UAV platforms owned and operated by the Greek Air Forces.

  6. Microprocessor-Based Airborne Spectrometer System

    NASA Astrophysics Data System (ADS)

    Kates, John C.

    1980-08-01

    A system for airborne infrared spectral signature measurements has been developed using a Fourier transform spectrometer interfaced to a microprocessor data acquisition, control and display system. The microprocessor is a DEC LSI-ll with 20KW RAM, 4KW EPROM, DMA spectrometer interface, digital magnetic tape, and dot-matrix video graphic display. A real-time executive tailored to the requirements and resources available allows concurrent data acquisition, recording, reduction and display. Using multiple buffers, acquisition of spectrometer data via DMA is overlapped with magnetic tape output. A background task selects the most recent spectrometer data and processes it using an FFT into a raw spectrum. A reference background spectrum is subtracted to isolate the data component, then a reference instrument response function is applied to obtain a calibrated absolute irradiance spectrum. The irradiance spectrum is displayed on the video graphic display and mixed with boresight camera video to show the target spectrum superimposed on the target image. Extensive selftest facilities are incorporated for testing all system components and compatibility with data reduction systems. System calibration is supported by selection of reference blackbody temperatures, apertures, and distances. The instrument response curve obtained during calibration is displayed for verification of correct spectrometer operation or diagnosis of faults.

  7. Cameras Monitor Spacecraft Integrity to Prevent Failures

    NASA Technical Reports Server (NTRS)

    2014-01-01

    The Jet Propulsion Laboratory contracted Malin Space Science Systems Inc. to outfit Curiosity with four of its cameras using the latest commercial imaging technology. The company parlayed the knowledge gained under working with NASA to develop an off-the-shelf line of cameras, along with a digital video recorder, designed to help troubleshoot problems that may arise on satellites in space.

  8. An electronic multiband camera film viewer.

    NASA Technical Reports Server (NTRS)

    Roberts, L. H.

    1972-01-01

    An electronic viewer for real-time viewing and processing of multiband camera imagery is described. The Multiband Camera Film Viewer (MCFV) is a high-resolution, 1000-line system scanning three channels of multiband imagery. The MCFV provides a calibrated output from each of the three channels for viewing in composite true color, analog false-color, and digitized, enhanced false color.

  9. Making Connections with Digital Data

    ERIC Educational Resources Information Center

    Leonard, William; Bassett, Rick; Clinger, Alicia; Edmondson, Elizabeth; Horton, Robert

    2004-01-01

    State-of-the-art digital cameras open up enormous possibilities in the science classroom, especially when used as data collectors. Because most high school students are not fully formal thinkers, the digital camera can provide a much richer learning experience than traditional observation. Data taken through digital images can make the…

  10. Generation of countrywide reference digital terrain model from airborne laser scannig in ISOK project. (Polish Title: Generowanie referencyjnego numerycznego modelu terenu o zasięgu krajowym w oparciu o lotnicze skanowanie laserowe w projekcie ISOK)

    NASA Astrophysics Data System (ADS)

    Kurczyński, Z.; Bakuła, K.

    2013-12-01

    The paper analyzes the selection of main parameters of ALS system having an impact on acquisition of digital elevation models described by the assumed and exorbitant qualitative characteristics on the basis of airborne laser scanning project organized as a part of recommendations included in the Floods Directive in Poland. Such an analysis was the basis for determining conditions for a huge project, whose implementation is still ongoing, as well as its organizational and economic possibilities. Another subject for consideration was the scope of the planned works and time limits for their implementation that imposed the schedule of the project with respect to the efficiency of the work. Among the technical characteristics the authors investigated the density of the LiDAR points, georeferencing of point clouds, meteorological conditions during data acquisition and parameters describing the accuracy of the final products, namely digital elevation models, were also investigated. In such a complex project (the division of the whole project for subareas, many contractors, several final products and many stages of delivery product acceptance as well as quality control are key issues, whose effectiveness is proved by good organization, their scope and the adopted criteria

  11. Extracting Roof Parameters and Heat Bridges Over the City of Oldenburg from Hyperspectral, Thermal, and Airborne Laser Scanning Data

    NASA Astrophysics Data System (ADS)

    Bannehr, L.; Luhmann, Th.; Piechel, J.; Roelfs, T.; Schmidt, An.

    2011-09-01

    Remote sensing methods are used to obtain different kinds of information about the state of the environment. Within the cooperative research project HiReSens, funded by the German BMBF, a hyperspectral scanner, an airborne laser scanner, a thermal camera, and a RGB-camera are employed on a small aircraft to determine roof material parameters and heat bridges of house tops over the city Oldenburg, Lower Saxony. HiReSens aims to combine various geometrical highly resolved data in order to achieve relevant evidence about the state of the city buildings. Thermal data are used to obtain the energy distribution of single buildings. The use of hyperspectral data yields information about material consistence of roofs. From airborne laser scanning data (ALS) digital surface models are inferred. They build the basis to locate the best orientations for solar panels of the city buildings. The combination of the different data sets offers the opportunity to capitalize synergies between differently working systems. Central goals are the development of tools for the collection of heat bridges by means of thermal data, spectral collection of roofs parameters on basis of hyperspectral data as well as 3D-capture of buildings from airborne lasers scanner data. Collecting, analyzing and merging of the data are not trivial especially not when the resolution and accuracy is aimed in the domain of a few decimetre. The results achieved need to be regarded as preliminary. Further investigations are still required to prove the accuracy in detail.

  12. The Continuous wavelet in airborne gravimetry

    NASA Astrophysics Data System (ADS)

    Liang, X.; Liu, L.

    2013-12-01

    Airborne gravimetry is an efficient method to recover medium and high frequency band of earth gravity over any region, especially inaccessible areas, which can measure gravity data with high accuracy,high resolution and broad range in a rapidly and economical way, and It will play an important role for geoid and geophysical exploration. Filtering methods for reducing high-frequency errors is critical to the success of airborne gravimetry due to Aircraft acceleration determination based on GPS.Tradiontal filters used in airborne gravimetry are FIR,IIR filer and so on. This study recommends an improved continuous wavelet to process airborne gravity data. Here we focus on how to construct the continuous wavelet filters and show their working principle. Particularly the technical parameters (window width parameter and scale parameter) of the filters are tested. Then the raw airborne gravity data from the first Chinese airborne gravimetry campaign are filtered using FIR-low pass filter and continuous wavelet filters to remove the noise. The comparison to reference data is performed to determinate external accuracy, which shows that continuous wavelet filters applied to airborne gravity in this thesis have good performances. The advantages of the continuous wavelet filters over digital filters are also introduced. The effectiveness of the continuous wavelet filters for airborne gravimetry is demonstrated through real data computation.

  13. Digitized Photography: What You Can Do with It.

    ERIC Educational Resources Information Center

    Kriss, Jack

    1997-01-01

    Discusses benefits of digital cameras which allow users to take a picture, store it on a digital disk, and manipulate/export these photos to a print document, Web page, or multimedia presentation. Details features of digital cameras and discusses educational uses. A sidebar presents prices and other information for 12 digital cameras. (AEF)

  14. Benchmarking High Density Image Matching for Oblique Airborne Imagery

    NASA Astrophysics Data System (ADS)

    Cavegn, S.; Haala, N.; Nebiker, S.; Rothermel, M.; Tutzauer, P.

    2014-08-01

    Both, improvements in camera technology and new pixel-wise matching approaches triggered the further development of software tools for image based 3D reconstruction. Meanwhile research groups as well as commercial vendors provide photogrammetric software to generate dense, reliable and accurate 3D point clouds and Digital Surface Models (DSM) from highly overlapping aerial images. In order to evaluate the potential of these algorithms in view of the ongoing software developments, a suitable test bed is provided by the ISPRS/EuroSDR initiative Benchmark on High Density Image Matching for DSM Computation. This paper discusses the proposed test scenario to investigate the potential of dense matching approaches for 3D data capture from oblique airborne imagery. For this purpose, an oblique aerial image block captured at a GSD of 6 cm in the west of Zürich by a Leica RCD30 Oblique Penta camera is used. Within this paper, the potential test scenario is demonstrated using matching results from two software packages, Agisoft PhotoScan and SURE from University of Stuttgart. As oblique images are frequently used for data capture at building facades, 3D point clouds are mainly investigated at such areas. Reference data from terrestrial laser scanning is used to evaluate data quality from dense image matching for several facade patches with respect to accuracy, density and reliability.

  15. Enhancing Positioning Accuracy in Urban Terrain by Fusing Data from a GPS Receiver, Inertial Sensors, Stereo-Camera and Digital Maps for Pedestrian Navigation

    PubMed Central

    Przemyslaw, Baranski; Pawel, Strumillo

    2012-01-01

    The paper presents an algorithm for estimating a pedestrian location in an urban environment. The algorithm is based on the particle filter and uses different data sources: a GPS receiver, inertial sensors, probability maps and a stereo camera. Inertial sensors are used to estimate a relative displacement of a pedestrian. A gyroscope estimates a change in the heading direction. An accelerometer is used to count a pedestrian's steps and their lengths. The so-called probability maps help to limit GPS inaccuracy by imposing constraints on pedestrian kinematics, e.g., it is assumed that a pedestrian cannot cross buildings, fences etc. This limits position inaccuracy to ca. 10 m. Incorporation of depth estimates derived from a stereo camera that are compared to the 3D model of an environment has enabled further reduction of positioning errors. As a result, for 90% of the time, the algorithm is able to estimate a pedestrian location with an error smaller than 2 m, compared to an error of 6.5 m for a navigation based solely on GPS. PMID:22969321

  16. Depth estimation using a lightfield camera

    NASA Astrophysics Data System (ADS)

    Roper, Carissa

    The latest innovation to camera design has come in the form of the lightfield, or plenoptic, camera that captures 4-D radiance data rather than just the 2-D scene image via microlens arrays. With the spatial and angular light ray data now recorded on the camera sensor, it is feasible to construct algorithms that can estimate depth of field in different portions of a given scene. There are limitations to the precision due to hardware structure and the sheer number of scene variations that can occur. In this thesis, the potential of digital image analysis and spatial filtering to extract depth information is tested on the commercially available plenoptic camera.

  17. Solid state television camera has no imaging tube

    NASA Technical Reports Server (NTRS)

    Huggins, C. T.

    1972-01-01

    Camera with characteristics of vidicon camera and greater resolution than home TV receiver uses mosaic of phototransistors. Because of low power and small size, camera has many applications. Mosaics can be used as cathode ray tubes and analog-to-digital converters.

  18. Imaging Emission Spectra with Handheld and Cellphone Cameras

    ERIC Educational Resources Information Center

    Sitar, David

    2012-01-01

    As point-and-shoot digital camera technology advances it is becoming easier to image spectra in a laboratory setting on a shoestring budget and get immediate results. With this in mind, I wanted to test three cameras to see how their results would differ. Two undergraduate physics students and I used one handheld 7.1 megapixel (MP) digital Cannon…

  19. The development of airborne video system for monitoring of river environments

    SciTech Connect

    Yoshikawa, Shigeya; Mizutani, Nobuyuki; Mizukami, Masumi; Koyano, Toshirou

    1996-11-01

    Recently, airborne videography is widely used by many monitoring for environmental resources, such as rivers, forests, ocean, and so on. Although airborne videography has a low resolution than aerial photographs, it can effectively reduce the cost of continuous monitoring of wide area. Furthermore video images can easily be processed with personal computer. This paper introduces an airborne video system for monitoring of Class A river environment. This system consists of two sub-systems. One is the data collection system that is composed of a video camera, a Global Positioning System(GPS) and a personal computer. This sub-system records information of rivers by video images and their corresponding location data. A GPS system is used for calculating location data and navigating the airplane to the destination of monitoring site. Other is a simplified digital video editing system. This system runs on a personal computer with Microsoft Windows 3.1. This system can also be used for management and planning of road environment, marine resources, forest resources and for prevention of disasters. 7 refs., 4 figs.

  20. Extracting dynamic spatial data from airborne imaging sensors to support traffic flow estimation

    NASA Astrophysics Data System (ADS)

    Toth, C. K.; Grejner-Brzezinska, D.

    The recent transition from analog to totally digital data acquisition and processing techniques in airborne surveying represents a major milestone in the evolution of spatial information science and practice. On one hand, the improved quality of the primary sensor data can provide the foundation for better automation of the information extraction processes. This phenomenon is also strongly supported by continuously expanding computer technology, which offers almost unlimited processing power. On the other hand, the variety of the data, including rich information content and better temporal characteristics, acquired by the new digital sensors and coupled with rapidly advancing processing techniques, is broadening the applications of airborne surveying. One of these new application areas is traffic flow extraction aimed at supporting better traffic monitoring and management. Transportation mapping has always represented a significant segment of civilian mapping and is mainly concerned with road corridor mapping for design and engineering purposes, infrastructure mapping and facility management, and more recently, environmental mapping. In all these cases, the objective of the mapping is to extract the static features of the object space, such as man-made and natural objects, typically along the road network. In contrast, the traffic moving in the transportation network represents a very dynamic environment, which complicates the spatial data extraction processes as the signals of moving vehicles should be identified and removed. Rather than removing and discarding the signals, however, they can be turned into traffic flow information. This paper reviews initial research efforts to extract traffic flow information from laserscanner and digital camera sensors installed in airborne platforms.

  1. NASA IceBridge: Airborne surveys of the polar sea ice covers

    NASA Astrophysics Data System (ADS)

    Richter-Menge, J.; Farrell, S. L.

    2014-12-01

    The NASA Operation IceBridge (OIB) airborne sea ice surveys are designed to continue a valuable series of sea ice thickness measurements by bridging the gap between NASA's Ice, Cloud and Land Elevation Satellite (ICESat), which operated from 2003 to 2009, and ICESat-2, which is scheduled for launch in 2017. Initiated in 2009, OIB has conducted campaigns over the western Arctic Ocean (March/April) and Southern Oceans (October/November) on an annual basis. Primary OIB sensors being used for sea ice observations include the Airborne Topographic Mapper laser altimeter, the Digital Mapping System digital camera, a Ku-band radar altimeter, a frequency-modulated continuous-wave (FMCW) snow radar, and a KT-19 infrared radiation pyrometer. Data from the campaigns are available to the research community at: http://nsidc.org/data/icebridge/. This presentation will summarize the spatial and temporal extent of the campaigns and highlight key scientific accomplishments, which include: • Documented changes in the Arctic marine cryosphere since the dramatic sea ice loss of 2007 • Novel snow depth measurements over sea ice in the Arctic • Improved skill of April-to-September sea ice predictions via numerical ice/ocean models • Validation of satellite altimetry measurements (ICESat, CryoSat-2, and IceSat-2/MABEL)

  2. NASA IceBridge: Scientific Insights from Airborne Surveys of the Polar Sea Ice Covers

    NASA Astrophysics Data System (ADS)

    Richter-Menge, J.; Farrell, S. L.

    2015-12-01

    The NASA Operation IceBridge (OIB) airborne sea ice surveys are designed to continue a valuable series of sea ice thickness measurements by bridging the gap between NASA's Ice, Cloud and Land Elevation Satellite (ICESat), which operated from 2003 to 2009, and ICESat-2, which is scheduled for launch in 2017. Initiated in 2009, OIB has conducted campaigns over the western Arctic Ocean (March/April) and Southern Oceans (October/November) on an annual basis when the thickness of sea ice cover is nearing its maximum. More recently, a series of Arctic surveys have also collected observations in the late summer, at the end of the melt season. The Airborne Topographic Mapper (ATM) laser altimeter is one of OIB's primary sensors, in combination with the Digital Mapping System digital camera, a Ku-band radar altimeter, a frequency-modulated continuous-wave (FMCW) snow radar, and a KT-19 infrared radiation pyrometer. Data from the campaigns are available to the research community at: http://nsidc.org/data/icebridge/. This presentation will summarize the spatial and temporal extent of the OIB campaigns and their complementary role in linking in situ and satellite measurements, advancing observations of sea ice processes across all length scales. Key scientific insights gained on the state of the sea ice cover will be highlighted, including snow depth, ice thickness, surface roughness and morphology, and melt pond evolution.

  3. Camera-on-a-Chip

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Jet Propulsion Laboratory's research on a second generation, solid-state image sensor technology has resulted in the Complementary Metal- Oxide Semiconductor Active Pixel Sensor (CMOS), establishing an alternative to the Charged Coupled Device (CCD). Photobit Corporation, the leading supplier of CMOS image sensors, has commercialized two products of their own based on this technology: the PB-100 and PB-300. These devices are cameras on a chip, combining all camera functions. CMOS "active-pixel" digital image sensors offer several advantages over CCDs, a technology used in video and still-camera applications for 30 years. The CMOS sensors draw less energy, they use the same manufacturing platform as most microprocessors and memory chips, and they allow on-chip programming of frame size, exposure, and other parameters.

  4. Caught on Camera.

    ERIC Educational Resources Information Center

    Milshtein, Amy

    2002-01-01

    Describes the benefits of and rules to be followed when using surveillance cameras for school security. Discusses various camera models, including indoor and outdoor fixed position cameras, pan-tilt zoom cameras, and pinhole-lens cameras for covert surveillance. (EV)

  5. Point cloud generation from aerial image data acquired by a quadrocopter type micro unmanned aerial vehicle and a digital still camera.

    PubMed

    Rosnell, Tomi; Honkavaara, Eija

    2012-01-01

    The objective of this investigation was to develop and investigate methods for point cloud generation by image matching using aerial image data collected by quadrocopter type micro unmanned aerial vehicle (UAV) imaging systems. Automatic generation of high-quality, dense point clouds from digital images by image matching is a recent, cutting-edge step forward in digital photogrammetric technology. The major components of the system for point cloud generation are a UAV imaging system, an image data collection process using high image overlaps, and post-processing with image orientation and point cloud generation. Two post-processing approaches were developed: one of the methods is based on Bae Systems' SOCET SET classical commercial photogrammetric software and another is built using Microsoft(®)'s Photosynth™ service available in the Internet. Empirical testing was carried out in two test areas. Photosynth processing showed that it is possible to orient the images and generate point clouds fully automatically without any a priori orientation information or interactive work. The photogrammetric processing line provided dense and accurate point clouds that followed the theoretical principles of photogrammetry, but also some artifacts were detected. The point clouds from the Photosynth processing were sparser and noisier, which is to a large extent due to the fact that the method is not optimized for dense point cloud generation. Careful photogrammetric processing with self-calibration is required to achieve the highest accuracy. Our results demonstrate the high performance potential of the approach and that with rigorous processing it is possible to reach results that are consistent with theory. We also point out several further research topics. Based on theoretical and empirical results, we give recommendations for properties of imaging sensor, data collection and processing of UAV image data to ensure accurate point cloud generation.

  6. Point Cloud Generation from Aerial Image Data Acquired by a Quadrocopter Type Micro Unmanned Aerial Vehicle and a Digital Still Camera

    PubMed Central

    Rosnell, Tomi; Honkavaara, Eija

    2012-01-01

    The objective of this investigation was to develop and investigate methods for point cloud generation by image matching using aerial image data collected by quadrocopter type micro unmanned aerial vehicle (UAV) imaging systems. Automatic generation of high-quality, dense point clouds from digital images by image matching is a recent, cutting-edge step forward in digital photogrammetric technology. The major components of the system for point cloud generation are a UAV imaging system, an image data collection process using high image overlaps, and post-processing with image orientation and point cloud generation. Two post-processing approaches were developed: one of the methods is based on Bae Systems’ SOCET SET classical commercial photogrammetric software and another is built using Microsoft®’s Photosynth™ service available in the Internet. Empirical testing was carried out in two test areas. Photosynth processing showed that it is possible to orient the images and generate point clouds fully automatically without any a priori orientation information or interactive work. The photogrammetric processing line provided dense and accurate point clouds that followed the theoretical principles of photogrammetry, but also some artifacts were detected. The point clouds from the Photosynth processing were sparser and noisier, which is to a large extent due to the fact that the method is not optimized for dense point cloud generation. Careful photogrammetric processing with self-calibration is required to achieve the highest accuracy. Our results demonstrate the high performance potential of the approach and that with rigorous processing it is possible to reach results that are consistent with theory. We also point out several further research topics. Based on theoretical and empirical results, we give recommendations for properties of imaging sensor, data collection and processing of UAV image data to ensure accurate point cloud generation. PMID:22368479

  7. Point cloud generation from aerial image data acquired by a quadrocopter type micro unmanned aerial vehicle and a digital still camera.

    PubMed

    Rosnell, Tomi; Honkavaara, Eija

    2012-01-01

    The objective of this investigation was to develop and investigate methods for point cloud generation by image matching using aerial image data collected by quadrocopter type micro unmanned aerial vehicle (UAV) imaging systems. Automatic generation of high-quality, dense point clouds from digital images by image matching is a recent, cutting-edge step forward in digital photogrammetric technology. The major components of the system for point cloud generation are a UAV imaging system, an image data collection process using high image overlaps, and post-processing with image orientation and point cloud generation. Two post-processing approaches were developed: one of the methods is based on Bae Systems' SOCET SET classical commercial photogrammetric software and another is built using Microsoft(®)'s Photosynth™ service available in the Internet. Empirical testing was carried out in two test areas. Photosynth processing showed that it is possible to orient the images and generate point clouds fully automatically without any a priori orientation information or interactive work. The photogrammetric processing line provided dense and accurate point clouds that followed the theoretical principles of photogrammetry, but also some artifacts were detected. The point clouds from the Photosynth processing were sparser and noisier, which is to a large extent due to the fact that the method is not optimized for dense point cloud generation. Careful photogrammetric processing with self-calibration is required to achieve the highest accuracy. Our results demonstrate the high performance potential of the approach and that with rigorous processing it is possible to reach results that are consistent with theory. We also point out several further research topics. Based on theoretical and empirical results, we give recommendations for properties of imaging sensor, data collection and processing of UAV image data to ensure accurate point cloud generation. PMID:22368479

  8. Using Google Earth for Rapid Dissemination of Airborne Remote Sensing Lidar and Photography

    NASA Astrophysics Data System (ADS)

    Wright, C. W.; Nayegandhi, A.; Brock, J. C.

    2006-12-01

    In order to visualize and disseminate vast amounts of lidar and digital photography data, we present a unique method that make these data layers available via the Google Earth interface. The NASA Experimental Advanced Airborne Research Lidar (EAARL) provides unprecedented capabilities to survey coral reefs, nearshore benthic habitats, coastal vegetation, and sandy beaches. The EAARL sensor suite includes a water-penetrating lidar that provides high-resolution topographic information, a down-looking color digital camera, a down-looking high-resolution color-infrared (CIR) digital camera, and precision kinematic GPS receivers which provide for sub-meter geo-referencing of each laser and multispectral sample. Google Earth "kml" files are created for each EAARL multispectral and processed lidar image. A hierarchical structure of network links allows the user to download high-resolution images within the region of interest. The first network link (kmz file) downloaded by the user contains a color coded flight path and "minute marker" icons along the flight path. Each "minute" icon provides access to the image overlays, and additional network links for each second along the flight path as well as flight navigation information. Layers of false-color-coded lidar Digital Elevation Model (DEM) data are made available in 2 km by 2km tiles. These layers include canopy-top, bare-Earth, submerged topography, and links to any other lidar products. The user has the option to download the x,y,z ascii point data or a DEM in the Geotif file format for each tile. The NASA EAARL project captured roughly 250,000 digital photographs in five flights conducted a few days after Hurricane Katrina made landfall along the Gulf Coast in 2005. All of the photos and DEM layers are georeferenced and viewable online using Google Earth.

  9. Spherical Camera

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Developed largely through a Small Business Innovation Research contract through Langley Research Center, Interactive Picture Corporation's IPIX technology provides spherical photography, a panoramic 360-degrees. NASA found the technology appropriate for use in guiding space robots, in the space shuttle and space station programs, as well as research in cryogenic wind tunnels and for remote docking of spacecraft. Images of any location are captured in their entirety in a 360-degree immersive digital representation. The viewer can navigate to any desired direction within the image. Several car manufacturers already use IPIX to give viewers a look at their latest line-up of automobiles. Another application is for non-invasive surgeries. By using OmniScope, surgeons can look more closely at various parts of an organ with medical viewing instruments now in use. Potential applications of IPIX technology include viewing of homes for sale, hotel accommodations, museum sites, news events, and sports stadiums.

  10. Lights, camera, action research: The effects of didactic digital movie making on students' twenty-first century learning skills and science content in the middle school classroom

    NASA Astrophysics Data System (ADS)

    Ochsner, Karl

    Students are moving away from content consumption to content production. Short movies are uploaded onto video social networking sites and shared around the world. Unfortunately they usually contain little to no educational value, lack a narrative and are rarely created in the science classroom. According to new Arizona Technology standards and ISTE NET*S, along with the framework from the Partnership for 21st Century Learning Standards, our society demands students not only to learn curriculum, but to think critically, problem solve effectively, and become adept at communicating and collaborating. Didactic digital movie making in the science classroom may be one way that these twenty-first century learning skills may be implemented. An action research study using a mixed-methods approach to collect data was used to investigate if didactic moviemaking can help eighth grade students learn physical science content while incorporating 21st century learning skills of collaboration, communication, problem solving and critical thinking skills through their group production. Over a five week period, students researched lessons, wrote scripts, acted, video recorded and edited a didactic movie that contained a narrative plot to teach a science strand from the Arizona State Standards in physical science. A pretest/posttest science content test and KWL chart was given before and after the innovation to measure content learned by the students. Students then took a 21st Century Learning Skills Student Survey to measure how much they perceived that communication, collaboration, problem solving and critical thinking were taking place during the production. An open ended survey and a focus group of four students were used for qualitative analysis. Three science teachers used a project evaluation rubric to measure science content and production values from the movies. Triangulating the science content test, KWL chart, open ended questions and the project evaluation rubric, it

  11. Key performance requirements for military low-light television cameras

    NASA Astrophysics Data System (ADS)

    Shimer, Steven; Heim, Gerald

    2007-04-01

    Low-light-level video cameras have benefited from rapid advances in digital technology during the past two decades. In legacy cameras, the video signal was processed using analog electronics which made real-time, nonlinear processing of the video signal very difficult. In state-of-the-art cameras, the analog signal is digitized directly from the sensor and processed entirely in the digital domain, enabling the application of advanced processing techniques to the video signal in real time. In fact, all aspects of modern low-light television cameras are controlled via digital technology, resulting in various enhancements that surpass analog electronics. In addition to video processing, large-scale digital integration in these low-light level cameras enables precise control of the image intensifier and image sensor, facilitating large inter-scene dynamic range capability, extended intra-scene dynamic range and blooming control. Digital video processing and digital camera control are used to provide improved system-level performance, including nearly perfect pixel response uniformity, correction of blemishes, and electronic boresight. Compact digital electronics also enable comprehensive camera built-in-test (BIT) capability which provides coverage for the entire camera--from photons into the sensor to the processed video signal going out the connector. Individuals involved in the procurement of present and future low-light-level cameras need to understand these advanced camera capabilities in order to write accurate specifications for their advanced video system requirements. This paper provides an overview of these modern video system capabilities along with example specification text.

  12. Advanced High-Definition Video Cameras

    NASA Technical Reports Server (NTRS)

    Glenn, William

    2007-01-01

    A product line of high-definition color video cameras, now under development, offers a superior combination of desirable characteristics, including high frame rates, high resolutions, low power consumption, and compactness. Several of the cameras feature a 3,840 2,160-pixel format with progressive scanning at 30 frames per second. The power consumption of one of these cameras is about 25 W. The size of the camera, excluding the lens assembly, is 2 by 5 by 7 in. (about 5.1 by 12.7 by 17.8 cm). The aforementioned desirable characteristics are attained at relatively low cost, largely by utilizing digital processing in advanced field-programmable gate arrays (FPGAs) to perform all of the many functions (for example, color balance and contrast adjustments) of a professional color video camera. The processing is programmed in VHDL so that application-specific integrated circuits (ASICs) can be fabricated directly from the program. ["VHDL" signifies VHSIC Hardware Description Language C, a computing language used by the United States Department of Defense for describing, designing, and simulating very-high-speed integrated circuits (VHSICs).] The image-sensor and FPGA clock frequencies in these cameras have generally been much higher than those used in video cameras designed and manufactured elsewhere. Frequently, the outputs of these cameras are converted to other video-camera formats by use of pre- and post-filters.

  13. Imaging Emission Spectra with Handheld and Cellphone Cameras

    NASA Astrophysics Data System (ADS)

    Sitar, David

    2012-12-01

    As point-and-shoot digital camera technology advances it is becoming easier to image spectra in a laboralory setting on a shoestring budget and get immediale results. With this in mind, I wanted to test three cameras to see how their results would differ. Two undergraduate physics students and I used one handheld 7.1 megapixel (MP) digital Cannon point-and-shoot auto focusing camera and two different cellphone cameras: one at 6.1 MP and the other at 5.1 MP.

  14. Video indirect ophthalmoscopy using a hand-held video camera.

    PubMed

    Shanmugam, Mahesh P

    2011-01-01

    Fundus photography in adults and cooperative children is possible with a fundus camera or by using a slit lamp-mounted digital camera. Retcam TM or a video indirect ophthalmoscope is necessary for fundus imaging in infants and young children under anesthesia. Herein, a technique of converting and using a digital video camera into a video indirect ophthalmoscope for fundus imaging is described. This device will allow anyone with a hand-held video camera to obtain fundus images. Limitations of this technique involve a learning curve and inability to perform scleral depression.

  15. Determining camera parameters for round glassware measurements

    NASA Astrophysics Data System (ADS)

    Baldner, F. O.; Costa, P. B.; Gomes, J. F. S.; Filho, D. M. E. S.; Leta, F. R.

    2015-01-01

    Nowadays there are many types of accessible cameras, including digital single lens reflex ones. Although these cameras are not usually employed in machine vision applications, they can be an interesting choice. However, these cameras have many available parameters to be chosen by the user and it may be difficult to select the best of these in order to acquire images with the needed metrological quality. This paper proposes a methodology to select a set of parameters that will supply a machine vision system with the needed quality image, considering the measurement required of a laboratory glassware.

  16. Determining Camera Gain in Room Temperature Cameras

    SciTech Connect

    Joshua Cogliati

    2010-12-01

    James R. Janesick provides a method for determining the amplification of a CCD or CMOS camera when only access to the raw images is provided. However, the equation that is provided ignores the contribution of dark current. For CCD or CMOS cameras that are cooled well below room temperature, this is not a problem, however, the technique needs adjustment for use with room temperature cameras. This article describes the adjustment made to the equation, and a test of this method.

  17. Experimental Advanced Airborne Research Lidar (EAARL) Data Processing Manual

    USGS Publications Warehouse

    Bonisteel, Jamie M.; Nayegandhi, Amar; Wright, C. Wayne; Brock, John C.; Nagle, David

    2009-01-01

    The Experimental Advanced Airborne Research Lidar (EAARL) is an example of a Light Detection and Ranging (Lidar) system that utilizes a blue-green wavelength (532 nanometers) to determine the distance to an object. The distance is determined by recording the travel time of a transmitted pulse at the speed of light (fig. 1). This system uses raster laser scanning with full-waveform (multi-peak) resolving capabilities to measure submerged topography and adjacent coastal land elevations simultaneously (Nayegandhi and others, 2009). This document reviews procedures for the post-processing of EAARL data using the custom-built Airborne Lidar Processing System (ALPS). ALPS software was developed in an open-source programming environment operated on a Linux platform. It has the ability to combine the laser return backscatter digitized at 1-nanosecond intervals with aircraft positioning information. This solution enables the exploration and processing of the EAARL data in an interactive or batch mode. ALPS also includes modules for the creation of bare earth, canopy-top, and submerged topography Digital Elevation Models (DEMs). The EAARL system uses an Earth-centered coordinate and reference system that removes the necessity to reference submerged topography data relative to water level or tide gages (Nayegandhi and others, 2006). The EAARL system can be mounted in an array of small twin-engine aircraft that operate at 300 meters above ground level (AGL) at a speed of 60 meters per second (117 knots). While other systems strive to maximize operational depth limits, EAARL has a narrow transmit beam and receiver field of view (1.5 to 2 milliradians), which improves the depth-measurement accuracy in shallow, clear water but limits the maximum depth to about 1.5 Secchi disk depth (~20 meters) in clear water. The laser transmitter [Continuum EPO-5000 yttrium aluminum garnet (YAG)] produces up to 5,000 short-duration (1.2 nanosecond), low-power (70 microjoules) pulses each second

  18. Performance Evaluation of Thermographic Cameras for Photogrammetric Measurements

    NASA Astrophysics Data System (ADS)

    Yastikli, N.; Guler, E.

    2013-05-01

    The aim of this research is the performance evaluation of the termographic cameras for possible use for photogrammetric documentation and deformation analyses caused by moisture and isolation problem of the historical and cultural heritage. To perform geometric calibration of the termographic camera, the 3D test object was designed with 77 control points which were distributed in different depths. For performance evaluation, Flir A320 termographic camera with 320 × 240 pixels and lens with 18 mm focal length was used. The Nikon D3X SLR digital camera with 6048 × 4032 pixels and lens with 20 mm focal length was used as reference for comparison. The size of pixel was 25 μm for the Flir A320 termographic camera and 6 μm for the Nikon D3X SLR digital camera. The digital images of the 3D test object were recorded with the Flir A320 termographic camera and Nikon D3X SLR digital camera and the image coordinate of the control points in the images were measured. The geometric calibration parameters, including the focal length, position of principal points, radial and tangential distortions were determined with introduced additional parameters in bundle block adjustments. The measurement of image coordinates and bundle block adjustments with additional parameters were performed using the PHIDIAS digital photogrammetric system. The bundle block adjustment was repeated with determined calibration parameter for both Flir A320 termographic camera and Nikon D3X SLR digital camera. The obtained standard deviation of measured image coordinates was 9.6 μm and 10.5 μm for Flir A320 termographic camera and 8.3 μm and 7.7 μm for Nikon D3X SLR digital camera. The obtained standard deviation of measured image points in Flir A320 termographic camera images almost same accuracy level with digital camera in comparison with 4 times bigger pixel size. The obtained results from this research, the interior geometry of the termographic cameras and lens distortion was modelled efficiently

  19. Digital photorefraction

    NASA Astrophysics Data System (ADS)

    Costa, Manuel F.; Jorge, Jorge M.

    1997-12-01

    The early evaluation of the visual status of human infants is of a critical importance. It is of utmost importance to the development of the child's visual system that she perceives clear, focused, retinal images. Furthermore if the refractive problems are not corrected in due time amblyopia may occur. Photorefraction is a non-invasive clinical tool rather convenient for application to this kind of population. A qualitative or semi-quantitative information about refractive errors, accommodation, strabismus, amblyogenic factors and some pathologies (cataracts) can the easily obtained. The photorefraction experimental setup we established using new technological breakthroughs on the fields of imaging devices, image processing and fiber optics, allows the implementation of both the isotropic and eccentric photorefraction approaches. Essentially both methods consist on delivering a light beam into the eyes. It is refracted by the ocular media, strikes the retina, focusing or not, reflects off and is collected by a camera. The system is formed by one CCD color camera and a light source. A beam splitter in front of the camera's objective allows coaxial illumination and observation. An optomechanical system also allows eccentric illumination. The light source is a flash type one and is synchronized with the camera's image acquisition. The camera's image is digitized displayed in real time. Image processing routines are applied for image's enhancement and feature extraction.

  20. Digital photorefraction

    NASA Astrophysics Data System (ADS)

    Costa, Manuel F. M.; Jorge, Jorge M.

    1998-01-01

    The early evaluation of the visual status of human infants is of a critical importance. It is of utmost importance to the development of the child's visual system that she perceives clear, focused, retinal images. Furthermore if the refractive problems are not corrected in due time amblyopia may occur. Photorefraction is a non-invasive clinical tool rather convenient for application to this kind of population. A qualitative or semi-quantitative information about refractive errors, accommodation, strabismus, amblyogenic factors and some pathologies (cataracts) can the easily obtained. The photorefraction experimental setup we established using new technological breakthroughs on the fields of imaging devices, image processing and fiber optics, allows the implementation of both the isotropic and eccentric photorefraction approaches. Essentially both methods consist on delivering a light beam into the eyes. It is refracted by the ocular media, strikes the retina, focusing or not, reflects off and is collected by a camera. The system is formed by one CCD color camera and a light source. A beam splitter in front of the camera's objective allows coaxial illumination and observation. An optomechanical system also allows eccentric illumination. The light source is a flash type one and is synchronized with the camera's image acquisition. The camera's image is digitized displayed in real time. Image processing routines are applied for image's enhancement and feature extraction.

  1. Method for out-of-focus camera calibration.

    PubMed

    Bell, Tyler; Xu, Jing; Zhang, Song

    2016-03-20

    State-of-the-art camera calibration methods assume that the camera is at least nearly in focus and thus fail if the camera is substantially defocused. This paper presents a method which enables the accurate calibration of an out-of-focus camera. Specifically, the proposed method uses a digital display (e.g., liquid crystal display monitor) to generate fringe patterns that encode feature points into the carrier phase; these feature points can be accurately recovered, even if the fringe patterns are substantially blurred (i.e., the camera is substantially defocused). Experiments demonstrated that the proposed method can accurately calibrate a camera regardless of the amount of defocusing: the focal length difference is approximately 0.2% when the camera is focused compared to when the camera is substantially defocused.

  2. Long-distance eye-safe laser TOF camera design

    NASA Astrophysics Data System (ADS)

    Kovalev, Anton V.; Polyakov, Vadim M.; Buchenkov, Vyacheslav A.

    2016-04-01

    We present a new TOF camera design based on a compact actively Q-switched diode pumped solid-state laser operating in 1.5 μm range and a receiver system based on a short wave infrared InGaAs PIN diodes focal plane array with an image intensifier and a special readout integration circuit. The compact camera is capable of depth imaging up to 4 kilometers with 10 frame/s and 1.2 m error. The camera could be applied for airborne and space geodesy location and navigation.

  3. Sub-Camera Calibration of a Penta-Camera

    NASA Astrophysics Data System (ADS)

    Jacobsen, K.; Gerke, M.

    2016-03-01

    Penta cameras consisting of a nadir and four inclined cameras are becoming more and more popular, having the advantage of imaging also facades in built up areas from four directions. Such system cameras require a boresight calibration of the geometric relation of the cameras to each other, but also a calibration of the sub-cameras. Based on data sets of the ISPRS/EuroSDR benchmark for multi platform photogrammetry the inner orientation of the used IGI Penta DigiCAM has been analyzed. The required image coordinates of the blocks Dortmund and Zeche Zollern have been determined by Pix4Dmapper and have been independently adjusted and analyzed by program system BLUH. With 4.1 million image points in 314 images respectively 3.9 million image points in 248 images a dense matching was provided by Pix4Dmapper. With up to 19 respectively 29 images per object point the images are well connected, nevertheless the high number of images per object point are concentrated to the block centres while the inclined images outside the block centre are satisfying but not very strongly connected. This leads to very high values for the Student test (T-test) of the finally used additional parameters or in other words, additional parameters are highly significant. The estimated radial symmetric distortion of the nadir sub-camera corresponds to the laboratory calibration of IGI, but there are still radial symmetric distortions also for the inclined cameras with a size exceeding 5μm even if mentioned as negligible based on the laboratory calibration. Radial and tangential effects of the image corners are limited but still available. Remarkable angular affine systematic image errors can be seen especially in the block Zeche Zollern. Such deformations are unusual for digital matrix cameras, but it can be caused by the correlation between inner and exterior orientation if only parallel flight lines are used. With exception of the angular affinity the systematic image errors for corresponding

  4. Mini gamma camera, camera system and method of use

    DOEpatents

    Majewski, Stanislaw; Weisenberger, Andrew G.; Wojcik, Randolph F.

    2001-01-01

    A gamma camera comprising essentially and in order from the front outer or gamma ray impinging surface: 1) a collimator, 2) a scintillator layer, 3) a light guide, 4) an array of position sensitive, high resolution photomultiplier tubes, and 5) printed circuitry for receipt of the output of the photomultipliers. There is also described, a system wherein the output supplied by the high resolution, position sensitive photomultipiler tubes is communicated to: a) a digitizer and b) a computer where it is processed using advanced image processing techniques and a specific algorithm to calculate the center of gravity of any abnormality observed during imaging, and c) optional image display and telecommunications ports.

  5. Lightweight Electronic Camera for Research on Clouds

    NASA Technical Reports Server (NTRS)

    Lawson, Paul

    2006-01-01

    "Micro-CPI" (wherein "CPI" signifies "cloud-particle imager") is the name of a small, lightweight electronic camera that has been proposed for use in research on clouds. It would acquire and digitize high-resolution (3- m-pixel) images of ice particles and water drops at a rate up to 1,000 particles (and/or drops) per second.

  6. Improved Airborne System for Sensing Wildfires

    NASA Technical Reports Server (NTRS)

    McKeown, Donald; Richardson, Michael

    2008-01-01

    The Wildfire Airborne Sensing Program (WASP) is engaged in a continuing effort to develop an improved airborne instrumentation system for sensing wildfires. The system could also be used for other aerial-imaging applications, including mapping and military surveillance. Unlike prior airborne fire-detection instrumentation systems, the WASP system would not be based on custom-made multispectral line scanners and associated custom- made complex optomechanical servomechanisms, sensors, readout circuitry, and packaging. Instead, the WASP system would be based on commercial off-the-shelf (COTS) equipment that would include (1) three or four electronic cameras (one for each of three or four wavelength bands) instead of a multispectral line scanner; (2) all associated drive and readout electronics; (3) a camera-pointing gimbal; (4) an inertial measurement unit (IMU) and a Global Positioning System (GPS) receiver for measuring the position, velocity, and orientation of the aircraft; and (5) a data-acquisition subsystem. It would be necessary to custom-develop an integrated sensor optical-bench assembly, a sensor-management subsystem, and software. The use of mostly COTS equipment is intended to reduce development time and cost, relative to those of prior systems.

  7. Vacuum Camera Cooler

    NASA Technical Reports Server (NTRS)

    Laugen, Geoffrey A.

    2011-01-01

    Acquiring cheap, moving video was impossible in a vacuum environment, due to camera overheating. This overheating is brought on by the lack of cooling media in vacuum. A water-jacketed camera cooler enclosure machined and assembled from copper plate and tube has been developed. The camera cooler (see figure) is cup-shaped and cooled by circulating water or nitrogen gas through copper tubing. The camera, a store-bought "spy type," is not designed to work in a vacuum. With some modifications the unit can be thermally connected when mounted in the cup portion of the camera cooler. The thermal conductivity is provided by copper tape between parts of the camera and the cooled enclosure. During initial testing of the demonstration unit, the camera cooler kept the CPU (central processing unit) of this video camera at operating temperature. This development allowed video recording of an in-progress test, within a vacuum environment.

  8. Making Ceramic Cameras

    ERIC Educational Resources Information Center

    Squibb, Matt

    2009-01-01

    This article describes how to make a clay camera. This idea of creating functional cameras from clay allows students to experience ceramics, photography, and painting all in one unit. (Contains 1 resource and 3 online resources.)

  9. Constrained space camera assembly

    DOEpatents

    Heckendorn, Frank M.; Anderson, Erin K.; Robinson, Casandra W.; Haynes, Harriet B.

    1999-01-01

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras.

  10. Adaptation of the Camera Link Interface for Flight-Instrument Applications

    NASA Technical Reports Server (NTRS)

    Randall, David P.; Mahoney, John C.

    2010-01-01

    COTS (commercial-off-the-shelf) hard ware using an industry-standard Camera Link interface is proposed to accomplish the task of designing, building, assembling, and testing electronics for an airborne spectrometer that would be low-cost, but sustain the required data speed and volume. The focal plane electronics were designed to support that hardware standard. Analysis was done to determine how these COTS electronics could be interfaced with space-qualified camera electronics. Interfaces available for spaceflight application do not support the industry standard Camera Link interface, but with careful design, COTS EGSE (electronics ground support equipment), including camera interfaces and camera simulators, can still be used.

  11. Nanosecond frame cameras

    SciTech Connect

    Frank, A M; Wilkins, P R

    2001-01-05

    The advent of CCD cameras and computerized data recording has spurred the development of several new cameras and techniques for recording nanosecond images. We have made a side by side comparison of three nanosecond frame cameras, examining them for both performance and operational characteristics. The cameras include; Micro-Channel Plate/CCD, Image Diode/CCD and Image Diode/Film; combinations of gating/data recording. The advantages and disadvantages of each device will be discussed.

  12. HST High Gain Antennae photographed by Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This close-up view of one of the two High Gain Antennae (HGA) on the Hubble Space Telescope (HST) was photographed with an Electronic Still Camera (ESC), and downlinked to ground controllers soon afterward. Electronic still photography is a technology which provides the means for a handheld camera to electronically capture and digitize an image with resolution approaching film quality.

  13. Harpicon camera for HDTV

    NASA Astrophysics Data System (ADS)

    Tanada, Jun

    1992-08-01

    Ikegami has been involved in broadcast equipment ever since it was established as a company. In conjunction with NHK it has brought forth countless television cameras, from black-and-white cameras to color cameras, HDTV cameras, and special-purpose cameras. In the early days of HDTV (high-definition television, also known as "High Vision") cameras the specifications were different from those for the cameras of the present-day system, and cameras using all kinds of components, having different arrangements of components, and having different appearances were developed into products, with time spent on experimentation, design, fabrication, adjustment, and inspection. But recently the knowhow built up thus far in components, , printed circuit boards, and wiring methods has been incorporated in camera fabrication, making it possible to make HDTV cameras by metbods similar to the present system. In addition, more-efficient production, lower costs, and better after-sales service are being achieved by using the same circuits, components, mechanism parts, and software for both HDTV cameras and cameras that operate by the present system.

  14. Vacuum compatible miniature CCD camera head

    DOEpatents

    Conder, Alan D.

    2000-01-01

    A charge-coupled device (CCD) camera head which can replace film for digital imaging of visible light, ultraviolet radiation, and soft to penetrating x-rays, such as within a target chamber where laser produced plasmas are studied. The camera head is small, capable of operating both in and out of a vacuum environment, and is versatile. The CCD camera head uses PC boards with an internal heat sink connected to the chassis for heat dissipation, which allows for close(0.04" for example) stacking of the PC boards. Integration of this CCD camera head into existing instrumentation provides a substantial enhancement of diagnostic capabilities for studying high energy density plasmas, for a variety of military industrial, and medical imaging applications.

  15. Aviation spectral camera infinity target simulation system

    NASA Astrophysics Data System (ADS)

    Liu, Xinyue; Ming, Xing; Liu, Jiu; Guo, Wenji; Lv, Gunbo

    2014-11-01

    With the development of science and technology, the applications of aviation spectral camera becoming more widely. Developing a test system of dynamic target is more important. Aviation spectral camera infinity target simulation system can be used to test the resolution and the modulation transfer function of camera. The construction and work principle of infinity target simulation system were introduced in detail. Dynamic target generator based digital micromirror device (DMD) and required performance of collimation System were analyzed and reported. The dynamic target generator based on DMD had the advantages of replacing image convenient, size small and flexible. According to the requirement of tested camera, by rotating and moving mirror, has completed a full field infinity dynamic target test plan.

  16. Airborne remote sensing of spatiotemporal change (1955-2004) in indigenous and exotic forest cover in the Taita Hills, Kenya

    NASA Astrophysics Data System (ADS)

    Pellikka, Petri K. E.; Lötjönen, Milla; Siljander, Mika; Lens, Luc

    2009-08-01

    We studied changes in area and species composition of six indigenous forest fragments in the Taita Hills, Kenya using 1955 and 1995 aerial photography with 2004 airborne digital camera mosaics. The study area is part of Eastern Arc Mountains, a global biodiversity hot spot that boasts an outstanding diversity of flora and fauna and a high level of endemism. While a total of 260 ha (50%) of indigenous tropical cloud forest was lost to agriculture and bushland between 1955 and 2004, large-scale planting of exotic pines, eucalyptus, grevillea, black wattle and cypress on barren land during the same period resulted in a balanced total forest area. In the Taita Hills, like in other Afrotropical forests, indigenous forest loss may adversely affect ecosystem services.

  17. 2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING WEST TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  18. 6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA CAR WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  19. 7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA INSIDE CAMERA CAR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  20. Tower Camera Handbook

    SciTech Connect

    Moudry, D

    2005-01-01

    The tower camera in Barrow provides hourly images of ground surrounding the tower. These images may be used to determine fractional snow cover as winter arrives, for comparison with the albedo that can be calculated from downward-looking radiometers, as well as some indication of present weather. Similarly, during spring time, the camera images show the changes in the ground albedo as the snow melts. The tower images are saved in hourly intervals. In addition, two other cameras, the skydeck camera in Barrow and the piling camera in Atqasuk, show the current conditions at those sites.

  1. Qualification Tests of Micro-camera Modules for Space Applications

    NASA Astrophysics Data System (ADS)

    Kimura, Shinichi; Miyasaka, Akira

    Visual capability is very important for space-based activities, for which small, low-cost space cameras are desired. Although cameras for terrestrial applications are continually being improved, little progress has been made on cameras used in space, which must be extremely robust to withstand harsh environments. This study focuses on commercial off-the-shelf (COTS) CMOS digital cameras because they are very small and are based on an established mass-market technology. Radiation and ultrahigh-vacuum tests were conducted on a small COTS camera that weighs less than 100 mg (including optics). This paper presents the results of the qualification tests for COTS cameras and for a small, low-cost COTS-based space camera.

  2. Automated Camera Calibration

    NASA Technical Reports Server (NTRS)

    Chen, Siqi; Cheng, Yang; Willson, Reg

    2006-01-01

    Automated Camera Calibration (ACAL) is a computer program that automates the generation of calibration data for camera models used in machine vision systems. Machine vision camera models describe the mapping between points in three-dimensional (3D) space in front of the camera and the corresponding points in two-dimensional (2D) space in the camera s image. Calibrating a camera model requires a set of calibration data containing known 3D-to-2D point correspondences for the given camera system. Generating calibration data typically involves taking images of a calibration target where the 3D locations of the target s fiducial marks are known, and then measuring the 2D locations of the fiducial marks in the images. ACAL automates the analysis of calibration target images and greatly speeds the overall calibration process.

  3. Digital In, Digital Out: Digital Editing with Firewire.

    ERIC Educational Resources Information Center

    Doyle, Bob; Sauer, Jeff

    1997-01-01

    Reviews linear and nonlinear digital video (DV) editing equipment and software, using the IEEE 1394 (FireWire) connector. Includes a chart listing specifications and rating eight DV editing systems, reviews two DV still-photo cameras, and previews beta DV products. (PEN)

  4. Reflectance Mechanism and Biophysical Characteristics of a Boreal Forest through Analyses of Airborne Spectral Radiance Observations

    NASA Astrophysics Data System (ADS)

    Dim, J. R.; Kajiwara, K.; Honda, Y.

    2006-12-01

    Hyperspectral radiance data were recorded from airborne observations simultaneously with whiteboard measurements in order to identify the reflectance mechanism patterns of the vegetation of a boreal forest located in the northern part of Japan. Because the degree of reflectance of a leaf depends on the leaf surface properties and internal structure as well as its water content and biochemical composition, the canopy reflectance signature may be used to understand the vegetation growing conditions and influencing factors. In this study a radio-controlled helicopter flying at a height just above the trees and bearing a portable spectral radiometer, a digital camera, a video camera and a laser scanner, was used to obtain the vegetation spectral reflectance data and biophysical characteristics of this forest. Spectral reflectance discrimination analyses show that vegetation types of the study field can be well distinguished. And, the amount of vegetation reflectance tends to decrease with the complexity of the canopy structure, as a result of increasing radiation scattering of these surfaces. The mechanism of multiple reflection was suggested to explain the relation between reflectance and irregularities of the canopy structures.

  5. Measuring Positions of Objects using Two or More Cameras

    NASA Technical Reports Server (NTRS)

    Klinko, Steve; Lane, John; Nelson, Christopher

    2008-01-01

    An improved method of computing positions of objects from digitized images acquired by two or more cameras (see figure) has been developed for use in tracking debris shed by a spacecraft during and shortly after launch. The method is also readily adaptable to such applications as (1) tracking moving and possibly interacting objects in other settings in order to determine causes of accidents and (2) measuring positions of stationary objects, as in surveying. Images acquired by cameras fixed to the ground and/or cameras mounted on tracking telescopes can be used in this method. In this method, processing of image data starts with creation of detailed computer- aided design (CAD) models of the objects to be tracked. By rotating, translating, resizing, and overlaying the models with digitized camera images, parameters that characterize the position and orientation of the camera can be determined. The final position error depends on how well the centroids of the objects in the images are measured; how accurately the centroids are interpolated for synchronization of cameras; and how effectively matches are made to determine rotation, scaling, and translation parameters. The method involves use of the perspective camera model (also denoted the point camera model), which is one of several mathematical models developed over the years to represent the relationships between external coordinates of objects and the coordinates of the objects as they appear on the image plane in a camera. The method also involves extensive use of the affine camera model, in which the distance from the camera to an object (or to a small feature on an object) is assumed to be much greater than the size of the object (or feature), resulting in a truly two-dimensional image. The affine camera model does not require advance knowledge of the positions and orientations of the cameras. This is because ultimately, positions and orientations of the cameras and of all objects are computed in a coordinate

  6. Camera Calibration with Radial Variance Component Estimation

    NASA Astrophysics Data System (ADS)

    Mélykuti, B.; Kruck, E. J.

    2014-11-01

    Camera calibration plays a more and more important role in recent times. Beside real digital aerial survey cameras the photogrammetric market is dominated by a big number of non-metric digital cameras mounted on UAVs or other low-weight flying platforms. The in-flight calibration of those systems has a significant role to enhance the geometric accuracy of survey photos considerably. It is expected to have a better precision of photo measurements in the center of images then along the edges or in the corners. With statistical methods the accuracy of photo measurements in dependency of the distance of points from image center has been analyzed. This test provides a curve for the measurement precision as function of the photo radius. A high number of camera types have been tested with well penetrated point measurements in image space. The result of the tests led to a general consequence to show a functional connection between accuracy and radial distance and to give a method how to check and enhance the geometrical capability of the cameras in respect to these results.

  7. Situational Awareness from a Low-Cost Camera System

    NASA Technical Reports Server (NTRS)

    Freudinger, Lawrence C.; Ward, David; Lesage, John

    2010-01-01

    A method gathers scene information from a low-cost camera system. Existing surveillance systems using sufficient cameras for continuous coverage of a large field necessarily generate enormous amounts of raw data. Digitizing and channeling that data to a central computer and processing it in real time is difficult when using low-cost, commercially available components. A newly developed system is located on a combined power and data wire to form a string-of-lights camera system. Each camera is accessible through this network interface using standard TCP/IP networking protocols. The cameras more closely resemble cell-phone cameras than traditional security camera systems. Processing capabilities are built directly onto the camera backplane, which helps maintain a low cost. The low power requirements of each camera allow the creation of a single imaging system comprising over 100 cameras. Each camera has built-in processing capabilities to detect events and cooperatively share this information with neighboring cameras. The location of the event is reported to the host computer in Cartesian coordinates computed from data correlation across multiple cameras. In this way, events in the field of view can present low-bandwidth information to the host rather than high-bandwidth bitmap data constantly being generated by the cameras. This approach offers greater flexibility than conventional systems, without compromising performance through using many small, low-cost cameras with overlapping fields of view. This means significant increased viewing without ignoring surveillance areas, which can occur when pan, tilt, and zoom cameras look away. Additionally, due to the sharing of a single cable for power and data, the installation costs are lower. The technology is targeted toward 3D scene extraction and automatic target tracking for military and commercial applications. Security systems and environmental/ vehicular monitoring systems are also potential applications.

  8. Assessing the Photogrammetric Potential of Cameras in Portable Devices

    NASA Astrophysics Data System (ADS)

    Smith, M. J.; Kokkas, N.

    2012-07-01

    In recent years, there have been an increasing number of portable devices, tablets and Smartphone's employing high-resolution digital cameras to satisfy consumer demand. In most cases, these cameras are designed primarily for capturing visually pleasing images and the potential of using Smartphone and tablet cameras for metric applications remains uncertain. The compact nature of the host's devices leads to very small cameras and therefore smaller geometric characteristics. This also makes them extremely portable and with their integration into a multi-function device, which is part of the basic unit cost often makes them readily available. Many application specialists may find them an attractive proposition where some modest photogrammetric capability would be useful. This paper investigates the geometric potential of these cameras for close range photogrammetric applications by: • investigating their geometric characteristics using the self-calibration method of camera calibration and comparing results from a state-of-the-art Digital SLR camera. • investigating their capability for 3D building modelling. Again, these results will be compared with findings from results obtained from a Digital SLR camera. The early results presented show that the iPhone has greater potential for photogrammetric use than the iPad.

  9. Calibration of Low Cost RGB and NIR Uav Cameras

    NASA Astrophysics Data System (ADS)

    Fryskowska, A.; Kedzierski, M.; Grochala, A.; Braula, A.

    2016-06-01

    Non-metric digital cameras are being widely used for photogrammetric studies. The increase in resolution and quality of images obtained by non-metric cameras, allows to use it in low-cost UAV and terrestrial photogrammetry. Imagery acquired with non-metric cameras can be used in 3D modeling of objects or landscapes, reconstructing of historical sites, generating digital elevation models (DTM), orthophotos, or in the assessment of accidents. Non-metric digital camcorders are characterized by instability and ignorance of the interior orientation parameters. Therefore, the use of these devices requires prior calibration. Calibration research was conducted using non-metric camera, different calibration tests and various software. The first part of the paper contains a brief theoretical introduction including the basic definitions, like the construction of non-metric cameras or description of different optical distortions. The second part of the paper contains cameras calibration process, details of the calibration methods and models that have been used. Sony Nex 5 camera calibration has been done using software: Image Master Calib, Matlab - Camera Calibrator application and Agisoft Lens. For the study 2D test fields has been used. As a part of the research a comparative analysis of the results have been done.

  10. Camera-Model Identification Using Markovian Transition Probability Matrix

    NASA Astrophysics Data System (ADS)

    Xu, Guanshuo; Gao, Shang; Shi, Yun Qing; Hu, Ruimin; Su, Wei

    Detecting the (brands and) models of digital cameras from given digital images has become a popular research topic in the field of digital forensics. As most of images are JPEG compressed before they are output from cameras, we propose to use an effective image statistical model to characterize the difference JPEG 2-D arrays of Y and Cb components from the JPEG images taken by various camera models. Specifically, the transition probability matrices derived from four different directional Markov processes applied to the image difference JPEG 2-D arrays are used to identify statistical difference caused by image formation pipelines inside different camera models. All elements of the transition probability matrices, after a thresholding technique, are directly used as features for classification purpose. Multi-class support vector machines (SVM) are used as the classification tool. The effectiveness of our proposed statistical model is demonstrated by large-scale experimental results.

  11. Vision Sensors and Cameras

    NASA Astrophysics Data System (ADS)

    Hoefflinger, Bernd

    Silicon charge-coupled-device (CCD) imagers have been and are a specialty market ruled by a few companies for decades. Based on CMOS technologies, active-pixel sensors (APS) began to appear in 1990 at the 1 μm technology node. These pixels allow random access, global shutters, and they are compatible with focal-plane imaging systems combining sensing and first-level image processing. The progress towards smaller features and towards ultra-low leakage currents has provided reduced dark currents and μm-size pixels. All chips offer Mega-pixel resolution, and many have very high sensitivities equivalent to ASA 12.800. As a result, HDTV video cameras will become a commodity. Because charge-integration sensors suffer from a limited dynamic range, significant processing effort is spent on multiple exposure and piece-wise analog-digital conversion to reach ranges >10,000:1. The fundamental alternative is log-converting pixels with an eye-like response. This offers a range of almost a million to 1, constant contrast sensitivity and constant colors, important features in professional, technical and medical applications. 3D retino-morphic stacking of sensing and processing on top of each other is being revisited with sub-100 nm CMOS circuits and with TSV technology. With sensor outputs directly on top of neurons, neural focal-plane processing will regain momentum, and new levels of intelligent vision will be achieved. The industry push towards thinned wafers and TSV enables backside-illuminated and other pixels with a 100% fill-factor. 3D vision, which relies on stereo or on time-of-flight, high-speed circuitry, will also benefit from scaled-down CMOS technologies both because of their size as well as their higher speed.

  12. Microchannel plate streak camera

    DOEpatents

    Wang, Ching L.

    1989-01-01

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 KeV x-rays.

  13. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1984-09-28

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (uv to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 keV x-rays.

  14. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1989-03-21

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras is disclosed. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1,000 KeV x-rays. 3 figs.

  15. GRACE star camera noise

    NASA Astrophysics Data System (ADS)

    Harvey, Nate

    2016-08-01

    Extending results from previous work by Bandikova et al. (2012) and Inacio et al. (2015), this paper analyzes Gravity Recovery and Climate Experiment (GRACE) star camera attitude measurement noise by processing inter-camera quaternions from 2003 to 2015. We describe a correction to star camera data, which will eliminate a several-arcsec twice-per-rev error with daily modulation, currently visible in the auto-covariance function of the inter-camera quaternion, from future GRACE Level-1B product releases. We also present evidence supporting the argument that thermal conditions/settings affect long-term inter-camera attitude biases by at least tens-of-arcsecs, and that several-to-tens-of-arcsecs per-rev star camera errors depend largely on field-of-view.

  16. Camera-enabled techniques for organic synthesis

    PubMed Central

    Ingham, Richard J; O’Brien, Matthew; Browne, Duncan L

    2013-01-01

    Summary A great deal of time is spent within synthetic chemistry laboratories on non-value-adding activities such as sample preparation and work-up operations, and labour intensive activities such as extended periods of continued data collection. Using digital cameras connected to computer vision algorithms, camera-enabled apparatus can perform some of these processes in an automated fashion, allowing skilled chemists to spend their time more productively. In this review we describe recent advances in this field of chemical synthesis and discuss how they will lead to advanced synthesis laboratories of the future. PMID:23766820

  17. An evaluation of onshore digital elevation models for modelling tsunami inundation zones

    NASA Astrophysics Data System (ADS)

    Griffin, Jonathan; Latief, Hamzah; Kongko, Widjo; Harig, Sven; Horspool, Nick; Hanung, Raditya; Rojali, Aditia; Maher, Nicola; Fuchs, Annika; Hossen, Jakir; Upi, Supriyati; Edi, Dewanto; Rakowsky, Natalja; Cummins, Phil

    2015-06-01

    A sensitivity study is undertaken to assess the utility of different onshore digital elevation models (DEM) for simulating the extent of tsunami inundation using case studies from two locations in Indonesia. We compare airborne IFSAR, ASTER and SRTM against high resolution LiDAR and stereo-camera data in locations with different coastal morphologies. Tsunami inundation extents modelled with airborne IFSAR DEMs are comparable with those modelled with the higher resolution datasets and are also consistent with historical run-up data, where available. Large vertical errors and poor resolution of the coastline in the ASTER and SRTM elevation datasets cause the modelled inundation extent to be much less compared with the other datasets and observations. Therefore ASTER and SRTM should not be used to underpin tsunami inundation models. a model mesh resolution of 25 m was sufficient for estimating the inundated area when using elevation data with high vertical accuracy in the case studies presented here. Differences in modelled inundation between digital terrain models (DTM) and digital surface models (DSM) for LiDAR and IFSAR are greater than differences between the two data types. Models using DTM may overestimate inundation while those using DSM may underestimate inundation when a constant Manning’s roughness value is used. We recommend using DTM for modelling tsunami inundation extent with further work needed to resolve the scale at which surface roughness should be parameterised.

  18. Analytical multicollimator camera calibration

    USGS Publications Warehouse

    Tayman, W.P.

    1978-01-01

    Calibration with the U.S. Geological survey multicollimator determines the calibrated focal length, the point of symmetry, the radial distortion referred to the point of symmetry, and the asymmetric characteristiecs of the camera lens. For this project, two cameras were calibrated, a Zeiss RMK A 15/23 and a Wild RC 8. Four test exposures were made with each camera. Results are tabulated for each exposure and averaged for each set. Copies of the standard USGS calibration reports are included. ?? 1978.

  19. Digital security technology simplified.

    PubMed

    Scaglione, Bernard J

    2007-01-01

    Digital security technology is making great strides in replacing analog and other traditional security systems including CCTV card access, personal identification and alarm monitoring applications. Like any new technology, the author says, it is important to understand its benefits and limitations before purchasing and installing, to ensure its proper operation and effectiveness. This article is a primer for security directors on how digital technology works. It provides an understanding of the key components which make up the foundation for digital security systems, focusing on three key aspects of the digital security world: the security network, IP cameras and IP recorders.

  20. Ringfield lithographic camera

    DOEpatents

    Sweatt, William C.

    1998-01-01

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D.sub.source .apprxeq.0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry with an increased etendue for the camera system. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors.

  1. LSST Camera Optics Design

    SciTech Connect

    Riot, V J; Olivier, S; Bauman, B; Pratuch, S; Seppala, L; Gilmore, D; Ku, J; Nordby, M; Foss, M; Antilogus, P; Morgado, N

    2012-05-24

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics will meet their performance goals.

  2. Airborne Laser Polar Nephelometer

    NASA Technical Reports Server (NTRS)

    Grams, Gerald W.

    1973-01-01

    A polar nephelometer has been developed at NCAR to measure the angular variation of the intensity of light scattered by air molecules and particles. The system has been designed for airborne measurements using outside air ducted through a 5-cm diameter airflow tube; the sample volume is that which is common to the intersection of a collimated source beam and the detector field of view within the airflow tube. The source is a linearly polarized helium-neon laser beam. The optical system defines a collimated field-of-view (0.5deg half-angle) through a series of diaphragms located behind a I72-mm focal length objective lens. A photomultiplier tube is located immediately behind an aperture in the focal plane of the objective lens. The laser beam is mechanically chopped (on-off) at a rate of 5 Hz; a two-channel pulse counter, synchronized to the laser output, measures the photomultiplier pulse rate with the light beam both on and off. The difference in these measured pulse rates is directly proportional to the intensity of the scattered light from the volume common to the intersection of the laser beam and the detector field-of-view. Measurements can be made at scattering angles from 15deg to 165deg with reference to the direction of propagation of the light beam. Intermediate angles are obtained by selecting the angular increments desired between these extreme angles (any multiple of 0.1deg can be selected for the angular increment; 5deg is used in normal operation). Pulses provided by digital circuits control a stepping motor which sequentially rotates the detector by pre-selected angular increments. The synchronous photon-counting system automatically begins measurement of the scattered-light intensity immediately after the rotation to a new angle has been completed. The instrument has been flown on the NASA Convair 990 airborne laboratory to obtain data on the complex index of refraction of atmospheric aerosols. A particle impaction device is operated simultaneously

  3. Presence capture cameras - a new challenge to the image quality

    NASA Astrophysics Data System (ADS)

    Peltoketo, Veli-Tapani

    2016-04-01

    Commercial presence capture cameras are coming to the markets and a new era of visual entertainment starts to get its shape. Since the true presence capturing is still a very new technology, the real technical solutions are just passed a prototyping phase and they vary a lot. Presence capture cameras have still the same quality issues to tackle as previous phases of digital imaging but also numerous new ones. This work concentrates to the quality challenges of presence capture cameras. A camera system which can record 3D audio-visual reality as it is has to have several camera modules, several microphones and especially technology which can synchronize output of several sources to a seamless and smooth virtual reality experience. Several traditional quality features are still valid in presence capture cameras. Features like color fidelity, noise removal, resolution and dynamic range create the base of virtual reality stream quality. However, co-operation of several cameras brings a new dimension for these quality factors. Also new quality features can be validated. For example, how the camera streams should be stitched together with 3D experience without noticeable errors and how to validate the stitching? The work describes quality factors which are still valid in the presence capture cameras and defines the importance of those. Moreover, new challenges of presence capture cameras are investigated in image and video quality point of view. The work contains considerations how well current measurement methods can be used in presence capture cameras.

  4. Mars Airborne Prospecting Spectrometer

    NASA Astrophysics Data System (ADS)

    Steinkraus, J. M.; Wright, M. W.; Rheingans, B. E.; Steinkraus, D. E.; George, W. P.; Aljabri, A.; Hall, J. L.; Scott, D. C.

    2012-06-01

    One novel approach towards addressing the need for innovative instrumentation and investigation approaches is the integration of a suite of four spectrometer systems to form the Mars Airborne Prospecting Spectrometers (MAPS) for prospecting on Mars.

  5. 3D astigmatic depth sensing camera

    NASA Astrophysics Data System (ADS)

    Birch, Gabriel C.; Tyo, J. Scott; Schwiegerling, Jim

    2011-10-01

    Three-dimensional displays have become increasingly present in consumer markets. However, the ability to capture threedimensional images inexpensively and without major modifications to current cameras is uncommon. Our goal is to create a modification to a common commercial camera that allows a three dimensional reconstruction. We desire such an imaging system to be inexpensive and easy to use. Furthermore, we require that any three-dimensional modification to a camera does not reduce its resolution. Here we present a possible solution to this problem. A commercial digital camera is used with a projector system with astigmatic focus to capture images of a scene. By using an astigmatic projected pattern we can create two different focus depths for horizontal and vertical features of the projected pattern, thereby encoding depth. This projector could be integrated into the flash unit of the camera. By carefully choosing a pattern we are able to exploit this differential focus in image processing. Wavelet transforms are performed on the image that pick out the projected pattern. By taking ratios of certain wavelet coefficients we are able to correlate the distance an object at a particular transverse position is from the camera to the contrast ratios. We present our information regarding construction, calibration, and images produced by this system. The nature of linking a projected pattern design and image processing algorithms will be discussed.

  6. Seeing the trees yet not missing the forest: an airborne lidar approach

    NASA Astrophysics Data System (ADS)

    Guo, Q.; Li, W.; Flanagan, J.

    2011-12-01

    Light Detection and Ranging (lidar) is an optical remote sensing technology that measures properties of scattered light to find range and/or other information of a distant object. Due to its ability to generate 3-dimensional data with high spatial resolution and accuracy, lidar technology is being increasingly used in ecology, geography, geology, geomorphology, seismology, remote sensing, and atmospheric physics. In this study, we acquire airborne lidar data for the study of hydrologic, geomorphologic, and geochemical processes at six Critical Zone Observatories: Southern Sierra, Boulder Creek, Shale Hills, Luquillo, Jemez, and Christina River Basin. Each site will have two lidar flights (leaf on/off, or snow on/off). Based on lidar data, we derive various products, including high resolution Digital Elevation Model (DEM), Digital Surface Model (DSM), Canopy Height Model (CHM), canopy cover & closure, tree height, DBH, canopy base height, canopy bulk density, biomass, LAI, etc. A novel approach is also developed to map individual tree based on segmentation of lidar point clouds, and a virtual forest is simulated using the location of individual trees as well as tree structure information. The simulated image is then compared to a camera photo taken at the same location. The two images look very similar, while, our simulated image provides not only a visually impressive visualization of the landscape, but also contains all the detailed information about the individual tree locations and forest structure properties.

  7. Smart Camera Technology Increases Quality

    NASA Technical Reports Server (NTRS)

    2004-01-01

    When it comes to real-time image processing, everyone is an expert. People begin processing images at birth and rapidly learn to control their responses through the real-time processing of the human visual system. The human eye captures an enormous amount of information in the form of light images. In order to keep the brain from becoming overloaded with all the data, portions of an image are processed at a higher resolution than others, such as a traffic light changing colors. changing colors. In the same manner, image processing products strive to extract the information stored in light in the most efficient way possible. Digital cameras available today capture millions of pixels worth of information from incident light. However, at frame rates more than a few per second, existing digital interfaces are overwhelmed. All the user can do is store several frames to memory until that memory is full and then subsequent information is lost. New technology pairs existing digital interface technology with an off-the-shelf complementary metal oxide semiconductor (CMOS) imager to provide more than 500 frames per second of specialty image processing. The result is a cost-effective detection system unlike any other.

  8. Tests of commercial colour CMOS cameras for astronomical applications

    NASA Astrophysics Data System (ADS)

    Pokhvala, S. M.; Reshetnyk, V. M.; Zhilyaev, B. E.

    2013-12-01

    We present some results of testing commercial colour CMOS cameras for astronomical applications. Colour CMOS sensors allow to perform photometry in three filters simultaneously that gives a great advantage compared with monochrome CCD detectors. The Bayer BGR colour system realized in colour CMOS sensors is close to the astronomical Johnson BVR system. The basic camera characteristics: read noise (e^{-}/pix), thermal noise (e^{-}/pix/sec) and electronic gain (e^{-}/ADU) for the commercial digital camera Canon 5D MarkIII are presented. We give the same characteristics for the scientific high performance cooled CCD camera system ALTA E47. Comparing results for tests of Canon 5D MarkIII and CCD ALTA E47 show that present-day commercial colour CMOS cameras can seriously compete with the scientific CCD cameras in deep astronomical imaging.

  9. Camera Operator and Videographer

    ERIC Educational Resources Information Center

    Moore, Pam

    2007-01-01

    Television, video, and motion picture camera operators produce images that tell a story, inform or entertain an audience, or record an event. They use various cameras to shoot a wide range of material, including television series, news and sporting events, music videos, motion pictures, documentaries, and training sessions. Those who film or…

  10. The Camera Cook Book.

    ERIC Educational Resources Information Center

    Education Development Center, Inc., Newton, MA.

    Intended for use with the photographic materials available from the Workshop for Learning Things, Inc., this "camera cookbook" describes procedures that have been tried in classrooms and workshops and proven to be the most functional and inexpensive. Explicit starting off instructions--directions for exploring and loading the camera and for taking…

  11. Constrained space camera assembly

    DOEpatents

    Heckendorn, F.M.; Anderson, E.K.; Robinson, C.W.; Haynes, H.B.

    1999-05-11

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity is disclosed. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras. 17 figs.

  12. CCD Luminescence Camera

    NASA Technical Reports Server (NTRS)

    Janesick, James R.; Elliott, Tom

    1987-01-01

    New diagnostic tool used to understand performance and failures of microelectronic devices. Microscope integrated to low-noise charge-coupled-device (CCD) camera to produce new instrument for analyzing performance and failures of microelectronics devices that emit infrared light during operation. CCD camera also used to indentify very clearly parts that have failed where luminescence typically found.

  13. Target detection algorithm for airborne thermal hyperspectral data

    NASA Astrophysics Data System (ADS)

    Marwaha, R.; Kumar, A.; Raju, P. L. N.; Krishna Murthy, Y. V. N.

    2014-11-01

    Airborne hyperspectral imaging is constantly being used for classification purpose. But airborne thermal hyperspectral image usually is a challenge for conventional classification approaches. The Telops Hyper-Cam sensor is an interferometer-based imaging system that helps in the spatial and spectral analysis of targets utilizing a single sensor. It is based on the technology of Fourier-transform which yields high spectral resolution and enables high accuracy radiometric calibration. The Hypercam instrument has 84 spectral bands in the 868 cm-1 to 1280 cm-1 region (7.8 μm to 11.5 μm), at a spectral resolution of 6 cm-1 (full-width-half-maximum) for LWIR (long wave infrared) range. Due to the Hughes effect, only a few classifiers are able to handle high dimensional classification task. MNF (Minimum Noise Fraction) rotation is a data dimensionality reducing approach to segregate noise in the data. In this, the component selection of minimum noise fraction (MNF) rotation transformation was analyzed in terms of classification accuracy using constrained energy minimization (CEM) algorithm as a classifier for Airborne thermal hyperspectral image and for the combination of airborne LWIR hyperspectral image and color digital photograph. On comparing the accuracy of all the classified images for airborne LWIR hyperspectral image and combination of Airborne LWIR hyperspectral image with colored digital photograph, it was found that accuracy was highest for MNF component equal to twenty. The accuracy increased by using the combination of airborne LWIR hyperspectral image with colored digital photograph instead of using LWIR data alone.

  14. Optical Communications Link to Airborne Transceiver

    NASA Technical Reports Server (NTRS)

    Regehr, Martin W.; Kovalik, Joseph M.; Biswas, Abhijit

    2011-01-01

    An optical link from Earth to an aircraft demonstrates the ability to establish a link from a ground platform to a transceiver moving overhead. An airplane has a challenging disturbance environment including airframe vibrations and occasional abrupt changes in attitude during flight. These disturbances make it difficult to maintain pointing lock in an optical transceiver in an airplane. Acquisition can also be challenging. In the case of the aircraft link, the ground station initially has no precise knowledge of the aircraft s location. An airborne pointing system has been designed, built, and demonstrated using direct-drive brushless DC motors for passive isolation of pointing disturbances and for high-bandwidth control feedback. The airborne transceiver uses a GPS-INS system to determine the aircraft s position and attitude, and to then illuminate the ground station initially for acquisition. The ground transceiver participates in link-pointing acquisition by first using a wide-field camera to detect initial illumination from the airborne beacon, and to perform coarse pointing. It then transfers control to a high-precision pointing detector. Using this scheme, live video was successfully streamed from the ground to the aircraft at 270 Mb/s while simultaneously downlinking a 50 kb/s data stream from the aircraft to the ground.

  15. 3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH THE VAL TO THE RIGHT, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  16. 7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION EQUIPMENT AND STORAGE CABINET. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  17. Miniaturized Autonomous Extravehicular Robotic Camera (Mini AERCam)

    NASA Technical Reports Server (NTRS)

    Fredrickson, Steven E.

    2001-01-01

    The NASA Johnson Space Center (JSC) Engineering Directorate is developing the Autonomous Extravehicular Robotic Camera (AERCam), a low-volume, low-mass free-flying camera system . AERCam project team personnel recently initiated development of a miniaturized version of AERCam known as Mini AERCam. The Mini AERCam target design is a spherical "nanosatellite" free-flyer 7.5 inches in diameter and weighing 1 0 pounds. Mini AERCam is building on the success of the AERCam Sprint STS-87 flight experiment by adding new on-board sensing and processing capabilities while simultaneously reducing volume by 80%. Achieving enhanced capability in a smaller package depends on applying miniaturization technology across virtually all subsystems. Technology innovations being incorporated include micro electromechanical system (MEMS) gyros, "camera-on-a-chip" CMOS imagers, rechargeable xenon gas propulsion system , rechargeable lithium ion battery, custom avionics based on the PowerPC 740 microprocessor, GPS relative navigation, digital radio frequency communications and tracking, micropatch antennas, digital instrumentation, and dense mechanical packaging. The Mini AERCam free-flyer will initially be integrated into an approximate flight-like configuration for demonstration on an airbearing table. A pilot-in-the-loop and hardware-in-the-loop simulation to simulate on-orbit navigation and dynamics will complement the airbearing table demonstration. The Mini AERCam lab demonstration is intended to form the basis for future development of an AERCam flight system that provides beneficial on-orbit views unobtainable from fixed cameras, cameras on robotic manipulators, or cameras carried by EVA crewmembers.

  18. Determination of the spatial structure of vegetation on the repository of the mine "Fryderyk" in Tarnowskie Góry, based on airborne laser scanning from the ISOK project and digital orthophotomaps

    NASA Astrophysics Data System (ADS)

    Szostak, Marta; Wężyk, Piotr; Pająk, Marek; Haryło, Paweł; Lisańczuk, Marek

    2015-06-01

    The purpose of this study was to determine the spatial structure of vegetation on the repository of the mine "Fryderyk" in Tarnowskie Góry. Tested area was located in the Upper Silesian Industrial Region (a large industrial region in Poland). It was a unique refuge habitat - Natura2000; PLH240008. The main aspect of this elaboration was to investigate the possible use of geotechniques and generally available geodata for mapping LULC changes and determining the spatial structure of vegetation. The presented study focuses on the analysis of a spatial structure of vegetation in the research area. This exploration was based on aerial images and orthophotomaps from 1947, 1998, 2003, 2009, 2011 and airborne laser scanning data (2011, ISOK project). Forest succession changes which occurred between 1947 and 2011 were analysed. The selected features of vegetation overgrowing spoil heap "Fryderyk" was determined. The results demonstrated a gradual succession of greenery on soil heap. In 1947, 84% of this area was covered by low vegetation. Tree expansion was proceeding in the westerly and northwest direction. In 2011 this canopy layer covered almost 50% of the research area. Parameters such as height of vegetation, crowns length and cover density were calculated by an airborne laser scanning data. These analyses indicated significant diversity in vertical and horizontal structures of vegetation. The study presents some capacities to use airborne laser scanning for an impartial evaluation of the structure of vegetation.

  19. Integrating TV/digital data spectrograph system

    NASA Technical Reports Server (NTRS)

    Duncan, B. J.; Fay, T. D.; Miller, E. R.; Wamsteker, W.; Brown, R. M.; Neely, P. L.

    1975-01-01

    A 25-mm vidicon camera was previously modified to allow operation in an integration mode for low-light-level astronomical work. The camera was then mated to a low-dispersion spectrograph for obtaining spectral information in the 400 to 750 nm range. A high speed digital video image system was utilized to digitize the analog video signal, place the information directly into computer-type memory, and record data on digital magnetic tape for permanent storage and subsequent analysis.

  20. Novel computer-based endoscopic camera

    NASA Astrophysics Data System (ADS)

    Rabinovitz, R.; Hai, N.; Abraham, Martin D.; Adler, Doron; Nissani, M.; Fridental, Ron; Vitsnudel, Ilia

    1995-05-01

    We have introduced a computer-based endoscopic camera which includes (a) unique real-time digital image processing to optimize image visualization by reducing over exposed glared areas and brightening dark areas, and by accentuating sharpness and fine structures, and (b) patient data documentation and management. The image processing is based on i Sight's iSP1000TM digital video processor chip and Adaptive SensitivityTM patented scheme for capturing and displaying images with wide dynamic range of light, taking into account local neighborhood image conditions and global image statistics. It provides the medical user with the ability to view images under difficult lighting conditions, without losing details `in the dark' or in completely saturated areas. The patient data documentation and management allows storage of images (approximately 1 MB per image for a full 24 bit color image) to any storage device installed into the camera, or to an external host media via network. The patient data which is included with every image described essential information on the patient and procedure. The operator can assign custom data descriptors, and can search for the stored image/data by typing any image descriptor. The camera optics has extended zoom range of f equals 20 - 45 mm allowing control of the diameter of the field which is displayed on the monitor such that the complete field of view of the endoscope can be displayed on all the area of the screen. All these features provide versatile endoscopic camera with excellent image quality and documentation capabilities.

  1. Measurement of the nonuniformity of first responder thermal imaging cameras

    NASA Astrophysics Data System (ADS)

    Lock, Andrew; Amon, Francine

    2008-04-01

    Police, firefighters, and emergency medical personnel are examples of first responders that are utilizing thermal imaging cameras in a very practical way every day. However, few performance metrics have been developed to assist first responders in evaluating the performance of thermal imaging technology. This paper describes one possible metric for evaluating the nonuniformity of thermal imaging cameras. Several commercially available uncooled focal plane array cameras were examined. Because of proprietary property issues, each camera was considered a 'black box'. In these experiments, an extended area black body (18 cm square) was placed very close to the objective lens of the thermal imaging camera. The resultant video output from the camera was digitized at a resolution of 640x480 pixels and a grayscale depth of 10 bits. The nonuniformity was calculated using the standard deviation of the digitized image pixel intensities divided by the mean of those pixel intensities. This procedure was repeated for each camera at several blackbody temperatures in the range from 30° C to 260° C. It has observed that the nonuniformity initially increases with temperature, then asymptotically approaches a maximum value. Nonuniformity is also applied to the calculation of Spatial Frequency Response as well providing a noise floor. The testing procedures described herein are being developed as part of a suite of tests to be incorporated into a performance standard covering thermal imaging cameras for first responders.

  2. Airborne and Ground-Based Platforms for Data Collection in Small Vineyards: Examples from the UK and Switzerland

    NASA Astrophysics Data System (ADS)

    Green, David R.; Gómez, Cristina; Fahrentrapp, Johannes

    2015-04-01

    This paper presents an overview of some of the low-cost ground and airborne platforms and technologies now becoming available for data collection in small area vineyards. Low-cost UAV or UAS platforms and cameras are now widely available as the means to collect both vertical and oblique aerial still photography and airborne videography in vineyards. Examples of small aerial platforms include the AR Parrot Drone, the DJI Phantom (1 and 2), and 3D Robotics IRIS+. Both fixed-wing and rotary wings platforms offer numerous advantages for aerial image acquisition including the freedom to obtain high resolution imagery at any time required. Imagery captured can be stored on mobile devices such as an Apple iPad and shared, written directly to a memory stick or card, or saved to the Cloud. The imagery can either be visually interpreted or subjected to semi-automated analysis using digital image processing (DIP) software to extract information about vine status or the vineyard environment. At the ground-level, a radio-controlled 'rugged' model 4x4 vehicle can also be used as a mobile platform to carry a number of sensors (e.g. a Go-Pro camera) around a vineyard, thereby facilitating quick and easy field data collection from both within the vine canopy and rows. For the small vineyard owner/manager with limited financial resources, this technology has a number of distinct advantages to aid in vineyard management practices: it is relatively cheap to purchase; requires a short learning-curve to use and to master; can make use of autonomous ground control units for repetitive coverage enabling reliable monitoring; and information can easily be analysed and integrated within a GIS with minimal expertise. In addition, these platforms make widespread use of familiar and everyday, off-the-shelf technologies such as WiFi, Go-Pro cameras, Cloud computing, and smartphones or tablets as the control interface, all with a large and well established end-user support base. Whilst there are

  3. Night Vision Camera

    NASA Technical Reports Server (NTRS)

    1996-01-01

    PixelVision, Inc. developed the Night Video NV652 Back-illuminated CCD Camera, based on the expertise of a former Jet Propulsion Laboratory employee and a former employee of Scientific Imaging Technologies, Inc. The camera operates without an image intensifier, using back-illuminated and thinned CCD technology to achieve extremely low light level imaging performance. The advantages of PixelVision's system over conventional cameras include greater resolution and better target identification under low light conditions, lower cost and a longer lifetime. It is used commercially for research and aviation.

  4. Gamma camera purchasing.

    PubMed

    Wells, C P; Buxton-Thomas, M

    1995-03-01

    The purchase of a new gamma camera is a major undertaking and represents a long-term commitment for most nuclear medicine departments. The purpose of tendering for gamma cameras is to assess the best match between the requirements of the clinical department and the equipment available and not necessarily to buy the 'best camera' [1-3]. After many years of drawing up tender specifications, this paper tries to outline some of the traps and pitfalls of this potentially perilous, although largely rewarding, exercise. PMID:7770241

  5. Ringfield lithographic camera

    DOEpatents

    Sweatt, W.C.

    1998-09-08

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D{sub source} {approx_equal} 0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors. 11 figs.

  6. Kitt Peak speckle camera

    NASA Technical Reports Server (NTRS)

    Breckinridge, J. B.; Mcalister, H. A.; Robinson, W. G.

    1979-01-01

    The speckle camera in regular use at Kitt Peak National Observatory since 1974 is described in detail. The design of the atmospheric dispersion compensation prisms, the use of film as a recording medium, the accuracy of double star measurements, and the next generation speckle camera are discussed. Photographs of double star speckle patterns with separations from 1.4 sec of arc to 4.7 sec of arc are shown to illustrate the quality of image formation with this camera, the effects of seeing on the patterns, and to illustrate the isoplanatic patch of the atmosphere.

  7. Airborne thermography for condition monitoring of a public baths building

    NASA Astrophysics Data System (ADS)

    Mattsson, Mats; Hellman, Erik; Ljungberg, Sven-Ake

    2001-03-01

    Airborne and ground-based thermography surveys have been performed in order to detect moisture and energy related problems in the construction of a public swimming bath building. This paper describes the information potential and the advantages and limitations using a standard IR-camera and traditional inspection methods to gather information for retrofit priorities. The damage conditions indicated in the thermal images are confirmed by field inspections and photographic documentation.

  8. Generalized phase-shifting color digital holography

    NASA Astrophysics Data System (ADS)

    Nomura, Takanori; Kawakami, Takaaki; Shinomura, Kazuma

    2016-06-01

    Two methods to apply the generalized phase-shifting digital holography to color digital holography are proposed. One is wave-splitting generalized phase-shifting color digital holography. This is realized by using a color Bayer camera. Another is multiple exposure generalized phase-shifting color digital holography. This is realized by the wavelength-dependent phase-shifting devices. Experimental results for both generalized phase-shifting color digital holography are presented to confirm the proposed methods.

  9. Airborne data acquisition techniques

    SciTech Connect

    Arro, A.A.

    1980-01-01

    The introduction of standards on acceptable procedures for assessing building heat loss has created a dilemma for the contractor performing airborne thermographic surveys. These standards impose specifications on instrumentation, data acquisition, recording, interpretation, and presentation. Under the standard, the contractor has both the obligation of compliance and the requirement of offering his services at a reasonable price. This paper discusses the various aspects of data acquisition for airborne thermographic surveys and various techniques to reduce the costs of this operation. These techniques include the calculation of flight parameters for economical data acquisition, the selection and use of maps for mission planning, and the use of meteorological forecasts for flight scheduling and the actual execution of the mission. The proper consideration of these factors will result in a cost effective data acquisition and will place the contractor in a very competitive position in offering airborne thermographic survey services.

  10. Computer vision camera with embedded FPGA processing

    NASA Astrophysics Data System (ADS)

    Lecerf, Antoine; Ouellet, Denis; Arias-Estrada, Miguel

    2000-03-01

    Traditional computer vision is based on a camera-computer system in which the image understanding algorithms are embedded in the computer. To circumvent the computational load of vision algorithms, low-level processing and imaging hardware can be integrated in a single compact module where a dedicated architecture is implemented. This paper presents a Computer Vision Camera based on an open architecture implemented in an FPGA. The system is targeted to real-time computer vision tasks where low level processing and feature extraction tasks can be implemented in the FPGA device. The camera integrates a CMOS image sensor, an FPGA device, two memory banks, and an embedded PC for communication and control tasks. The FPGA device is a medium size one equivalent to 25,000 logic gates. The device is connected to two high speed memory banks, an IS interface, and an imager interface. The camera can be accessed for architecture programming, data transfer, and control through an Ethernet link from a remote computer. A hardware architecture can be defined in a Hardware Description Language (like VHDL), simulated and synthesized into digital structures that can be programmed into the FPGA and tested on the camera. The architecture of a classical multi-scale edge detection algorithm based on a Laplacian of Gaussian convolution has been developed to show the capabilities of the system.

  11. Managing a large database of camera fingerprints

    NASA Astrophysics Data System (ADS)

    Goljan, Miroslav; Fridrich, Jessica; Filler, Tomáš

    2010-01-01

    Sensor fingerprint is a unique noise-like pattern caused by slightly varying pixel dimensions and inhomogeneity of the silicon wafer from which the sensor is made. The fingerprint can be used to prove that an image came from a specific digital camera. The presence of a camera fingerprint in an image is usually established using a detector that evaluates cross-correlation between the fingerprint and image noise. The complexity of the detector is thus proportional to the number of pixels in the image. Although computing the detector statistic for a few megapixel image takes several seconds on a single-processor PC, the processing time becomes impractically large if a sizeable database of camera fingerprints needs to be searched through. In this paper, we present a fast searching algorithm that utilizes special "fingerprint digests" and sparse data structures to address several tasks that forensic analysts will find useful when deploying camera identification from fingerprints in practice. In particular, we develop fast algorithms for finding if a given fingerprint already resides in the database and for determining whether a given image was taken by a camera whose fingerprint is in the database.

  12. Study on airborne multispectral imaging fusion detection technology

    NASA Astrophysics Data System (ADS)

    Ding, Na; Gao, Jiaobo; Wang, Jun; Cheng, Juan; Gao, Meng; Gao, Fei; Fan, Zhe; Sun, Kefeng; Wu, Jun; Li, Junna; Gao, Zedong; Cheng, Gang

    2014-11-01

    The airborne multispectral imaging fusion detection technology is proposed in this paper. In this design scheme, the airborne multispectral imaging system consists of the multispectral camera, the image processing unit, and the stabilized platform. The multispectral camera can operate in the spectral region from visible to near infrared waveband (0.4-1.0um), it has four same and independent imaging channels, and sixteen different typical wavelengths to be selected based on the different typical targets and background. The related experiments were tested by the airborne multispectral imaging system. In particularly, the camouflage targets were fused and detected in the different complex environment, such as the land vegetation background, the desert hot background and underwater. In the spectral region from 0.4 um to 1.0um, the three different characteristic wave from sixteen typical spectral are selected and combined according to different backgrounds and targets. The spectral image corresponding to the three characteristic wavelengths is resisted and fused by the image processing technology in real time, and the fusion video with typical target property is outputted. In these fusion images, the contrast of target and background is greatly increased. Experimental results confirm that the airborne multispectral imaging fusion detection technology can acquire multispectral fusion image with high contrast in real time, and has the ability of detecting and identification camouflage objects from complex background to targets underwater.

  13. Light microscopy digital imaging.

    PubMed

    Joubert, James; Sharma, Deepak

    2011-10-01

    This unit presents an overview of digital imaging hardware used in light microscopy. CMOS, CCD, and EMCCDs are the primary sensors used. The strengths and weaknesses of each define the primary applications for these sensors. Sensor architecture and formats are also reviewed. Color camera design strategies and sensor window cleaning are also described in the unit.

  14. The Digital Divide

    ERIC Educational Resources Information Center

    Hudson, Hannah Trierweiler

    2011-01-01

    Megan is a 14-year-old from Nebraska who just started ninth grade. She has her own digital camera, cell phone, Nintendo DS, and laptop, and one or more of these devices is usually by her side. Compared to the interactions and exploration she's engaged in at home, Megan finds the technology in her classroom falls a little flat. Most of the…

  15. Advanced CCD camera developments

    SciTech Connect

    Condor, A.

    1994-11-15

    Two charge coupled device (CCD) camera systems are introduced and discussed, describing briefly the hardware involved, and the data obtained in their various applications. The Advanced Development Group Defense Sciences Engineering Division has been actively designing, manufacturing, fielding state-of-the-art CCD camera systems for over a decade. These systems were originally developed for the nuclear test program to record data from underground nuclear tests. Today, new and interesting application for these systems have surfaced and development is continuing in the area of advanced CCD camera systems, with the new CCD camera that will allow experimenters to replace film for x-ray imaging at the JANUS, USP, and NOVA laser facilities.

  16. The MKID Camera

    NASA Astrophysics Data System (ADS)

    Maloney, P. R.; Czakon, N. G.; Day, P. K.; Duan, R.; Gao, J.; Glenn, J.; Golwala, S.; Hollister, M.; LeDuc, H. G.; Mazin, B.; Noroozian, O.; Nguyen, H. T.; Sayers, J.; Schlaerth, J.; Vaillancourt, J. E.; Vayonakis, A.; Wilson, P.; Zmuidzinas, J.

    2009-12-01

    The MKID Camera project is a collaborative effort of Caltech, JPL, the University of Colorado, and UC Santa Barbara to develop a large-format, multi-color millimeter and submillimeter-wavelength camera for astronomy using microwave kinetic inductance detectors (MKIDs). These are superconducting, micro-resonators fabricated from thin aluminum and niobium films. We couple the MKIDs to multi-slot antennas and measure the change in surface impedance produced by photon-induced breaking of Cooper pairs. The readout is almost entirely at room temperature and can be highly multiplexed; in principle hundreds or even thousands of resonators could be read out on a single feedline. The camera will have 576 spatial pixels that image simultaneously in four bands at 750, 850, 1100 and 1300 microns. It is scheduled for deployment at the Caltech Submillimeter Observatory in the summer of 2010. We present an overview of the camera design and readout and describe the current status of testing and fabrication.

  17. The Complementary Pinhole Camera.

    ERIC Educational Resources Information Center

    Bissonnette, D.; And Others

    1991-01-01

    Presents an experiment based on the principles of rectilinear motion of light operating in a pinhole camera that projects the image of an illuminated object through a small hole in a sheet to an image screen. (MDH)

  18. Miniature TV Camera

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Originally devised to observe Saturn stage separation during Apollo flights, Marshall Space Flight Center's Miniature Television Camera, measuring only 4 x 3 x 1 1/2 inches, quickly made its way to the commercial telecommunications market.

  19. Airborne oceanographic lidar system

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Specifications and preliminary design of an Airborne Oceanographic Lidar (AOL) system, which is to be constructed for installation and used on a NASA Wallops Flight Center (WFC) C-54 research aircraft, are reported. The AOL system is to provide an airborne facility for use by various government agencies to demonstrate the utility and practicality of hardware of this type in the wide area collection of oceanographic data on an operational basis. System measurement and performance requirements are presented, followed by a description of the conceptual system approach and the considerations attendant to its development. System performance calculations are addressed, and the system specifications and preliminary design are presented and discussed.

  20. Airborne rain mapping radar

    NASA Technical Reports Server (NTRS)

    Wilson, W. J.; Parks, G. S.; Li, F. K.; Im, K. E.; Howard, R. J.

    1988-01-01

    An airborne scanning radar system for remote rain mapping is described. The airborne rain mapping radar is composed of two radar frequency channels at 13.8 and 24.1 GHz. The radar is proposed to scan its antenna beam over + or - 20 deg from the antenna boresight; have a swath width of 7 km; a horizontal spatial resolution at nadir of about 500 m; and a range resolution of 120 m. The radar is designed to be applicable for retrieving rainfall rates from 0.1-60 mm/hr at the earth's surface, and for measuring linear polarization signatures and raindrop's fall velocity.

  1. Gamma ray camera

    SciTech Connect

    Robbins, C.D.; Wang, S.

    1980-09-09

    An anger gamma ray camera is improved by the substitution of a gamma ray sensitive, proximity type image intensifier tube for the scintillator screen in the anger camera, the image intensifier tube having a negatively charged flat scintillator screen and a flat photocathode layer and a grounded, flat output phosphor display screen all of the same dimension (Unity image magnification) and all within a grounded metallic tube envelope and having a metallic, inwardly concaved input window between the scintillator screen and the collimator.

  2. NASA Airborne Lidar July 1991

    Atmospheric Science Data Center

    2016-05-26

    NASA Airborne Lidar July 1991 Data from the 1991 NASA Langley Airborne Lidar flights following the eruption of Pinatubo in July ... and Osborn [1992a, 1992b]. Project Title:  NASA Airborne Lidar Discipline:  Field Campaigns ...

  3. NASA Airborne Lidar May 1992

    Atmospheric Science Data Center

    2016-05-26

    NASA Airborne Lidar May 1992 An airborne Nd:YAG (532 nm) lidar was operated by the NASA Langley Research Center about a year following the June 1991 eruption of ... Osborn [1992a, 1992b].  Project Title:  NASA Airborne Lidar Discipline:  Field Campaigns ...

  4. HST High Gain Antennae photographed by Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This view of one of the two High Gain Antennae (HGA) on the Hubble Space Telescope (HST) was photographed with an Electronic Still Camera (ESC). The scene was downlinked to ground controllers soon after the Shuttle Endeavour caught up to the orbiting telescope. Electronic still photography is a technology which provides the means for a handheld camera to electronically capture and digitize an image with resolution approaching film quality.

  5. 1. VARIABLEANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VARIABLE-ANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING NORTH TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  6. 9. VIEW OF CAMERA STATIONS UNDER CONSTRUCTION INCLUDING CAMERA CAR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. VIEW OF CAMERA STATIONS UNDER CONSTRUCTION INCLUDING CAMERA CAR ON RAILROAD TRACK AND FIXED CAMERA STATION 1400 (BUILDING NO. 42021) ABOVE, ADJACENT TO STATE HIGHWAY 39, LOOKING WEST, March 23, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  7. Digital Photography and Its Impact on Instruction.

    ERIC Educational Resources Information Center

    Lantz, Chris

    Today the chemical processing of film is being replaced by a virtual digital darkroom. Digital image storage makes new levels of consistency possible because its nature is less volatile and more mutable than traditional photography. The potential of digital imaging is great, but issues of disk storage, computer speed, camera sensor resolution,…

  8. Application of Optical Measurement Techniques During Stages of Pregnancy: Use of Phantom High Speed Cameras for Digital Image Correlation (D.I.C.) During Baby Kicking and Abdomen Movements

    NASA Technical Reports Server (NTRS)

    Gradl, Paul

    2016-01-01

    Paired images were collected using a projected pattern instead of standard painting of the speckle pattern on her abdomen. High Speed cameras were post triggered after movements felt. Data was collected at 120 fps -limited due to 60hz frequency of projector. To ensure that kicks and movement data was real a background test was conducted with no baby movement (to correct for breathing and body motion).

  9. Deployable Wireless Camera Penetrators

    NASA Technical Reports Server (NTRS)

    Badescu, Mircea; Jones, Jack; Sherrit, Stewart; Wu, Jiunn Jeng

    2008-01-01

    A lightweight, low-power camera dart has been designed and tested for context imaging of sampling sites and ground surveys from an aerobot or an orbiting spacecraft in a microgravity environment. The camera penetrators also can be used to image any line-of-sight surface, such as cliff walls, that is difficult to access. Tethered cameras to inspect the surfaces of planetary bodies use both power and signal transmission lines to operate. A tether adds the possibility of inadvertently anchoring the aerobot, and requires some form of station-keeping capability of the aerobot if extended examination time is required. The new camera penetrators are deployed without a tether, weigh less than 30 grams, and are disposable. They are designed to drop from any altitude with the boost in transmitting power currently demonstrated at approximately 100-m line-of-sight. The penetrators also can be deployed to monitor lander or rover operations from a distance, and can be used for surface surveys or for context information gathering from a touch-and-go sampling site. Thanks to wireless operation, the complexity of the sampling or survey mechanisms may be reduced. The penetrators may be battery powered for short-duration missions, or have solar panels for longer or intermittent duration missions. The imaging device is embedded in the penetrator, which is dropped or projected at the surface of a study site at 90 to the surface. Mirrors can be used in the design to image the ground or the horizon. Some of the camera features were tested using commercial "nanny" or "spy" camera components with the charge-coupled device (CCD) looking at a direction parallel to the ground. Figure 1 shows components of one camera that weighs less than 8 g and occupies a volume of 11 cm3. This camera could transmit a standard television signal, including sound, up to 100 m. Figure 2 shows the CAD models of a version of the penetrator. A low-volume array of such penetrator cameras could be deployed from an

  10. Scientific Objectives of Small Carry-on Impactor (SCI) and Deployable Camera 3 Digital (DCAM3-D): Observation of an Ejecta Curtain and a Crater Formed on the Surface of Ryugu by an Artificial High-Velocity Impact

    NASA Astrophysics Data System (ADS)

    Arakawa, M.; Wada, K.; Saiki, T.; Kadono, T.; Takagi, Y.; Shirai, K.; Okamoto, C.; Yano, H.; Hayakawa, M.; Nakazawa, S.; Hirata, N.; Kobayashi, M.; Michel, P.; Jutzi, M.; Imamura, H.; Ogawa, K.; Sakatani, N.; Iijima, Y.; Honda, R.; Ishibashi, K.; Hayakawa, H.; Sawada, H.

    2016-10-01

    The Small Carry-on Impactor (SCI) equipped on Hayabusa2 was developed to produce an artificial impact crater on the primitive Near-Earth Asteroid (NEA) 162173 Ryugu (Ryugu) in order to explore the asteroid subsurface material unaffected by space weathering and thermal alteration by solar radiation. An exposed fresh surface by the impactor and/or the ejecta deposit excavated from the crater will be observed by remote sensing instruments, and a subsurface fresh sample of the asteroid will be collected there. The SCI impact experiment will be observed by a Deployable CAMera 3-D (DCAM3-D) at a distance of ˜1 km from the impact point, and the time evolution of the ejecta curtain will be observed by this camera to confirm the impact point on the asteroid surface. As a result of the observation of the ejecta curtain by DCAM3-D and the crater morphology by onboard cameras, the subsurface structure and the physical properties of the constituting materials will be derived from crater scaling laws. Moreover, the SCI experiment on Ryugu gives us a precious opportunity to clarify effects of microgravity on the cratering process and to validate numerical simulations and models of the cratering process.

  11. NV-CMOS HD camera for day/night imaging

    NASA Astrophysics Data System (ADS)

    Vogelsong, T.; Tower, J.; Sudol, Thomas; Senko, T.; Chodelka, D.

    2014-06-01

    SRI International (SRI) has developed a new multi-purpose day/night video camera with low-light imaging performance comparable to an image intensifier, while offering the size, weight, ruggedness, and cost advantages enabled by the use of SRI's NV-CMOS HD digital image sensor chip. The digital video output is ideal for image enhancement, sharing with others through networking, video capture for data analysis, or fusion with thermal cameras. The camera provides Camera Link output with HD/WUXGA resolution of 1920 x 1200 pixels operating at 60 Hz. Windowing to smaller sizes enables operation at higher frame rates. High sensitivity is achieved through use of backside illumination, providing high Quantum Efficiency (QE) across the visible and near infrared (NIR) bands (peak QE <90%), as well as projected low noise (<2h+) readout. Power consumption is minimized in the camera, which operates from a single 5V supply. The NVCMOS HD camera provides a substantial reduction in size, weight, and power (SWaP) , ideal for SWaP-constrained day/night imaging platforms such as UAVs, ground vehicles, fixed mount surveillance, and may be reconfigured for mobile soldier operations such as night vision goggles and weapon sights. In addition the camera with the NV-CMOS HD imager is suitable for high performance digital cinematography/broadcast systems, biofluorescence/microscopy imaging, day/night security and surveillance, and other high-end applications which require HD video imaging with high sensitivity and wide dynamic range. The camera comes with an array of lens mounts including C-mount and F-mount. The latest test data from the NV-CMOS HD camera will be presented.

  12. Airborne Fraunhofer Line Discriminator

    NASA Technical Reports Server (NTRS)

    Gabriel, F. C.; Markle, D. A.

    1969-01-01

    Airborne Fraunhofer Line Discriminator enables prospecting for fluorescent materials, hydrography with fluorescent dyes, and plant studies based on fluorescence of chlorophyll. Optical unit design is the coincidence of Fraunhofer lines in the solar spectrum occurring at the characteristic wavelengths of some fluorescent materials.

  13. Recognizing Airborne Hazards.

    ERIC Educational Resources Information Center

    Schneider, Christian M.

    1990-01-01

    The heating, ventilating, and air conditioning (HVAC) systems in older buildings often do not adequately handle air-borne contaminants. Outlines a three-stage Indoor Air Quality (IAQ) assessment and describes a case in point at a Pittsburgh, Pennsylvania, school. (MLF)

  14. Airborne asbestos in buildings.

    PubMed

    Lee, R J; Van Orden, D R

    2008-03-01

    The concentration of airborne asbestos in buildings nationwide is reported in this study. A total of 3978 indoor samples from 752 buildings, representing nearly 32 man-years of sampling, have been analyzed by transmission electron microscopy. The buildings that were surveyed were the subject of litigation related to suits alleging the general building occupants were exposed to a potential health hazard as a result the presence of asbestos-containing materials (ACM). The average concentration of all airborne asbestos structures was 0.01structures/ml (s/ml) and the average concentration of airborne asbestos > or = 5microm long was 0.00012fibers/ml (f/ml). For all samples, 99.9% of the samples were <0.01 f/ml for fibers longer than 5microm; no building averaged above 0.004f/ml for fibers longer than 5microm. No asbestos was detected in 27% of the buildings and in 90% of the buildings no asbestos was detected that would have been seen optically (> or = 5microm long and > or = 0.25microm wide). Background outdoor concentrations have been reported at 0.0003f/ml > or = 5microm. These results indicate that in-place ACM does not result in elevated airborne asbestos in building atmospheres approaching regulatory levels and that it does not result in a significantly increased risk to building occupants.

  15. Plenoptic camera image simulation for reconstruction algorithm verification

    NASA Astrophysics Data System (ADS)

    Schwiegerling, Jim

    2014-09-01

    Plenoptic cameras have emerged in recent years as a technology for capturing light field data in a single snapshot. A conventional digital camera can be modified with the addition of a lenslet array to create a plenoptic camera. Two distinct camera forms have been proposed in the literature. The first has the camera image focused onto the lenslet array. The lenslet array is placed over the camera sensor such that each lenslet forms an image of the exit pupil onto the sensor. The second plenoptic form has the lenslet array relaying the image formed by the camera lens to the sensor. We have developed a raytracing package that can simulate images formed by a generalized version of the plenoptic camera. Several rays from each sensor pixel are traced backwards through the system to define a cone of rays emanating from the entrance pupil of the camera lens. Objects that lie within this cone are integrated to lead to a color and exposure level for that pixel. To speed processing three-dimensional objects are approximated as a series of planes at different depths. Repeating this process for each pixel in the sensor leads to a simulated plenoptic image on which different reconstruction algorithms can be tested.

  16. Identification and extraction of the seaward edge of terrestrial vegetation using digital aerial photography

    USGS Publications Warehouse

    Harris, Melanie; Brock, John C.; Nayegandhi, A.; Duffy, M.; Wright, C.W.

    2006-01-01

    This report is created as part of the Aerial Data Collection and Creation of Products for Park Vital Signs Monitoring within the Northeast Region Coastal and Barrier Network project, which is a joint project between the National Park Service Inventory and Monitoring Program (NPS-IM), the National Aeronautics and Space Administration (NASA) Observational Sciences Branch, and the U.S. Geological Survey (USGS) Center for Coastal and Watershed Studies (CCWS). This report is one of a series that discusses methods for extracting topographic features from aerial survey data. It details step-by-step methods used to extract a spatially referenced digital line from aerial photography that represents the seaward edge of terrestrial vegetation along the coast of Assateague Island National Seashore (ASIS). One component of the NPS-IM/USGS/NASA project includes the collection of NASA aerial surveys over various NPS barrier islands and coastal parks throughout the National Park Service's Northeast Region. These aerial surveys consist of collecting optical remote sensing data from a variety of sensors, including the NASA Airborne Topographic Mapper (ATM), the NASA Experimental Advanced Airborne Research Lidar (EAARL), and down-looking digital mapping cameras.

  17. International Symposium on Airborne Geophysics

    NASA Astrophysics Data System (ADS)

    Mogi, Toru; Ito, Hisatoshi; Kaieda, Hideshi; Kusunoki, Kenichiro; Saltus, Richard W.; Fitterman, David V.; Okuma, Shigeo; Nakatsuka, Tadashi

    2006-05-01

    Airborne geophysics can be defined as the measurement of Earth properties from sensors in the sky. The airborne measurement platform is usually a traditional fixed-wing airplane or helicopter, but could also include lighter-than-air craft, unmanned drones, or other specialty craft. The earliest history of airborne geophysics includes kite and hot-air balloon experiments. However, modern airborne geophysics dates from the mid-1940s when military submarine-hunting magnetometers were first used to map variations in the Earth's magnetic field. The current gamut of airborne geophysical techniques spans a broad range, including potential fields (both gravity and magnetics), electromagnetics (EM), radiometrics, spectral imaging, and thermal imaging.

  18. Specific Analysis of Web Camera and High Resolution Planetary Imaging

    NASA Astrophysics Data System (ADS)

    Park, Youngsik; Lee, Dongju; Jin, Ho; Han, Wonyong; Park, Jang-Hyun

    2006-12-01

    Web camera is usually used for video communication between PC, it has small sensing area, cannot using long exposure application, so that is insufficient for astronomical application. But web camera is suitable for bright planet, moon, it doesn't need long exposure time. So many amateur astronomer using web camera for planetary imaging. We used ToUcam manufactured by Phillips for planetary imaging and Registax commercial program for a video file combining. And then, we are measure a property of web camera, such as linearity, gain that is usually using for analysis of CCD performance. Because of using combine technic selected high quality image from video frame, this method can take higher resolution planetary imaging than one shot image by film, digital camera and CCD. We describe a planetary observing method and a video frame combine method.

  19. Positron emission particle tracking using the new Birmingham positron camera

    NASA Astrophysics Data System (ADS)

    Parker, D. J.; Forster, R. N.; Fowles, P.; Takhar, P. S.

    2002-01-01

    Since 1985 a positron camera consisting of a pair of multi-wire proportional chambers has been used at Birmingham for engineering studies involving positron emitting radioactive tracers. The technique of positron emission particle tracking (PEPT), developed at Birmingham, whereby a single tracer particle can be tracked at high speed, has proved particularly powerful. The main limitation of the original positron camera was its low sensitivity and correspondingly low data rate. A new positron camera has recently been installed; it consists of a pair of NaI (Tl) gamma camera heads with fully digital readout and offers an enormous improvement in data rate and data quality. The performance of this camera, and in particular the improved capabilities it brings to the PEPT technique, are summarised.

  20. Photoreactivation in Airborne Mycobacterium parafortuitum

    PubMed Central

    Peccia, Jordan; Hernandez, Mark

    2001-01-01

    Photoreactivation was observed in airborne Mycobacterium parafortuitum exposed concurrently to UV radiation (254 nm) and visible light. Photoreactivation rates of airborne cells increased with increasing relative humidity (RH) and decreased with increasing UV dose. Under a constant UV dose with visible light absent, the UV inactivation rate of airborne M. parafortuitum cells decreased by a factor of 4 as RH increased from 40 to 95%; however, under identical conditions with visible light present, the UV inactivation rate of airborne cells decreased only by a factor of 2. When irradiated in the absence of visible light, cellular cyclobutane thymine dimer content of UV-irradiated airborne M. parafortuitum and Serratia marcescens increased in response to RH increases. Results suggest that, unlike in waterborne bacteria, cyclobutane thymine dimers are not the most significant form of UV-induced DNA damage incurred by airborne bacteria and that the distribution of DNA photoproducts incorporated into UV-irradiated airborne cells is a function of RH. PMID:11526027

  1. THE DARK ENERGY CAMERA

    SciTech Connect

    Flaugher, B.; Diehl, H. T.; Alvarez, O.; Angstadt, R.; Annis, J. T.; Buckley-Geer, E. J.; Honscheid, K.; Abbott, T. M. C.; Bonati, M.; Antonik, M.; Brooks, D.; Ballester, O.; Cardiel-Sas, L.; Beaufore, L.; Bernstein, G. M.; Bernstein, R. A.; Bigelow, B.; Boprie, D.; Campa, J.; Castander, F. J.; Collaboration: DES Collaboration; and others

    2015-11-15

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel{sup −1}. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6–9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  2. The CAMCAO infrared camera

    NASA Astrophysics Data System (ADS)

    Amorim, Antonio; Melo, Antonio; Alves, Joao; Rebordao, Jose; Pinhao, Jose; Bonfait, Gregoire; Lima, Jorge; Barros, Rui; Fernandes, Rui; Catarino, Isabel; Carvalho, Marta; Marques, Rui; Poncet, Jean-Marc; Duarte Santos, Filipe; Finger, Gert; Hubin, Norbert; Huster, Gotthard; Koch, Franz; Lizon, Jean-Louis; Marchetti, Enrico

    2004-09-01

    The CAMCAO instrument is a high resolution near infrared (NIR) camera conceived to operate together with the new ESO Multi-conjugate Adaptive optics Demonstrator (MAD) with the goal of evaluating the feasibility of Multi-Conjugate Adaptive Optics techniques (MCAO) on the sky. It is a high-resolution wide field of view (FoV) camera that is optimized to use the extended correction of the atmospheric turbulence provided by MCAO. While the first purpose of this camera is the sky observation, in the MAD setup, to validate the MCAO technology, in a second phase, the CAMCAO camera is planned to attach directly to the VLT for scientific astrophysical studies. The camera is based on the 2kx2k HAWAII2 infrared detector controlled by an ESO external IRACE system and includes standard IR band filters mounted on a positional filter wheel. The CAMCAO design requires that the optical components and the IR detector should be kept at low temperatures in order to avoid emitting radiation and lower detector noise in the region analysis. The cryogenic system inclues a LN2 tank and a sptially developed pulse tube cryocooler. Field and pupil cold stops are implemented to reduce the infrared background and the stray-light. The CAMCAO optics provide diffraction limited performance down to J Band, but the detector sampling fulfills the Nyquist criterion for the K band (2.2mm).

  3. The Dark Energy Camera

    SciTech Connect

    Flaugher, B.

    2015-04-11

    The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250-μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 Mpixel focal plane comprises 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. The CCDs have 15μm x 15μm pixels with a plate scale of 0.263" per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  4. Satellite camera image navigation

    NASA Technical Reports Server (NTRS)

    Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Savides, John (Inventor); Hanson, Charles W. (Inventor)

    1987-01-01

    Pixels within a satellite camera (1, 2) image are precisely located in terms of latitude and longitude on a celestial body, such as the earth, being imaged. A computer (60) on the earth generates models (40, 50) of the satellite's orbit and attitude, respectively. The orbit model (40) is generated from measurements of stars and landmarks taken by the camera (1, 2), and by range data. The orbit model (40) is an expression of the satellite's latitude and longitude at the subsatellite point, and of the altitude of the satellite, as a function of time, using as coefficients (K) the six Keplerian elements at epoch. The attitude model (50) is based upon star measurements taken by each camera (1, 2). The attitude model (50) is a set of expressions for the deviations in a set of mutually orthogonal reference optical axes (x, y, z) as a function of time, for each camera (1, 2). Measured data is fit into the models (40, 50) using a walking least squares fit algorithm. A transformation computer (66 ) transforms pixel coordinates as telemetered by the camera (1, 2) into earth latitude and longitude coordinates, using the orbit and attitude models (40, 50).

  5. The Dark Energy Camera

    NASA Astrophysics Data System (ADS)

    Flaugher, B.; Diehl, H. T.; Honscheid, K.; Abbott, T. M. C.; Alvarez, O.; Angstadt, R.; Annis, J. T.; Antonik, M.; Ballester, O.; Beaufore, L.; Bernstein, G. M.; Bernstein, R. A.; Bigelow, B.; Bonati, M.; Boprie, D.; Brooks, D.; Buckley-Geer, E. J.; Campa, J.; Cardiel-Sas, L.; Castander, F. J.; Castilla, J.; Cease, H.; Cela-Ruiz, J. M.; Chappa, S.; Chi, E.; Cooper, C.; da Costa, L. N.; Dede, E.; Derylo, G.; DePoy, D. L.; de Vicente, J.; Doel, P.; Drlica-Wagner, A.; Eiting, J.; Elliott, A. E.; Emes, J.; Estrada, J.; Fausti Neto, A.; Finley, D. A.; Flores, R.; Frieman, J.; Gerdes, D.; Gladders, M. D.; Gregory, B.; Gutierrez, G. R.; Hao, J.; Holland, S. E.; Holm, S.; Huffman, D.; Jackson, C.; James, D. J.; Jonas, M.; Karcher, A.; Karliner, I.; Kent, S.; Kessler, R.; Kozlovsky, M.; Kron, R. G.; Kubik, D.; Kuehn, K.; Kuhlmann, S.; Kuk, K.; Lahav, O.; Lathrop, A.; Lee, J.; Levi, M. E.; Lewis, P.; Li, T. S.; Mandrichenko, I.; Marshall, J. L.; Martinez, G.; Merritt, K. W.; Miquel, R.; Muñoz, F.; Neilsen, E. H.; Nichol, R. C.; Nord, B.; Ogando, R.; Olsen, J.; Palaio, N.; Patton, K.; Peoples, J.; Plazas, A. A.; Rauch, J.; Reil, K.; Rheault, J.-P.; Roe, N. A.; Rogers, H.; Roodman, A.; Sanchez, E.; Scarpine, V.; Schindler, R. H.; Schmidt, R.; Schmitt, R.; Schubnell, M.; Schultz, K.; Schurter, P.; Scott, L.; Serrano, S.; Shaw, T. M.; Smith, R. C.; Soares-Santos, M.; Stefanik, A.; Stuermer, W.; Suchyta, E.; Sypniewski, A.; Tarle, G.; Thaler, J.; Tighe, R.; Tran, C.; Tucker, D.; Walker, A. R.; Wang, G.; Watson, M.; Weaverdyck, C.; Wester, W.; Woods, R.; Yanny, B.; DES Collaboration

    2015-11-01

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel-1. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6-9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  6. A Coordinated Ice-based and Airborne Snow and Ice Thickness Measurement Campaign on Arctic Sea Ice

    NASA Astrophysics Data System (ADS)

    Richter-Menge, J.; Farrell, S.; Elder, B. C.; Gardner, J. M.; Brozena, J. M.

    2011-12-01

    A rare opportunity presented itself in March 2011 when the Naval Research Laboratory (NRL) and NASA IceBridge teamed with scientists from the U.S. Army Corps of Engineers Cold Regions Research and Engineering Laboratory (CRREL) to coordinate a multi-scale approach to mapping snow depth and sea ice thickness distribution in the Arctic. Ground-truth information for calibration/validation of airborne and CryoSat-2 satellite data were collected near a manned camp deployed in support of the US Navy's Ice Expedition 2011 (ICEX 2011). The ice camp was established at a location approximately 230 km north of Prudhoe Bay, Alaska, at the edge of the perennial ice zone. The suite of measurements was strategically organized around a 9-km-long survey line that covered a wide range of ice types, including refrozen leads, deformed and undeformed first year ice, and multiyear ice. A highly concentrated set of in situ measurements of snow depth and ice thickness were taken along the survey line. Once the survey line was in place, NASA IceBridge flew a dedicated mission along the survey line, collecting data with an instrument suite that included the Airborne Topographic Mapper (ATM), a high precision, airborne scanning laser altimeter; the Digital Mapping System (DMS), nadir-viewing digital camera; and the University of Kansas ultra-wideband Frequency Modulated Continuous Wave (FMCW) snow radar. NRL also flew a dedicated mission over the survey line with complementary airborne radar, laser and photogrammetric sensors (see Brozena et al., this session). These measurements were further leveraged by a series of CryoSat-2 under flights made in the region by the instrumented NRL and NASA planes, as well as US Navy submarine underpasses of the 9-km-long survey line to collect ice draft measurements. This comprehensive suite of data provides the full spectrum of sampling resolutions from satellite, to airborne, to ground-based, to submarine and will allow for a careful determination of

  7. Solid state television camera

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The design, fabrication, and tests of a solid state television camera using a new charge-coupled imaging device are reported. An RCA charge-coupled device arranged in a 512 by 320 format and directly compatible with EIA format standards was the sensor selected. This is a three-phase, sealed surface-channel array that has 163,840 sensor elements, which employs a vertical frame transfer system for image readout. Included are test results of the complete camera system, circuit description and changes to such circuits as a result of integration and test, maintenance and operation section, recommendations to improve the camera system, and a complete set of electrical and mechanical drawing sketches.

  8. HIGH SPEED CAMERA

    DOEpatents

    Rogers, B.T. Jr.; Davis, W.C.

    1957-12-17

    This patent relates to high speed cameras having resolution times of less than one-tenth microseconds suitable for filming distinct sequences of a very fast event such as an explosion. This camera consists of a rotating mirror with reflecting surfaces on both sides, a narrow mirror acting as a slit in a focal plane shutter, various other mirror and lens systems as well as an innage recording surface. The combination of the rotating mirrors and the slit mirror causes discrete, narrow, separate pictures to fall upon the film plane, thereby forming a moving image increment of the photographed event. Placing a reflecting surface on each side of the rotating mirror cancels the image velocity that one side of the rotating mirror would impart, so as a camera having this short a resolution time is thereby possible.

  9. Selective-imaging camera

    NASA Astrophysics Data System (ADS)

    Szu, Harold; Hsu, Charles; Landa, Joseph; Cha, Jae H.; Krapels, Keith A.

    2015-05-01

    How can we design cameras that image selectively in Full Electro-Magnetic (FEM) spectra? Without selective imaging, we cannot use, for example, ordinary tourist cameras to see through fire, smoke, or other obscurants contributing to creating a Visually Degraded Environment (VDE). This paper addresses a possible new design of selective-imaging cameras at firmware level. The design is consistent with physics of the irreversible thermodynamics of Boltzmann's molecular entropy. It enables imaging in appropriate FEM spectra for sensing through the VDE, and displaying in color spectra for Human Visual System (HVS). We sense within the spectra the largest entropy value of obscurants such as fire, smoke, etc. Then we apply a smart firmware implementation of Blind Sources Separation (BSS) to separate all entropy sources associated with specific Kelvin temperatures. Finally, we recompose the scene using specific RGB colors constrained by the HVS, by up/down shifting Planck spectra at each pixel and time.

  10. An airborne real-time hyperspectral target detection system

    NASA Astrophysics Data System (ADS)

    Skauli, Torbjorn; Haavardsholm, Trym V.; Kåsen, Ingebjørg; Arisholm, Gunnar; Kavara, Amela; Opsahl, Thomas Olsvik; Skaugen, Atle

    2010-04-01

    An airborne system for hyperspectral target detection is described. The main sensor is a HySpex pushbroom hyperspectral imager for the visible and near-infrared spectral range with 1600 pixels across track, supplemented by a panchromatic line imager. An optional third sensor can be added, either a SWIR hyperspectral camera or a thermal camera. In real time, the system performs radiometric calibration and georeferencing of the images, followed by image processing for target detection and visualization. The current version of the system implements only spectral anomaly detection, based on normal mixture models. Image processing runs on a PC with a multicore Intel processor and an Nvidia graphics processing unit (GPU). The processing runs in a software framework optimized for large sustained data rates. The platform is a Cessna 172 aircraft based close to FFI, modified with a camera port in the floor.

  11. Lights, Camera, Learning!

    ERIC Educational Resources Information Center

    Bull, Glen; Bell, Lynn

    2009-01-01

    The shift from analog to digital video transformed the system from a unidirectional analog broadcast to a two-way conversation, resulting in the birth of participatory media. Digital video offers new opportunities for teaching science, social studies, mathematics, and English language arts. The professional education associations for each content…

  12. GROT in NICMOS Cameras

    NASA Astrophysics Data System (ADS)

    Sosey, M.; Bergeron, E.

    1999-09-01

    Grot is exhibited as small areas of reduced sensitivity, most likely due to flecks of antireflective paint scraped off the optical baffles as they were forced against each other. This paper characterizes grot associated with all three cameras. Flat field images taken from March 1997 through January 1999 have been investigated for changes in the grot, including possible wavelength dependency and throughput characteristics. The main products of this analysis are grot masks for each of the cameras which may also contain any new cold or dead pixels not specified in the data quality arrays.

  13. Wide angle pinhole camera

    NASA Technical Reports Server (NTRS)

    Franke, J. M.

    1978-01-01

    Hemispherical refracting element gives pinhole camera 180 degree field-of-view without compromising its simplicity and depth-of-field. Refracting element, located just behind pinhole, bends light coming in from sides so that it falls within image area of film. In contrast to earlier pinhole cameras that used water or other transparent fluids to widen field, this model is not subject to leakage and is easily loaded and unloaded with film. Moreover, by selecting glass with different indices of refraction, field at film plane can be widened or reduced.

  14. Artificial human vision camera

    NASA Astrophysics Data System (ADS)

    Goudou, J.-F.; Maggio, S.; Fagno, M.

    2014-10-01

    In this paper we present a real-time vision system modeling the human vision system. Our purpose is to inspire from human vision bio-mechanics to improve robotic capabilities for tasks such as objects detection and tracking. This work describes first the bio-mechanical discrepancies between human vision and classic cameras and the retinal processing stage that takes place in the eye, before the optic nerve. The second part describes our implementation of these principles on a 3-camera optical, mechanical and software model of the human eyes and associated bio-inspired attention model.

  15. NETD test of high-sensitivity infrared camera

    NASA Astrophysics Data System (ADS)

    Li, Yingwen; Pan, Debin; Yang, Changcheng; Luo, Yan

    2007-12-01

    Infrared camera has more and more application in military, judicature, rescue, industry, hospital and science. Nowadays the NETD (Noise Equivalent Temperature Difference) of high-sensitivity cooled infrared camera is less than 10 mK. If we test the NETD from the analog video output port of infrared camera using 8-bit and 10-bit ADC frame grabber, the NETD accuracy is 7.81 mK and 2.76 mK which correspond to relative error 78.7% and 27.6% for a 10 mK NETD infrared camera. Such kind of accuracy is obviously not proper for the performance evaluation of high-sensitivity infrared camera with NETD less than 10 mK. The NETD test accuracy can be improved by increasing the effective bit number of the ADC of frame grabber. The quantization error of ADC of frame grabber has become the main factor which contributes most to the NETD error of the high-sensitivity infrared camera. It is difficult to evaluate the electrooptical performance of the high-sensitivity infrared camera through its analog video. Although the NETD test accuracy can be improved by reducing the linear temperature range or increasing the effective bits of the ADC of frame grabber under analog video interface test condition, it is difficult to meet the test needs. But under the 14 bits digital video interface test condition and 1 K linear range, the NETD test accuracy of 0.24 mK can be achieved. The NETD accuracy can be also improved by reducing the linear temperature range. The NETD test accuracy can be 0.488 mK through 14-bit digital video under 2 K linear temperature range and its relative error equals 4.9% for a 10 mK NETD high-sensitivity infrared camera which meets the requirement. The test result through the digital video port of an infrared camera shows that the test result through digital video port matches with its nominal value. This necessitates the need of digital video interface of high-sensitivity infrared camera in NETD test in order to evaluate its performance accuracy.

  16. Autofocus method for scanning remote sensing cameras.

    PubMed

    Lv, Hengyi; Han, Chengshan; Xue, Xucheng; Hu, Changhong; Yao, Cheng

    2015-07-10

    Autofocus methods are conventionally based on capturing the same scene from a series of positions of the focal plane. As a result, it has been difficult to apply this technique to scanning remote sensing cameras where the scenes change continuously. In order to realize autofocus in scanning remote sensing cameras, a novel autofocus method is investigated in this paper. Instead of introducing additional mechanisms or optics, the overlapped pixels of the adjacent CCD sensors on the focal plane are employed. Two images, corresponding to the same scene on the ground, can be captured at different times. Further, one step of focusing is done during the time interval, so that the two images can be obtained at different focal plane positions. Subsequently, the direction of the next step of focusing is calculated based on the two images. The analysis shows that the method investigated operates without restriction of the time consumption of the algorithm and realizes a total projection for general focus measures and algorithms from digital still cameras to scanning remote sensing cameras. The experiment results show that the proposed method is applicable to the entire focus measure family, and the error ratio is, on average, no more than 0.2% and drops to 0% by reliability improvement, which is lower than that of prevalent approaches (12%). The proposed method is demonstrated to be effective and has potential in other scanning imaging applications.

  17. Development of a multispectral camera system

    NASA Astrophysics Data System (ADS)

    Sugiura, Hiroaki; Kuno, Tetsuya; Watanabe, Norihiro; Matoba, Narihiro; Hayashi, Junichiro; Miyake, Yoichi

    2000-05-01

    A highly accurate multispectral camera and the application software have been developed as a practical system to capture digital images of the artworks stored in galleries and museums. Instead of recording color data in the conventional three RGB primary colors, the newly developed camera and the software carry out a pixel-wise estimation of spectral reflectance, the color data specific to the object, to enable the practical multispectral imaging. In order to realize the accurate multispectral imaging, the dynamic range of the camera is set to 14 bits or over and the output bits to 14 bits so as to allow capturing even when the difference in light quantity between the each channel is large. Further, a small-size rotary color filter was simultaneously developed to keep the camera to a practical size. We have developed software capable of selecting the optimum combination of color filters available in the market. Using this software, n types of color filter can be selected from m types of color filter giving a minimum Euclidean distance or minimum color difference in CIELAB color space between actual and estimated spectral reflectance as to 147 types of oil paint samples.

  18. The NRL 2011 Airborne Sea-Ice Thickness Campaign

    NASA Astrophysics Data System (ADS)

    Brozena, J. M.; Gardner, J. M.; Liang, R.; Ball, D.; Richter-Menge, J.

    2011-12-01

    In March of 2011, the US Naval Research Laboratory (NRL) performed a study focused on the estimation of sea-ice thickness from airborne radar, laser and photogrammetric sensors. The study was funded by ONR to take advantage of the Navy's ICEX2011 ice-camp /submarine exercise, and to serve as a lead-in year for NRL's five year basic research program on the measurement and modeling of sea-ice scheduled to take place from 2012-2017. Researchers from the Army Cold Regions Research and Engineering Laboratory (CRREL) and NRL worked with the Navy Arctic Submarine Lab (ASL) to emplace a 9 km-long ground-truth line near the ice-camp (see Richter-Menge et al., this session) along which ice and snow thickness were directly measured. Additionally, US Navy submarines collected ice draft measurements under the groundtruth line. Repeat passes directly over the ground-truth line were flown and a grid surrounding the line was also flown to collect altimeter, LiDAR and Photogrammetry data. Five CRYOSAT-2 satellite tracks were underflown, as well, coincident with satellite passage. Estimates of sea ice thickness are calculated assuming local hydrostatic balance, and require the densities of water, ice and snow, snow depth, and freeboard (defined as the elevation of sea ice, plus accumulated snow, above local sea level). Snow thickness is estimated from the difference between LiDAR and radar altimeter profiles, the latter of which is assumed to penetrate any snow cover. The concepts we used to estimate ice thickness are similar to those employed in NASA ICEBRIDGE sea-ice thickness estimation. Airborne sensors used for our experiment were a Reigl Q-560 scanning topographic LiDAR, a pulse-limited (2 nS), 10 GHz radar altimeter and an Applanix DSS-439 digital photogrammetric camera (for lead identification). Flights were conducted on a Twin Otter aircraft from Pt. Barrow, AK, and averaged ~ 5 hours in duration. It is challenging to directly compare results from the swath LiDAR with the

  19. [Air-borne disease].

    PubMed

    Lameiro Vilariño, Carmen; del Campo Pérez, Victor M; Alonso Bürger, Susana; Felpeto Nodar, Irene; Guimarey Pérez, Rosa; Pérez Alvarellos, Alberto

    2003-11-01

    Respiratory protection is a factor which worries nursing professionals who take care of patients susceptible of transmitting microorganisms through the air more as every day passes. This type of protection covers the use of surgical or hygienic masks against the transmission of infection by airborne drops to the use of highly effective masks or respirators against the transmission of airborne diseases such as tuberculosis or SARS, a recently discovered disease. The adequate choice of this protective device and its correct use are fundamental in order to have an effective protection for exposed personnel. The authors summarize the main protective respiratory devices used by health workers, their characteristics and degree of effectiveness, as well as the circumstances under which each device is indicated for use. PMID:14705591

  20. Everything Digital: Converting the world in 2 Exabytes

    SciTech Connect

    Lesk, Michael

    2003-11-05

    Nearly everything created today is in digital format: music is on digital CDs, documents come from word processing, still photography is switching to digital cameras and even movies are now edited digitally. What about the past? We have projects like the Million Book Project scanning one million books, and we know technically how to convert everything: the problems are legal, economic and organizational.

  1. MLS airborne antenna research

    NASA Technical Reports Server (NTRS)

    Yu, C. L.; Burnside, W. D.

    1975-01-01

    The geometrical theory of diffraction was used to analyze the elevation plane pattern of on-aircraft antennas. The radiation patterns for basic elements (infinitesimal dipole, circumferential and axial slot) mounted on fuselage of various aircrafts with or without radome included were calculated and compared well with experimental results. Error phase plots were also presented. The effects of radiation patterns and error phase plots on the polarization selection for the MLS airborne antenna are discussed.

  2. Airborne forest fire research

    NASA Technical Reports Server (NTRS)

    Mattingly, G. S.

    1974-01-01

    The research relating to airborne fire fighting systems is reviewed to provide NASA/Langley Research Center with current information on the use of aircraft in forest fire operations, and to identify research requirements for future operations. A literature survey, interview of forest fire service personnel, analysis and synthesis of data from research reports and independent conclusions, and recommendations for future NASA-LRC programs are included.

  3. Snapshot polarimeter fundus camera.

    PubMed

    DeHoog, Edward; Luo, Haitao; Oka, Kazuhiko; Dereniak, Eustace; Schwiegerling, James

    2009-03-20

    A snapshot imaging polarimeter utilizing Savart plates is integrated into a fundus camera for retinal imaging. Acquired retinal images can be processed to reconstruct Stokes vector images, giving insight into the polarization properties of the retina. Results for images from a normal healthy retina and retinas with pathology are examined and compared. PMID:19305463

  4. Spas color camera

    NASA Technical Reports Server (NTRS)

    Toffales, C.

    1983-01-01

    The procedures to be followed in assessing the performance of the MOS color camera are defined. Aspects considered include: horizontal and vertical resolution; value of the video signal; gray scale rendition; environmental (vibration and temperature) tests; signal to noise ratios; and white balance correction.

  5. The LSST Camera Overview

    SciTech Connect

    Gilmore, Kirk; Kahn, Steven A.; Nordby, Martin; Burke, David; O'Connor, Paul; Oliver, John; Radeka, Veljko; Schalk, Terry; Schindler, Rafe; /SLAC

    2007-01-10

    The LSST camera is a wide-field optical (0.35-1um) imager designed to provide a 3.5 degree FOV with better than 0.2 arcsecond sampling. The detector format will be a circular mosaic providing approximately 3.2 Gigapixels per image. The camera includes a filter mechanism and, shuttering capability. It is positioned in the middle of the telescope where cross-sectional area is constrained by optical vignetting and heat dissipation must be controlled to limit thermal gradients in the optical beam. The fast, f/1.2 beam will require tight tolerances on the focal plane mechanical assembly. The focal plane array operates at a temperature of approximately -100 C to achieve desired detector performance. The focal plane array is contained within an evacuated cryostat, which incorporates detector front-end electronics and thermal control. The cryostat lens serves as an entrance window and vacuum seal for the cryostat. Similarly, the camera body lens serves as an entrance window and gas seal for the camera housing, which is filled with a suitable gas to provide the operating environment for the shutter and filter change mechanisms. The filter carousel can accommodate 5 filters, each 75 cm in diameter, for rapid exchange without external intervention.

  6. Jack & the Video Camera

    ERIC Educational Resources Information Center

    Charlan, Nathan

    2010-01-01

    This article narrates how the use of video camera has transformed the life of Jack Williams, a 10-year-old boy from Colorado Springs, Colorado, who has autism. The way autism affected Jack was unique. For the first nine years of his life, Jack remained in his world, alone. Functionally non-verbal and with motor skill problems that affected his…

  7. Photogrammetric camera calibration

    USGS Publications Warehouse

    Tayman, W.P.; Ziemann, H.

    1984-01-01

    Section 2 (Calibration) of the document "Recommended Procedures for Calibrating Photogrammetric Cameras and Related Optical Tests" from the International Archives of Photogrammetry, Vol. XIII, Part 4, is reviewed in the light of recent practical work, and suggestions for changes are made. These suggestions are intended as a basis for a further discussion. ?? 1984.

  8. Communities, Cameras, and Conservation

    ERIC Educational Resources Information Center

    Patterson, Barbara

    2012-01-01

    Communities, Cameras, and Conservation (CCC) is the most exciting and valuable program the author has seen in her 30 years of teaching field science courses. In this citizen science project, students and community volunteers collect data on mountain lions ("Puma concolor") at four natural areas and public parks along the Front Range of Colorado.…

  9. Make a Pinhole Camera

    ERIC Educational Resources Information Center

    Fisher, Diane K.; Novati, Alexander

    2009-01-01

    On Earth, using ordinary visible light, one can create a single image of light recorded over time. Of course a movie or video is light recorded over time, but it is a series of instantaneous snapshots, rather than light and time both recorded on the same medium. A pinhole camera, which is simple to make out of ordinary materials and using ordinary…

  10. Advanced Virgo phase cameras

    NASA Astrophysics Data System (ADS)

    van der Schaaf, L.; Agatsuma, K.; van Beuzekom, M.; Gebyehu, M.; van den Brand, J.

    2016-05-01

    A century after the prediction of gravitational waves, detectors have reached the sensitivity needed to proof their existence. One of them, the Virgo interferometer in Pisa, is presently being upgraded to Advanced Virgo (AdV) and will come into operation in 2016. The power stored in the interferometer arms raises from 20 to 700 kW. This increase is expected to introduce higher order modes in the beam, which could reduce the circulating power in the interferometer, limiting the sensitivity of the instrument. To suppress these higher-order modes, the core optics of Advanced Virgo is equipped with a thermal compensation system. Phase cameras, monitoring the real-time status of the beam constitute a critical component of this compensation system. These cameras measure the phases and amplitudes of the laser-light fields at the frequencies selected to control the interferometer. The measurement combines heterodyne detection with a scan of the wave front over a photodetector with pin-hole aperture. Three cameras observe the phase front of these laser sidebands. Two of them monitor the in-and output of the interferometer arms and the third one is used in the control of the aberrations introduced by the power recycling cavity. In this paper the working principle of the phase cameras is explained and some characteristic parameters are described.

  11. Imaging phoswich anger camera

    NASA Astrophysics Data System (ADS)

    Manchanda, R. K.; Sood, R. K.

    1991-08-01

    High angular resolution and low background are the primary requisites for detectors for future astronomy experiments in the low energy gamma-ray region. Scintillation counters are still the only available large area detector for studies in this energy range. Preliminary details of a large area phoswich anger camera designed for coded aperture imaging is described and its background and position characteristics are discussed.

  12. Millisecond readout CCD camera

    NASA Astrophysics Data System (ADS)

    Prokop, Mark; McCurnin, Thomas W.; Stradling, Gary L.

    1993-01-01

    We have developed a prototype of a fast-scanning CCD readout system to record a 1024 X 256 pixel image and transport the image to a recording station within 1 ms of the experimental event. The system is designed to have a dynamic range of greater than 1000 with adequate sensitivity to read single-electron excitations of a CRT phosphor when amplified by a microchannel plate image intensifier. This readout camera is intended for recording images from oscilloscopes, streak, and framing cameras. The sensor is a custom CCD chip, designed by LORAL Aeroneutronics. This CCD chip is designed with 16 parallel output ports to supply the necessary image transfer speed. The CCD is designed as an interline structure to allow fast clearing of the image and on-chip fast sputtering. Special antiblooming provisions are also included. The camera is designed to be modular and to allow CCD chips of other sizes to be used with minimal reengineering of the camera head.

  13. Millisecond readout CCD camera

    NASA Astrophysics Data System (ADS)

    Prokop, M.; McCurnin, T. W.; Stradling, G.

    We have developed a prototype of a fast-scanning CCD readout system to record a 1024 x 256 pixel image and transport the image to a recording station within 1 ms of the experimental event. The system is designed to have a dynamic range of greater than 1000 with adequate sensitivity to read single-electron excitations of a CRT phosphor when amplified by a microchannel plate image intensifier. This readout camera is intended for recording images from oscilloscopes, streak, and framing cameras. The sensor is a custom CCD chip, designed by LORAL Aeroneutronics. This CCD chip is designed with 16 parallel output ports to supply the necessary image transfer speed. The CCD is designed as an interline structure to allow fast clearing of the image and on-chip fast shuttering. Special antiblooming provisions are also included. The camera is designed to be modular and to allow CCD chips of other sizes to be used with minimal reengineering of the camera head.

  14. Mutagenicity of airborne particles.

    PubMed

    Chrisp, C E; Fisher, G L

    1980-09-01

    The physical and chemical properties of airborne particles are important for the interpretation of their potential biologic significance as genotoxic hazards. For polydisperse particle size distributions, the smallest, most respirable particles are generally the most mutagenic. Particulate collection for testing purposes should be designed to reduce artifact formation and allow condensation of mutagenic compounds. Other critical factors such as UV irradiation, wind direction, chemical reactivity, humidity, sample storage, and temperature of combustion are important. Application of chemical extraction methods and subsequent class fractionation techniques influence the observed mutagenic activity. Particles from urban air, coal fly ash, automobile and diesel exhaust, agricultural burning and welding fumes contain primarily direct-acting mutagens. Cigarette smoke condensate, smoke from charred meat and protein pyrolysates, kerosene soot and cigarette smoke condensates contain primarily mutagens which require metabolic activation. Fractionation coupled with mutagenicity testing indicates that the most potent mutagens are found in the acidic fractions of urban air, coal fly ash, and automobile diesel exhaust, whereas mutagens in rice straw smoke and cigarette smoke condensate are found primarily in the basic fractions. The interaction of the many chemical compounds in complex mixtures from airborne particles is likely to be important in determining mutagenic or comutagenic potentials. Because the mode of exposure is generally frequent and prolonged, the presence of tumor-promoting agents in complex mixtures may be a major factor in evaluation of the carcinogenic potential of airborne particles.

  15. Mammalian airborne allergens.

    PubMed

    Aalberse, Rob C

    2014-01-01

    Historically, horse dandruff was a favorite allergen source material. Today, however, allergic symptoms due to airborne mammalian allergens are mostly a result of indoor exposure, be it at home, at work or even at school. The relevance of mammalian allergens in relation to the allergenic activity of house dust extract is briefly discussed in the historical context of two other proposed sources of house dust allergenic activity: mites and Maillard-type lysine-sugar conjugates. Mammalian proteins involved in allergic reactions to airborne dust are largely found in only 2 protein families: lipocalins and secretoglobins (Fel d 1-like proteins), with a relatively minor contribution of serum albumins, cystatins and latherins. Both the lipocalin and the secretoglobin family are very complex. In some instances this results in a blurred separation between important and less important allergenic family members. The past 50 years have provided us with much detailed information on the genomic organization and protein structure of many of these allergens. However, the complex family relations, combined with the wide range of post-translational enzymatic and non-enzymatic modifications, make a proper qualitative and quantitative description of the important mammalian indoor airborne allergens still a significant proteomic challenge. PMID:24925404

  16. Airborne laser scanning for high-resolution mapping of Antarctica

    NASA Astrophysics Data System (ADS)

    Csatho, Bea; Schenk, Toni; Krabill, William; Wilson, Terry; Lyons, William; McKenzie, Garry; Hallam, Cheryl; Manizade, Serdar; Paulsen, Timothy

    In order to evaluate the potential of airborne laser scanning for topographic mapping in Antarctica and to establish calibration/validation sites for NASA's Ice, Cloud and land Elevation Satellite (ICESat) altimeter mission, NASA, the U.S. National Science Foundation (NSF), and the U.S. Geological Survey (USGS) joined forces to collect high-resolution airborne laser scanning data.In a two-week campaign during the 2001-2002 austral summer, NASA's Airborne Topographic Mapper (ATM) system was used to collect data over several sites in the McMurdo Sound area of Antarctica (Figure 1a). From the recorded signals, NASA computed laser points and The Ohio State University (OSU) completed the elaborate computation/verification of high-resolution Digital Elevation Models (DEMs) in 2003. This article reports about the DEM generation and some exemplary results from scientists using the geomorphologic information from the DEMs during the 2003-2004 field season.

  17. Mars Cameras Make Panoramic Photography a Snap

    NASA Technical Reports Server (NTRS)

    2008-01-01

    If you wish to explore a Martian landscape without leaving your armchair, a few simple clicks around the NASA Web site will lead you to panoramic photographs taken from the Mars Exploration Rovers, Spirit and Opportunity. Many of the technologies that enable this spectacular Mars photography have also inspired advancements in photography here on Earth, including the panoramic camera (Pancam) and its housing assembly, designed by the Jet Propulsion Laboratory and Cornell University for the Mars missions. Mounted atop each rover, the Pancam mast assembly (PMA) can tilt a full 180 degrees and swivel 360 degrees, allowing for a complete, highly detailed view of the Martian landscape. The rover Pancams take small, 1 megapixel (1 million pixel) digital photographs, which are stitched together into large panoramas that sometimes measure 4 by 24 megapixels. The Pancam software performs some image correction and stitching after the photographs are transmitted back to Earth. Different lens filters and a spectrometer also assist scientists in their analyses of infrared radiation from the objects in the photographs. These photographs from Mars spurred developers to begin thinking in terms of larger and higher quality images: super-sized digital pictures, or gigapixels, which are images composed of 1 billion or more pixels. Gigapixel images are more than 200 times the size captured by today s standard 4 megapixel digital camera. Although originally created for the Mars missions, the detail provided by these large photographs allows for many purposes, not all of which are limited to extraterrestrial photography.

  18. Airborne wireless communication systems, airborne communication methods, and communication methods

    DOEpatents

    Deaton, Juan D.; Schmitt, Michael J.; Jones, Warren F.

    2011-12-13

    An airborne wireless communication system includes circuitry configured to access information describing a configuration of a terrestrial wireless communication base station that has become disabled. The terrestrial base station is configured to implement wireless communication between wireless devices located within a geographical area and a network when the terrestrial base station is not disabled. The circuitry is further configured, based on the information, to configure the airborne station to have the configuration of the terrestrial base station. An airborne communication method includes answering a 911 call from a terrestrial cellular wireless phone using an airborne wireless communication system.

  19. Detection of soil properties with airborne hyperspectral measurements of bare fields.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Airborne remote sensing data, using a hyperspectral (HSI) camera, were collected for a flight over two fields with a total of 128 ha. of recently seeded and nearly bare soil. The within-field spatial distribution of several soil properties was found by using multiple linear regression to select the ...

  20. An evolution of image source camera attribution approaches.

    PubMed

    Jahanirad, Mehdi; Wahab, Ainuddin Wahid Abdul; Anuar, Nor Badrul

    2016-05-01

    Camera attribution plays an important role in digital image forensics by providing the evidence and distinguishing characteristics of the origin of the digital image. It allows the forensic analyser to find the possible source camera which captured the image under investigation. However, in real-world applications, these approaches have faced many challenges due to the large set of multimedia data publicly available through photo sharing and social network sites, captured with uncontrolled conditions and undergone variety of hardware and software post-processing operations. Moreover, the legal system only accepts the forensic analysis of the digital image evidence if the applied camera attribution techniques are unbiased, reliable, nondestructive and widely accepted by the experts in the field. The aim of this paper is to investigate the evolutionary trend of image source camera attribution approaches from fundamental to practice, in particular, with the application of image processing and data mining techniques. Extracting implicit knowledge from images using intrinsic image artifacts for source camera attribution requires a structured image mining process. In this paper, we attempt to provide an introductory tutorial on the image processing pipeline, to determine the general classification of the features corresponding to different components for source camera attribution. The article also reviews techniques of the source camera attribution more comprehensively in the domain of the image forensics in conjunction with the presentation of classifying ongoing developments within the specified area. The classification of the existing source camera attribution approaches is presented based on the specific parameters, such as colour image processing pipeline, hardware- and software-related artifacts and the methods to extract such artifacts. The more recent source camera attribution approaches, which have not yet gained sufficient attention among image forensics