Science.gov

Sample records for video ccd camera

  1. Flexible method to obtain high sensitivity, low-cost CCD cameras for video microscopy.

    PubMed

    Cinelli, A R

    1998-11-01

    A simple method is described to extend image exposure times in video-rate CCD cameras and thereby, increase their sensitivity and reduce noise level of low-light images. Most commercial video cameras lack the capability of extending image exposures since they operate regular television timing formats. The technique described here implements the control of the exposure times by selectively gating the image readout from the CCD sensor. This prevents the cyclic clearing of photo-charges occurring at regular video-rates, allowing image integration beyond the duration of single video field periods. Image readout is controlled by the duration of external gating pulses, giving the camera an efficient operational versatility under different light conditions. This technique is applicable to standard monochrome and color CCD cameras. The evaluations described here using this technique show that the light sensitivity of an standard video-rate CCD camera can be significantly improved, generating high quality images at low-light levels. These were comparable to those obtained with image intensifiers or intensified video cameras. Cameras are still compatible with regular video equipment, since this technique preserves the normal TV synchronization signals. Results in simulated and real experimental situations confirmed that this technique enables the use of affordable video-rate CCD cameras for a variety of fluorescence microscopy and optical recording applications. PMID:9874139

  2. CCD Camera

    DOEpatents

    Roth, Roger R. (Minnetonka, MN)

    1983-01-01

    A CCD camera capable of observing a moving object which has varying intensities of radiation eminating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other.

  3. Research on anti-blooming driving timing for CCD video camera

    NASA Astrophysics Data System (ADS)

    Mo, Site; Chen, Qiuxia; Liu, Qi

    2010-10-01

    How CCD camera causes vertical blooming was investigated. The technique based on vertical driving timing, which can clear vertical blooming up the high-intensity pixels, was presented. The saturated charge of photo diode, which is produced by high-intensity-pixel during the exposure time of a line, enters the vertical shift register. On the control of CCD vertical driving timing, all pixels of the same column add the saturation charge, and then the image has vertical blooming. The vertical blooming up the high-intensity-pixel is produced by the saturation charge of the last frame. Depended on the control method of charge transfer, a special driving timing, which can transfer the remaining charge, is used for clearing the vertical blooming up the high-intensity-pixel. The driving timing is tested on a CCD demo board of ICX205, and the testing showed that the vertical blooming up the high-intensity-pixel is deleted. This method has been applied to intelligent traffic monitoring.

  4. Biofeedback control analysis using a synchronized system of two CCD video cameras and a force-plate sensor

    NASA Astrophysics Data System (ADS)

    Tsuruoka, Masako; Shibasaki, Ryosuke; Murai, Shunji

    1999-01-01

    The biofeedback control analysis of human movement has become increasingly important in rehabilitation, sports medicine and physical fitness. In this study, a synchronized system was developed for acquiring sequential data of a person's movement. The setup employs a video recorder system linked with two CCD video cameras and fore-plate sensor system, which are configured to stop and start simultaneously. The feedback control movement of postural stability was selected as a subject for analysis. The person's center of body gravity (COG) was calculated by measured 3-D coordinates of major joints using videometry with bundle adjustment and self-calibration. The raw serial data of COG and foot pressure by measured force plate sensor are difficult to analyze directly because of their complex fluctuations. Utilizing auto regressive modeling, the power spectrum and the impulse response of movement factors, enable analysis of their dynamic relations. This new biomedical engineering approach provides efficient information for medical evaluation of a person's stability.

  5. Transmission electron microscope CCD camera

    DOEpatents

    Downing, Kenneth H.

    1999-01-01

    In order to improve the performance of a CCD camera on a high voltage electron microscope, an electron decelerator is inserted between the microscope column and the CCD. This arrangement optimizes the interaction of the electron beam with the scintillator of the CCD camera while retaining optimization of the microscope optics and of the interaction of the beam with the specimen. Changing the electron beam energy between the specimen and camera allows both to be optimized.

  6. Calibration Tests of Industrial and Scientific CCD Cameras

    NASA Technical Reports Server (NTRS)

    Shortis, M. R.; Burner, A. W.; Snow, W. L.; Goad, W. K.

    1991-01-01

    Small format, medium resolution CCD cameras are at present widely used for industrial metrology applications. Large format, high resolution CCD cameras are primarily in use for scientific applications, but in due course should increase both the range of applications and the object space accuracy achievable by close range measurement. Slow scan, cooled scientific CCD cameras provide the additional benefit of additional quantisation levels which enables improved radiometric resolution. The calibration of all types of CCD cameras is necessary in order to characterize the geometry of the sensors and lenses. A number of different types of CCD cameras have been calibrated a the NASA Langley Research Center using self calibration and a small test object. The results of these calibration tests will be described, with particular emphasis on the differences between standard CCD video cameras and scientific slow scan CCD cameras.

  7. Wide Dynamic Range CCD Camera

    NASA Astrophysics Data System (ADS)

    Younse, J. M.; Gove, R. J.; Penz, P. A.; Russell, D. E.

    1984-11-01

    A liquid crystal attenuator (LCA) operated as a variable neutral density filter has been attached to a charge-coupled device (CCD) imager to extend the dynamic range of a solid-state TV camera by an order of magnitude. Many applications are best served by a camera with a dynamic range of several thousand. For example, outside security systems must operate unattended with "dawn-to-dusk" lighting conditions. Although this can be achieved with available auto-iris lens assemblies, more elegant solutions which provide the small size, low power, high reliability advantages of solid state technology are now available. This paper will describe one such unique way of achieving these dynamic ranges using standard optics by making the CCD imager's glass cover a controllable neutral density filter. The liquid crystal attenuator's structure and theoretical properties for this application will be described along with measured transmittance. A small integrated TV camera which utilizes a "virtual-phase" CCD sensor coupled to a LCA will be described and test results for a number of the camera's optical and electrical parameters will be given. These include the following camera parameters: dynamic range, Modulation Transfer Function (MTF), spectral response, and uniformity. Also described will be circuitry which senses the ambient scene illuminance and automatically provides feedback signals to appropriately adjust the transmittance of the LCA. Finally, image photographs using this camera, under various scene illuminations, will be shown.

  8. CCD Camera Observations

    NASA Astrophysics Data System (ADS)

    Buchheim, Bob; Argyle, R. W.

    One night late in 1918, astronomer William Milburn, observing the region of Cassiopeia from Reverend T.H.E.C. Espin's observatory in Tow Law (England), discovered a hitherto unrecorded double star (Wright 1993). He reported it to Rev. Espin, who measured the pair using his 24-in. reflector: the fainter star was 6.0 arcsec from the primary, at position angle 162.4 ^{circ } (i.e. the fainter star was south-by-southeast from the primary) (Espin 1919). Some time later, it was recognized that the astrograph of the Vatican Observatory had taken an image of the same star-field a dozen years earlier, in late 1906. At that earlier epoch, the fainter star had been separated from the brighter one by only 4.8 arcsec, at position angle 186.2 ^{circ } (i.e. almost due south). Were these stars a binary pair, or were they just two unrelated stars sailing past each other? Some additional measurements might have begun to answer this question. If the secondary star was following a curved path, that would be a clue of orbital motion; if it followed a straight-line path, that would be a clue that these are just two stars passing in the night. Unfortunately, nobody took the trouble to re-examine this pair for almost a century, until the 2MASS astrometric/photometric survey recorded it in late 1998. After almost another decade, this amateur astronomer took some CCD images of the field in 2007, and added another data point on the star's trajectory, as shown in Fig. 15.1.

  9. An auto-focusing CCD camera mount

    NASA Astrophysics Data System (ADS)

    Arbour, R. W.

    1994-08-01

    The traditional methods of focusing a CCD camera are either time consuming, difficult or, more importantly, indecisive. This paper describes a device designed to allow the observer to be confident that the camera will always be properly focused by sensing a selected star image and automatically adjusting the camera's focal position.

  10. CCD Video Imaging Of Cardiac Activity

    NASA Astrophysics Data System (ADS)

    Nassif, G.; Fillette, F.; Lascault, Aouate G.; Grosgogeat, Y.

    1988-06-01

    Helium-Neon LASER spectrometry of fluorescent dye WW 781 bound to heart tissues permits to collect optical signals significant from the electrical activity and from the electromechanical activity. It is also possible to image the electrical activity of a myocardial surface stained with WW 781 and illuminated with a direct or an unfocused Helium-Neon LASER beam using a Charge Coupled Device (CCD) video camera. Sheep ventricular fragments and right mouse atria were stained with Tyrode solutions containing from 0.4 to 1 g/1 WW 781. Focused or unfocused illuminations were performed with a 2 mW Helium-Neon LASER through a lens or an optical fiber. Direct illumination was performed with nine 5 mW Helim-Neon LASERs permitting to map the observed surface. CCD video camera was connected to a 70-220 Zoom-telelens and placed behind a 665 nm high pass optical filter. Video signals were amplified and recorded on a NTSC B. V. U professionnal magnetoscope. Video recordings were studied frame by frame. Fluorescent emissions from illuminated areas were monitored with a 200 um diameter optical fiber optrode using a monochromator-photomultiplier set. Direct illumination permitted to map on nine points and to follow the electrical activity propagation on sheep ventricular epicardium observing a 2/1 conduction block on a limited area. Fluorescent signals significant from electrical activity were simultaneously recorded. Unfocused lighting permitted to follow the depolarization of a 1.2 mm spot on sheep ventricular endocardium and to follow the propagation of the fluorescence on a 2 mm diameter area on mouse atrium. Such a technique appear to be of great imterest in the study of arrythmias on experimental models with foreseeable pharmacological applications.

  11. Compact CCD Guider Camera for Magellan

    NASA Astrophysics Data System (ADS)

    Burley, G.; Thompson, I.; Hull, C.

    The Magellan guider camera uses a low-noise frame transfer CCD with a digital signal processor based controller. The electronics feature a compact, simple design, optimized for fast settling times and rapid readout rate. The camera operates (nominally) at -20 ºC with thermoelectric cooling. Multiple operating modes are supported, with software selectable binning, exposure times, and subrastering

  12. Application of the CCD camera in medical imaging

    NASA Astrophysics Data System (ADS)

    Chu, Wei-Kom; Smith, Chuck; Bunting, Ralph; Knoll, Paul; Wobig, Randy; Thacker, Rod

    1999-04-01

    Medical fluoroscopy is a set of radiological procedures used in medical imaging for functional and dynamic studies of digestive system. Major components in the imaging chain include image intensifier that converts x-ray information into an intensity pattern on its output screen and a CCTV camera that converts the output screen intensity pattern into video information to be displayed on a TV monitor. To properly respond to such a wide dynamic range on a real-time basis, such as fluoroscopy procedure, are very challenging. Also, similar to all other medical imaging studies, detail resolution is of great importance. Without proper contrast, spatial resolution is compromised. The many inherent advantages of CCD make it a suitable choice for dynamic studies. Recently, CCD camera are introduced as the camera of choice for medical fluoroscopy imaging system. The objective of our project was to investigate a newly installed CCD fluoroscopy system in areas of contrast resolution, details, and radiation dose.

  13. Solid state television camera (CCD-buried channel)

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The development of an all solid state television camera, which uses a buried channel charge coupled device (CCD) as the image sensor, was undertaken. A 380 x 488 element CCD array is utilized to ensure compatibility with 525 line transmission and display monitor equipment. Specific camera design approaches selected for study and analysis included (a) optional clocking modes for either fast (1/60 second) or normal (1/30 second) frame readout, (b) techniques for the elimination or suppression of CCD blemish effects, and (c) automatic light control and video gain control (i.e., ALC and AGC) techniques to eliminate or minimize sensor overload due to bright objects in the scene. Preferred approaches were determined and integrated into a deliverable solid state TV camera which addressed the program requirements for a prototype qualifiable to space environment conditions.

  14. Solid state television camera (CCD-buried channel), revision 1

    NASA Technical Reports Server (NTRS)

    1977-01-01

    An all solid state television camera was designed which uses a buried channel charge coupled device (CCD) as the image sensor. A 380 x 488 element CCD array is utilized to ensure compatibility with 525-line transmission and display monitor equipment. Specific camera design approaches selected for study and analysis included (1) optional clocking modes for either fast (1/60 second) or normal (1/30 second) frame readout, (2) techniques for the elimination or suppression of CCD blemish effects, and (3) automatic light control and video gain control techniques to eliminate or minimize sensor overload due to bright objects in the scene. Preferred approaches were determined and integrated into a deliverable solid state TV camera which addressed the program requirements for a prototype qualifiable to space environment conditions.

  15. Vacuum compatible miniature CCD camera head

    DOEpatents

    Conder, Alan D.

    2000-01-01

    A charge-coupled device (CCD) camera head which can replace film for digital imaging of visible light, ultraviolet radiation, and soft to penetrating x-rays, such as within a target chamber where laser produced plasmas are studied. The camera head is small, capable of operating both in and out of a vacuum environment, and is versatile. The CCD camera head uses PC boards with an internal heat sink connected to the chassis for heat dissipation, which allows for close(0.04" for example) stacking of the PC boards. Integration of this CCD camera head into existing instrumentation provides a substantial enhancement of diagnostic capabilities for studying high energy density plasmas, for a variety of military industrial, and medical imaging applications.

  16. CCD Camera Systems for the GTC

    NASA Astrophysics Data System (ADS)

    Kohley, R.; Suarez, M.; Martin, J. M.; Burley, G.; Cavaller, L.; Vilela, R.

    The construction of the GTC (Gran Telescopio Canarias) on La Palma, Canary Islands, is well under way with first light planned for October 2003. All subsystems are developed and are now in the fabrication or commissioning phase. Apart from the suite of instruments, this paper discusses the design and procurement of the CCD camera systems, which are designed for the close-loop active optics and the acquisition and guiding purposes of the telescope (wavefront sensors and guiding cameras) and for in-house developed scientific instrumentation. Furthermore, we describe our test bench facility for CCD evaluation up to surface sizes equivalent to a 4Kx4K, 15 μm detector.

  17. CCD camera-based diagnostics of optically dense pulsed plasma with account of self-absorption

    NASA Astrophysics Data System (ADS)

    Nikonchuk, I. S.; Chumakov, A. N.

    2016-01-01

    We present a system developed for the optically dense pulsed plasma diagnostics based on a digital video camera with a CCD matrix, that provides determination of spectral brightness and optical density with account of self-absorption.

  18. Jack & the Video Camera

    ERIC Educational Resources Information Center

    Charlan, Nathan

    2010-01-01

    This article narrates how the use of video camera has transformed the life of Jack Williams, a 10-year-old boy from Colorado Springs, Colorado, who has autism. The way autism affected Jack was unique. For the first nine years of his life, Jack remained in his world, alone. Functionally non-verbal and with motor skill problems that affected his…

  19. Jack & the Video Camera

    ERIC Educational Resources Information Center

    Charlan, Nathan

    2010-01-01

    This article narrates how the use of video camera has transformed the life of Jack Williams, a 10-year-old boy from Colorado Springs, Colorado, who has autism. The way autism affected Jack was unique. For the first nine years of his life, Jack remained in his world, alone. Functionally non-verbal and with motor skill problems that affected his

  20. Typical effects of laser dazzling CCD camera

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen; Zhang, Jianmin; Shao, Bibo; Cheng, Deyan; Ye, Xisheng; Feng, Guobin

    2015-05-01

    In this article, an overview of laser dazzling effect to buried channel CCD camera is given. The CCDs are sorted into staring and scanning types. The former includes the frame transfer and interline transfer types. The latter includes linear and time delay integration types. All CCDs must perform four primary tasks in generating an image, which are called charge generation, charge collection, charge transfer and charge measurement. In camera, the lenses are needed to input the optical signal to the CCD sensors, in which the techniques for erasing stray light are used. And the electron circuits are needed to process the output signal of CCD, in which many electronic techniques are used. The dazzling effects are the conjunct result of light distribution distortion and charge distribution distortion, which respectively derive from the lens and the sensor. Strictly speaking, in lens, the light distribution is not distorted. In general, the lens are so well designed and fabricated that its stray light can be neglected. But the laser is of much enough intensity to make its stray light obvious. In CCD image sensors, laser can induce a so large electrons generation. Charges transfer inefficiency and charges blooming will cause the distortion of the charge distribution. Commonly, the largest signal outputted from CCD sensor is restricted by capability of the collection well of CCD, and can't go beyond the dynamic range for the subsequent electron circuits maintaining normal work. So the signal is not distorted in the post-processing circuits. But some techniques in the circuit can make some dazzling effects present different phenomenon in final image.

  1. High-performance digital color video camera

    NASA Astrophysics Data System (ADS)

    Parulski, Kenneth A.; D'Luna, Lionel J.; Benamati, Brian L.; Shelley, Paul R.

    1992-01-01

    Typical one-chip color cameras use analog video processing circuits. An improved digital camera architecture has been developed using a dual-slope A/D conversion technique and two full-custom CMOS digital video processing integrated circuits, the color filter array (CFA) processor and the RGB postprocessor. The system used a 768 X 484 active element interline transfer CCD with a new field-staggered 3G color filter pattern and a lenslet overlay, which doubles the sensitivity of the camera. The industrial-quality digital camera design offers improved image quality, reliability, manufacturability, while meeting aggressive size, power, and cost constraints. The CFA processor digital VLSI chip includes color filter interpolation processing, an optical black clamp, defect correction, white balance, and gain control. The RGB postprocessor digital integrated circuit includes a color correction matrix, gamma correction, 2D edge enhancement, and circuits to control the black balance, lens aperture, and focus.

  2. Event Pileup in AXAF's ACIS CCD Camera

    NASA Technical Reports Server (NTRS)

    McNamara, Brian R.

    1998-01-01

    AXAF's high resolution mirrors will focus a point source near the optical axis to a spot that is contained within a radius of about two pixels on the ACIS Charge Coupled Devices (CCD) camera. Because of the small spot size, the accuracy to which fluxes and spectral energy distributions of bright point sources can be measured will be degrad3ed by event pileup. Event pileup occurs when two or more X-ray photons arrive simultaneously in a single detection cell on a CCD readout frame. When pileup occurs, ACIS's event detection algorithm registers the photons as a single X-ray event. The pulse height channel of the event will correspond to an energy E approximately E-1 + E-2...E-n, where n is the number of photons registered per detection cell per readout frame. As a result, pileup artificially hardens the observed spectral energy distribution. I will discuss the effort at the AXAF Science Center Lo calibrate pileup in ACIS using focused, nearly monochromatic X-ray source. I will discuss techniques for modeling and correcting pileup effects in polychromatic spectra.

  3. Ultrahigh-speed, high-sensitivity color camera with 300,000-pixel single CCD

    NASA Astrophysics Data System (ADS)

    Kitamura, K.; Arai, T.; Yonai, J.; Hayashida, T.; Ohtake, H.; Kurita, T.; Tanioka, K.; Maruyama, H.; Namiki, J.; Yanagi, T.; Yoshida, T.; van Kuijk, H.; Bosiers, Jan T.; Etoh, T. G.

    2007-01-01

    We have developed an ultrahigh-speed, high-sensitivity portable color camera with a new 300,000-pixel single CCD. The 300,000-pixel CCD, which has four times the number of pixels of our initial model, was developed by seamlessly joining two 150,000-pixel CCDs. A green-red-green-blue (GRGB) Bayer filter is used to realize a color camera with the single-chip CCD. The camera is capable of ultrahigh-speed video recording at up to 1,000,000 frames/sec, and small enough to be handheld. We also developed a technology for dividing the CCD output signal to enable parallel, highspeed readout and recording in external memory; this makes possible long, continuous shots up to 1,000 frames/second. As a result of an experiment, video footage was imaged at an athletics meet. Because of high-speed shooting, even detailed movements of athletes' muscles were captured. This camera can capture clear slow-motion videos, so it enables previously impossible live footage to be imaged for various TV broadcasting programs.

  4. Streak Camera Performance with Large-Format CCD Readout

    SciTech Connect

    Lerche, R A; Andrews, D S; Bell, P M; Griffith, R L; McDonald, J W; Torres, P III; Vergel de Dios, G

    2003-07-08

    The ICF program at Livermore has a large inventory of optical streak cameras that were built in the 1970s and 1980s. The cameras include micro-channel plate image-intensifier tubes (IIT) that provide signal amplification and early lens-coupled CCD readouts. Today, these cameras are still very functional, but some replacement parts such as the original streak tube, CCD, and IIT are scarce and obsolete. This article describes recent efforts to improve the performance of these cameras using today's advanced CCD readout technologies. Very sensitive, large-format CCD arrays with efficient fiber-optic input faceplates are now available for direct coupling with the streak tube. Measurements of camera performance characteristics including linearity, spatial and temporal resolution, line-spread function, contrast transfer ratio (CTR), and dynamic range have been made for several different camera configurations: CCD coupled directly to the streak tube, CCD directly coupled to the IIT, and the original configuration with a smaller CCD lens coupled to the IIT output. Spatial resolution (limiting visual) with and without the IIT is 8 and 20 lp/mm, respectively, for photocathode current density up to 25% of the Child-Langmuir (C-L) space-charge limit. Temporal resolution (fwhm) deteriorates by about 20% when the cathode current density reaches 10% of the C-L space charge limit. Streak tube operation with large average tube current was observed by illuminating the entire slit region through a Ronchi ruling and measuring the CTR. Sensitivity (CCD electrons per streak tube photoelectron) for the various configurations ranged from 7.5 to 2,700 with read noise of 7.5 to 10.5 electrons. Optimum spatial resolution is achieved when the IIT is removed. Maximum dynamic range requires a configuration where a single photoelectron from the photocathode produces a signal that is 3 to 5 times the read noise.

  5. Solid state, CCD-buried channel, television camera study and design

    NASA Technical Reports Server (NTRS)

    Hoagland, K. A.; Balopole, H.

    1976-01-01

    An investigation of an all solid state television camera design, which uses a buried channel charge-coupled device (CCD) as the image sensor, was undertaken. A 380 x 488 element CCD array was utilized to ensure compatibility with 525 line transmission and display monitor equipment. Specific camera design approaches selected for study and analysis included (a) optional clocking modes for either fast (1/60 second) or normal (1/30 second) frame readout, (b) techniques for the elimination or suppression of CCD blemish effects, and (c) automatic light control and video gain control techniques to eliminate or minimize sensor overload due to bright objects in the scene. Preferred approaches were determined and integrated into a design which addresses the program requirements for a deliverable solid state TV camera.

  6. Printed circuit board for a CCD camera head

    DOEpatents

    Conder, Alan D.

    2002-01-01

    A charge-coupled device (CCD) camera head which can replace film for digital imaging of visible light, ultraviolet radiation, and soft to penetrating x-rays, such as within a target chamber where laser produced plasmas are studied. The camera head is small, capable of operating both in and out of a vacuum environment, and is versatile. The CCD camera head uses PC boards with an internal heat sink connected to the chassis for heat dissipation, which allows for close (0.04" for example) stacking of the PC boards. Integration of this CCD camera head into existing instrumentation provides a substantial enhancement of diagnostic capabilities for studying high energy density plasmas, for a variety of military industrial, and medical imaging applications.

  7. High-resolution CCD camera family with a PC host

    NASA Astrophysics Data System (ADS)

    Raanes, Chris A.; Bottenberg, Les

    1993-05-01

    EG&G Reticon and Adaptive Optics Associates have developed a family of high resolution CCD cameras with a PC/AT host to fulfill imaging applications from medical science to industrial inspection. The MC4000 family of CCD cameras encompasses resolutions of 512 X 512, 1024 X 1024, and 2048 X 2048 pixels. All three of these high performance cameras interface to the SB4000, PC/AT controller, which serves as a frame buffer with up to 64 MBytes of storage, as well as providing all the required control, and setup parameters while the camera head is remotely located at distances of up to 100 ft. All of the MC4000 high resolution cameras employ MPP clocking to achieve high dynamic range without cooling the CCD sensor. The use of this low power clocking technique, surface mount components, electronic shutter and clever packaging have allowed Reticon to deliver the MC4000 cameras in convenient, rugged small housings. The MC4000 family provides users with a total imaging solution from leading edge sensors and electronics in ruggedized housings, to cables, power supplies, and a PC/AT frame buffer and controller card. All the components are designed to function together as a turn-key, self-contained system, or individual components can become part of a user's larger system. The MC4000 CCD camera family makes high resolution, electronic imaging an accessible tool for a wide range of applications.

  8. Solid-State Video Camera for the Accelerator Environment

    SciTech Connect

    Brown, R

    2004-05-27

    Solid-State video cameras employing CMOS technology have been developed and tested for several years in the SLAC accelerator; notably the PEPII (BaBar) injection lines. They have proven much more robust than their CCD counterparts in radiation areas. Repair is simple, inexpensive, and generates very little radioactive waste.

  9. A simple transputer-based CCD camera controller

    NASA Astrophysics Data System (ADS)

    Waltham, N. R.; van Breda, I. G.; Newton, G. M.

    1990-07-01

    A CCD camera controller based on a transputer chip is described which is used for sequencing programmable waveform patterns, including windowed and pixel binning formats. A key feature is the significant reduction in component count over previous systems made possible by the versatility and unique features of the transputer. The outcome of this is that a complete controller for a single-chip CCD camera may be accommodated in a small enclosure on the cryostat used to cool the chip. Alternatively, extra drive cards may be added to the waveform sequencer to drive a multichip camera. The use of the transputer allows the controller to be incorporated naturally into a parallel processing system whereby several cameras can be operated within the telescope environment using simple serial control and data transfer links.

  10. Driving techniques for high frame rate CCD camera

    NASA Astrophysics Data System (ADS)

    Guo, Weiqiang; Jin, Longxu; Xiong, Jingwu

    2008-03-01

    This paper describes a high-frame rate CCD camera capable of operating at 100 frames/s. This camera utilizes Kodak KAI-0340, an interline transfer CCD with 640(vertical)×480(horizontal) pixels. Two output ports are used to read out CCD data and pixel rates approaching 30 MHz. Because of its reduced effective opacity of vertical charge transfer registers, interline transfer CCD can cause undesired image artifacts, such as random white spots and smear generated in the registers. To increase frame rate, a kind of speed-up structure has been incorporated inside KAI-0340, then it is vulnerable to a vertical stripe effect. The phenomena which mentioned above may severely impair the image quality. To solve these problems, some electronic methods of eliminating these artifacts are adopted. Special clocking mode can dump the unwanted charge quickly, then the fast readout of the images, cleared of smear, follows immediately. Amplifier is used to sense and correct delay mismatch between the dual phase vertical clock pulses, the transition edges become close to coincident, so vertical stripes disappear. Results obtained with the CCD camera are shown.

  11. Color measurements using a colorimeter and a CCD camera

    SciTech Connect

    Spratlin, T.L.; Simpson, M.L.

    1992-02-01

    Two new techniques are introduced for measuring the color content of printed graphic images with applications to web inspection such as color flaws and measurement of color quality. The techniques involve the development of algorithms for combining the information obtained from commercially available CCD color cameras and colorimeters to produce a colorimeter system with pixel resolution. 9 refs.

  12. High frame rate CCD cameras with fast optical shutters for military and medical imaging applications

    SciTech Connect

    King, N.S.P.; Albright, K.; Jaramillo, S.A.; McDonald, T.E.; Yates, G.J.; Turko, B.T.

    1994-09-01

    Los Alamos National Laboratory has designed and prototyped high-frame rate intensified/shuttered Charge-Coupled-Device (CCD) cameras capable of operating at kilohertz frame rates (non-interlaced mode) with optical shutters capable of acquiring nanosecond-to-microsecond exposures each frame. These cameras utilize an Interline Transfer CCD, Loral Fairchild CCD-222 with 244 {times} 380 pixels operated at pixel rates approaching 100 Mhz. Initial prototype designs demonstrated single-port serial readout rates exceeding 3.97 Kilohertz with greater than 51p/mm spatial resolution at shutter speeds as short as 5ns. Readout was achieved by using a truncated format of 128 {times} 128 pixels by partial masking of the CCD and then subclocking the array at approximately 65Mhz pixel rate. Shuttering was accomplished with a proximity focused microchannel plate (MCP) image intensifier (MCPII) that incorporated a high strip current MCP and a design modification for high-speed stripline gating geometry to provide both fast shuttering and high repetition rate capabilities. Later camera designs use a close-packed quadruple head geometry fabricated using an array of four separate CCDs (pseudo 4-port device). This design provides four video outputs with optional parallel or time-phased sequential readout modes. The quad head format was designed with flexibility for coupling to various image intensifier configurations, including individual intensifiers for each CCD imager, a single intensifier with fiber optic or lens/prism coupled fanout of the input image to be shared by the four CCD imagers or a large diameter phosphor screen of a gateable framing type intensifier for time sequential relaying of a complete new input image to each CCD imager. Camera designs and their potential use in ongoing military and medical time-resolved imaging applications are discussed.

  13. Series of CCD cameras for low-light-level applications

    NASA Astrophysics Data System (ADS)

    Peri, Michal L.; Weaver, Daniel W.; Ambrose, Tom P.; Hirpara, Dan; Gallagher, Susan; Hall, Andrew M.; Bone, Gregg

    1996-03-01

    We describe a series of five CCD cameras designed by Gordian for low light-level applications. The first device is a low-cost non-imaging astronomical autoguiding tracker based on the Texas Instruments TC255 CCD chip and an MC6811 microcontroller. Mounting off-axis, it provides standardized tracking-motor signals for any telescope with a dual-axis drive corrector, automatically compensating for the mechanical peculiarities of the drive, set- up factors, and pointing errors. The tracker can guide to +/- 1 arcsec on an 8th magnitude star when used with an 8' aperture, f/10 telescope. The basic autoguider design has been extended to produce self-contained 8-bit and 16-bit imaging cameras with autoguiding functionality. Images are buffered in PSRAM, then relayed to a host PC via an RS-232 serial connection. The addition of regulated thermoelectric cooling reduces CCD thermal noise and alleviates dark current saturation. Gordian has also designed two high-resolution cameras based on the Kodak KAF-0400 and KAF-1600 CCDs. The cameras produce 16-bit images with 768 X 512 pixels or 1536 X 1024 pixels, respectively. Pixel size is 9 micrometers square. The camera head contains the CCD, thermoelectric cooling mechanism, analog electronics, and a custom-designed electromechanical shutter based on FlexinolTM actuator wire. A separate base unit houses a Motorola 68306 microprocessor and associated electronics for telescope control and on-board image processing. A stepper-motor based filter wheel can be attached directly to the camera head. The camera communicates with a personal computer via SCSI or serial connection. Software for the host PC provides additional control options, data storage, and image processing capability.

  14. Developments in the EM-CCD camera for OGRE

    NASA Astrophysics Data System (ADS)

    Tutt, James H.; McEntaffer, Randall L.; DeRoo, Casey; Schultz, Ted; Miles, Drew M.; Zhang, William; Murray, Neil J.; Holland, Andrew D.; Cash, Webster; Rogers, Thomas; O'Dell, Steve; Gaskin, Jessica; Kolodziejczak, Jeff; Evagora, Anthony M.; Holland, Karen; Colebrook, David

    2014-07-01

    The Off-plane Grating Rocket Experiment (OGRE) is a sub-orbital rocket payload designed to advance the development of several emerging technologies for use on space missions. The payload consists of a high resolution soft X-ray spectrometer based around an optic made from precision cut and ground, single crystal silicon mirrors, a module of off-plane gratings and a camera array based around Electron Multiplying CCD (EM-CCD) technology. This paper gives an overview of OGRE with emphasis on the detector array; specifically this paper will address the reasons that EM-CCDs are the detector of choice and the advantages and disadvantages that this technology offers.

  15. The development of large-aperture test system of infrared camera and visible CCD camera

    NASA Astrophysics Data System (ADS)

    Li, Yingwen; Geng, Anbing; Wang, Bo; Wang, Haitao; Wu, Yanying

    2015-10-01

    Infrared camera and CCD camera dual-band imaging system is used in many equipment and application widely. If it is tested using the traditional infrared camera test system and visible CCD test system, 2 times of installation and alignment are needed in the test procedure. The large-aperture test system of infrared camera and visible CCD camera uses the common large-aperture reflection collimator, target wheel, frame-grabber, computer which reduces the cost and the time of installation and alignment. Multiple-frame averaging algorithm is used to reduce the influence of random noise. Athermal optical design is adopted to reduce the change of focal length location change of collimator when the environmental temperature is changing, and the image quality of the collimator of large field of view and test accuracy are also improved. Its performance is the same as that of the exotic congener and is much cheaper. It will have a good market.

  16. Design and application of TEC controller Using in CCD camera

    NASA Astrophysics Data System (ADS)

    Gan, Yu-quan; Ge, Wei; Qiao, Wei-dong; Lu, Di; Lv, Juan

    2011-08-01

    Thermoelectric cooler (TEC) is a kind of solid hot pump performed with Peltier effect. And it is small, light and noiseless. The cooling quantity is proportional to the TEC working current when the temperature difference between the hot side and the cold side keeps stable. The heating quantity and cooling quantity can be controlled by changing the value and direction of current of two sides of TEC. So, thermoelectric cooling technology is the best way to cool CCD device. The E2V's scientific image sensor CCD47-20 integrates TEC and CCD together. This package makes easier of electrical design. Software and hardware system of TEC controller are designed with CCD47-20 which is packaged with integral solid-state Peltier cooler. For hardware system, 80C51 MCU is used as CPU, 8-bit ADC and 8-bit DAC compose of closed-loop controlled system. Controlled quantity can be computed by sampling the temperature from thermistor in CCD. TEC is drove by MOSFET which consists of constant current driving circuit. For software system, advanced controlled precision and convergence speed of TEC system can be gotten by using PID controlled algorithm and tuning proportional, integral and differential coefficient. The result shows: if the heat emission of the hot side of TEC is good enough to keep the temperature stable, and when the sampling frequency is 2 seconds, temperature controlled velocity is 5°C/min. And temperature difference can reach -40°C controlled precision can achieve 0.3°C. When the hot side temperature is stable at °C, CCD temperature can reach -°C, and thermal noise of CCD is less than 1e-/pix/s. The controlled system restricts the dark-current noise of CCD and increases SNR of the camera system.

  17. Frame transfer CCD driving circuit design for space camera

    NASA Astrophysics Data System (ADS)

    Chen, Xin-hua; Zhou, Jian-kang; Zhou, Wang; Shen, Wei-min

    2008-03-01

    Micro-satellite is characterized by miniaturized structure and low cost, so it is a good choice to use area array CCD space camera as image system on micro-satellite. FT-18 is a monochrome frame transfer image sensor offering 1024×1024 pixels with excellent antiblooming and variable electronic shuttering. The main components of driving circuit for FT-18 include power supply unit, microcontroller unit, clock signal generator unit, and analog-to-digital (A/D) converter. The microcontroller unit controls startup sequence of all voltage, the exposure time of CCD and the working status of A/D converter; the clock signal generator unit generates sequence signals for CCD and A/D converter; the A/D converter converts the output of FT-18 to a 12-bit digital output. Special attention should be paid to the reliability of this camera for it will work in a condition different from ground. The camera may suffer from vacuum discharge, particle radiation, strong shock, hypergravity and so on. All these should be considered in the design of space camera, and enough environment tests should be done to ensure it can work normally in space.

  18. CCD camera for dual-energy digital subtraction angiography.

    PubMed

    Molloi, S; Ersahin, A; Qian, Y J

    1995-01-01

    A motion immune dual-energy subtraction technique in which X-ray tube voltage and beam filtration were switched at 30 Hz between 60 kVp (2.0 mm Al filter) and 120 kVp (2.00 mm Al+2.5 mm Cu filter) was previously reported. In this study the effects of camera lag on the dual-energy iodine signal is investigated. The temporal lag of the lead oxide vidicon tested reduced the dual-energy iodine signal by a factor of 2.3, as compared to a mode that included 4 scrub frames between low- and high-energy images, for an iodine phantom with thicknesses of 0-86.0 mg/cm(2), imaged over a 15 cm thick Lucite phantom. On the other hand, the Charge-Coupled Device (CCD) camera has inherently no temporal lag and its versatile scanning characteristics make it near ideal for dual-energy DSA. The CCD camera eliminates the reduction of dual-energy iodine signal, since it does not mix low- and high-energy image data. Another benefit of the CCD camera is that the separation time between low and high-energy images is not limited to the frame period, as is the lead oxide vidicon; and as small as a 5-msec time difference is possible. The short time interval between low and high-energy images minimizes motion misregistration artifacts. Due to these advantages, the CCD camera significantly improves the utility of dual-energy DSA. PMID:18215878

  19. Development of an all-in-one gamma camera/CCD system for safeguard verification

    NASA Astrophysics Data System (ADS)

    Kim, Hyun-Il; An, Su Jung; Chung, Yong Hyun; Kwak, Sung-Woo

    2014-12-01

    For the purpose of monitoring and verifying efforts at safeguarding radioactive materials in various fields, a new all-in-one gamma camera/charged coupled device (CCD) system was developed. This combined system consists of a gamma camera, which gathers energy and position information on gamma-ray sources, and a CCD camera, which identifies the specific location in a monitored area. Therefore, 2-D image information and quantitative information regarding gamma-ray sources can be obtained using fused images. A gamma camera consists of a diverging collimator, a 22 × 22 array CsI(Na) pixelated scintillation crystal with a pixel size of 2 × 2 × 6 mm3 and Hamamatsu H8500 position-sensitive photomultiplier tube (PSPMT). The Basler scA640-70gc CCD camera, which delivers 70 frames per second at video graphics array (VGA) resolution, was employed. Performance testing was performed using a Co-57 point source 30 cm from the detector. The measured spatial resolution and sensitivity were 4.77 mm full width at half maximum (FWHM) and 7.78 cps/MBq, respectively. The energy resolution was 18% at 122 keV. These results demonstrate that the combined system has considerable potential for radiation monitoring.

  20. System Synchronizes Recordings from Separated Video Cameras

    NASA Technical Reports Server (NTRS)

    Nail, William; Nail, William L.; Nail, Jasper M.; Le, Doung T.

    2009-01-01

    A system of electronic hardware and software for synchronizing recordings from multiple, physically separated video cameras is being developed, primarily for use in multiple-look-angle video production. The system, the time code used in the system, and the underlying method of synchronization upon which the design of the system is based are denoted generally by the term "Geo-TimeCode(TradeMark)." The system is embodied mostly in compact, lightweight, portable units (see figure) denoted video time-code units (VTUs) - one VTU for each video camera. The system is scalable in that any number of camera recordings can be synchronized. The estimated retail price per unit would be about $350 (in 2006 dollars). The need for this or another synchronization system external to video cameras arises because most video cameras do not include internal means for maintaining synchronization with other video cameras. Unlike prior video-camera-synchronization systems, this system does not depend on continuous cable or radio links between cameras (however, it does depend on occasional cable links lasting a few seconds). Also, whereas the time codes used in prior video-camera-synchronization systems typically repeat after 24 hours, the time code used in this system does not repeat for slightly more than 136 years; hence, this system is much better suited for long-term deployment of multiple cameras.

  1. Suppression of multiple scattering with a CCD camera detection scheme

    NASA Astrophysics Data System (ADS)

    Zakharov, Pavel; Schurtenberger, Peter; Scheffold, Frank

    2005-06-01

    We introduce a CCD camera detection scheme in dynamic light scattering that provides information on the single-scattered auto-correlation function even for fairly turbid samples. Our approach allows access to the extensive range of systems that show low-order scattering by selective detection of the singly scattered light. Model experiments on slowly relaxing suspensions of latex spheres in glycerol were carried out to verify validity range of our approach.

  2. The SXI: CCD camera onboard the NeXT mission

    NASA Astrophysics Data System (ADS)

    Tsunemi, Hiroshi; Tsuru, Takeshi Go; Dotani, Tadayasu; Hayashida, Kiyoshi; Bautz, Marshall W.

    2008-07-01

    The Soft X-ray Imager (SXI) is the X-ray CCD camera on board the NeXT mission that is to be launched around 2013. We are going to employ the CCD chips developed at Hamamatsu Photonics, K.K. We have been developing two types of the CCD: an N-channel chip and a P-channel chip. The effective area of the detector system will be 5-6 cm square with a depletion layer of 100-200μm. The P-channel chip will have thicker depletion layer that makes it easy to develop it to back-illuminated type CCD. It will need a year or so for us to reach the final conclusion which type will be available. Based on the Suzaku experience, we will incorporate the charge injection gate so that we can reduce the proton damage. Furthermore, we will employ a mechanical cooler to keep the CCD working temperature down to -120°C in spite that NeXT will be in the low earth orbit. We can expect the radiation damage on our system very small. The CCD will have an Al coat on the chip to prevent optical photons from entering. This also eliminates the vacuum-tight chamber and the door-opening mechanism. We are planning to employ a custom-made analog ASIC that will reduce the power consumption and the size. The ASIC may speed up the frame-time if we can use a multi-node CCD. With using the focal length of 6m, the SXI will fully function with the optics around 20" resolution. We will report the current plan of the SXI in detail.

  3. Use Of A C.C.D. Array In An X-Ray Pinhole Camera

    NASA Astrophysics Data System (ADS)

    Cavailler, C.; Henry, Ph.; Launspach, J.; Mens, A.; Rostaing, M.; Sauneuf, R.

    1985-02-01

    X-ray imaging adapted to the laser-matter interaction experiments consits in recording plasma images from its X-ray emission ; those phenomena have between 100 ps and some nanoseconds duration. When we only need spatial information on 1-10 keV X-ray emission, the most simple imaging device is the pinhole camera ; the two dimension image of the plasma is temporally integrated by an X-ray sensitive detector. Until now, X-ray film was used. Its operation and processing were long and tedious, so we replaced it by a television camera built around a Charge Coupled Device (C.C.D.). This camera is directly integrated in the pinhole camera. The X-ray detection is made by the silicon substrat of a C.C.D. without input window working in the vacuum of the expe-riment chamber ; a compact camera head (40 mm diameter, 120 mm length) located near the C.C.D. (1 to 2 cm) makes the charge/voltage conversion and the signal amplification. The immediate operation of images is done by an image acquisition and processing unit after digitizing the video signal on 8 bits. From measurements made on a continuous X-ray source (5,4 keV) we could point out the fact that a THOMSON-CSF THX 31135 CCD is 10 times more sensitive than the X-ray SB2 KODAK film that we use in pinhole cameras. The dynamic range measured in these conditions was about 300. The first experimental results obtained on a pulsed X-ray source are presented.

  4. Television camera video level control system

    NASA Technical Reports Server (NTRS)

    Kravitz, M.; Freedman, L. A.; Fredd, E. H.; Denef, D. E. (Inventor)

    1985-01-01

    A video level control system is provided which generates a normalized video signal for a camera processing circuit. The video level control system includes a lens iris which provides a controlled light signal to a camera tube. The camera tube converts the light signal provided by the lens iris into electrical signals. A feedback circuit in response to the electrical signals generated by the camera tube, provides feedback signals to the lens iris and the camera tube. This assures that a normalized video signal is provided in a first illumination range. An automatic gain control loop, which is also responsive to the electrical signals generated by the camera tube 4, operates in tandem with the feedback circuit. This assures that the normalized video signal is maintained in a second illumination range.

  5. High frame rate CCD camera with fast optical shutter

    SciTech Connect

    Yates, G.J.; McDonald, T.E. Jr.; Turko, B.T.

    1998-09-01

    A high frame rate CCD camera coupled with a fast optical shutter has been designed for high repetition rate imaging applications. The design uses state-of-the-art microchannel plate image intensifier (MCPII) technology fostered/developed by Los Alamos National Laboratory to support nuclear, military, and medical research requiring high-speed imagery. Key design features include asynchronous resetting of the camera to acquire random transient images, patented real-time analog signal processing with 10-bit digitization at 40--75 MHz pixel rates, synchronized shutter exposures as short as 200pS, sustained continuous readout of 512 x 512 pixels per frame at 1--5Hz rates via parallel multiport (16-port CCD) data transfer. Salient characterization/performance test data for the prototype camera are presented, temporally and spatially resolved images obtained from range-gated LADAR field testing are included, an alternative system configuration using several cameras sequenced to deliver discrete numbers of consecutive frames at effective burst rates up to 5GHz (accomplished by time-phasing of consecutive MCPII shutter gates without overlap) is discussed. Potential applications including dynamic radiography and optical correlation will be presented.

  6. The U. H. Institute for Astronomy CCD Camera Control System

    NASA Astrophysics Data System (ADS)

    Jim, K. T. C.; Yamada, H. T.; Luppino, G. A.; Hlivak, R. J.

    1993-01-01

    The U. H. CCD Camera Control System consists of a NeXT workstation, a graphical user interface, and a fiber optics interface which is connected to a San Diego State University CCD controller. The U. H. system employs the NeXT-resident Motorola DSP 56001 as a real time hardware controller interfaced to the Mach-based UNIX of the NeXT workstation by DMA. Since the SDSU controller also uses the DSP 56001, the NeXT is used as a development platform for the embedded control software. The fiber optic interface links the two DSP 56001s through their Synchronous Serial Interfaces. The user interface is based on the NeXTStep windowing system. It is easy to use and features real-time display of image data and control over all camera functions. Both Loral and Tektronix 2048times 2048 CCDs have been driven at full readout speeds, and the system is designed to readout four such CCDs simultaneously. The total hardware package is compact and portable, and has been used on five different telescopes on Mauna Kea. The complete CCD control system can be assembled for a very low cost. The hardware and software of the control system have proven to be reliable, well adapted to the needs of astronomers, and extensible to increasingly complicated control requirements.

  7. Two-wavelength microscopic speckle interferometry using colour CCD camera

    NASA Astrophysics Data System (ADS)

    Upputuri, Paul K.; Pramanik, Manojit; Kothiyal, Mahendra P.; Nandigana, Krishna M.

    2015-03-01

    Single wavelength microscopic speckle interferometry is widely used for deformation, shape and non-destructive testing (NDT) of engineering structures. However the single wavelength configuration fails to quantify the large deformation due to the overcrowding of fringes and it cannot provide shape of a specimen under test. In this paper, we discuss a two wavelength microscopic speckle interferometry using single-chip colour CCD camera for characterization of microsamples. The use of colour CCD allows simultaneous acquisition of speckle patterns at two different wavelengths and thus it makes the data acquisition as simple as single wavelength case. For the quantitative measurement, an error compensating 8-step phase shifted algorithm is used. The system allows quantification of large deformation and shape of a specimen with rough surface. The design of the system along with few experimental results on small scale rough specimens is presented.

  8. Design of 300 frames per second 16-port CCD video processing circuit

    NASA Astrophysics Data System (ADS)

    Yang, Shao-hua; Guo, Ming-an; Li, Bin-kang; Xia, Jing-tao; Wang, Qunshu

    2011-08-01

    It is hard to achieve the speed of hundreds frames per second in high resolution charge coupled device (CCD) cameras, because the pixels' charge must be read out one by one in serial mode, this cost a lot of time. The multiple-port CCD technology is a new efficiency way to realize high frame rate high resolution solid state imaging systems. The pixel charge is read out from a multiple-port CCD through several ports in parallel mode, witch decrease the reading time of the CCD. But it is hard for the multiple-port CCDs' video processing circuit design, and the real time high speed image data acquisition is also a knotty problem. A 16-port high frame rate CCD video processing circuit based on Complex Programmable Logic Device (CPLD) and VSP5010 has been developed around a specialized back illuminated, 512 x 512 pixels, 400fps (frames per second) frame transfer CCD sensor from Sarnoff Ltd. A CPLD is used to produce high precision sample clock and timing, and the high accurate CCD video voltage sample is achieved with Correlated Double Sampling (CDS) technology. 8 chips of VSP5010 with CDS function is adopted to achieve sample and digitize CCD analog signal into 12 bit digital image data. Thus the 16 analog CCD output was digitized into 192 bit 6.67MHz parallel digital image data. Then CPLD and Time Division Multiplexing (TDM) technology are used to encode the 192 bit wide data into two 640MHz serial data and transmitted to remote data acquisition module via two fibers. The acquisition module decodes the serial data into original image data and stores the data into a frame cache, and then the software reads the data from the frame cache based on USB2.0 technology and stores the data in a hard disk. The digital image data with 12bit per pixel was collected and displayed with system software. The results show that the 16-por 300fps CCD output signals could be digitized and transmitted with the video processing circuit, and the remote data acquisition has been realized.

  9. Calibration tool for a CCD-camera-based vision system

    NASA Astrophysics Data System (ADS)

    Xu, Gan; Tan, Siew Leng; Low, Siok Pheng; Heng, Yee S.; Lai, Weng C.; Du, Xianhe

    2000-11-01

    A special calibration tool has been developed for a CCD camera based vision system in an automatic assembly machine. The machine is used to attach orifice plates onto a silicon wafer in a production process. The center locations of the positioning circular holes on the plate must be controlled accurately to coincide with those on the wafer die before they are attached together by UV curing. Although CCD camera based vision systems are widely used for accurate positioning and dimensional measurements in precision engineering, electronics and semiconductor industry, their calibrations are normally done by artefacts with plane patterns. These artefacts are therefore restricted to only two dimensional measurements. The calibration tool we developed was to check the positioning accuracy of circular objects in a two-layered structure. It can also be used to determine parallax errors, non-linearity and spatial non- uniformity errors as well as repeatability of the vision system with an uncertainty at sub-micrometer level. The design, calibration and performance of the tool are described in detail in this paper.

  10. Optical system based on a CCD camera for ethanol detection

    NASA Astrophysics Data System (ADS)

    Martínez-Hipatl, C.; Muñoz-Aguirre, S.; Muñoz-Guerrero, R.; Castillo-Mixcóatl, J.; Beltrán-Pérez, G.; Gutiérrez-Salgado, J. M.

    2013-10-01

    This work reports the optimization of an optical system used to detect and quantify volatile organic compounds (VOC). The sensor consisted of a polydimethylsiloxane (PDMS) sensing film deposited on a glass substrate by the spin-coating technique. The PDMS has the property of swelling and/or changing its refractive index when it interacts with molecules of VOC in vapor phase. In order to measure the PDMS swelling, a charge-coupled device (CCD) camera was employed to evaluate the interference fringe shift in a Pohl interferometric arrangement. With this approach, it is possible to use each pixel of the CCD camera as a single photodetector in the arrangement. Similarly, different computer algorithms were developed in order to acquire and process the obtained data. The improvements in the system allowed the acquisition and plot of 1 datum per second. The steady-state responses of the PDMS sensors in the presence of ethanol vapor were analyzed. The obtained results showed that noise level was reduced approximately three times after performing data processing.

  11. CCD Camera Lens Interface for Real-Time Theodolite Alignment

    NASA Technical Reports Server (NTRS)

    Wake, Shane; Scott, V. Stanley, III

    2012-01-01

    Theodolites are a common instrument in the testing, alignment, and building of various systems ranging from a single optical component to an entire instrument. They provide a precise way to measure horizontal and vertical angles. They can be used to align multiple objects in a desired way at specific angles. They can also be used to reference a specific location or orientation of an object that has moved. Some systems may require a small margin of error in position of components. A theodolite can assist with accurately measuring and/or minimizing that error. The technology is an adapter for a CCD camera with lens to attach to a Leica Wild T3000 Theodolite eyepiece that enables viewing on a connected monitor, and thus can be utilized with multiple theodolites simultaneously. This technology removes a substantial part of human error by relying on the CCD camera and monitors. It also allows image recording of the alignment, and therefore provides a quantitative means to measure such error.

  12. Research of fiber position measurement by multi CCD cameras

    NASA Astrophysics Data System (ADS)

    Zhou, Zengxiang; Hu, Hongzhuan; Wang, Jianping; Zhai, Chao; Chu, Jiaru; Liu, Zhigang

    2014-07-01

    Parallel controlled fiber positioner as an efficiency observation system, has been used in LAMOST for four years, and will be proposed in ngCFHT and rebuilt telescope Mayall. The fiber positioner research group in USTC have designed a new generation prototype by a close-packed module robotic positioner mechanisms. The prototype includes about 150 groups fiber positioning module plugged in 1 meter diameter honeycombed focal plane. Each module has 37 12mm diameter fiber positioners. Furthermore the new system promotes the accuracy from 40 um in LAMOST to 10um in MSDESI. That's a new challenge for measurement. Close-loop control system are to be used in new system. The CCD camera captures the photo of fiber tip position covered the focal plane, calculates the precise position information and feeds back to control system. After the positioner rotated several loops, the accuracy of all positioners will be confined to less than 10um. We report our component development and performance measurement program of new measuring system by using multi CCD cameras. With the stereo vision and image processing method, we precisely measure the 3-demension position of fiber tip carried by fiber positioner. Finally we present baseline parameters for the fiber positioner measurement as a reference of next generation survey telescope design.

  13. Close-Range Photogrammetry with Video Cameras

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Goad, W. K.

    1983-01-01

    Examples of photogrammetric measurements made with video cameras uncorrected for electronic and optical lens distortions are presented. The measurement and correction of electronic distortions of video cameras using both bilinear and polynomial interpolation are discussed. Examples showing the relative stability of electronic distortions over long periods of time are presented. Having corrected for electronic distortion, the data are further corrected for lens distortion using the plumb line method. Examples of close-range photogrammetric data taken with video cameras corrected for both electronic and optical lens distortion are presented.

  14. Close-range photogrammetry with video cameras

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Goad, W. K.

    1985-01-01

    Examples of photogrammetric measurements made with video cameras uncorrected for electronic and optical lens distortions are presented. The measurement and correction of electronic distortions of video cameras using both bilinear and polynomial interpolation are discussed. Examples showing the relative stability of electronic distortions over long periods of time are presented. Having corrected for electronic distortion, the data are further corrected for lens distortion using the plumb line method. Examples of close-range photogrammetric data taken with video cameras corrected for both electronic and optical lens distortion are presented.

  15. The European Photon Imaging Camera on XMM-Newton: The pn-CCD camera

    NASA Astrophysics Data System (ADS)

    Strüder, L.; Briel, U.; Dennerl, K.; Hartmann, R.; Kendziorra, E.; Meidinger, N.; Pfeffermann, E.; Reppin, C.; Aschenbach, B.; Bornemann, W.; Bräuninger, H.; Burkert, W.; Elender, M.; Freyberg, M.; Haberl, F.; Hartner, G.; Heuschmann, F.; Hippmann, H.; Kastelic, E.; Kemmer, S.; Kettenring, G.; Kink, W.; Krause, N.; Müller, S.; Oppitz, A.; Pietsch, W.; Popp, M.; Predehl, P.; Read, A.; Stephan, K. H.; Stötter, D.; Trümper, J.; Holl, P.; Kemmer, J.; Soltau, H.; Stötter, R.; Weber, U.; Weichert, U.; von Zanthier, C.; Carathanassis, D.; Lutz, G.; Richter, R. H.; Solc, P.; Böttcher, H.; Kuster, M.; Staubert, R.; Abbey, A.; Holland, A.; Turner, M.; Balasini, M.; Bignami, G. F.; La Palombara, N.; Villa, G.; Buttler, W.; Gianini, F.; Lainé, R.; Lumb, D.; Dhez, P.

    2001-01-01

    The European Photon Imaging Camera (EPIC) consortium has provided the focal plane instruments for the three X-ray mirror systems on XMM-Newton. Two cameras with a reflecting grating spectrometer in the optical path are equipped with MOS type CCDs as focal plane detectors (Turner \\cite{mturner}), the telescope with the full photon flux operates the novel pn-CCD as an imaging X-ray spectrometer. The pn-CCD camera system was developed under the leadership of the Max-Planck-Institut für extraterrestrische Physik (MPE), Garching. The concept of the pn-CCD is described as well as the different operational modes of the camera system. The electrical, mechanical and thermal design of the focal plane and camera is briefly treated. The in-orbit performance is described in terms of energy resolution, quantum efficiency, time resolution, long term stability and charged particle background. Special emphasis is given to the radiation hardening of the devices and the measured and expected degradation due to radiation damage of ionizing particles in the first 9 months of in orbit operation. Based on observations with XMM-Newton, an ESA Science Mission with instruments and contributions directly funded by ESA Member States and the USA (NASA).

  16. Development of high-speed video cameras

    NASA Astrophysics Data System (ADS)

    Etoh, Takeharu G.; Takehara, Kohsei; Okinaka, Tomoo; Takano, Yasuhide; Ruckelshausen, Arno; Poggemann, Dirk

    2001-04-01

    Presented in this paper is an outline of the R and D activities on high-speed video cameras, which have been done in Kinki University since more than ten years ago, and are currently proceeded as an international cooperative project with University of Applied Sciences Osnabruck and other organizations. Extensive marketing researches have been done, (1) on user's requirements on high-speed multi-framing and video cameras by questionnaires and hearings, and (2) on current availability of the cameras of this sort by search of journals and websites. Both of them support necessity of development of a high-speed video camera of more than 1 million fps. A video camera of 4,500 fps with parallel readout was developed in 1991. A video camera with triple sensors was developed in 1996. The sensor is the same one as developed for the previous camera. The frame rate is 50 million fps for triple-framing and 4,500 fps for triple-light-wave framing, including color image capturing. Idea on a video camera of 1 million fps with an ISIS, In-situ Storage Image Sensor, was proposed in 1993 at first, and has been continuously improved. A test sensor was developed in early 2000, and successfully captured images at 62,500 fps. Currently, design of a prototype ISIS is going on, and, hopefully, will be fabricated in near future. Epoch-making cameras in history of development of high-speed video cameras by other persons are also briefly reviewed.

  17. Advanced High-Definition Video Cameras

    NASA Technical Reports Server (NTRS)

    Glenn, William

    2007-01-01

    A product line of high-definition color video cameras, now under development, offers a superior combination of desirable characteristics, including high frame rates, high resolutions, low power consumption, and compactness. Several of the cameras feature a 3,840 2,160-pixel format with progressive scanning at 30 frames per second. The power consumption of one of these cameras is about 25 W. The size of the camera, excluding the lens assembly, is 2 by 5 by 7 in. (about 5.1 by 12.7 by 17.8 cm). The aforementioned desirable characteristics are attained at relatively low cost, largely by utilizing digital processing in advanced field-programmable gate arrays (FPGAs) to perform all of the many functions (for example, color balance and contrast adjustments) of a professional color video camera. The processing is programmed in VHDL so that application-specific integrated circuits (ASICs) can be fabricated directly from the program. ["VHDL" signifies VHSIC Hardware Description Language C, a computing language used by the United States Department of Defense for describing, designing, and simulating very-high-speed integrated circuits (VHSICs).] The image-sensor and FPGA clock frequencies in these cameras have generally been much higher than those used in video cameras designed and manufactured elsewhere. Frequently, the outputs of these cameras are converted to other video-camera formats by use of pre- and post-filters.

  18. Video imagers with low speed CCD and LC based on temporal compressed

    NASA Astrophysics Data System (ADS)

    Zhong, Xiaoming; Li, Huan; Zhao, Haibo; Liu, Yanli

    2015-08-01

    Traditional video imagers require high-speed CCD, we present a new method to implement video imagers with low speed CCD detector imager system based on video compressed. Using low speed CCD detector and transmissive liquid crystal (LC) instead of high speed CCD to get data cube; by the method of data processing method , we make high precision reconstruction of compressed video data, theoretical analysis and experimental result show that it is not ensures the video imaging quality but also reduced the frame rate of the detectors and complexity of video imaging system greatly.

  19. Video Analysis with a Web Camera

    ERIC Educational Resources Information Center

    Wyrembeck, Edward P.

    2009-01-01

    Recent advances in technology have made video capture and analysis in the introductory physics lab even more affordable and accessible. The purchase of a relatively inexpensive web camera is all you need if you already have a newer computer and Vernier's Logger Pro 3 software. In addition to Logger Pro 3, other video analysis tools such as…

  20. Laboratory calibration and characterization of video cameras

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Shortis, M. R.; Goad, W. K.

    1990-01-01

    Some techniques for laboratory calibration and characterization of video cameras used with frame grabber boards are presented. A laser-illuminated displaced reticle technique (with camera lens removed) is used to determine the camera/grabber effective horizontal and vertical pixel spacing as well as the angle of nonperpendicularity of the axes. The principal point of autocollimation and point of symmetry are found by illuminating the camera with an unexpanded laser beam, either aligned with the sensor or lens. Lens distortion and the principal distance are determined from images of a calibration plate suitably aligned with the camera. Calibration and characterization results for several video cameras are presented. Differences between these laboratory techniques and test range and plumb line calibration are noted.

  1. Laboratory Calibration and Characterization of Video Cameras

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Shortis, M. R.; Goad, W. K.

    1989-01-01

    Some techniques for laboratory calibration and characterization of video cameras used with frame grabber boards are presented. A laser-illuminated displaced reticle technique (with camera lens removed) is used to determine the camera/grabber effective horizontal and vertical pixel spacing as well as the angle of non-perpendicularity of the axes. The principal point of autocollimation and point of symmetry are found by illuminating the camera with an unexpanded laser beam, either aligned with the sensor or lens. Lens distortion and the principal distance are determined from images of a calibration plate suitable aligned with the camera. Calibration and characterization results for several video cameras are presented. Differences between these laboratory techniques and test range and plumb line calibration are noted.

  2. Photogrammetric Applications of Immersive Video Cameras

    NASA Astrophysics Data System (ADS)

    Kwiatek, K.; Tokarczyk, R.

    2014-05-01

    The paper investigates immersive videography and its application in close-range photogrammetry. Immersive video involves the capture of a live-action scene that presents a 360° field of view. It is recorded simultaneously by multiple cameras or microlenses, where the principal point of each camera is offset from the rotating axis of the device. This issue causes problems when stitching together individual frames of video separated from particular cameras, however there are ways to overcome it and applying immersive cameras in photogrammetry provides a new potential. The paper presents two applications of immersive video in photogrammetry. At first, the creation of a low-cost mobile mapping system based on Ladybug®3 and GPS device is discussed. The amount of panoramas is much too high for photogrammetric purposes as the base line between spherical panoramas is around 1 metre. More than 92 000 panoramas were recorded in one Polish region of Czarny Dunajec and the measurements from panoramas enable the user to measure the area of outdoors (adverting structures) and billboards. A new law is being created in order to limit the number of illegal advertising structures in the Polish landscape and immersive video recorded in a short period of time is a candidate for economical and flexible measurements off-site. The second approach is a generation of 3d video-based reconstructions of heritage sites based on immersive video (structure from immersive video). A mobile camera mounted on a tripod dolly was used to record the interior scene and immersive video, separated into thousands of still panoramas, was converted from video into 3d objects using Agisoft Photoscan Professional. The findings from these experiments demonstrated that immersive photogrammetry seems to be a flexible and prompt method of 3d modelling and provides promising features for mobile mapping systems.

  3. A CCD CAMERA-BASED HYPERSPECTRAL IMAGING SYSTEM FOR STATIONARY AND AIRBORNE APPLICATIONS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper describes a charge coupled device (CCD) camera-based hyperspectral imaging system designed for both stationary and airborne remote sensing applications. The system consists of a high performance digital CCD camera, an imaging spectrograph, an optional focal plane scanner, and a PC comput...

  4. Video Analysis with a Web Camera

    NASA Astrophysics Data System (ADS)

    Wyrembeck, Edward P.

    2009-01-01

    Recent advances in technology have made video capture and analysis in the introductory physics lab even more affordable and accessible. The purchase of a relatively inexpensive web camera is all you need if you already have a newer computer and Vernier's2 Logger Pro 3 software. In addition to Logger Pro 3, other video analysis tools such as Videopoint3 and Tracker,4 which is freely downloadable, by Doug Brown could also be used. I purchased Logitech's5 QuickCam Pro 4000 web camera for 99 after Rick Sorensen6 at Vernier Software and Technology recommended it for computers using a Windows platform. Once I had mounted the web camera on a mobile computer with Velcro and installed the software, I was ready to capture motion video and analyze it.

  5. Design and Development of Multi-Purpose CCD Camera System with Thermoelectric Cooling: Hardware

    NASA Astrophysics Data System (ADS)

    Kang, Y.-W.; Byun, Y. I.; Rhee, J. H.; Oh, S. H.; Kim, D. K.

    2007-12-01

    We designed and developed a multi-purpose CCD camera system for three kinds of CCDs; KAF-0401E(768×512), KAF-1602E(1536×1024), KAF-3200E(2184×1472) made by KODAK Co.. The system supports fast USB port as well as parallel port for data I/O and control signal. The packing is based on two stage circuit boards for size reduction and contains built-in filter wheel. Basic hardware components include clock pattern circuit, A/D conversion circuit, CCD data flow control circuit, and CCD temperature control unit. The CCD temperature can be controlled with accuracy of approximately 0.4° C in the max. range of temperature, Δ 33° C. This CCD camera system has with readout noise 6 e^{-}, and system gain 5 e^{-}/ADU. A total of 10 CCD camera systems were produced and our tests show that all of them show passable performance.

  6. Video metrology using a single camera.

    PubMed

    Guo, Feng; Chellappa, Rama

    2010-07-01

    This paper presents a video metrology approach using an uncalibrated single camera that is either stationary or in planar motion. Although theoretically simple, measuring the length of even a line segment in a given video is often a difficult problem. Most existing techniques for this task are extensions of single image-based techniques and do not achieve the desired accuracy especially in noisy environments. In contrast, the proposed algorithm moves line segments on the reference plane to share a common endpoint using the vanishing line information followed by fitting multiple concentric circles on the image plane. A fully automated real-time system based on this algorithm has been developed to measure vehicle wheelbases using an uncalibrated stationary camera. The system estimates the vanishing line using invariant lengths on the reference plane from multiple frames rather than the given parallel lines, which may not exist in videos. It is further extended to a camera undergoing a planar motion by automatically selecting frames with similar vanishing lines from the video. Experimental results show that the measurement results are accurate enough to classify moving vehicles based on their size. PMID:20489235

  7. Vision system using linear CCD cameras in fluorescent magnetic particle inspection of axles of railway wheelsets

    NASA Astrophysics Data System (ADS)

    Hao, Hongwei; Li, Luming; Deng, Yuanhui

    2005-05-01

    Automatic magnetic particle inspection based vision system using CCD camera is a new development of magnetic particle inspection. A vision system using linear CCD cameras in semiautomatic fluorescent magnetic particle inspection of axles of railway wheelsets is presented in this paper. The system includes four linear CCD cameras, a PCI data acquisition & logic control card, and an industrial computer. The unique characteristic of striation induced by UV light flicker in scanning image acquired by linear CCD camera are investigated, and some digital image processing methods for images of magnetic particle indications are designed to identify the cracks, including image pre-processing using wavelet, edge detection based connected region using Candy operator and double thresholds. The experimental results show that the system can detect the article cracks effectively, and may improve inspection quality highly and increase productivity practically.

  8. Auto-measuring system of aero-camera lens focus using linear CCD

    NASA Astrophysics Data System (ADS)

    Zhang, Yu-ye; Zhao, Yu-liang; Wang, Shu-juan

    2014-09-01

    The automatic and accurate focal length measurement of aviation camera lens is of great significance and practical value. The traditional measurement method depends on the human eye to read the scribed line on the focal plane of parallel light pipe by means of reading microscope. The method is of low efficiency and the measuring results are influenced by artificial factors easily. Our method used linear array solid-state image sensor instead of reading microscope to transfer the imaging size of specific object to be electrical signal pulse width, and used computer to measure the focal length automatically. In the process of measurement, the lens to be tested placed in front of the object lens of parallel light tube. A couple of scribed line on the surface of the parallel light pipe's focal plane were imaging on the focal plane of the lens to be tested. Placed the linear CCD drive circuit on the image plane, the linear CCD can convert the light intensity distribution of one dimension signal into time series of electrical signals. After converting, a path of electrical signals is directly brought to the video monitor by image acquisition card for optical path adjustment and focusing. The other path of electrical signals is processed to obtain the pulse width corresponding to the scribed line by electrical circuit. The computer processed the pulse width and output focal length measurement result. Practical measurement results showed that the relative error was about 0.10%, which was in good agreement with the theory.

  9. Nios II implementation in CCD camera for Pi of the Sky experiment

    NASA Astrophysics Data System (ADS)

    Kwiatkowski, Maciej; Kasprowicz, Grzegorz; Rybka, Dominik; Romaniuk, Ryszard S.; Pozniak, Krzysztof T.

    2008-01-01

    The concept of the Altera Nios II embedded processor implementation inside Field Programmable Gate Array (FPGA) of the CCD camera for the "Pi of the Sky" experiment is presented. The digital board of the CCD camera, its most important components, current implementation of firmware (VHDL) inside the FPGA and the role of external 8051 microcontroller is briefly described. The main goal of the presented work is to get rid of the external microcontroller and to design new system with Nios II processor built inside FPGA chip. Constraints for implementing the design into the existing camera boards are discussed. New possibilities offered by a larger FPGA for next generation of cameras are considered.

  10. The image pretreatment based on the FPGA inside digital CCD camera

    NASA Astrophysics Data System (ADS)

    Tian, Rui; Liu, Yan-ying

    2009-07-01

    In a space project, a digital CCD camera which can image more clearly in the 1 Lux light environment has been asked to design . The CCD sensor ICX285AL produced by SONY Co.Ltd has been used in the CCD camera. The FPGA (Field Programmable Gate Array) chip XQR2V1000 has been used as a timing generator and a signal processor inside the CCD camera. But in the low-light environment, two kinds of random noise become apparent because of the improving of CCD camera's variable gain, one is dark current noise in the image background, the other is vertical transfer noise. The real time method for eliminating noise based on FPGA inside the CCD camera would be introduced. The causes and characteristics of the random noise have been analyzed. First, several ideas for eliminating dark current noise had been motioned; then they were emulated by VC++ in order to compare their speed and effect; Gauss filter has been chosen because of the filtering effect. The vertical transfer vertical noise has the character that the vertical noise points have regular ordinate in the image two-dimensional coordinates; and the performance of the noise is fixed, the gray value of the noise points is 16-20 less than the surrounding pixels. According to these characters, local median filter has been used to clear up the vertical noise. Finally, these algorithms had been transplanted into the FPGA chip inside the CCD camera. A large number of experiments had proved that the pretreatment has better real-time features. The pretreatment makes the digital CCD camera improve the signal-to-noise ratio of 3-5dB in the low-light environment.

  11. Development of filter exchangeable 3CCD camera for multispectral imaging acquisition

    NASA Astrophysics Data System (ADS)

    Lee, Hoyoung; Park, Soo Hyun; Kim, Moon S.; Noh, Sang Ha

    2012-05-01

    There are a lot of methods to acquire multispectral images. Dynamic band selective and area-scan multispectral camera has not developed yet. This research focused on development of a filter exchangeable 3CCD camera which is modified from the conventional 3CCD camera. The camera consists of F-mounted lens, image splitter without dichroic coating, three bandpass filters, three image sensors, filer exchangeable frame and electric circuit for parallel image signal processing. In addition firmware and application software have developed. Remarkable improvements compared to a conventional 3CCD camera are its redesigned image splitter and filter exchangeable frame. Computer simulation is required to visualize a pathway of ray inside of prism when redesigning image splitter. Then the dimensions of splitter are determined by computer simulation which has options of BK7 glass and non-dichroic coating. These properties have been considered to obtain full wavelength rays on all film planes. The image splitter is verified by two line lasers with narrow waveband. The filter exchangeable frame is designed to make swap bandpass filters without displacement change of image sensors on film plane. The developed 3CCD camera is evaluated to application of detection to scab and bruise on Fuji apple. As a result, filter exchangeable 3CCD camera could give meaningful functionality for various multispectral applications which need to exchange bandpass filter.

  12. Photometric Calibration of Consumer Video Cameras

    NASA Technical Reports Server (NTRS)

    Suggs, Robert; Swift, Wesley, Jr.

    2007-01-01

    Equipment and techniques have been developed to implement a method of photometric calibration of consumer video cameras for imaging of objects that are sufficiently narrow or sufficiently distant to be optically equivalent to point or line sources. Heretofore, it has been difficult to calibrate consumer video cameras, especially in cases of image saturation, because they exhibit nonlinear responses with dynamic ranges much smaller than those of scientific-grade video cameras. The present method not only takes this difficulty in stride but also makes it possible to extend effective dynamic ranges to several powers of ten beyond saturation levels. The method will likely be primarily useful in astronomical photometry. There are also potential commercial applications in medical and industrial imaging of point or line sources in the presence of saturation.This development was prompted by the need to measure brightnesses of debris in amateur video images of the breakup of the Space Shuttle Columbia. The purpose of these measurements is to use the brightness values to estimate relative masses of debris objects. In most of the images, the brightness of the main body of Columbia was found to exceed the dynamic ranges of the cameras. A similar problem arose a few years ago in the analysis of video images of Leonid meteors. The present method is a refined version of the calibration method developed to solve the Leonid calibration problem. In this method, one performs an endto- end calibration of the entire imaging system, including not only the imaging optics and imaging photodetector array but also analog tape recording and playback equipment (if used) and any frame grabber or other analog-to-digital converter (if used). To automatically incorporate the effects of nonlinearity and any other distortions into the calibration, the calibration images are processed in precisely the same manner as are the images of meteors, space-shuttle debris, or other objects that one seeks to analyze. The light source used to generate the calibration images is an artificial variable star comprising a Newtonian collimator illuminated by a light source modulated by a rotating variable neutral- density filter. This source acts as a point source, the brightness of which varies at a known rate. A video camera to be calibrated is aimed at this source. Fixed neutral-density filters are inserted in or removed from the light path as needed to make the video image of the source appear to fluctuate between dark and saturated bright. The resulting video-image data are analyzed by use of custom software that determines the integrated signal in each video frame and determines the system response curve (measured output signal versus input brightness). These determinations constitute the calibration, which is thereafter used in automatic, frame-by-frame processing of the data from the video images to be analyzed.

  13. An RS-170 to 700 frame per second CCD camera

    SciTech Connect

    Albright, K.L.; King, N.S.P.; Yates, G.J.; McDonald, T.E.; Turko, B.T.

    1993-08-01

    A versatile new camera, the Los Alamos National Laboratory (LANL) model GY6, is described. It operates at a wide variety of frame rates, from RS-170 to 700 frames per second. The camera operates as an NTSC compatible black and white camera when operating at RS-170 rates. When used for variable high-frame rates, a simple substitution is made of the RS-170 sync/clock generator circuit card with a high speed emitter-coupled logic (ECL) circuit card.

  14. Radiation damage of the PCO Pixelfly VGA CCD camera of the BES system on KSTAR tokamak

    NASA Astrophysics Data System (ADS)

    Náfrádi, Gábor; Kovácsik, Ákos; Pór, Gábor; Lampert, Máté; Un Nam, Yong; Zoletnik, Sándor

    2015-01-01

    A PCO Pixelfly VGA CCD camera which is part a of the Beam Emission Spectroscopy (BES) diagnostic system of the Korea Superconducting Tokamak Advanced Research (KSTAR) used for spatial calibrations, suffered from serious radiation damage, white pixel defects have been generated in it. The main goal of this work was to identify the origin of the radiation damage and to give solutions to avoid it. Monte Carlo N-Particle eXtended (MCNPX) model was built using Monte Carlo Modeling Interface Program (MCAM) and calculations were carried out to predict the neutron and gamma-ray fields in the camera position. Besides the MCNPX calculations pure gamma-ray irradiations of the CCD camera were carried out in the Training Reactor of BME. Before, during and after the irradiations numerous frames were taken with the camera with 5 s long exposure times. The evaluation of these frames showed that with the applied high gamma-ray dose (1.7 Gy) and dose rate levels (up to 2 Gy/h) the number of the white pixels did not increase. We have found that the origin of the white pixel generation was the neutron-induced thermal hopping of the electrons which means that in the future only neutron shielding is necessary around the CCD camera. Another solution could be to replace the CCD camera with a more radiation tolerant one for example with a suitable CMOS camera or apply both solutions simultaneously.

  15. Sensors for 3D Imaging: Metric Evaluation and Calibration of a CCD/CMOS Time-of-Flight Camera

    PubMed Central

    Chiabrando, Filiberto; Chiabrando, Roberto; Piatti, Dario; Rinaudo, Fulvio

    2009-01-01

    3D imaging with Time-of-Flight (ToF) cameras is a promising recent technique which allows 3D point clouds to be acquired at video frame rates. However, the distance measurements of these devices are often affected by some systematic errors which decrease the quality of the acquired data. In order to evaluate these errors, some experimental tests on a CCD/CMOS ToF camera sensor, the SwissRanger (SR)-4000 camera, were performed and reported in this paper. In particular, two main aspects are treated: the calibration of the distance measurements of the SR-4000 camera, which deals with evaluation of the camera warm up time period, the distance measurement error evaluation and a study of the influence on distance measurements of the camera orientation with respect to the observed object; the second aspect concerns the photogrammetric calibration of the amplitude images delivered by the camera using a purpose-built multi-resolution field made of high contrast targets. PMID:22303163

  16. Video inpainting under constrained camera motion.

    PubMed

    Patwardhan, Kedar A; Sapiro, Guillermo; Bertalmío, Marcelo

    2007-02-01

    A framework for inpainting missing parts of a video sequence recorded with a moving or stationary camera is presented in this work. The region to be inpainted is general: it may be still or moving, in the background or in the foreground, it may occlude one object and be occluded by some other object. The algorithm consists of a simple preprocessing stage and two steps of video inpainting. In the preprocessing stage, we roughly segment each frame into foreground and background. We use this segmentation to build three image mosaics that help to produce time consistent results and also improve the performance of the algorithm by reducing the search space. In the first video inpainting step, we reconstruct moving objects in the foreground that are "occluded" by the region to be inpainted. To this end, we fill the gap as much as possible by copying information from the moving foreground in other frames, using a priority-based scheme. In the second step, we inpaint the remaining hole with the background. To accomplish this, we first align the frames and directly copy when possible. The remaining pixels are filled in by extending spatial texture synthesis techniques to the spatiotemporal domain. The proposed framework has several advantages over state-of-the-art algorithms that deal with similar types of data and constraints. It permits some camera motion, is simple to implement, fast, does not require statistical models of background nor foreground, works well in the presence of rich and cluttered backgrounds, and the results show that there is no visible blurring or motion artifacts. A number of real examples taken with a consumer hand-held camera are shown supporting these findings. PMID:17269646

  17. EEV CCD39 wavefront sensor cameras for AO and interferometry

    NASA Astrophysics Data System (ADS)

    DuVarney, Raymond C.; Bleau, Charles A.; Motter, Garry T.; Shaklan, Stuart B.; Kuhnert, Andreas C.; Brack, Gary; Palmer, Dean; Troy, Mitchell; Kieu, Thangh; Dekany, Richard G.

    2000-07-01

    SciMeasure, in collaboration with Emory University and the Jet Propulsion Laboratory (JPL), has developed an extremely versatile CCD controller for use in adaptive optics, optical interferometry, and other applications requiring high-speed readout rates and/or low read noise. The overall architecture of this controller system will be discussed and its performance using both EEV CCD39 and MIT/LL CCID-19 detectors will be presented. Initially developed for adaptive optics applications, this controller is used in the Palomar Adaptive Optics program (PALAO), the AO system developed by JPL for the 200' Hale telescope at Palomar Mountain. An overview of the PALAO system is discussed and diffraction-limited science results will be shown. Recently modified under NASA SBIR Phase II funding for use in the Space Interferometry Mission testbeds, this controller is currently in use on the Micro- Arcsecond Metrology testbed at JPL. Details of a new vacuum- compatible remote CCD enclosure and specialized readout sequence programming will also be presented.

  18. Characterization of a CCD-camera-based system for measurement of the solar radial energy distribution

    NASA Astrophysics Data System (ADS)

    Gambardella, A.; Galleano, R.

    2011-10-01

    Charge-coupled device (CCD)-camera-based measurement systems offer the possibility to gather information on the solar radial energy distribution (sunshape). Sunshape measurements are very useful in designing high concentration photovoltaic systems and heliostats as they collect light only within a narrow field of view, the dimension of which has to be defined in the context of several different system design parameters. However, in this regard the CCD camera response needs to be adequately characterized. In this paper, uncertainty components for optical and other CCD-specific sources have been evaluated using indoor test procedures. We have considered CCD linearity and background noise, blooming, lens aberration, exposure time linearity and quantization error. Uncertainty calculation showed that a 0.94% (k = 2) combined expanded uncertainty on the solar radial energy distribution can be assumed.

  19. Interline Transfer CCD Camera for Gated Broadband Coherent Anti-Stokes Raman-Scattering Measurements.

    PubMed

    Roy, S; Ray, G; Lucht, R P

    2001-11-20

    Use of an interline transfer CCD camera for the acquisition of broadband coherent anti-Stokes Raman-scattering (CARS) spectra is demonstrated. The interline transfer CCD has alternating columns of imaging and storage pixels that allow one to acquire two successive images by shifting the first image in the storage pixels and immediately acquiring the second image. We have used this dual-image mode for gated CARS measurements by acquiring a CARS spectral image and shifting it rapidly from the imaging pixel columns to the storage pixel columns. We have demonstrated the use of this dual-image mode for gated single-laser-shot measurement of hydrogen and nitrogen CARS spectra at room temperature and in atmospheric pressure flames. The performance of the interline transfer CCD for these CARS measurements is compared directly with the performance of a back-illuminated unintensified CCD camera. PMID:18364895

  20. The In-flight Spectroscopic Performance of the Swift XRT CCD Camera During 2006-2007

    NASA Technical Reports Server (NTRS)

    Godet, O.; Beardmore, A.P.; Abbey, A.F.; Osborne, J.P.; Page, K.L.; Evans, P.; Starling, R.; Wells, A.A.; Angelini, L.; Burrows, D.N.; Kennea, J.; Campana, S.; Chincarini, G.; Citterio, O.; Cusumano, G.; LaParola, V.; Mangano, V.; Mineo, T.; Giommi, P.; Perri, M.; Capalbi, M.; Tamburelli, F.

    2007-01-01

    The Swift X-ray Telescope focal plane camera is a front-illuminated MOS CCD, providing a spectral response kernel of 135 eV FWHM at 5.9 keV as measured before launch. We describe the CCD calibration program based on celestial and on-board calibration sources, relevant in-flight experiences, and developments in the CCD response model. We illustrate how the revised response model describes the calibration sources well. Comparison of observed spectra with models folded through the instrument response produces negative residuals around and below the Oxygen edge. We discuss several possible causes for such residuals. Traps created by proton damage on the CCD increase the charge transfer inefficiency (CTI) over time. We describe the evolution of the CTI since the launch and its effect on the CCD spectral resolution and the gain.

  1. Automated CCD camera characterization. 1998 summer research program for high school juniors at the University of Rochester`s Laboratory for Laser Energetics: Student research reports

    SciTech Connect

    Silbermann, J.

    1999-03-01

    The OMEGA system uses CCD cameras for a broad range of applications. Over 100 video rate CCD cameras are used for such purposes as targeting, aligning, and monitoring areas such as the target chamber, laser bay, and viewing gallery. There are approximately 14 scientific grade CCD cameras on the system which are used to obtain precise photometric results from the laser beam as well as target diagnostics. It is very important that these scientific grade CCDs are properly characterized so that the results received from them can be evaluated appropriately. Currently characterization is a tedious process done by hand. The operator must manually operate the camera and light source simultaneously. Because more exposures means more accurate information on the camera, the characterization tests can become very length affairs. Sometimes it takes an entire day to complete just a single plot. Characterization requires the testing of many aspects of the camera`s operation. Such aspects include the following: variance vs. mean signal level--this should be proportional due to Poisson statistics of the incident photon flux; linearity--the ability of the CCD to produce signals proportional to the light it received; signal-to-noise ratio--the relative magnitude of the signal vs. the uncertainty in that signal; dark current--the amount of noise due to thermal generation of electrons (cooling lowers this noise contribution significantly). These tests, as well as many others, must be conducted in order to properly understand a CCD camera. The goal of this project was to construct an apparatus that could characterize a camera automatically.

  2. Development of CCD Cameras for Soft X-ray Imaging at the National Ignition Facility

    SciTech Connect

    Teruya, A. T.; Palmer, N. E.; Schneider, M. B.; Bell, P. M.; Sims, G.; Toerne, K.; Rodenburg, K.; Croft, M.; Haugh, M. J.; Charest, M. R.; Romano, E. D.; Jacoby, K. D.

    2013-09-01

    The Static X-Ray Imager (SXI) is a National Ignition Facility (NIF) diagnostic that uses a CCD camera to record time-integrated X-ray images of target features such as the laser entrance hole of hohlraums. SXI has two dedicated positioners on the NIF target chamber for viewing the target from above and below, and the X-ray energies of interest are 870 eV for the “soft” channel and 3 – 5 keV for the “hard” channels. The original cameras utilize a large format back-illuminated 2048 x 2048 CCD sensor with 24 micron pixels. Since the original sensor is no longer available, an effort was recently undertaken to build replacement cameras with suitable new sensors. Three of the new cameras use a commercially available front-illuminated CCD of similar size to the original, which has adequate sensitivity for the hard X-ray channels but not for the soft. For sensitivity below 1 keV, Lawrence Livermore National Laboratory (LLNL) had additional CCDs back-thinned and converted to back-illumination for use in the other two new cameras. In this paper we describe the characteristics of the new cameras and present performance data (quantum efficiency, flat field, and dynamic range) for the front- and back-illuminated cameras, with comparisons to the original cameras.

  3. Wilbur: A low-cost CCD camera system for MDM Observatory

    NASA Technical Reports Server (NTRS)

    Metzger, M. R.; Luppino, G. A.; Tonry, J. L.

    1992-01-01

    The recent availability of several 'off-the-shelf' components, particularly CCD control electronics from SDSU, has made it possible to put together a flexible CCD camera system at relatively low cost and effort. The authors describe Wilbur, a complete CCD camera system constructed for the Michigan-Dartmouth-MIT Observatory. The hardware consists of a Loral 2048(exp 2) CCD controlled by the SDSU electronics, an existing dewar design modified for use at MDM, a Sun Sparcstation 2 with a commercial high-speed parallel controller, and a simple custom interface between the controller and the SDSU electronics. The camera is controlled from the Sparcstation by software that provides low-level I/O in real time, collection of additional information from the telescope, and a simple command interface for use by an observer. Readout of the 2048(exp 2) array is complete in under two minutes at 5 e(sup -) read noise, and readout time can be decreased at the cost of increased noise. The system can be easily expanded to handle multiple CCD's/multiple readouts, and can control other dewars/CCD's using the same host software.

  4. Theodolite with CCD Camera for Safe Measurement of Laser-Beam Pointing

    NASA Technical Reports Server (NTRS)

    Crooke, Julie A.

    2003-01-01

    The simple addition of a charge-coupled-device (CCD) camera to a theodolite makes it safe to measure the pointing direction of a laser beam. The present state of the art requires this to be a custom addition because theodolites are manufactured without CCD cameras as standard or even optional equipment. A theodolite is an alignment telescope equipped with mechanisms to measure the azimuth and elevation angles to the sub-arcsecond level. When measuring the angular pointing direction of a Class ll laser with a theodolite, one could place a calculated amount of neutral density (ND) filters in front of the theodolite s telescope. One could then safely view and measure the laser s boresight looking through the theodolite s telescope without great risk to one s eyes. This method for a Class ll visible wavelength laser is not acceptable to even consider tempting for a Class IV laser and not applicable for an infrared (IR) laser. If one chooses insufficient attenuation or forgets to use the filters, then looking at the laser beam through the theodolite could cause instant blindness. The CCD camera is already commercially available. It is a small, inexpensive, blackand- white CCD circuit-board-level camera. An interface adaptor was designed and fabricated to mount the camera onto the eyepiece of the specific theodolite s viewing telescope. Other equipment needed for operation of the camera are power supplies, cables, and a black-and-white television monitor. The picture displayed on the monitor is equivalent to what one would see when looking directly through the theodolite. Again, the additional advantage afforded by a cheap black-and-white CCD camera is that it is sensitive to infrared as well as to visible light. Hence, one can use the camera coupled to a theodolite to measure the pointing of an infrared as well as a visible laser.

  5. Sports video categorizing method using camera motion parameters

    NASA Astrophysics Data System (ADS)

    Takagi, Shinichi; Hattori, Shinobu; Yokoyama, Kazumasa; Kodate, Akihisa; Tominaga, Hideyoshi

    2003-06-01

    In this paper, we propose a content based video categorizing method for broadcasted sports videos using camera motion parameters. We define and introduce two new features in the proposed method; "Camera motion extraction ratio" and "Camera motion transition". Camera motion parameters in the video sequence contain very significant information for categorization of broadcasted sports video, because in most of sports video, camera motions are closely related to the actions taken in the sports, which are mostly based on a certain rule depending on types of sports. Based on the charactersitics, we design a sports video categorization algorithm for identifying 6 major different sports types. In our algorithm, the features automatically extracted from videos are analysed statistically. The experimental results show a clear tendency and the applicability of the proposed method for sports genre identification.

  6. Optical synthesizer for a large quadrant-array CCD camera: Center director's discretionary fund

    NASA Technical Reports Server (NTRS)

    Hagyard, Mona J.

    1992-01-01

    The objective of this program was to design and develop an optical device, an optical synthesizer, that focuses four contiguous quadrants of a solar image on four spatially separated CCD arrays that are part of a unique CCD camera system. This camera and the optical synthesizer will be part of the new NASA-Marshall Experimental Vector Magnetograph, and instrument developed to measure the Sun's magnetic field as accurately as present technology allows. The tasks undertaken in the program are outlined and the final detailed optical design is presented.

  7. Video-Based Point Cloud Generation Using Multiple Action Cameras

    NASA Astrophysics Data System (ADS)

    Teo, T.

    2015-05-01

    Due to the development of action cameras, the use of video technology for collecting geo-spatial data becomes an important trend. The objective of this study is to compare the image-mode and video-mode of multiple action cameras for 3D point clouds generation. Frame images are acquired from discrete camera stations while videos are taken from continuous trajectories. The proposed method includes five major parts: (1) camera calibration, (2) video conversion and alignment, (3) orientation modelling, (4) dense matching, and (5) evaluation. As the action cameras usually have large FOV in wide viewing mode, camera calibration plays an important role to calibrate the effect of lens distortion before image matching. Once the camera has been calibrated, the author use these action cameras to take video in an indoor environment. The videos are further converted into multiple frame images based on the frame rates. In order to overcome the time synchronous issues in between videos from different viewpoints, an additional timer APP is used to determine the time shift factor between cameras in time alignment. A structure form motion (SfM) technique is utilized to obtain the image orientations. Then, semi-global matching (SGM) algorithm is adopted to obtain dense 3D point clouds. The preliminary results indicated that the 3D points from 4K video are similar to 12MP images, but the data acquisition performance of 4K video is more efficient than 12MP digital images.

  8. Research on detecting heterogeneous fibre from cotton based on linear CCD camera

    NASA Astrophysics Data System (ADS)

    Zhang, Xian-bin; Cao, Bing; Zhang, Xin-peng; Shi, Wei

    2009-07-01

    The heterogeneous fibre in cotton make a great impact on production of cotton textile, it will have a bad effect on the quality of product, thereby affect economic benefits and market competitive ability of corporation. So the detecting and eliminating of heterogeneous fibre is particular important to improve machining technics of cotton, advance the quality of cotton textile and reduce production cost. There are favorable market value and future development for this technology. An optical detecting system obtains the widespread application. In this system, we use a linear CCD camera to scan the running cotton, then the video signals are put into computer and processed according to the difference of grayscale, if there is heterogeneous fibre in cotton, the computer will send an order to drive the gas nozzle to eliminate the heterogeneous fibre. In the paper, we adopt monochrome LED array as the new detecting light source, it's lamp flicker, stability of luminous intensity, lumens depreciation and useful life are all superior to fluorescence light. We analyse the reflection spectrum of cotton and various heterogeneous fibre first, then select appropriate frequency of the light source, we finally adopt violet LED array as the new detecting light source. The whole hardware structure and software design are introduced in this paper.

  9. Auto-measurement system of aerial camera lens' resolution based on orthogonal linear CCD

    NASA Astrophysics Data System (ADS)

    Zhao, Yu-liang; Zhang, Yu-ye; Ding, Hong-yi

    2010-10-01

    The resolution of aerial camera lens is one of the most important camera's performance indexes. The measurement and calibration of resolution are important test items in in maintenance of camera. The traditional method that is observing resolution panel of collimator rely on human's eyes using microscope and doing some computing. The method is of low efficiency and susceptible to artificial factors. The measurement results are unstable, too. An auto-measurement system of aerial camera lens' resolution, which uses orthogonal linear CCD sensor as the detector to replace reading microscope, is introduced. The system can measure automatically and show result real-timely. In order to measure the smallest diameter of resolution panel which could be identified, two orthogonal linear CCD is laid on the imaging plane of measured lens and four intersection points are formed on the orthogonal linear CCD. A coordinate system is determined by origin point of the linear CCD. And a circle is determined by four intersection points. In order to obtain the circle's radius, firstly, the image of resolution panel is transformed to pulse width of electric signal which is send to computer through amplifying circuit and threshold comparator and counter. Secondly, the smallest circle would be extracted to do measurement. The circle extraction made using of wavelet transform which has character of localization in the domain of time and frequency and has capability of multi-scale analysis. Lastly, according to the solution formula of lens' resolution, we could obtain the resolution of measured lens. The measuring precision on practical measurement is analyzed, and the result indicated that the precision will be improved when using linear CCD instead of reading microscope. Moreover, the improvement of system error is determined by the pixel's size of CCD. With the technique of CCD developed, the pixel's size will smaller, the system error will be reduced greatly too. So the auto-measuring system has high practical value and wide application prospect.

  10. A Design and Development of Multi-Purpose CCD Camera System with Thermoelectric Cooling: Software

    NASA Astrophysics Data System (ADS)

    Oh, S. H.; Kang, Y. W.; Byun, Y. I.

    2007-12-01

    We present a software which we developed for the multi-purpose CCD camera. This software can be used on the all 3 types of CCD - KAF-0401E (768×512), KAF-1602E (15367times;1024), KAF-3200E (2184×1472) made in KODAK Co.. For the efficient CCD camera control, the software is operated with two independent processes of the CCD control program and the temperature/shutter operation program. This software is designed to fully automatic operation as well as manually operation under LINUX system, and is controled by LINUX user signal procedure. We plan to use this software for all sky survey system and also night sky monitoring and sky observation. As our results, the read-out time of each CCD are about 15sec, 64sec, 134sec for KAF-0401E, KAF-1602E, KAF-3200E., because these time are limited by the data transmission speed of parallel port. For larger format CCD, the data transmission is required more high speed. we are considering this control software to one using USB port for high speed data transmission.

  11. A CCD Camera with Electron Decelerator for Intermediate Voltage Electron Microscopy

    SciTech Connect

    Downing, Kenneth H; Downing, Kenneth H.; Mooney, Paul E.

    2008-03-17

    Electron microscopists are increasingly turning to Intermediate Voltage Electron Microscopes (IVEMs) operating at 300 - 400 kV for a wide range of studies. They are also increasingly taking advantage of slow-scan charge coupled device (CCD) cameras, which have become widely used on electron microscopes. Under some conditions CCDs provide an improvement in data quality over photographic film, as well as the many advantages of direct digital readout. However, CCD performance is seriously degraded on IVEMs compared to the more conventional 100 kV microscopes. In order to increase the efficiency and quality of data recording on IVEMs, we have developed a CCD camera system in which the electrons are decelerated to below 100 kV before impacting the camera, resulting in greatly improved performance in both signal quality and resolution compared to other CCDs used in electron microscopy. These improvements will allow high-quality image and diffraction data to be collected directly with the CCD, enabling improvements in data collection for applications including high-resolution electron crystallography, single-particle reconstruction of protein structures, tomographic studies of cell ultrastructure and remote microscope operation. This approach will enable us to use even larger format CCD chips that are being developed with smaller pixels.

  12. Pixel correspondence calibration method of a 2CCD camera based on absolute phase calculation

    NASA Astrophysics Data System (ADS)

    Zhang, Zonghua; Zheng, Guoquan; Huang, Shujun

    2014-11-01

    This paper presents a novel calibration method to build up pixel correspondence between the IR CCD sensor and the visible CCD sensor of a 2CCD camera by using absolute phase calculation. Vertical and horizontal sinusoidal fringe patterns are projected onto a white plate surface through the visible and infrared (IR) channels of a DLP projector. The visible and IR fringe patterns are captured by the IR sensor and visible sensor respectively. Absolute phase of each pixel at IR and visible channels is calculated by using the optimum three-fringe number selection method. The precise pixel relationship between the two channels can be determined by the obtained absolute phase data. Experimental results show the effectiveness and validity of the proposed 2CCD calibration method. Due to using continuous phase information, this method can accurately give pixel-to-pixel correspondence.

  13. Color video camera capable of 1,000,000 fps with triple ultrahigh-speed image sensors

    NASA Astrophysics Data System (ADS)

    Maruyama, Hirotaka; Ohtake, Hiroshi; Hayashida, Tetsuya; Yamada, Masato; Kitamura, Kazuya; Arai, Toshiki; Tanioka, Kenkichi; Etoh, Takeharu G.; Namiki, Jun; Yoshida, Tetsuo; Maruno, Hiromasa; Kondo, Yasushi; Ozaki, Takao; Kanayama, Shigehiro

    2005-03-01

    We developed an ultrahigh-speed, high-sensitivity, color camera that captures moving images of phenomena too fast to be perceived by the human eye. The camera operates well even under restricted lighting conditions. It incorporates a special CCD device that is capable of ultrahigh-speed shots while retaining its high sensitivity. Its ultrahigh-speed shooting capability is made possible by directly connecting CCD storages, which record video images, to photodiodes of individual pixels. Its large photodiode area together with the low-noise characteristic of the CCD contributes to its high sensitivity. The camera can clearly capture events even under poor light conditions, such as during a baseball game at night. Our camera can record the very moment the bat hits the ball.

  14. Time-Resolved Spectra of Dense Plasma Focus Using Spectrometer, Streak Camera, CCD Combination

    SciTech Connect

    F. J. Goldin, B. T. Meehan, E. C. Hagen, P. R. Wilkins

    2010-10-01

    A time-resolving spectrographic instrument has been assembled with the primary components of a spectrometer, image-converting streak camera, and CCD recording camera, for the primary purpose of diagnosing highly dynamic plasmas. A collection lens defines the sampled region and couples light from the plasma into a step index, multimode fiber which leads to the spectrometer. The output spectrum is focused onto the photocathode of the streak camera, the output of which is proximity-coupled to the CCD. The spectrometer configuration is essentially Czerny–Turner, but off-the-shelf Nikon refraction lenses, rather than mirrors, are used for practicality and flexibility. Only recently assembled, the instrument requires significant refinement, but has now taken data on both bridge wire and dense plasma focus experiments.

  15. Time-resolved spectra of dense plasma focus using spectrometer, streak camera, and CCD combination

    SciTech Connect

    Goldin, F. J.; Meehan, B. T.; Hagen, E. C.; Wilkins, P. R.

    2010-10-15

    A time-resolving spectrographic instrument has been assembled with the primary components of a spectrometer, image-converting streak camera, and CCD recording camera, for the primary purpose of diagnosing highly dynamic plasmas. A collection lens defines the sampled region and couples light from the plasma into a step index, multimode fiber which leads to the spectrometer. The output spectrum is focused onto the photocathode of the streak camera, the output of which is proximity-coupled to the CCD. The spectrometer configuration is essentially Czerny-Turner, but off-the-shelf Nikon refraction lenses, rather than mirrors, are used for practicality and flexibility. Only recently assembled, the instrument requires significant refinement, but has now taken data on both bridge wire and dense plasma focus experiments.

  16. A Simple Approach of CCD Camera Calibration for Optical Diagnostics Instrumentation

    NASA Technical Reports Server (NTRS)

    Cha, Soyoung Stephen; Leslie, Fred W.; Ramachandran, Narayanan; Rose, M. Franklin (Technical Monitor)

    2001-01-01

    Solid State array sensors are ubiquitous nowadays for obtaining gross field images in numerous scientific and engineering applications including optical diagnostics and instrumentation. Linear responses of these sensors are often required as in interferometry, light scattering and attenuation measurements, and photometry. In most applications, the linearity is usually taken to be granted without thorough quantitative assessment or correction through calibration. Upper-grade CCD cameras of high price may offer better linearity: however, they also require linearity checking and correction if necessary. Intermediate- or low-grade CCD cameras are more likely to need calibration for linearity . Here, we present two very simple approaches: one for quickly checking camera linearity without any additional setup and one for precisely correcting nonlinear sensor responses. It is believed that after calibration, those sensors of intermediate or low grade can function as effectively as their expensive counterpart.

  17. Time-resolved spectra of dense plasma focus using spectrometer, streak camera, and CCD combination.

    PubMed

    Goldin, F J; Meehan, B T; Hagen, E C; Wilkins, P R

    2010-10-01

    A time-resolving spectrographic instrument has been assembled with the primary components of a spectrometer, image-converting streak camera, and CCD recording camera, for the primary purpose of diagnosing highly dynamic plasmas. A collection lens defines the sampled region and couples light from the plasma into a step index, multimode fiber which leads to the spectrometer. The output spectrum is focused onto the photocathode of the streak camera, the output of which is proximity-coupled to the CCD. The spectrometer configuration is essentially Czerny-Turner, but off-the-shelf Nikon refraction lenses, rather than mirrors, are used for practicality and flexibility. Only recently assembled, the instrument requires significant refinement, but has now taken data on both bridge wire and dense plasma focus experiments. PMID:21034059

  18. The University of Hawaii Institute for Astronomy CCD camera control system

    NASA Technical Reports Server (NTRS)

    Jim, K. T. C.; Yamada, H. T.; Luppino, G. A.; Hlivak, R. J.

    1992-01-01

    The University of Hawaii Institute for Astronomy CCD Camera Control System consists of a NeXT workstation, a graphical user interface, and a fiber optics communications interface which is connected to a San Diego State University CCD controller. The UH system employs the NeXT-resident Motorola DSP 56001 as a real time hardware controller. The DSP 56001 is interfaced to the Mach-based UNIX of the NeXT workstation by DMA and multithreading. Since the SDSU controller also uses the DPS 56001, the NeXT is used as a development platform for the embedded control software. The fiber optic interface links the two DSP 56001's through their Synchronous Serial Interfaces. The user interface is based on the NeXTStep windowing system. It is easy to use and features real-time display of image data and control over all camera functions. Both Loral and Tektronix 2048 x 2048 CCD's have been driven at full readout speeds, and the system is intended to be capable of simultaneous readout of four such CCD's. The total hardware package is compact enough to be quite portable and has been used on five different telescopes on Mauna Kea. The complete CCD control system can be assembled for a very low cost. The hardware and software of the control system has proven to be quite reliable, well adapted to the needs of astronomers, and extensible to increasingly complicated control requirements.

  19. Applications of visible CCD cameras on the Alcator C-Mod tokamak

    NASA Astrophysics Data System (ADS)

    Boswell, C. J.; Terry, J. L.; Lipschultz, B.; Stillerman, J.

    2001-01-01

    Five 7 mm diameter remote-head visible charge-coupled device (CCD) cameras are being used on Alcator C-Mod for several different diagnostic purposes. All of the cameras' detectors and optics are placed inside a magnetic field of up to 4 T. Images of the cameras are recorded simultaneously using two three-channel color framegrabber cards. Two CCD cameras are used typically to generate two-dimensional emissivity profiles of deuterium line radiation from the divertor. Interference filters are used to select the spectral line to be measured. The local emissivity is obtained by inverting the measured brightnesses assuming toroidal symmetry of the emission. Another use of the cameras is the identification and localization of impurity sources generated by the ion cyclotron radio frequency (ICRF) antennas, which supply the auxiliary heating on Alcator C-Mod. The impurities generated by the antennas are identified by correlating in time the injections seen at the cameras with measurements made with core diagnostics. Fibers whose views aligned with the camera views and whose outputs are coupled to a visible spectrometer are also used to identify the species of the impurities injected.

  20. Multi-spectral CCD camera system for ocean water color and seacoast observation

    NASA Astrophysics Data System (ADS)

    Zhu, Min; Chen, Shiping; Wu, Yanlin; Huang, Qiaolin; Jin, Weiqi

    2001-10-01

    One of the earth observing instruments on HY-1 Satellite which will be launched in 2001, the multi-spectral CCD camera system, is developed by Beijing Institute of Space Mechanics & Electricity (BISME), Chinese Academy of Space Technology (CAST). In 798 km orbit, the system can provide images with 250 m ground resolution and a swath of 500 km. It is mainly used for coast zone dynamic mapping and oceanic watercolor monitoring, which include the pollution of offshore and coast zone, plant cover, watercolor, ice, terrain underwater, suspended sediment, mudflat, soil and vapor gross. The multi- spectral camera system is composed of four monocolor CCD cameras, which are line array-based, 'push-broom' scanning cameras, and responding for four spectral bands. The camera system adapts view field registration; that is, each camera scans the same region at the same moment. Each of them contains optics, focal plane assembly, electrical circuit, installation structure, calibration system, thermal control and so on. The primary features on the camera system are: (1) Offset of the central wavelength is better than 5 nm; (2) Degree of polarization is less than 0.5%; (3) Signal-to-noise ratio is about 1000; (4) Dynamic range is better than 2000:1; (5) Registration precision is better than 0.3 pixel; (6) Quantization value is 12 bit.

  1. Demonstrations of Optical Spectra with a Video Camera

    ERIC Educational Resources Information Center

    Kraftmakher, Yaakov

    2012-01-01

    The use of a video camera may markedly improve demonstrations of optical spectra. First, the output electrical signal from the camera, which provides full information about a picture to be transmitted, can be used for observing the radiant power spectrum on the screen of a common oscilloscope. Second, increasing the magnification by the camera

  2. Calibration of CCD-Cameras for Machine Vision and Robotics

    NASA Astrophysics Data System (ADS)

    Beyer, Horst A.

    1990-02-01

    The basic mathematical formulation of a general solution to the extraction of three-dimensional information from images and camera calibration is presented. Standard photogrammetric algorithms for the least squares estimation of relevant parameters are outlined together with terms and principal aspects of calibration and quality assessment. A second generation prototype system for "Real-Time Photogrammetry" developed as part of the "Digital Photogrammetric Station" of the Institute of Geodesy and Photogrammetry of ETH-Zurich is described. Two calibration tests with three-dimensional testfields and independently determined reference coordinates for quality assessment are presented. In a laboratory calibration with off the shelf equipment an accuracy of 1120th and 1150th of the pixel spacing in row and column direction respectively has been achieved. Problems of the hardware used in the test are outlined. The calibration of a vision system of a ping-pong playing high-speed robot led to an improvement of the accuracy of object coordinates by a factor of over 8. The vision system is tracking table-tennis balls with a 50 Hz rate.

  3. Design and realization of an AEC&AGC system for the CCD aerial camera

    NASA Astrophysics Data System (ADS)

    Liu, Hai ying; Feng, Bing; Wang, Peng; Li, Yan; Wei, Hao yun

    2015-08-01

    An AEC and AGC(Automatic Exposure Control and Automatic Gain Control) system was designed for a CCD aerial camera with fixed aperture and electronic shutter. The normal AEC and AGE algorithm is not suitable to the aerial camera since the camera always takes high-resolution photographs in high-speed moving. The AEC and AGE system adjusts electronic shutter and camera gain automatically according to the target brightness and the moving speed of the aircraft. An automatic Gamma correction is used before the image is output so that the image is better for watching and analyzing by human eyes. The AEC and AGC system could avoid underexposure, overexposure, or image blurring caused by fast moving or environment vibration. A series of tests proved that the system meet the requirements of the camera system with its fast adjusting speed, high adaptability, high reliability in severe complex environment.

  4. Initial laboratory evaluation of color video cameras: Phase 2

    SciTech Connect

    Terry, P.L.

    1993-07-01

    Sandia National Laboratories has considerable experience with monochrome video cameras used in alarm assessment video systems. Most of these systems, used for perimeter protection, were designed to classify rather than to identify intruders. The monochrome cameras were selected over color cameras because they have greater sensitivity and resolution. There is a growing interest in the identification function of security video systems for both access control and insider protection. Because color camera technology is rapidly changing and because color information is useful for identification purposes, Sandia National Laboratories has established an on-going program to evaluate the newest color solid-state cameras. Phase One of the Sandia program resulted in the SAND91-2579/1 report titled: Initial Laboratory Evaluation of Color Video Cameras. The report briefly discusses imager chips, color cameras, and monitors, describes the camera selection, details traditional test parameters and procedures, and gives the results reached by evaluating 12 cameras. Here, in Phase Two of the report, we tested 6 additional cameras using traditional methods. In addition, all 18 cameras were tested by newly developed methods. This Phase 2 report details those newly developed test parameters and procedures, and evaluates the results.

  5. CCD camera and data acquisition system of the scientific instrument ELMER for the GTC 10-m telescope

    NASA Astrophysics Data System (ADS)

    Kohley, Ralf; Martin-Fleitas, Juan Manuel; Cavaller-Marques, Lluis; Hammersley, Peter L.; Suarez-Valles, Marcos; Vilela, Rafael; Beigbeder, Francis

    2004-09-01

    ELMER is a multi-purpose instrument for the GTC designed for both, Imaging and Spectrosopy in the visible range. The CCD camera employs a E2V Technologies CCD44-82 detector mounted in a high performance LN2 Bath Cryostat based on an ESO design and a SDSU-II CCD controller with parallel interface. The design including the low-noise fan-out electronics has been kept flexible to allow alternatively the use of MIT/LL CCID-20 detectors. We present the design of the CCD camera and data acquisition system and first performance test results.

  6. Measurement of nanosecond time-resolved fluorescence with a directly gated interline CCD camera.

    PubMed

    Mitchell, A C; Wall, J E; Murray, J G; Morgan, C G

    2002-06-01

    CCD cameras coupled optically to gated image intensifiers have been used for fast time-resolved measurements for some years. Image intensifiers have disadvantages, however, and for some applications it would be better if the image sensor could be gated directly at high speed. Control of the 'charge drain' function on an interline-transfer CCD allows the sensor to be switched rapidly from an insensitive state. The temporal and spatial properties of the charge drain are explored in the present paper and it is shown that nanosecond time resolution with acceptable spatial uniformity can be achieved for a small commercial sensor. A fluorescence lifetime imaging system is demonstrated, based on a repetitively pulsed laser excitation source synchronized to the CCD control circuitry via a programmable delay unit. PMID:12067368

  7. Grayscale adjustment method for CCD mosaic camera in surface defect detection system

    NASA Astrophysics Data System (ADS)

    Yan, Lu; Yang, Yongying; Wang, Xiaodan; Wang, Shitong; Cao, Pin; Li, Lu; Liu, Dong

    2014-09-01

    Based on microscopic imaging and sub-aperture stitching technology, Surface defect detection system realizes the automatic quantitative detection of submicron defects on the macroscopic surface of optical components, and solves quality control problems of numerous large- aperture precision optical components in ICF (Inertial Confinement Fusion) system. In order to improve the testing efficiency and reduce the number of sub-aperture images, the large format CCD (charged-coupled device) camera is employed to expand the field of view of the system. Large format CCD cameras are usually mosaicked by multi-channel CCD chips, but the differences among the intensity-grayscale functions of different channels will lead to the obvious gray gap among different regions of image. It may cause the shortening and fracture of defects in the process of the image binarization , and thereby lead to the misjudgment of defects. This paper analyzes the different gray characteristics in unbalance images, establishes gray matrix mode of image pixels, and finally proposes a new method to correct the gray gap of CCD self-adaptively. Firstly, by solving the inflection point of the pixel level curve in the gray histogram of the original image, the background threshold is set, and then the background of the image is obtained; Secondly, pixels are sampled from the background and calculated to get the gray gap among different regions of the image; Ultimately, the gray gap is compensated. With this method, an experiment is carried out to adjust 96 dual-channel images from testing a fused silica sample with aperture 180mm×120mm. The results show that the gray gap of the images on different channel is reduced from 3.64 to 0.70 grayscale on average. This method can be also applied to other CCD mosaic camera.

  8. The L3Vision CCD220 with its OCam test camera for AO applications in Europe

    NASA Astrophysics Data System (ADS)

    Feautrier, Philippe; Gach, Jean-Luc; Balard, Philippe; Guillaume, Christian; Downing, Mark; Stadler, Eric; Magnard, Yves; Denney, Sandy; Suske, Wolfgang; Jorden, Paul; Wheeler, Patrick; Skegg, Michael; Pool, Peter; Bell, Ray; Burt, David; Reyes, Javier; Meyer, Manfred; Hubin, Norbert; Baade, Dietrich; Kasper, Markus; Arsenault, Robin; Fusco, Thierry; Diaz Garcia, Jose Javier

    2008-07-01

    ESO and JRA2 OPTICON have jointly funded e2v technologies to develop a custom CCD for Adaptive Optic Wave Front Sensor (AO WFS) applications. The device, called CCD220, is a compact Peltier-cooled 240×240 pixel frametransfer 8-output back-illuminated sensor. Using the electron-multiplying technology of L3Vision detectors, the device is designed to achieve sub-electron read noise at frame rates from 25 Hz to 1,500 Hz and dark current lower than 0.01 e-/pixel/frame. The development has many unique features. To obtain high frame rates, multiple EMCCD gain registers and metal buttressing of row clock lines are used. The baseline device is built in standard silicon. In addition, two speculative variants have been built; deep depletion silicon devices to improve red response and devices with an electronic shutter to extend use to Rayleigh and Pulsed Laser Guide Star applications. These are all firsts for L3Vision CCDs. These CCD220 detectors have now been fabricated by e2v technologies. This paper describes the design of the device, technology trade-offs, and progress to date. A Test Camera, called "OCam", has been specially designed and built for these sensors. Main features of the OCam camera are extensively described in this paper, together with first light images obtained with the CCD220.

  9. Development of a Portable 3CCD Camera System for Multispectral Imaging of Biological Samples

    PubMed Central

    Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S.

    2014-01-01

    Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples. PMID:25350510

  10. Development of a portable 3CCD camera system for multispectral imaging of biological samples.

    PubMed

    Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S

    2014-01-01

    Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples. PMID:25350510

  11. Deflection Measurements of a Thermally Simulated Nuclear Core Using a High-Resolution CCD-Camera

    NASA Technical Reports Server (NTRS)

    Stanojev, B. J.; Houts, M.

    2004-01-01

    Space fission systems under consideration for near-term missions all use compact. fast-spectrum reactor cores. Reactor dimensional change with increasing temperature, which affects neutron leakage. is the dominant source of reactivity feedback in these systems. Accurately measuring core dimensional changes during realistic non-nuclear testing is therefore necessary in predicting the system nuclear equivalent behavior. This paper discusses one key technique being evaluated for measuring such changes. The proposed technique is to use a Charged Couple Device (CCD) sensor to obtain deformation readings of electrically heated prototypic reactor core geometry. This paper introduces a technique by which a single high spatial resolution CCD camera is used to measure core deformation in Real-Time (RT). Initial system checkout results are presented along with a discussion on how additional cameras could be used to achieve a three- dimensional deformation profile of the core during test.

  12. White light single-shot interferometry with colour CCD camera for optical inspection of microsystems

    NASA Astrophysics Data System (ADS)

    Upputuri, Paul Kumar; Pramanik, Manojit; Nandigana, Krishna Mohan; Kothiyal, Mahendra Prasad

    2015-07-01

    White light interferometry is a well-established optical tool for surface metrology of reflective samples. In this work, we discuss a single-shot white light interferometer based on single-chip color CCD camera and Hilbert transformation. The approach makes the measurement dynamic, faster, easier and cost-effective for industrial applications. Here we acquire only one white light interferogram using colour CCD camera and then decompose into its individual components using software. We present a simple Hilbert transformation approach to remove the non-uniform bias associated with the interference signal. The phases at individual wavelengths are calculated using Hilbert transformation. The use of Hilbert transformation introduces phase error which depends on number of fringe cycles. We discuss these errors. Experimental results on reflective micro-scale-samples for surface profiling are presented.

  13. The development of a high-speed 100 fps CCD camera

    SciTech Connect

    Hoffberg, M.; Laird, R.; Lenkzsus, F. Liu, Chuande; Rodricks, B.; Gelbart, A.

    1996-09-01

    This paper describes the development of a high-speed CCD digital camera system. The system has been designed to use CCDs from various manufacturers with minimal modifications. The first camera built on this design utilizes a Thomson 512x512 pixel CCD as its sensor which is read out from two parallel outputs at a speed of 15 MHz/pixel/output. The data undergoes correlated double sampling after which, they are digitized into 12 bits. The throughput of the system translates into 60 MB/second which is either stored directly in a PC or transferred to a custom designed VXI module. The PC data acquisition version of the camera can collect sustained data in real time that is limited to the memory installed in the PC. The VXI version of the camera, also controlled by a PC, stores 512 MB of real-time data before it must be read out to the PC disk storage. The uncooled CCD can be used either with lenses for visible light imaging or with a phosphor screen for x-ray imaging. This camera has been tested with a phosphor screen coupled to a fiber-optic face plate for high-resolution, high-speed x-ray imaging. The camera is controlled through a custom event-driven user-friendly Windows package. The pixel clock speed can be changed from I MHz to 15 MHz. The noise was measure to be 1.05 bits at a 13.3 MHz pixel clock. This paper will describe the electronics, software, and characterizations that have been performed using both visible and x-ray photons.

  14. Outer planet investigations using a CCD camera system. [Saturn disk photommetry

    NASA Technical Reports Server (NTRS)

    Price, M. J.

    1980-01-01

    Problems related to analog noise, data transfer from the camera buffer to the storage computer, and loss of sensitivity of a two dimensional charge coupled device imaging system are reported. To calibrate the CCD system, calibrated UBV pinhole scans of the Saturn disk were obtained with a photoelectric area scanning photometer. Atmospheric point spread functions were also obtained. The UBV observations and models of the Saturn atmosphere are analyzed.

  15. Subtractive imaging in confocal scanning microscopy using a CCD camera as a detector.

    PubMed

    Sánchez-Ortiga, Emilio; Sheppard, Colin J R; Saavedra, Genaro; Martínez-Corral, Manuel; Doblas, Ana; Calatayud, Arnau

    2012-04-01

    We report a scheme for the detector system of confocal microscopes in which the pinhole and a large-area detector are substituted by a CCD camera. The numerical integration of the intensities acquired by the active pixels emulates the signal passing through the pinhole. We demonstrate the imaging capability and the optical sectioning of the system. Subtractive-imaging confocal microscopy can be implemented in a simple manner, providing superresolution and improving optical sectioning. PMID:22466221

  16. Development and characterization of a CCD camera system for use on six-inch manipulator systems

    SciTech Connect

    Logory, L.M.; Bell, P.M.; Conder, A.D.; Lee, F.D.

    1996-05-03

    The Lawrence Livermore National Laboratory has designed, constructed, and fielded a compact CCD camera system for use on the Six Inch Manipulator (SIM) at the Nova laser facility. The camera system has been designed to directly replace the 35 mm film packages on all active SIM-based diagnostics. The unit`s electronic package is constructed for small size and high thermal conductivity using proprietary printed circuit board technology, thus reducing the size of the overall camera and improving its performance when operated within the vacuum environment of the Nova laser target chamber. The camera has been calibrated and found to yield a linear response, with superior dynamic range and signal-to-noise levels as compared to T-Max 3200 optic film, while providing real-time access to the data. Limiting factors related to fielding such devices on Nova will be discussed, in addition to planned improvements of the current design.

  17. High-resolution image digitizing through 12x3-bit RGB-filtered CCD camera

    NASA Astrophysics Data System (ADS)

    Cheng, Andrew Y. S.; Pau, Michael C. Y.

    1996-09-01

    A high resolution computer-controlled CCD image capturing system is developed by using a 12 bits 1024 by 1024 pixels CCD camera and motorized RGB filters to grasp an image with color depth up to 36 bits. The filters distinguish the major components of color and collect them separately while the CCD camera maintains the spatial resolution and detector filling factor. The color separation can be done optically rather than electronically. The operation is simply by placing the capturing objects like color photos, slides and even x-ray transparencies under the camera system, the necessary parameters such as integration time, mixing level and light intensity are automatically adjusted by an on-line expert system. This greatly reduces the restrictions of the capturing species. This unique approach can save considerable time for adjusting the quality of image, give much more flexibility of manipulating captured object even if it is a 3D object with minimal setup fixers. In addition, cross sectional dimension of a 3D capturing object can be analyzed by adapting a fiber optic ring light source. It is particularly useful in non-contact metrology of a 3D structure. The digitized information can be stored in an easily transferable format. Users can also perform a special LUT mapping automatically or manually. Applications of the system include medical images archiving, printing quality control, 3D machine vision, and etc.

  18. Analysis of unstructured video based on camera motion

    NASA Astrophysics Data System (ADS)

    Abdollahian, Golnaz; Delp, Edward J.

    2007-01-01

    Although considerable work has been done in management of "structured" video such as movies, sports, and television programs that has known scene structures, "unstructured" video analysis is still a challenging problem due to its unrestricted nature. The purpose of this paper is to address issues in the analysis of unstructured video and in particular video shot by a typical unprofessional user (i.e home video). We describe how one can make use of camera motion information for unstructured video analysis. A new concept, "camera viewing direction," is introduced as the building block of home video analysis. Motion displacement vectors are employed to temporally segment the video based on this concept. We then find the correspondence between the camera behavior with respect to the subjective importance of the information in each segment and describe how different patterns in the camera motion can indicate levels of interest in a particular object or scene. By extracting these patterns, the most representative frames, keyframes, for the scenes are determined and aggregated to summarize the video sequence.

  19. DETAIL VIEW OF A VIDEO CAMERA POSITIONED ALONG THE PERIMETER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL VIEW OF A VIDEO CAMERA POSITIONED ALONG THE PERIMETER OF THE MLP - Cape Canaveral Air Force Station, Launch Complex 39, Mobile Launcher Platforms, Launcher Road, East of Kennedy Parkway North, Cape Canaveral, Brevard County, FL

  20. DETAIL VIEW OF VIDEO CAMERA, MAIN FLOOR LEVEL, PLATFORM ESOUTH, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL VIEW OF VIDEO CAMERA, MAIN FLOOR LEVEL, PLATFORM E-SOUTH, HB-3, FACING SOUTHWEST - Cape Canaveral Air Force Station, Launch Complex 39, Vehicle Assembly Building, VAB Road, East of Kennedy Parkway North, Cape Canaveral, Brevard County, FL

  1. Are Video Cameras the Key to School Safety?

    ERIC Educational Resources Information Center

    Maranzano, Chuck

    1998-01-01

    Describes one high school's use of video cameras as a preventive tool in stemming theft and violent episodes within schools. The top 10 design tips for preventing crime on campus are highlighted. (GR)

  2. Research on simulation and verification system of satellite remote sensing camera video processor based on dual-FPGA

    NASA Astrophysics Data System (ADS)

    Ma, Fei; Liu, Qi; Cui, Xuenan

    2014-09-01

    To satisfy the needs for testing video processor of satellite remote sensing cameras, a design is provided to achieve a simulation and verification system of satellite remote sensing camera video processor based on dual-FPGA. The correctness of video processor FPGA logic can be verified even without CCD signals or analog to digital convertor. Two Xilinx Virtex FPGAs are adopted to make a center unit, the logic of A/D digital data generating and data processing are developed with VHDL. The RS-232 interface is used to receive commands from the host computer, and different types of data are generated and outputted depending on the commands. Experimental results show that the simulation and verification system is flexible and can work well. The simulation and verification system meets the requirements of testing video processors for several different types of satellite remote sensing cameras.

  3. Video camera system for locating bullet holes in targets at a ballistics tunnel

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Rummler, D. R.; Goad, W. K.

    1990-01-01

    A system consisting of a single charge coupled device (CCD) video camera, computer controlled video digitizer, and software to automate the measurement was developed to measure the location of bullet holes in targets at the International Shooters Development Fund (ISDF)/NASA Ballistics Tunnel. The camera/digitizer system is a crucial component of a highly instrumented indoor 50 meter rifle range which is being constructed to support development of wind resistant, ultra match ammunition. The system was designed to take data rapidly (10 sec between shoots) and automatically with little operator intervention. The system description, measurement concept, and procedure are presented along with laboratory tests of repeatability and bias error. The long term (1 hour) repeatability of the system was found to be 4 microns (one standard deviation) at the target and the bias error was found to be less than 50 microns. An analysis of potential errors and a technique for calibration of the system are presented.

  4. The calibration of video cameras for quantitative measurements

    NASA Technical Reports Server (NTRS)

    Snow, Walter L.; Childers, Brooks A.; Shortis, Mark R.

    1993-01-01

    Several different recent applications of velocimetry at Langley Research Center are described in order to show the need for video camera calibration for quantitative measurements. Problems peculiar to video sensing are discussed, including synchronization and timing, targeting, and lighting. The extension of the measurements to include radiometric estimates is addressed.

  5. Traceability of a CCD-Camera System for High-Temperature Measurements

    NASA Astrophysics Data System (ADS)

    Bünger, L.; Anhalt, K.; Taubert, R. D.; Krüger, U.; Schmidt, F.

    2015-08-01

    A CCD camera, which has been specially equipped with narrow-band interference filters in the visible spectral range for temperature measurements above 1200 K, was characterized with respect to its temperature response traceable to ITS-90 and with respect to absolute spectral radiance responsivity. The calibration traceable to ITS-90 was performed at a high-temperature blackbody source using a radiation thermometer as a transfer standard. Use of Planck's law and the absolute spectral radiance responsivity of the camera system allows the determination of the thermodynamic temperature. For the determination of the absolute spectral radiance responsivity, a monochromator-based setup with a supercontinuum white-light laser source was developed. The CCD-camera system was characterized with respect to the dark-signal-non-uniformity, the photo-response-non-uniformity, the non-linearity, and the size-of-source effect. The influence of these parameters on the calibration and measurement was evaluated and is considered for the uncertainty budget. The results of the two different calibration schemes for the investigated temperature range from 1200 K to 1800 K are in good agreement considering the expanded uncertainty . The uncertainty for the absolute spectral responsivity of the camera is 0.56 %.

  6. Portable CCD Cameras for Introductory Astronomy Lab:The First Year

    NASA Astrophysics Data System (ADS)

    Meisel, D.; Showers, D.; Lang, M.; Emerling, B.; Greenfield, M.; Abe, W.; Lane, D.

    1993-12-01

    Thirteen portable (battery-operated) CCD camera units have been assembled from commercially available equipment. Each kit consists of a Macintosh Powerbook 145, an SBIG ST-4 CCD array with 8mm f/1.8 TV lens, and camera tripod packed into photographic "suitcases" that students borrow and take outside. Unguided exposures of 50 seconds or less are all that are required to obtain useful images of the moon, planets, the Milky Way, the Andromeda galaxy, and stars (to roughly 5th magnitude in the red) directly from the exasperatingly bright SUNY-Geneseo campus. In addition to unfiltered images, these cameras were also able to take filtered images and low dispersion grating spectra. The images were successfully processed and reduced by the students themselves using a special version of the well-known MAC shareware program MAIA written by Tim DeBenedictis. In the fall of 1993, approximately 130 undergraduate students working in pairs were able to obtain up to 14 images with each camera during two one-hour periods. Images of fields containing bright variable stars were used by the students to prepare short "research" reports for oral presentation. Student reaction was, without exception, enthusiastic despite the usual bouts of astronomical reality ...poor weather, moonlight, and equipment failures. This work is supported by NSF ILI Grant USE9250493 and grants from SUNY-Geneseo.

  7. Demonstrations of Optical Spectra with a Video Camera

    ERIC Educational Resources Information Center

    Kraftmakher, Yaakov

    2012-01-01

    The use of a video camera may markedly improve demonstrations of optical spectra. First, the output electrical signal from the camera, which provides full information about a picture to be transmitted, can be used for observing the radiant power spectrum on the screen of a common oscilloscope. Second, increasing the magnification by the camera…

  8. CCD camera baseline calibration and its effects on imaging processing and laser beam analysis

    NASA Astrophysics Data System (ADS)

    Roundy, Carlos B.

    1997-09-01

    CCD cameras are commonly used for many imaging applications, as well as in optical instrumentation applications. These cameras have many excellent characteristics for both scene imaging and laser beam analysis. However, CCD cameras have two characteristics that limit their potential performance. The first limiting factor is the baseline drift of the camera. If the baseline drifts below the digitizer zero, data in the background is lost, and is uncorrectable. If the baseline drifts above the digitizer zero, than a false background is introduced into the scene. This false background is partially correctable by taking a background frame with no input image, and then subtracting that from each imaged frame. ('Partially correctable' is explained in detail later.) The second characteristic that inhibits CCD cameras is their high level of random noise. A typical CCD camera used with an 8-bit digitizer yielding 256 counts, has 2 to 6 counts of random noise in the baseline. The noise is typically Gaussian, and goes both positive and negative about a mean or average baseline level. When normal baseline subtraction occurs, the negative noise components are truncated, leaving only the positive components. These lost negative noise components can distort measurements that rely on low intensity background. Situations exist in which the baseline offset and lost negative noise components are very significant. For example, in image processing, when attempting to distinguish data with a very low contrast between objects, the contrast is compromised by the loss of the negative noise. Secondly the measurement of laser beam widths requires analysis of very low intensity signals far out into the wings of the beam. The intensity is low, but the area is large, and so even small distortion can create significant errors in measuring beam width. The effect of baseline error is particularly significant on the measurement of a laser beam width. This measurement is very important because it gives the size of the beam at the measurement point, it is used in laser divergence measurement, and it is critical for realistic measurement of M2, the ultimate criterion for the quality of a laser beam. One measurement of laser beam width, called second moment, or D4(sigma) , which is the ISO definition of a true laser beam width, is especially sensitive to noise in the baseline. The D4(sigma) measurement method integrates all signals far out into the wings of the beam, and gives particular weight to the noise and signal in the wings. It is impossible to make this measurement without the negative noise components, and without other special algorithms to limit the effect of noise in the wings.

  9. Design of an Event-Driven Random-Access-Windowing CCD-Based Camera

    NASA Technical Reports Server (NTRS)

    Monacos, Steve P.; Lam, Raymond K.; Portillo, Angel A.; Ortiz, Gerardo G.

    2003-01-01

    Commercially available cameras are not design for the combination of single frame and high-speed streaming digital video with real-time control of size and location of multiple regions-of-interest (ROI). A new control paradigm is defined to eliminate the tight coupling between the camera logic and the host controller. This functionality is achieved by defining the indivisible pixel read out operation on a per ROI basis with in-camera time keeping capability. This methodology provides a Random Access, Real-Time, Event-driven (RARE) camera for adaptive camera control and is will suited for target tracking applications requiring autonomous control of multiple ROI's. This methodology additionally provides for reduced ROI read out time and higher frame rates compared to the original architecture by avoiding external control intervention during the ROI read out process.

  10. Optical readout of a two phase liquid argon TPC using CCD camera and THGEMs

    NASA Astrophysics Data System (ADS)

    Mavrokoridis, K.; Ball, F.; Carroll, J.; Lazos, M.; McCormick, K. J.; Smith, N. A.; Touramanis, C.; Walker, J.

    2014-02-01

    This paper presents a preliminary study into the use of CCDs to image secondary scintillation light generated by THick Gas Electron Multipliers (THGEMs) in a two phase LAr TPC. A Sony ICX285AL CCD chip was mounted above a double THGEM in the gas phase of a 40 litre two-phase LAr TPC with the majority of the camera electronics positioned externally via a feedthrough. An Am-241 source was mounted on a rotatable motion feedthrough allowing the positioning of the alpha source either inside or outside of the field cage. Developed for and incorporated into the TPC design was a novel high voltage feedthrough featuring LAr insulation. Furthermore, a range of webcams were tested for operation in cryogenics as an internal detector monitoring tool. Of the range of webcams tested the Microsoft HD-3000 (model no:1456) webcam was found to be superior in terms of noise and lowest operating temperature. In ambient temperature and atmospheric pressure 1 ppm pure argon gas, the THGEM gain was ≈ 1000 and using a 1 msec exposure the CCD captured single alpha tracks. Successful operation of the CCD camera in two-phase cryogenic mode was also achieved. Using a 10 sec exposure a photograph of secondary scintillation light induced by the Am-241 source in LAr has been captured for the first time.

  11. OCam with CCD220, the Fastest and Most Sensitive Camera to Date for AO Wavefront Sensing

    NASA Astrophysics Data System (ADS)

    Feautrier, Philippe; Gach, Jean-Luc; Balard, Philippe; Guillaume, Christian; Downing, Mark; Hubin, Norbert; Stadler, Eric; Magnard, Yves; Skegg, Michael; Robbins, Mark; Denney, Sandy; Suske, Wolfgang; Jorden, Paul; Wheeler, Patrick; Pool, Peter; Bell, Ray; Burt, David; Davies, Ian; Reyes, Javier; Meyer, Manfred; Baade, Dietrich; Kasper, Markus; Arsenault, Robin; Fusco, Thierry; Diaz Garcia, José Javier

    2011-03-01

    For the first time, subelectron readout noise has been achieved with a camera dedicated to astronomical wavefront-sensing applications. The OCam system demonstrated this performance at a 1300 Hz frame rate and with 240 × 240 pixel frame size. ESO and JRA2 OPTICON jointly funded e2v Technologies to develop a custom CCD for adaptive optics (AO) wavefront-sensing applications. The device, called CCD220, is a compact Peltier-cooled 240 × 240 pixel frame-transfer eight-output back-illuminated sensor using the EMCCD technology. This article demonstrates, for the first time, subelectron readout noise at frame rates from 25 Hz to 1300 Hz and dark current lower than 0.01 e- pixel-1 frame-1. It reports on the quantitative performance characterization of OCam and the CCD220, including readout noise, dark current, multiplication gain, quantum efficiency, and charge transfer efficiency. OCam includes a low-noise preamplifier stage, a digital board to generate the clocks, and a microcontroller. The data acquisition system includes a user-friendly timer file editor to generate any type of clocking scheme. A second version of OCam, called OCam2, has been designed to offer enhanced performance, a completely sealed camera package, and an additional Peltier stage to facilitate operation on a telescope or environmentally challenging applications. New features of OCam2 are presented in this article. This instrumental development will strongly impact the performance of the most advanced AO systems to come.

  12. Video Cameras in the Ondrejov Flare Spectrograph Results and Prospects

    NASA Astrophysics Data System (ADS)

    Kotrc, P.

    Since 1991 video cameras have been widely used both in the image and in the spectral data acquisition of the Ondrejov Multichannel Flare Spectrograph. In addition to classical photographic data registration, this kind of detectors brought new possibilities, especially into dynamical solar phenomena observations and put new requirements on the digitization, archiving and data processing techniques. The unique complex video system consisting of four video cameras and auxiliary equipment was mostly developed, implemented and used in the Ondrejov observatory. The main advantages and limitations of the system are briefly described from the points of view of its scientific philosophy, intents and outputs. Some obtained results, experience and future prospects are discussed.

  13. Proton radiation damage experiment on P-Channel CCD for an X-ray CCD camera onboard the ASTRO-H satellite

    NASA Astrophysics Data System (ADS)

    Mori, Koji; Nishioka, Yusuke; Ohura, Satoshi; Koura, Yoshiaki; Yamauchi, Makoto; Nakajima, Hiroshi; Ueda, Shutaro; Kan, Hiroaki; Anabuki, Naohisa; Nagino, Ryo; Hayashida, Kiyoshi; Tsunemi, Hiroshi; Kohmura, Takayoshi; Ikeda, Shoma; Murakami, Hiroshi; Ozaki, Masanobu; Dotani, Tadayasu; Maeda, Yukie; Sagara, Kenshi

    2013-12-01

    We report on a proton radiation damage experiment on P-channel CCD newly developed for an X-ray CCD camera onboard the ASTRO-H satellite. The device was exposed up to 109 protons cm-2 at 6.7 MeV. The charge transfer inefficiency (CTI) was measured as a function of radiation dose. In comparison with the CTI currently measured in the CCD camera onboard the Suzaku satellite for 6 years, we confirmed that the new type of P-channel CCD is radiation tolerant enough for space use. We also confirmed that a charge-injection technique and lowering the operating temperature efficiently work to reduce the CTI for our device. A comparison with other P-channel CCD experiments is also discussed. We performed a proton radiation damage experiment on a new P-channel CCD. The device was exposed up to 109 protons cm-2 at 6.7 MeV. We confirmed that it is radiation tolerant enough for space use. We confirmed that a charge-injection technique reduces the CTI. We confirmed that lowering the operating temperature also reduces the CTI.

  14. Soft-x-ray Gabor holography by use of a backilluminated CCD camera

    NASA Astrophysics Data System (ADS)

    Watanabe, N.; Sakurai, K.; Takeuchi, A.; Aoki, S.

    1997-10-01

    A soft-x-ray Gabor hologram was recorded directly by means of a cooled backilluminated CCD camera, and numerical reconstruction was performed. Synchrotron radiation from a bending magnet with beam line 11 A at the Photon Factory, National Laboratory for High Energy Physics, Japan, was used. X rays were monochromatized to a wavelength of 2.34 nm with a grasshopper monochromator and focused on a pinhole of diameter 1.0 m with a zone plate. A specimen was illuminated with soft rays transmitted through the pinhole, and a shadowgraph with a magnification ratio of 33 at the CCD plane was digitized directly. The transverse resolution of the reconstructed image was estimated to be 1.5 m, which was in good agreement with a theoretical value.

  15. Design and realization of an image mosaic system on the CCD aerial camera

    NASA Astrophysics Data System (ADS)

    Liu, Hai ying; Wang, Peng; Zhu, Hai bin; Li, Yan; Zhang, Shao jun

    2015-08-01

    It has long been difficulties in aerial photograph to stitch multi-route images into a panoramic image in real time for multi-route flight framing CCD camera with very large amount of data, and high accuracy requirements. An automatic aerial image mosaic system based on GPU development platform is described in this paper. Parallel computing of SIFT feature extraction and matching algorithm module is achieved by using CUDA technology for motion model parameter estimation on the platform, which makes it's possible to stitch multiple CCD images in real-time. Aerial tests proved that the mosaic system meets the user's requirements with 99% accuracy and 30 to 50 times' speed improvement of the normal mosaic system.

  16. A pnCCD-based, fast direct single electron imaging camera for TEM and STEM

    NASA Astrophysics Data System (ADS)

    Ryll, H.; Simson, M.; Hartmann, R.; Holl, P.; Huth, M.; Ihle, S.; Kondo, Y.; Kotula, P.; Liebel, A.; Müller-Caspary, K.; Rosenauer, A.; Sagawa, R.; Schmidt, J.; Soltau, H.; Strüder, L.

    2016-04-01

    We report on a new camera that is based on a pnCCD sensor for applications in scanning transmission electron microscopy. Emerging new microscopy techniques demand improved detectors with regards to readout rate, sensitivity and radiation hardness, especially in scanning mode. The pnCCD is a 2D imaging sensor that meets these requirements. Its intrinsic radiation hardness permits direct detection of electrons. The pnCCD is read out at a rate of 1,150 frames per second with an image area of 264 x 264 pixel. In binning or windowing modes, the readout rate is increased almost linearly, for example to 4000 frames per second at 4× binning (264 x 66 pixel). Single electrons with energies from 300 keV down to 5 keV can be distinguished due to the high sensitivity of the detector. Three applications in scanning transmission electron microscopy are highlighted to demonstrate that the pnCCD satisfies experimental requirements, especially fast recording of 2D images. In the first application, 65536 2D diffraction patterns were recorded in 70 s. STEM images corresponding to intensities of various diffraction peaks were reconstructed. For the second application, the microscope was operated in a Lorentz-like mode. Magnetic domains were imaged in an area of 256 x 256 sample points in less than 37 seconds for a total of 65536 images each with 264 x 132 pixels. Due to information provided by the two-dimensional images, not only the amplitude but also the direction of the magnetic field could be determined. In the third application, millisecond images of a semiconductor nanostructure were recorded to determine the lattice strain in the sample. A speed-up in measurement time by a factor of 200 could be achieved compared to a previously used camera system.

  17. Experimental research on femto-second laser damaging array CCD cameras

    NASA Astrophysics Data System (ADS)

    Shao, Junfeng; Guo, Jin; Wang, Ting-feng; Wang, Ming

    2013-05-01

    Charged Coupled Devices (CCD) are widely used in military and security applications, such as airborne and ship based surveillance, satellite reconnaissance and so on. Homeland security requires effective means to negate these advanced overseeing systems. Researches show that CCD based EO systems can be significantly dazzled or even damaged by high-repetition rate pulsed lasers. Here, we report femto - second laser interaction with CCD camera, which is probable of great importance in future. Femto - second laser is quite fresh new lasers, which has unique characteristics, such as extremely short pulse width (1 fs = 10-15 s), extremely high peak power (1 TW = 1012W), and especially its unique features when interacting with matters. Researches in femto second laser interaction with materials (metals, dielectrics) clearly indicate non-thermal effect dominates the process, which is of vast difference from that of long pulses interaction with matters. Firstly, the damage threshold test are performed with femto second laser acting on the CCD camera. An 800nm, 500μJ, 100fs laser pulse is used to irradiate interline CCD solid-state image sensor in the experiment. In order to focus laser energy onto tiny CCD active cells, an optical system of F/5.6 is used. A Sony production CCDs are chose as typical targets. The damage threshold is evaluated with multiple test data. Point damage, line damage and full array damage were observed when the irradiated pulse energy continuously increase during the experiment. The point damage threshold is found 151.2 mJ/cm2.The line damage threshold is found 508.2 mJ/cm2.The full-array damage threshold is found to be 5.91 J/cm2. Although the phenomenon is almost the same as that of nano laser interaction with CCD, these damage thresholds are substantially lower than that of data obtained from nano second laser interaction with CCD. Then at the same time, the electric features after different degrees of damage are tested with electronic multi meter. The resistance values between clock signal lines are measured. Contrasting the resistance values of the CCD before and after damage, it is found that the resistances decrease significantly between the vertical transfer clock signal lines values. The same results are found between the vertical transfer clock signal line and the earth electrode (ground).At last, the damage position and the damage mechanism were analyzed with above results and SEM morphological experiments. The point damage results in the laser destroying material, which shows no macro electro influence. The line damage is quite different from that of point damage, which shows deeper material corroding effect. More importantly, short circuits are found between vertical clock lines. The full array damage is even more severe than that of line damage starring with SEM, while no obvious different electrical features than that of line damage are found. Further researches are anticipated in femto second laser caused CCD damage mechanism with more advanced tools. This research is valuable in EO countermeasure and/or laser shielding applications.

  18. An intensified/shuttered cooled CCD camera for dynamic proton radiography

    SciTech Connect

    Yates, G.J.; Albright, K.L.; Alrick, K.R.

    1998-12-31

    An intensified/shuttered cooled PC-based CCD camera system was designed and successfully fielded on proton radiography experiments at the Los Alamos National Laboratory LANSCE facility using 800-MeV protons. The four camera detector system used front-illuminated full-frame CCD arrays (two 1,024 x 1,024 pixels and two 512 x 512 pixels) fiber optically coupled to either 25-mm diameter planar diode or microchannel plate image intensifiers which provided optical shuttering for time resolved imaging of shock propagation in high explosives. The intensifiers also provided wavelength shifting and optical gain. Typical sequences consisting of four images corresponding to consecutive exposures of about 500 ns duration for 40-ns proton burst images (from a fast scintillating fiber array) separated by approximately 1 microsecond were taken during the radiography experiments. Camera design goals and measured performance characteristics including resolution, dynamic range, responsivity, system detection quantum efficiency (DQE), and signal-to-noise will be discussed.

  19. Upwelling radiance at 976 nm measured from space using the OPALS CCD camera on the ISS

    NASA Astrophysics Data System (ADS)

    Biswas, Abhijit; Kovalik, Joseph M.; Oaida, Bogdan V.; Abrahamson, Matthew; Wright, Malcolm W.

    2015-03-01

    The Optical Payload for Lasercomm Science (OPALS) Flight System on-board the International Space Station uses a charge coupled device (CCD) camera to detect a beacon laser from Earth. Relative measurements of the background contributed by upwelling radiance under diverse illumination conditions and varying surface terrain is presented. In some cases clouds in the field-of-view allowed a comparison of terrestrial and cloud-top upwelling radiance. In this paper we will report these measurements and examine the extent of agreement with atmospheric model predictions.

  20. Evidence for micrometeoroid damage in the pn-CCD camera system aboard XMM-Newton

    NASA Astrophysics Data System (ADS)

    Strüder, L.; Aschenbach, B.; Bräuninger, H.; Drolshagen, G.; Englhauser, J.; Hartmann, R.; Hartner, G.; Holl, P.; Kemmer, J.; Meidinger, N.; Stübig, M.; Trümper, J.

    2001-08-01

    The mirror systems of the X-ray observatory XMM-Newton were designed to image X-rays up to 15 keV by grazing incidence reflection onto a focal plane, equipped with Charge Coupled Devices (CCDs). In orbit # 156 we have observed a sudden increase of about 35 ``bright'' pixels spread over 15 cm2 in the pn-CCD camera system. The amount of locally generated leakage current cannot be explained by ionizing particles. We suggest that a micrometeoroid scattered under a small angle off the X-ray telescope mirror surface finally reached the focal plane detector and produced the damage.

  1. The Laboratory Radiometric Calibration of the CCD Stereo Camera for the Optical Payload of the Lunar Explorer Project

    NASA Astrophysics Data System (ADS)

    Wang, Jue; Li, Chun-Lai; Zhao, Bao-Chang

    2007-03-01

    The system of the optical payload for the Lunar Explorer includes a CCD stereo camera and an imaging interferometer. The former is devised to get the solid images of the lunar surface with a laser altimeter. The camera working principle, calibration purpose, and content, nude chip detection, and the process of the relative and absolute calibration in the laboratory are introduced.

  2. HERSCHEL/SCORE, imaging the solar corona in visible and EUV light: CCD camera characterization.

    PubMed

    Pancrazzi, M; Focardi, M; Landini, F; Romoli, M; Fineschi, S; Gherardi, A; Pace, E; Massone, G; Antonucci, E; Moses, D; Newmark, J; Wang, D; Rossi, G

    2010-07-01

    The HERSCHEL (helium resonant scattering in the corona and heliosphere) experiment is a rocket mission that was successfully launched last September from White Sands Missile Range, New Mexico, USA. HERSCHEL was conceived to investigate the solar corona in the extreme UV (EUV) and in the visible broadband polarized brightness and provided, for the first time, a global map of helium in the solar environment. The HERSCHEL payload consisted of a telescope, HERSCHEL EUV Imaging Telescope (HEIT), and two coronagraphs, HECOR (helium coronagraph) and SCORE (sounding coronagraph experiment). The SCORE instrument was designed and developed mainly by Italian research institutes and it is an imaging coronagraph to observe the solar corona from 1.4 to 4 solar radii. SCORE has two detectors for the EUV lines at 121.6 nm (HI) and 30.4 nm (HeII) and the visible broadband polarized brightness. The SCORE UV detector is an intensified CCD with a microchannel plate coupled to a CCD through a fiber-optic bundle. The SCORE visible light detector is a frame-transfer CCD coupled to a polarimeter based on a liquid crystal variable retarder plate. The SCORE coronagraph is described together with the performances of the cameras for imaging the solar corona. PMID:20428852

  3. LAIWO: a new wide-field CCD camera for Wise Observatory

    NASA Astrophysics Data System (ADS)

    Baumeister, Harald; Afonso, Cristina; Marien, Karl-Heinz; Klein, Ralf

    2006-06-01

    LAIWO is a new CCD wide-field camera for the 40-inch Ritchey-Chretien telescope at Wise Observatory in Mitzpe Ramon/Israel. The telescope is identical to the 40-in. telescope at Las Campanas Observatory, Chile, which is described in [2]. LAIWO was designed and built at Max-Planck-Institute for Astronomy in Heidelberg, Germany. The scientific aim of the instrument is to detect Jupiter-sized extra-solar planets around I=14-15 magnitude stars with the transit method, which relies on the temporary drop in brightness of the parent star harboring the planet. LAIWO can observe a 1.4 x 1.4 degree field-of-view and has four CCDs with 4096*4096 pixels each The Fairchild Imaging CCDs have a pixel size of 15 microns. Since they are not 2-side buttable, they are arranged with spacings between the chips that is equal to the size of a single CCD minus a small overlap. The CCDs are cooled by liquid nitrogen to a temperature of about -100 °C. The four science CCDs and the guider CCD are mounted on a common cryogenic plate which can be adjusted in three degrees of freedom. Each of these detectors can also be adjusted independently by a similar mechanism. The instrument contains large shutter and filter mechanisms, both designed in a modular way for fast exchange and easy maintenance.

  4. Synchronizing Light Pulses With Video Camera

    NASA Technical Reports Server (NTRS)

    Kalshoven, James E., Jr.; Tierney, Michael; Dabney, Philip

    1993-01-01

    Interface circuit triggers laser or other external source of light to flash in proper frame and field (at proper time) for video recording and playback in "pause" mode. Also increases speed of electronic shutter (if any) during affected frame to reduce visibility of background illumination relative to that of laser illumination.

  5. CCD-camera-based diffuse optical tomography to study ischemic stroke in preclinical rat models

    NASA Astrophysics Data System (ADS)

    Lin, Zi-Jing; Niu, Haijing; Liu, Yueming; Su, Jianzhong; Liu, Hanli

    2011-02-01

    Stroke, due to ischemia or hemorrhage, is the neurological deficit of cerebrovasculature and is the third leading cause of death in the United States. More than 80 percent of stroke patients are ischemic stroke due to blockage of artery in the brain by thrombosis or arterial embolism. Hence, development of an imaging technique to image or monitor the cerebral ischemia and effect of anti-stoke therapy is more than necessary. Near infrared (NIR) optical tomographic technique has a great potential to be utilized as a non-invasive image tool (due to its low cost and portability) to image the embedded abnormal tissue, such as a dysfunctional area caused by ischemia. Moreover, NIR tomographic techniques have been successively demonstrated in the studies of cerebro-vascular hemodynamics and brain injury. As compared to a fiberbased diffuse optical tomographic system, a CCD-camera-based system is more suitable for pre-clinical animal studies due to its simpler setup and lower cost. In this study, we have utilized the CCD-camera-based technique to image the embedded inclusions based on tissue-phantom experimental data. Then, we are able to obtain good reconstructed images by two recently developed algorithms: (1) depth compensation algorithm (DCA) and (2) globally convergent method (GCM). In this study, we will demonstrate the volumetric tomographic reconstructed results taken from tissuephantom; the latter has a great potential to determine and monitor the effect of anti-stroke therapies.

  6. A toolkit for the characterization of CCD cameras for transmission electron microscopy.

    PubMed

    Vulovic, M; Rieger, B; van Vliet, L J; Koster, A J; Ravelli, R B G

    2010-01-01

    Charge-coupled devices (CCD) are nowadays commonly utilized in transmission electron microscopy (TEM) for applications in life sciences. Direct access to digitized images has revolutionized the use of electron microscopy, sparking developments such as automated collection of tomographic data, focal series, random conical tilt pairs and ultralarge single-particle data sets. Nevertheless, for ultrahigh-resolution work photographic plates are often still preferred. In the ideal case, the quality of the recorded image of a vitrified biological sample would solely be determined by the counting statistics of the limited electron dose the sample can withstand before beam-induced alterations dominate. Unfortunately, the image is degraded by the non-ideal point-spread function of the detector, as a result of a scintillator coupled by fibre optics to a CCD, and the addition of several inherent noise components. Different detector manufacturers provide different types of figures of merit when advertising the quality of their detector. It is hard for most laboratories to verify whether all of the anticipated specifications are met. In this report, a set of algorithms is presented to characterize on-axis slow-scan large-area CCD-based TEM detectors. These tools have been added to a publicly available image-processing toolbox for MATLAB. Three in-house CCD cameras were carefully characterized, yielding, among others, statistics for hot and bad pixels, the modulation transfer function, the conversion factor, the effective gain and the detective quantum efficiency. These statistics will aid data-collection strategy programs and provide prior information for quantitative imaging. The relative performance of the characterized detectors is discussed and a comparison is made with similar detectors that are used in the field of X-ray crystallography. PMID:20057054

  7. Performance Characterization of the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) CCD Cameras

    NASA Technical Reports Server (NTRS)

    Joiner, Reyann; Kobayashi, Ken; Winebarger, Amy; Champey, Patrick

    2014-01-01

    The Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) is a sounding rocket instrument currently being developed by NASA's Marshall Space Flight Center (MSFC), the National Astronomical Observatory of Japan (NAOJ), and other partners. The goal of this instrument is to observe and detect the Hanle effect in the scattered Lyman-Alpha UV (121.6nm) light emitted by the Sun's chromosphere. The polarized spectrum imaged by the CCD cameras will capture information about the local magnetic field, allowing for measurements of magnetic strength and structure. In order to make accurate measurements of this effect, the performance characteristics of the three on- board charge-coupled devices (CCDs) must meet certain requirements. These characteristics include: quantum efficiency, gain, dark current, read noise, and linearity. Each of these must meet predetermined requirements in order to achieve satisfactory performance for the mission. The cameras must be able to operate with a gain of 2.0+/- 0.5 e--/DN, a read noise level less than 25e-, a dark current level which is less than 10e-/pixel/s, and a residual non- linearity of less than 1%. Determining these characteristics involves performing a series of tests with each of the cameras in a high vacuum environment. Here we present the methods and results of each of these performance tests for the CLASP flight cameras.

  8. Toward Dietary Assessment via Mobile Phone Video Cameras

    PubMed Central

    Chen, Nicholas; Lee, Yun Young; Rabb, Maurice; Schatz, Bruce

    2010-01-01

    Reliable dietary assessment is a challenging yet essential task for determining general health. Existing efforts are manual, require considerable effort, and are prone to underestimation and misrepresentation of food intake. We propose leveraging mobile phones to make this process faster, easier and automatic. Using mobile phones with built-in video cameras, individuals capture short videos of their meals; our software then automatically analyzes the videos to recognize dishes and estimate calories. Preliminary experiments on 20 typical dishes from a local cafeteria show promising results. Our approach complements existing dietary assessment methods to help individuals better manage their diet to prevent obesity and other diet-related diseases. PMID:21346950

  9. Soft x-ray response of the x-ray CCD camera directly coated with optical blocking layer

    NASA Astrophysics Data System (ADS)

    Ikeda, S.; Kohmura, T.; Kawai, K.; Kaneko, K.; watanabe, T.; Tsunemi, H.; Hayashida, K.; Anabuki, N.; Nakajima, H.; Ueda, S.; Tsuru, T. G.; Dotani, T.; Ozaki, M.; Matsuta, K.; Fujinaga, T.; Kitamoto, S.; Murakami, H.; Hiraga, J.; Mori, K.; ASTRO-H SXI Team

    2012-03-01

    We have developed the back-illuminated X-ray CCD camera (BI-CCD) to observe Xray in space. The X-ray CCD has a sensitivity not only for in X-ray but also in both Optical and UV light, X-ray CCD has to equip a filter to cut off optical light as well as UV light. The X-ray Imaging Spectrometer (XIS) onboard Suzaku satellite equipped with a thin film (OBF: Optical Blocking Filter) to cut off optical light and UV light. OBF is always in danger tearing by the acousmato or vibration during the launch, and it is difficult to handle on the ground because of its thickness. Instead of OBF, we have newly developed and produced OBL (Optical Blocking Layer), which is directly coating on the X-ray CCD surface.

  10. CTK-II & RTK: The CCD-cameras operated at the auxiliary telescopes of the University Observatory Jena

    NASA Astrophysics Data System (ADS)

    Mugrauer, M.

    2016-02-01

    The Cassegrain-Teleskop-Kamera (CTK-II) and the Refraktor-Teleskop-Kamera (RTK) are two CCD-imagers which are operated at the 25 cm Cassegrain and 20 cm refractor auxiliary telescopes of the University Observatory Jena. This article describes the main characteristics of these instruments. The properties of the CCD-detectors, the astrometry, the image quality, and the detection limits of both CCD-cameras, as well as some results of ongoing observing projects, carried out with these instruments, are presented. Based on observations obtained with telescopes of the University Observatory Jena, which is operated by the Astrophysical Institute of the Friedrich-Schiller-University.

  11. Using a Digital Video Camera to Study Motion

    ERIC Educational Resources Information Center

    Abisdris, Gil; Phaneuf, Alain

    2007-01-01

    To illustrate how a digital video camera can be used to analyze various types of motion, this simple activity analyzes the motion and measures the acceleration due to gravity of a basketball in free fall. Although many excellent commercially available data loggers and software can accomplish this task, this activity requires almost no financial…

  12. 67. DETAIL OF VIDEO CAMERA CONTROL PANEL LOCATED IMMEDIATELY WEST ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    67. DETAIL OF VIDEO CAMERA CONTROL PANEL LOCATED IMMEDIATELY WEST OF ASSISTANT LAUNCH CONDUCTOR PANEL SHOWN IN CA-133-1-A-66 - Vandenberg Air Force Base, Space Launch Complex 3, Launch Operations Building, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  13. Lights, Camera, Action! Using Video Recordings to Evaluate Teachers

    ERIC Educational Resources Information Center

    Petrilli, Michael J.

    2011-01-01

    Teachers and their unions do not want test scores to count for everything; classroom observations are key, too. But planning a couple of visits from the principal is hardly sufficient. These visits may "change the teacher's behavior"; furthermore, principals may not be the best judges of effective teaching. So why not put video cameras in…

  14. Lights, Camera, Action! Using Video Recordings to Evaluate Teachers

    ERIC Educational Resources Information Center

    Petrilli, Michael J.

    2011-01-01

    Teachers and their unions do not want test scores to count for everything; classroom observations are key, too. But planning a couple of visits from the principal is hardly sufficient. These visits may "change the teacher's behavior"; furthermore, principals may not be the best judges of effective teaching. So why not put video cameras in

  15. Charge-coupled device (CCD) television camera for NASA's Galileo mission to Jupiter

    NASA Technical Reports Server (NTRS)

    Klaasen, K. P.; Clary, M. C.; Janesick, J. R.

    1982-01-01

    The CCD detector under construction for use in the slow-scan television camera for the NASA Galileo Jupiter orbiter to be launched in 1985 is presented. The science objectives and the design constraints imposed by the earth telemetry link, platform residual motion, and the Jovian radiation environment are discussed. Camera optics are inherited from Voyager; filter wavelengths are chosen to enable discrimination of Galilean-satellite surface chemical composition. The CCO design, an 800 by 800-element 'virtual-phase' solid-state silicon image-sensor array with supporting electronics, is described with detailed discussion of the thermally generated dark current, quantum efficiency, signal-to-noise ratio, and resolution. Tests of the effect of ionizing radiation were performed and are analyzed statistically. An imaging mode using a 2-1/3-sec frame time and on-chip summation of the signal in 2 x 2 blocks of adjacent pixels is designed to limit the effects of the most extreme Jovian radiation. Smearing due to spacecraft/target relative velocity and platform instability will be corrected for via an algorithm maximizing spacial resolution at a given signal-to-noise level. The camera is expected to produce 40,000 images of Jupiter and its satellites during the 20-month mission.

  16. Stereo Imaging Velocimetry Technique Using Standard Off-the-Shelf CCD Cameras

    NASA Technical Reports Server (NTRS)

    McDowell, Mark; Gray, Elizabeth

    2004-01-01

    Stereo imaging velocimetry is a fluid physics technique for measuring three-dimensional (3D) velocities at a plurality of points. This technique provides full-field 3D analysis of any optically clear fluid or gas experiment seeded with tracer particles. Unlike current 3D particle imaging velocimetry systems that rely primarily on laser-based systems, stereo imaging velocimetry uses standard off-the-shelf charge-coupled device (CCD) cameras to provide accurate and reproducible 3D velocity profiles for experiments that require 3D analysis. Using two cameras aligned orthogonally, we present a closed mathematical solution resulting in an accurate 3D approximation of the observation volume. The stereo imaging velocimetry technique is divided into four phases: 3D camera calibration, particle overlap decomposition, particle tracking, and stereo matching. Each phase is explained in detail. In addition to being utilized for space shuttle experiments, stereo imaging velocimetry has been applied to the fields of fluid physics, bioscience, and colloidal microscopy.

  17. Benchmarking of Back Thinned 512x512 X-ray CCD Camera Measurements with DEF X-ray film

    NASA Astrophysics Data System (ADS)

    Shambo, N. A.; Workman, J.; Kyrala, G.; Hurry, T.; Gonzales, R.; Evans, S. C.

    1999-11-01

    Using the Trident Laser Facility at Los Alamos National Laboratory 25-micron thick, 2mm diameter titanium disks were shot with a 527nm(green) laser light to measure x-ray yield. 1.0 mil and 0.5 mil Aluminum steps were used to test the linearity of the CCD Camera and DEF X-ray film was used to test the calibration of the CCD Camera response at 4.75keV. Both laser spot size and incident laser intensity were constrained to give constancy to the experimental data. This poster will discuss both the experimental design and results.

  18. CameraCast: flexible access to remote video sensors

    NASA Astrophysics Data System (ADS)

    Kong, Jiantao; Ganev, Ivan; Schwan, Karsten; Widener, Patrick

    2007-01-01

    New applications like remote surveillance and online environmental or traffic monitoring are making it increasingly important to provide flexible and protected access to remote video sensor devices. Current systems use application-level codes like web-based solutions to provide such access. This requires adherence to user-level APIs provided by such services, access to remote video information through given application-specific service and server topologies, and that the data being captured and distributed is manipulated by third party service codes. CameraCast is a simple, easily used system-level solution to remote video access. It provides a logical device API so that an application can identically operate on local vs. remote video sensor devices, using its own service and server topologies. In addition, the application can take advantage of API enhancements to protect remote video information, using a capability-based model for differential data protection that offers fine grain control over the information made available to specific codes or machines, thereby limiting their ability to violate privacy or security constraints. Experimental evaluations of CameraCast show that the performance of accessing remote video information approximates that of accesses to local devices, given sufficient networking resources. High performance is also attained when protection restrictions are enforced, due to an efficient kernel-level realization of differential data protection.

  19. Performance of front-end mixed-signal ASIC for onboard CCD cameras

    NASA Astrophysics Data System (ADS)

    Nakajima, Hiroshi; Inoue, Shota; Nagino, Ryo; Anabuki, Naohisa; Hayashida, Kiyoshi; Tsunemi, Hiroshi; Doty, John P.; Ikeda, Hirokazu

    2014-07-01

    We report on the development status of the readout ASIC for an onboard X-ray CCD camera. The quick low- noise readout is essential for the pile-up free imaging spectroscopy with the future highly sensitive telescope. The dedicated ASIC for ASTRO-H/SXI has sufficient noise performance only at the slow pixel rate of 68 kHz. Then we have been developing the upgraded ASIC with the fourth-order ?? modulators. Upgrading the order of the modulator enables us to oversample the CCD signals less times so that we. The digitized pulse height is a serial bit stream that is decrypted with a decimation filter. The weighting coefficient of the filter is optimized to maximize the signal-to-noise ratio by a simulation. We present the performances such as the input equivalent noise (IEN), gain, effective signal range. The digitized pulse height data are successfully obtained in the first functional test up to 625 kHz. IEN is almost the same as that obtained with the chip for ASTRO-H/SXI. The residuals from the gain function is about 0.1%, which is better than that of the conventional ASIC by a factor of two. Assuming that the gain of the CCD is the same as that for ASTRO-H, the effective range is 30 keV in the case of the maximum gain. By changing the gain it can manage the signal charges of 100 ke-. These results will be fed back to the optimization of the pulse height decrypting filter.

  20. 0.25mm-thick CCD packaging for the Dark Energy Survey Camera array

    SciTech Connect

    Derylo, Greg; Diehl, H.Thomas; Estrada, Juan; /Fermilab

    2006-06-01

    The Dark Energy Survey Camera focal plane array will consist of 62 2k x 4k CCDs with a pixel size of 15 microns and a silicon thickness of 250 microns for use at wavelengths between 400 and 1000 nm. Bare CCD die will be received from the Lawrence Berkeley National Laboratory (LBNL). At the Fermi National Accelerator Laboratory, the bare die will be packaged into a custom back-side-illuminated module design. Cold probe data from LBNL will be used to select the CCDs to be packaged. The module design utilizes an aluminum nitride readout board and spacer and an Invar foot. A module flatness of 3 microns over small (1 sqcm) areas and less than 10 microns over neighboring areas on a CCD are required for uniform images over the focal plane. A confocal chromatic inspection system is being developed to precisely measure flatness over a grid up to 300 x 300 mm. This system will be utilized to inspect not only room-temperature modules, but also cold individual modules and partial arrays through flat dewar windows.

  1. Improvement of relief algorithm to prevent inpatient's downfall accident with night-vision CCD camera

    NASA Astrophysics Data System (ADS)

    Matsuda, Noriyuki; Yamamoto, Takeshi; Miwa, Masafumi; Nukumi, Shinobu; Mori, Kumiko; Kuinose, Yuko; Maeda, Etuko; Miura, Hirokazu; Taki, Hirokazu; Hori, Satoshi; Abe, Norihiro

    2005-12-01

    "ROSAI" hospital, Wakayama City in Japan, reported that inpatient's bed-downfall is one of the most serious accidents in hospital at night. Many inpatients have been having serious damages from downfall accidents from a bed. To prevent accidents, the hospital tested several sensors in a sickroom to send warning-signal of inpatient's downfall accidents to a nurse. However, it sent too much inadequate wrong warning about inpatients' sleeping situation. To send a nurse useful information, precise automatic detection for an inpatient's sleeping situation is necessary. In this paper, we focus on a clustering-algorithm which evaluates inpatient's situation from multiple angles by several kinds of sensor including night-vision CCD camera. This paper indicates new relief algorithm to improve the weakness about exceptional cases.

  2. ULTRACAM: an ultrafast, triple-beam CCD camera for high-speed astrophysics

    NASA Astrophysics Data System (ADS)

    Dhillon, V. S.; Marsh, T. R.; Stevenson, M. J.; Atkinson, D. C.; Kerry, P.; Peacocke, P. T.; Vick, A. J. A.; Beard, S. M.; Ives, D. J.; Lunney, D. W.; McLay, S. A.; Tierney, C. J.; Kelly, J.; Littlefair, S. P.; Nicholson, R.; Pashley, R.; Harlaftis, E. T.; O'Brien, K.

    2007-07-01

    ULTRACAM is a portable, high-speed imaging photometer designed to study faint astronomical objects at high temporal resolutions. ULTRACAM employs two dichroic beamsplitters and three frame-transfer CCD cameras to provide three-colour optical imaging at frame rates of up to 500 Hz. The instrument has been mounted on both the 4.2-m William Herschel Telescope on La Palma and the 8.2-m Very Large Telescope in Chile, and has been used to study white dwarfs, brown dwarfs, pulsars, black hole/neutron star X-ray binaries, gamma-ray bursts, cataclysmic variables, eclipsing binary stars, extrasolar planets, flare stars, ultracompact binaries, active galactic nuclei, asteroseismology and occultations by Solar System objects (Titan, Pluto and Kuiper Belt objects). In this paper we describe the scientific motivation behind ULTRACAM, present an outline of its design and report on its measured performance.

  3. Retrieval of the optical depth using an all-sky CCD camera.

    PubMed

    Olmo, Francisco J; Cazorla, Alberto; Alados-Arboledas, Lucas; López-Alvarez, Miguel A; Hernández-Andrés, Javier; Romero, Javier

    2008-12-01

    A new method is presented for retrieval of the aerosol and cloud optical depth using a CCD camera equipped with a fish-eye lens (all-sky imager system). In a first step, the proposed method retrieves the spectral radiance from sky images acquired by the all-sky imager system using a linear pseudoinverse algorithm. Then, the aerosol or cloud optical depth at 500 nm is obtained as that which minimizes the residuals between the zenith spectral radiance retrieved from the sky images and that estimated by the radiative transfer code. The method is tested under extreme situations including the presence of nonspherical aerosol particles. The comparison of optical depths derived from the all-sky imager with those retrieved with a sunphotometer operated side by side shows differences similar to the nominal error claimed in the aerosol optical depth retrievals from sunphotometer networks. PMID:19037341

  4. High resolution three-dimensional photoacoutic tomography with CCD-camera based ultrasound detection

    PubMed Central

    Nuster, Robert; Slezak, Paul; Paltauf, Guenther

    2014-01-01

    A photoacoustic tomograph based on optical ultrasound detection is demonstrated, which is capable of high resolution real-time projection imaging and fast three-dimensional (3D) imaging. Snapshots of the pressure field outside the imaged object are taken at defined delay times after photoacoustic excitation by use of a charge coupled device (CCD) camera in combination with an optical phase contrast method. From the obtained wave patterns photoacoustic projection images are reconstructed using a back propagation Fourier domain reconstruction algorithm. Applying the inverse Radon transform to a set of projections recorded over a half rotation of the sample provides 3D photoacoustic tomography images in less than one minute with a resolution below 100 µm. The sensitivity of the device was experimentally determined to be 5.1 kPa over a projection length of 1 mm. In vivo images of the vasculature of a mouse demonstrate the potential of the developed method for biomedical applications. PMID:25136491

  5. A reflectance model for non-contact mapping of venous oxygen saturation using a CCD camera

    NASA Astrophysics Data System (ADS)

    Li, Jun; Dunmire, Barbrina; Beach, Kirk W.; Leotta, Daniel F.

    2013-11-01

    A method of non-contact mapping of venous oxygen saturation (SvO2) is presented. A CCD camera is used to image skin tissue illuminated alternately by a red (660 nm) and an infrared (800 nm) LED light source. Low cuff pressures of 30-40 mmHg are applied to induce a venous blood volume change with negligible change in the arterial blood volume. A hybrid model combining the Beer-Lambert law and the light diffusion model is developed and used to convert the change in the light intensity to the change in skin tissue absorption coefficient. A simulation study incorporating the full light diffusion model is used to verify the hybrid model and to correct a calculation bias. SvO2 in the fingers, palm, and forearm for five volunteers are presented and compared with results in the published literature. Two-dimensional maps of venous oxygen saturation are given for the three anatomical regions.

  6. Picosecond Raman spectroscopy with a fast intensified CCD camera for depth analysis of diffusely scattering media.

    PubMed

    Ariese, Freek; Meuzelaar, Heleen; Kerssens, Marleen M; Buijs, Joost B; Gooijer, Cees

    2009-06-01

    A spectroscopic depth profiling approach is demonstrated for layers of non-transparent, diffusely scattering materials. The technique is based on the temporal discrimination between Raman photons emitted from the surface and Raman photons originating from a deeper layer. Excitation was carried out with a frequency-doubled, 3 ps Ti:sapphire laser system (398 nm; 76 MHz repetition rate). Time-resolved detection was carried out with an intensified CCD camera that can be gated with a 250 ps gate width. The performance of the system was assessed using 1 mm and 2 mm pathlength cuvettes with powdered PMMA and trans-stilbene (TS) crystals, respectively, or solid white polymer blocks: Arnite (polyethylene terephthalate), Delrin (polyoxymethylene), polythene (polyethylene) and Teflon (polytetrafluoroethylene). These samples were pressed together in different configurations and Raman photons were collected in backscatter mode in order to study the time difference in such media corresponding with several mm of extra net photon migration distance. We also studied the lateral contrast between two different second layers. The results demonstrate that by means of a picosecond laser system and the time discrimination of a gated intensified CCD camera, molecular spectroscopic information can be obtained through a turbid surface layer. In the case of the PMMA/TS two-layer system, time-resolved detection with a 400 ps delay improved the relative intensity of the Raman bands of the second layer with a factor of 124 in comparison with the spectrum recorded with a 100 ps delay (which is more selective for the first layer) and with a factor of 14 in comparison with a non-gated setup. Possible applications will be discussed, as well as advantages/disadvantages over other Raman techniques for diffusely scattering media. PMID:19475147

  7. Unmanned Vehicle Guidance Using Video Camera/Vehicle Model

    NASA Technical Reports Server (NTRS)

    Sutherland, T.

    1999-01-01

    A video guidance sensor (VGS) system has flown on both STS-87 and STS-95 to validate a single camera/target concept for vehicle navigation. The main part of the image algorithm was the subtraction of two consecutive images using software. For a nominal size image of 256 x 256 pixels this subtraction can take a large portion of the time between successive frames in standard rate video leaving very little time for other computations. The purpose of this project was to integrate the software subtraction into hardware to speed up the subtraction process and allow for more complex algorithms to be performed, both in hardware and software.

  8. Masking a CCD camera allows multichord charge exchange spectroscopy measurements at high speed on the DIII-D tokamak

    NASA Astrophysics Data System (ADS)

    Meyer, O.; Burrell, K. H.; Chavez, J. A.; Kaplan, D. H.; Chrystal, C.; Pablant, N. A.; Solomon, W. M.

    2011-02-01

    Charge exchange spectroscopy is one of the standard plasma diagnostic techniques used in tokamak research to determine ion temperature, rotation speed, particle density, and radial electric field. Configuring a charge coupled device (CCD) camera to serve as a detector in such a system requires a trade-off between the competing desires to detect light from as many independent spatial views as possible while still obtaining the best possible time resolution. High time resolution is essential, for example, for studying transient phenomena such as edge localized modes. By installing a mask in front of a camera with a 1024 × 1024 pixel CCD chip, we are able to acquire spectra from eight separate views while still achieving a minimum time resolution of 0.2 ms. The mask separates the light from the eight spectra, preventing spatial and temporal cross talk. A key part of the design was devising a compact translation stage which attaches to the front of the camera and allows adjustment of the position of the mask openings relative to the CCD surface. The stage is thin enough to fit into the restricted space between the CCD camera and the spectrometer endplate.

  9. Real-time synchronous CCD camera observation and reflectance measurement of evaporation-induced polystyrene colloidal self-assembly.

    PubMed

    Lin, Dongfeng; Wang, Jinze; Yang, Lei; Luo, Yanhong; Li, Dongmei; Meng, Qingbo

    2014-04-15

    A new monitoring technique, which combines real-time in-situ CCD camera observation and reflectance spectra measurement, has been developed to study the growing and drying processes of evaporation-induced self-assembly (EISA). Evolutions of the reflectance spectrum and CCD camera images both reveal that the entire process of polystyrene (PS) EISA contains three stages: crack-initiation stage (T1), crack-propagation stage (T2), and crack-remained stage (T3). A new phenomenon, the red-shift of stop-band, is observed when the crack begins to propagate in the monitored window of CCD camera. Deformation of colloidal spheres, which mainly results in the increase of volume fraction of spheres, is applied to explain the phenomenon. Moreover, the modified scalar wave approximation (SWA) is utilized to analyze the reflectance spectra, and the fitting results are in good agreement with the evolution of CCD camera images. This new monitoring technique and the analysis method provide a good way to get insight into the growing and drying processes of PS colloidal self-assembly, especially the crack propagation. PMID:24650361

  10. Scientific CCD technology at JPL

    NASA Technical Reports Server (NTRS)

    Janesick, J.; Collins, S. A.; Fossum, E. R.

    1991-01-01

    Charge-coupled devices (CCD's) were recognized for their potential as an imaging technology almost immediately following their conception in 1970. Twenty years later, they are firmly established as the technology of choice for visible imaging. While consumer applications of CCD's, especially the emerging home video camera market, dominated manufacturing activity, the scientific market for CCD imagers has become significant. Activity of the Jet Propulsion Laboratory and its industrial partners in the area of CCD imagers for space scientific instruments is described. Requirements for scientific imagers are significantly different from those needed for home video cameras, and are described. An imager for an instrument on the CRAF/Cassini mission is described in detail to highlight achieved levels of performance.

  11. Design of Digital Controller for a CCD Camera with Dual-Speed Tracking Imaging on Same Frame

    NASA Astrophysics Data System (ADS)

    Wang, Hui-Juan; Li, Bin-Hua; Li, Yong-Ming; He, Chun

    2007-12-01

    The CCD cameras with high performance have been widely used in astronomical observations. The techniques for observing moving objects or still objects independently are mature. However, when both the moving objects (such as satellites, debris and asteroids) and still objects (such as stars) are observed at the same time via the same CCD camera, images of one kind of these two objects must be elongated in the most time. In order to solve this problem, the authors developed a novel imaging technique and a corresponding observation method. The photosensitive areas in some CCD arrays are physically divided into two or more zones. Based on these CCD arrays, the new idea can be implemented: one half of the photosensitive area is used to image the still objects in stare mode, and another half to image the moving objects in drift scan mode. It means that both moving objects and still objects can be tracked at the same time without the elongation of their images on the same CCD frame. Thus the new technique is called Dual-Speed Tracking Imaging on Same Frame (DSTIS). This paper briefly introduces the operation principle of the DSTIS CCD camera. After some discussions on the request of a digital controller for the camera, the design philosophy and basic structure of the controller are presented. Then some simulation and testing results are shown, and problems that were encountered during the simulation and testing are analyzed in detail and solved successfully. By the results of the software simulation and hardware testing, the design has been certified correctly.

  12. In-flight Video Captured by External Tank Camera System

    NASA Technical Reports Server (NTRS)

    2005-01-01

    In this July 26, 2005 video, Earth slowly fades into the background as the STS-114 Space Shuttle Discovery climbs into space until the External Tank (ET) separates from the orbiter. An External Tank ET Camera System featuring a Sony XC-999 model camera provided never before seen footage of the launch and tank separation. The camera was installed in the ET LO2 Feedline Fairing. From this position, the camera had a 40% field of view with a 3.5 mm lens. The field of view showed some of the Bipod area, a portion of the LH2 tank and Intertank flange area, and some of the bottom of the shuttle orbiter. Contained in an electronic box, the battery pack and transmitter were mounted on top of the Solid Rocker Booster (SRB) crossbeam inside the ET. The battery pack included 20 Nickel-Metal Hydride batteries (similar to cordless phone battery packs) totaling 28 volts DC and could supply about 70 minutes of video. Located 95 degrees apart on the exterior of the Intertank opposite orbiter side, there were 2 blade S-Band antennas about 2 1/2 inches long that transmitted a 10 watt signal to the ground stations. The camera turned on approximately 10 minutes prior to launch and operated for 15 minutes following liftoff. The complete camera system weighs about 32 pounds. Marshall Space Flight Center (MSFC), Johnson Space Center (JSC), Goddard Space Flight Center (GSFC), and Kennedy Space Center (KSC) participated in the design, development, and testing of the ET camera system.

  13. Fast roadway detection using car cabin video camera

    NASA Astrophysics Data System (ADS)

    Krokhina, Daria; Blinov, Veniamin; Gladilin, Sergey; Tarhanov, Ivan; Postnikov, Vassili

    2015-12-01

    We describe a fast method for road detection in images from a vehicle cabin camera. Straight section of roadway is detected using Fast Hough Transform and the method of dynamic programming. We assume that location of horizon line in the image and the road pattern are known. The developed method is fast enough to detect the roadway on each frame of the video stream in real time and may be further accelerated by the use of tracking.

  14. Robust camera calibration for sport videos using court models

    NASA Astrophysics Data System (ADS)

    Farin, Dirk; Krabbe, Susanne; de With, Peter H. N.; Effelsberg, Wolfgang

    2003-12-01

    We propose an automatic camera calibration algorithm for court sports. The obtained camera calibration parameters are required for applications that need to convert positions in the video frame to real-world coordinates or vice versa. Our algorithm uses a model of the arrangement of court lines for calibration. Since the court model can be specified by the user, the algorithm can be applied to a variety of different sports. The algorithm starts with a model initialization step which locates the court in the image without any user assistance or a-priori knowledge about the most probable position. Image pixels are classified as court line pixels if they pass several tests including color and local texture constraints. A Hough transform is applied to extract line elements, forming a set of court line candidates. The subsequent combinatorial search establishes correspondences between lines in the input image and lines from the court model. For the succeeding input frames, an abbreviated calibration algorithm is used, which predicts the camera parameters for the new image and optimizes the parameters using a gradient-descent algorithm. We have conducted experiments on a variety of sport videos (tennis, volleyball, and goal area sequences of soccer games). Video scenes with considerable difficulties were selected to test the robustness of the algorithm. Results show that the algorithm is very robust to occlusions, partial court views, bad lighting conditions, or shadows.

  15. Real-time road traffic classification using mobile video cameras

    NASA Astrophysics Data System (ADS)

    Lapeyronnie, A.; Parisot, C.; Meessen, J.; Desurmont, X.; Delaigle, J.-F.

    2008-02-01

    On board video analysis has attracted a lot of interest over the two last decades with as main goal to improve safety by detecting obstacles or assisting the driver. Our study aims at providing a real-time understanding of the urban road traffic. Considering a video camera fixed on the front of a public bus, we propose a cost-effective approach to estimate the speed of the vehicles on the adjacent lanes when the bus operates on a dedicated lane. We work on 1-D segments drawn in the image space, aligned with the road lanes. The relative speed of the vehicles is computed by detecting and tracking features along each of these segments. The absolute speed can be estimated from the relative speed if the camera speed is known, e.g. thanks to an odometer and/or GPS. Using pre-defined speed thresholds, the traffic can be classified into different categories such as 'fluid', 'congestion' etc. The solution offers both good performances and low computing complexity and is compatible with cheap video cameras, which allows its adoption by city traffic management authorities.

  16. Identifying sports videos using replay, text, and camera motion features

    NASA Astrophysics Data System (ADS)

    Kobla, Vikrant; DeMenthon, Daniel; Doermann, David S.

    1999-12-01

    Automated classification of digital video is emerging as an important piece of the puzzle in the design of content management systems for digital libraries. The ability to classify videos into various classes such as sports, news, movies, or documentaries, increases the efficiency of indexing, browsing, and retrieval of video in large databases. In this paper, we discuss the extraction of features that enable identification of sports videos directly from the compressed domain of MPEG video. These features include detecting the presence of action replays, determining the amount of scene text in vide, and calculating various statistics on camera and/or object motion. The features are derived from the macroblock, motion,and bit-rate information that is readily accessible from MPEG video with very minimal decoding, leading to substantial gains in processing speeds. Full-decoding of selective frames is required only for text analysis. A decision tree classifier built using these features is able to identify sports clips with an accuracy of about 93 percent.

  17. Stereo mosaicing from a single moving video camera

    NASA Astrophysics Data System (ADS)

    Peleg, Shmuel; Ben-Ezra, Michael; Pritch, Yael

    2001-06-01

    Panoramic stereo pictures are created by stitching together frames taken from a single moving video camera. Stereo panoramas can be created up to a full 360 degrees. The mosaicing process is robust and fast, and can be performed in real time. Mosaicing starts by computing the motion between the video frames. The video frames, together with the motion between frames computed in the previous step, are used to generate two panoramic pictures: One picture for the left eye and one picture for the right eye. Since the camera is moving, each object is viewed from different directions in different frames. Stitching together strips from the different video frames, selected to have the correct viewing directions for stereo perception, generates the panoramic stereo pictures. The stereo mosaicing process allows several features that were not available before: (1) The creation of stereo panoramic images in 360 degrees. (2) Automatic disparity control: increasing stereo disparity for far away objects, and reducing stereo disparity for close object, to give optimal stereo viewing in all directions and for all distances. (3) The creation of multiple pictures from multiple views, not limited to two views. This enables viewing the panoramic stereo pictures using lenticular technology.

  18. Pixel-to-pixel correspondence alignment method of a 2CCD camera by using absolute phase map

    NASA Astrophysics Data System (ADS)

    Huang, Shujun; Liu, Yue; Bai, Xuefei; Wang, Zhangying; Zhang, Zonghua

    2015-06-01

    An alignment method of a 2CCD camera to build pixel-to-pixel correspondence between the infrared (IR) CCD sensor and the visible CCD sensor by using the absolute phase data is presented. Vertical and horizontal sinusoidal fringe patterns are generated by software and displayed on a liquid crystal display screen. The displayed fringe patterns are captured simultaneously by the IR sensor and the visible sensor of the 2CCD camera. The absolute phase values of each pixel at IR and visible channels are calculated from the captured fringe pattern images by using Fourier transform and the optimum three-fringe number selection method. The accurate pixel corresponding relationship between the two sensors can be determined along the vertical and the horizontal directions by comparing the obtained absolute phase data in IR and visible channels. Experimental results show the high accuracy, effectiveness, and validity of the proposed 2CCD alignment method. By using the continuous absolute phase information, this method can determine the pixel-to-pixel correspondence with high resolution.

  19. OP09O-OP404-9 Wide Field Camera 3 CCD Quantum Efficiency Hysteresis

    NASA Technical Reports Server (NTRS)

    Collins, Nick

    2009-01-01

    The HST/Wide Field Camera (WFC) 3 UV/visible channel CCD detectors have exhibited an unanticipated quantum efficiency hysteresis (QEH) behavior. At the nominal operating temperature of -83C, the QEH feature contrast was typically 0.1-0.2% or less. The behavior was replicated using flight spare detectors. A visible light flat-field (540nm) with a several times full-well signal level can pin the detectors at both optical (600nm) and near-UV (230nm) wavelengths, suppressing the QEH behavior. We are characterizing the timescale for the detectors to become unpinned and developing a protocol for flashing the WFC3 CCDs with the instrument's internal calibration system in flight. The HST/Wide Field Camera 3 UV/visible channel CCD detectors have exhibited an unanticipated quantum efficiency hysteresis (QEH) behavior. The first observed manifestation of QEH was the presence in a small percentage of flat-field images of a bowtie-shaped contrast that spanned the width of each chip. At the nominal operating temperature of -83C, the contrast observed for this feature was typically 0.1-0.2% or less, though at warmer temperatures contrasts up to 5% (at -50C) have been observed. The bowtie morphology was replicated using flight spare detectors in tests at the GSFC Detector Characterization Laboratory by power cycling the detector while cold. Continued investigation revealed that a clearly-related global QE suppression at the approximately 5% level can be produced by cooling the detector in the dark; subsequent flat-field exposures at a constant illumination show asymptotically increasing response. This QE "pinning" can be achieved with a single high signal flat-field or a series of lower signal flats; a visible light (500-580nm) flat-field with a signal level of several hundred thousand electrons per pixel is sufficient for QE pinning at both optical (600nm) and near-UV (230nm) wavelengths. We are characterizing the timescale for the detectors to become unpinned and developing a protocol for flashing the WFC3 CCDs with the instrument's internal calibration system in flight. A preliminary estimate of the decay timescale for one detector is that a drop of 0.1-0.2% occurs over a ten day period, indicating that relatively infrequent cal lamp exposures can mitigate the behavior to extremely low levels.

  20. Measuring the Flatness of Focal Plane for Very Large Mosaic CCD Camera

    SciTech Connect

    Hao, Jiangang; Estrada, Juan; Cease, Herman; Diehl, H.Thomas; Flaugher, Brenna L.; Kubik, Donna; Kuk, Keivin; Kuropatkine, Nickolai; Lin, Huan; Montes, Jorge; Scarpine, Vic

    2010-06-08

    Large mosaic multiCCD camera is the key instrument for modern digital sky survey. DECam is an extremely red sensitive 520 Megapixel camera designed for the incoming Dark Energy Survey (DES). It is consist of sixty two 4k x 2k and twelve 2k x 2k 250-micron thick fully-depleted CCDs, with a focal plane of 44 cm in diameter and a field of view of 2.2 square degree. It will be attached to the Blanco 4-meter telescope at CTIO. The DES will cover 5000 square-degrees of the southern galactic cap in 5 color bands (g, r, i, z, Y) in 5 years starting from 2011. To achieve the science goal of constraining the Dark Energy evolution, stringent requirements are laid down for the design of DECam. Among them, the flatness of the focal plane needs to be controlled within a 60-micron envelope in order to achieve the specified PSF variation limit. It is very challenging to measure the flatness of the focal plane to such precision when it is placed in a high vacuum dewar at 173 K. We developed two image based techniques to measure the flatness of the focal plane. By imaging a regular grid of dots on the focal plane, the CCD offset along the optical axis is converted to the variation the grid spacings at different positions on the focal plane. After extracting the patterns and comparing the change in spacings, we can measure the flatness to high precision. In method 1, the regular dots are kept in high sub micron precision and cover the whole focal plane. In method 2, no high precision for the grid is required. Instead, we use a precise XY stage moves the pattern across the whole focal plane and comparing the variations of the spacing when it is imaged by different CCDs. Simulation and real measurements show that the two methods work very well for our purpose, and are in good agreement with the direct optical measurements.

  1. Characterization and field use of a CCD camera system for retrieval of bidirectional reflectance distribution function

    NASA Astrophysics Data System (ADS)

    Nandy, P.; Thome, K.; Biggar, S.

    2001-06-01

    Vicarious calibration and field validation is a critical aspect of NASA's Earth Observing System program. As part of calibration and validation research related to this project, the Remote Sensing Group (RSG) of the Optical Science Center at the University of Arizona has developed an imaging radiometer for ground-based measurements of directional reflectance. The system relies on a commercially available 1024×1024 pixel, silicon CCD array. Angular measurements are accomplished using a fish-eye lens that has a full 180° field of view with each pixel on the CCD array having a nominal 0.2° field of view. Spectral selection is through four interference filters centered at 470, 575, 660, and 835 nm. The system is designed such that the entire 180° field is collected at one time with a complete multispectral data set collected in under 2 min. The results of laboratory experiments have been used to determine the gain and offset of each detector element as well as the effects of the lens on the system response. Measurements of a stable source using multiple integration times and at multiple distances for a set integration time indicate the system is linear to better than 0.5% over the upper 88% of the dynamic range of the system. The point spread function (PSF) of the lens system was measured for several field angles, and the signal level was found to fall to less than 1% of the peak signal within 1.5° for the on-axis case. The effect of this PSF on the retrieval of modeled BRDFs is shown to be less than 0.2% out to view angles of 70°. The degree of polarization of the system is shown to be negligible for on-axis imaging but to have up to a 20% effect at a field angle of 70°. The effect of the system polarization on the retrieval of modeled BRDFs is shown to be up to 3% for field angles of 70° off nadir and with a solar zenith angle of 70°. Field measurements are made by mounting the camera to a boom mounted to a large tripod that is aligned toward south. This tripod obstructs sampling of the surface reflectance past 25° off nadir northward. The system is typically operated at a height of 1.5 m to view over a large sampling of surface features, such as cracks. To evaluate the surface BRDF, measurements are collected throughout the morning as a function of Sun angle. A single measurement consists of all four bands and a dark-current measurement. Data sets have been collected over several vicarious calibration sites and calibration tarpaulins. Comparisons with measurements made by a simple goniometer-based system indicate that the camera system is as accurate as the goniometer. Scattering phase function values derived from the camera system are fit to a modified Pinty-Verstraete equation. This function is shown to fit the data to better than 0.3% for data collected during an example RSG vicarious calibration experiment. Bidirectional reflectance data derived from the camera system also compare well to those predicted from the Walthall model. These BRDF models are critical for determining the applicability of measurements taken over small areas to represent the BRDF properties of an entire site, which in some cases is of the order of several kilometers in size.

  2. Multiport backside-illuminated CCD imagers for high-frame-rate camera applications

    NASA Astrophysics Data System (ADS)

    Levine, Peter A.; Sauer, Donald J.; Hseuh, Fu-Lung; Shallcross, Frank V.; Taylor, Gordon C.; Meray, Grazyna M.; Tower, John R.; Harrison, Lorna J.; Lawler, William B.

    1994-05-01

    Two multiport, second-generation CCD imager designs have been fabricated and successfully tested. They are a 16-port 512 X 512 array and a 32-port 1024 X 1024 array. Both designs are back illuminated, have on-chip CDS, lateral blooming control, and use a split vertical frame transfer architecture with full frame storage. The 512 X 512 device has been operated at rates over 800 frames per second. The 1024 X 1024 device has been operated at rates over 300 frames per second. The major changes incorporated in the second-generation design are, reduction in gate length in the output area to give improved high-clock-rate performance, modified on-chip CDS circuitry for reduced noise, and optimized implants to improve performance of blooming control at lower clock amplitude. This paper discusses the imager design improvements and presents measured performance results at high and moderate frame rates. The design and performance of three moderate frame rate cameras are discussed.

  3. On the performance of optical filters for the XMM focal plane CCD-camera EPIC

    NASA Astrophysics Data System (ADS)

    Stephan, K.-H.; Reppin, C.; Maier, H. J.; Frischke, D.; Fuchs, D.; Müller, P.; Moeller, S.; Gürtler, P.

    1995-02-01

    Optical filters have been developed for the X-ray astronomy project XMM (X-ray Multi Mirror Mission) [1] of ESA, where specific CCDs will serve as focal plane cameras on board the observatory. These detectors are sensitive from the X-ray to the NIR (near infrared) spectral region. For observations in X-ray astronomy an optical filter must be placed in front of the CCD, suppressing visible and UV (ultraviolet) radiation of stars by more than 6 orders of magnitude while being highly transparent at photon energies above 100 eV. The flight model filter is designed to have an effective area of 73 mm diameter without making use of a supporting grid. Efforts have been made to utilize plastic foils to tailor filters meeting these specific requirements. It was found, that a typical filter could be composed, e.g., of a polypropylene foil of 20 μg/cm2 thickness serving as a carrier, coated with metallic films of Al or Al and Sn of about 20-25 μg/cm2 thickness. Other possible carriers are polycarbonate (Lexan, Macrolon) and poly-para-xylylene (Parylene N) films of similar thicknesses. The preparation and characterization of these three types of carrier foils as well as of two sample filters is described, including mechanical tests as well as optical transmission measurements in the photon energy range from 1 eV to 2 keV.

  4. Development of proton CT imaging system using plastic scintillator and CCD camera.

    PubMed

    Tanaka, Sodai; Nishio, Teiji; Matsushita, Keiichiro; Tsuneda, Masato; Kabuki, Shigeto; Uesaka, Mitsuru

    2016-06-01

    A proton computed tomography (pCT) imaging system was constructed for evaluation of the error of an x-ray CT (xCT)-to-WEL (water-equivalent length) conversion in treatment planning for proton therapy. In this system, the scintillation light integrated along the beam direction is obtained by photography using the CCD camera, which enables fast and easy data acquisition. The light intensity is converted to the range of the proton beam using a light-to-range conversion table made beforehand, and a pCT image is reconstructed. An experiment for demonstration of the pCT system was performed using a 70 MeV proton beam provided by the AVF930 cyclotron at the National Institute of Radiological Sciences. Three-dimensional pCT images were reconstructed from the experimental data. A thin structure of approximately 1 mm was clearly observed, with spatial resolution of pCT images at the same level as that of xCT images. The pCT images of various substances were reconstructed to evaluate the pixel value of pCT images. The image quality was investigated with regard to deterioration including multiple Coulomb scattering. PMID:27191962

  5. Field-programmable gate array-based hardware architecture for high-speed camera with KAI-0340 CCD image sensor

    NASA Astrophysics Data System (ADS)

    Wang, Hao; Yan, Su; Zhou, Zuofeng; Cao, Jianzhong; Yan, Aqi; Tang, Linao; Lei, Yangjie

    2013-08-01

    We present a field-programmable gate array (FPGA)-based hardware architecture for high-speed camera which have fast auto-exposure control and colour filter array (CFA) demosaicing. The proposed hardware architecture includes the design of charge coupled devices (CCD) drive circuits, image processing circuits, and power supply circuits. CCD drive circuits transfer the TTL (Transistor-Transistor-Logic) level timing Sequences which is produced by image processing circuits to the timing Sequences under which CCD image sensor can output analog image signals. Image processing circuits convert the analog signals to digital signals which is processing subsequently, and the TTL timing, auto-exposure control, CFA demosaicing, and gamma correction is accomplished in this module. Power supply circuits provide the power for the whole system, which is very important for image quality. Power noises effect image quality directly, and we reduce power noises by hardware way, which is very effective. In this system, the CCD is KAI-0340 which is can output 210 full resolution frame-per-second, and our camera can work outstandingly in this mode. The speed of traditional auto-exposure control algorithms to reach a proper exposure level is so slow that it is necessary to develop a fast auto-exposure control method. We present a new auto-exposure algorithm which is fit high-speed camera. Color demosaicing is critical for digital cameras, because it converts a Bayer sensor mosaic output to a full color image, which determines the output image quality of the camera. Complexity algorithm can acquire high quality but cannot implement in hardware. An low-complexity demosaicing method is presented which can implement in hardware and satisfy the demand of quality. The experiment results are given in this paper in last.

  6. A new paradigm for video cameras: optical sensors

    NASA Astrophysics Data System (ADS)

    Grottle, Kevin; Nathan, Anoo; Smith, Catherine

    2007-04-01

    This paper presents a new paradigm for the utilization of video surveillance cameras as optical sensors to augment and significantly improve the reliability and responsiveness of chemical monitoring systems. Incorporated into a hierarchical tiered sensing architecture, cameras serve as 'Tier 1' or 'trigger' sensors monitoring for visible indications after a release of warfare or industrial toxic chemical agents. No single sensor today yet detects the full range of these agents, but the result of exposure is harmful and yields visible 'duress' behaviors. Duress behaviors range from simple to complex types of observable signatures. By incorporating optical sensors in a tiered sensing architecture, the resulting alarm signals based on these behavioral signatures increases the range of detectable toxic chemical agent releases and allows timely confirmation of an agent release. Given the rapid onset of duress type symptoms, an optical sensor can detect the presence of a release almost immediately. This provides cues for a monitoring system to send air samples to a higher-tiered chemical sensor, quickly launch protective mitigation steps, and notify an operator to inspect the area using the camera's video signal well before the chemical agent can disperse widely throughout a building.

  7. Fast auto-acquisition tomography tilt series by using HD video camera in ultra-high voltage electron microscope.

    PubMed

    Nishi, Ryuji; Cao, Meng; Kanaji, Atsuko; Nishida, Tomoki; Yoshida, Kiyokazu; Isakozawa, Shigeto

    2014-11-01

    The ultra-high voltage electron microscope (UHVEM) H-3000 with the world highest acceleration voltage of 3 MV can observe remarkable three dimensional microstructures of microns-thick samples[1]. Acquiring a tilt series of electron tomography is laborious work and thus an automatic technique is highly desired. We proposed the Auto-Focus system using image Sharpness (AFS)[2,3] for UHVEM tomography tilt series acquisition. In the method, five images with different defocus values are firstly acquired and the image sharpness are calculated. The sharpness are then fitted to a quasi-Gaussian function to decide the best focus value[3]. Defocused images acquired by the slow scan CCD (SS-CCD) camera (Hitachi F486BK) are of high quality but one minute is taken for acquisition of five defocused images.In this study, we introduce a high-definition video camera (HD video camera; Hamamatsu Photonics K. K. C9721S) for fast acquisition of images[4]. It is an analog camera but the camera image is captured by a PC and the effective image resolution is 1280×1023 pixels. This resolution is lower than that of the SS-CCD camera of 4096×4096 pixels. However, the HD video camera captures one image for only 1/30 second. In exchange for the faster acquisition the S/N of images are low. To improve the S/N, 22 captured frames are integrated so that each image sharpness is enough to become lower fitting error. As countermeasure against low resolution, we selected a large defocus step, which is typically five times of the manual defocus step, to discriminate different defocused images.By using HD video camera for autofocus process, the time consumption for each autofocus procedure was reduced to about six seconds. It took one second for correction of an image position and the total correction time was seven seconds, which was shorter by one order than that using SS-CCD camera. When we used SS-CCD camera for final image capture, it took 30 seconds to record one tilt image. We can obtain a tilt series of 61 images within 30 minutes. Accuracy and repeatability were good enough to practical use (Figure 1). We successfully reduced the total acquisition time of a tomography tilt series in half than before.jmicro;63/suppl_1/i25/DFU066F1F1DFU066F1Fig. 1.Objective lens current change with a tilt angle during acquisition of tomography series (Sample: a rat hepatocyte, thickness: 2 m, magnification: 4k, acc. voltage: 2 MV). Tilt angle range is ±60 degree with 2 degree step angle. Two series were acquired in the same area. Both data were almost same and the deviation was smaller than the minimum step by manual, so auto-focus worked well. We also developed a computer-aided three dimensional (3D) visualization and analysis software for electron tomography "HawkC" which can sectionalize the 3D data semi-automatically[5,6]. If this auto-acquisition system is used with IMOD reconstruction software[7] and HawkC software, we will be able to do on-line UHVEM tomography. The system would help pathology examination in the future.This work was supported by the Ministry of Education, Culture, Sports, Science and Technology (MEXT), Japan, under a Grant-in-Aid for Scientific Research (Grant No. 23560024, 23560786), and SENTAN, Japan Science and Technology Agency, Japan. PMID:25359822

  8. Simultaneous Camera Path Optimization and Distraction Removal for Improving Amateur Video.

    PubMed

    Zhang, Fang-Lue; Wang, Jue; Zhao, Han; Martin, Ralph R; Hu, Shi-Min

    2015-12-01

    A major difference between amateur and professional video lies in the quality of camera paths. Previous work on video stabilization has considered how to improve amateur video by smoothing the camera path. In this paper, we show that additional changes to the camera path can further improve video aesthetics. Our new optimization method achieves multiple simultaneous goals: 1) stabilizing video content over short time scales; 2) ensuring simple and consistent camera paths over longer time scales; and 3) improving scene composition by automatically removing distractions, a common occurrence in amateur video. Our approach uses an L(1) camera path optimization framework, extended to handle multiple constraints. Two passes of optimization are used to address both low-level and high-level constraints on the camera path. The experimental and user study results show that our approach outputs video that is perceptually better than the input, or the results of using stabilization only. PMID:26513791

  9. Study of pixel damages in CCD cameras irradiated at the neutron tomography facility of IPEN-CNEN/SP

    NASA Astrophysics Data System (ADS)

    Pugliesi, R.; Andrade, M. L. G.; Dias, M. S.; Siqueira, P. T. D.; Pereira, M. A. S.

    2015-12-01

    A methodology to investigate damages in CCD sensors caused by radiation beams of neutron tomography facilities is proposed. This methodology has been developed in the facility installed at the nuclear research reactor of IPEN-CNEN/SP, and the damages were evaluated by counting of white spots in images. The damage production rate at the main camera position was evaluated to be in the range between 0.008 and 0.040 damages per second. For this range, only 4 to 20 CCD pixels are damaged per tomography, assuring high quality images for hundreds of tomographs. Since the present methodology is capable of quantifying the damage production rate for each type of radiation, it can also be used in other facilities to improve the radiation shielding close of the CCD sensors.

  10. Social Justice through Literacy: Integrating Digital Video Cameras in Reading Summaries and Responses

    ERIC Educational Resources Information Center

    Liu, Rong; Unger, John A.; Scullion, Vicki A.

    2014-01-01

    Drawing data from an action-oriented research project for integrating digital video cameras into the reading process in pre-college courses, this study proposes using digital video cameras in reading summaries and responses to promote critical thinking and to teach social justice concepts. The digital video research project is founded on…

  11. Automatic radial distortion correction in zoom lens video camera

    NASA Astrophysics Data System (ADS)

    Kim, Daehyun; Shin, Hyoungchul; Oh, Juhyun; Sohn, Kwanghoon

    2010-10-01

    We present a novel method for automatically correcting the radial lens distortion in a zoom lens video camera system. We first define the zoom lens distortion model using an inherent characteristic of the zoom lens. Next, we sample some video frames with different focal lengths and estimate their radial distortion parameters and focal lengths. We then optimize the zoom lens distortion model with preestimated parameter pairs using the least-squares method. For more robust optimization, we divide the sample images into two groups according to distortion types (i.e., barrel and pincushion) and then separately optimize the zoom lens distortion models with respect to divided groups. Our results show that the zoom lens distortion model can accurately represent the radial distortion of a zoom lens.

  12. Photometric correction and reflectance calculation for lunar images from the Chang'E-1 CCD stereo camera.

    PubMed

    Chen, Chao; Qin, Qiming; Chen, Li; Zheng, Hong; Fa, Wenzhe; Ghulam, Abduwasit; Zhang, Chengye

    2015-12-01

    Photometric correction and reflectance calculation are two important processes in the scientific analysis and application of Chang'E-1 (CE-1) charge-coupled device (CCD) stereo camera data. In this paper, the methods of photometric correction and reflectance calculation were developed. On the one hand, in considering the specificity of datasets acquired by the CE-1 CCD stereo camera, photometric correction was conducted based on the digital number value directly using the revised Lommel-Seeliger factor. On the other hand, on the basis of laboratory-measured bidirectional reflectances, the relative reflectance was then calculated using the empirical linear model. The presented approach can be used to identify landing sites, obtain global images, and produce topographic maps of the lunar surface. PMID:26831395

  13. Non-mydriatic, wide field, fundus video camera

    NASA Astrophysics Data System (ADS)

    Hoeher, Bernhard; Voigtmann, Peter; Michelson, Georg; Schmauss, Bernhard

    2014-02-01

    We describe a method we call "stripe field imaging" that is capable of capturing wide field color fundus videos and images of the human eye at pupil sizes of 2mm. This means that it can be used with a non-dilated pupil even with bright ambient light. We realized a mobile demonstrator to prove the method and we could acquire color fundus videos of subjects successfully. We designed the demonstrator as a low-cost device consisting of mass market components to show that there is no major additional technical outlay to realize the improvements we propose. The technical core idea of our method is breaking the rotational symmetry in the optical design that is given in many conventional fundus cameras. By this measure we could extend the possible field of view (FOV) at a pupil size of 2mm from a circular field with 20° in diameter to a square field with 68° by 18° in size. We acquired a fundus video while the subject was slightly touching and releasing the lid. The resulting video showed changes at vessels in the region of the papilla and a change of the paleness of the papilla.

  14. Scientists Behind the Camera - Increasing Video Documentation in the Field

    NASA Astrophysics Data System (ADS)

    Thomson, S.; Wolfe, J.

    2013-12-01

    Over the last two years, Skypunch Creative has designed and implemented a number of pilot projects to increase the amount of video captured by scientists in the field. The major barrier to success that we tackled with the pilot projects was the conflicting demands of the time, space, storage needs of scientists in the field and the demands of shooting high quality video. Our pilots involved providing scientists with equipment, varying levels of instruction on shooting in the field and post-production resources (editing and motion graphics). In each project, the scientific team was provided with cameras (or additional equipment if they owned their own), tripods, and sometimes sound equipment, as well as an external hard drive to return the footage to us. Upon receiving the footage we professionally filmed follow-up interviews and created animations and motion graphics to illustrate their points. We also helped with the distribution of the final product (http://climatescience.tv/2012/05/the-story-of-a-flying-hippo-the-hiaper-pole-to-pole-observation-project/ and http://climatescience.tv/2013/01/bogged-down-in-alaska/). The pilot projects were a success. Most of the scientists returned asking for additional gear and support for future field work. Moving out of the pilot phase, to continue the project, we have produced a 14 page guide for scientists shooting in the field based on lessons learned - it contains key tips and best practice techniques for shooting high quality footage in the field. We have also expanded the project and are now testing the use of video cameras that can be synced with sensors so that the footage is useful both scientifically and artistically. Extract from A Scientist's Guide to Shooting Video in the Field

  15. Preliminary Performance Measurements for a Streak Camera with a Large-Format Direct-Coupled CCD Readout

    SciTech Connect

    Lerche, R A; McDonald, J W; Griffith, R L; de Dios, G V; Andrews, D S; Huey, A W; Bell, P M; Landen, O L; Jaanimagi, P A; Boni, R

    2004-04-13

    Livermore's ICF Program has a large inventory of optical streak cameras built in the 1970s and 1980s. The cameras are still very functional, but difficult to maintain because many of their parts are obsolete including the original streak tube and image-intensifier tube. The University of Rochester's Laboratory for Laser Energetics is leading an effort to develop a fully automated, large-format streak camera that incorporates modern technology. Preliminary characterization of a prototype camera shows spatial resolution better than 20 lp/mm, temporal resolution of 12 ps, line-spread function of 40 {micro}m (fwhm), contrast transfer ratio (CTR) of 60% at 10 lp/mm, and system sensitivity of 16 CCD electrons per photoelectron. A dynamic range of 60 for a 2 ns window is determined from system noise, linearity and sensitivity measurements.

  16. Miniature, vacuum compatible 1024 {times} 1024 CCD camera for x-ray, ultra-violet, or optical imaging

    SciTech Connect

    Conder, A.D.; Dunn, J.; Young, B.K.F.

    1994-05-01

    We have developed a very compact (60 {times} 60 {times} 75 mm{sup 3}), vacuum compatible, large format (25 {times} 25 mm{sup 2}, 1024 {times} 1024 pixels) CCD camera for digital imaging of visible and ultraviolet radiation, soft to penetrating x-rays ({le}20 keV), and charged particles. This camera provides a suitable replacement for film with a linear response, dynamic range and intrinsic signal-to- noise response superior than current x-ray film, and provides real- time access to the data. The spatial resolution of the camera (< 25 {mu}m) is similar to typical digitization slit or step sizes used in processing film data. This new large format CCD camera has immediate applications as the recording device for steak cameras or gated microchannel plate diagnostic, or when used directly as the detector for x-ray, xuv, or optical signals. This is especially important in studying high-energy plasmas produced in pulse-power, ICF, and high powered laser-plasma experiments, as well as other medical and industrial applications.

  17. A Large Panel Two-CCD Camera Coordinate System with an Alternate-Eight-Matrix Look-Up Table Algorithm

    NASA Astrophysics Data System (ADS)

    Lin, Chern-Sheng; Lu, An-Tsung; Hsu, Yuen-Chang; Tien, Chuen-Lin; Chen, Der-Chin

    In this study, a novel positioning model of a double-CCD cameras calibration system, with an Alternate-Eight-Matrix (AEM) Look-Up-Table (LUT), was proposed. Two CCD cameras were fixed on both sides of a large scale screen to redeem Field Of View (FOV) problems. The first to the fourth AEMLUT were used to compute the corresponding positions of intermediate blocks on the screen captured by the right side camera. In these AEMLUT for the right side camera, the coordinate mapping data of the target in a specific space were stored in two matrixes, while the gray level threshold values of different position were stored in the others. Similarly, the fifth to the eighth AEMLUT were used to compute the corresponding positions of intermediate blocks on the screen captured by the left side camera. Experimental results showed that the problems of dead angles and non-uniform light fields were solved. In addition, rapid and precision positioning results can be obtained by the proposed method.

  18. Classification of volcanic ash particles from Sakurajima volcano using CCD camera image and cluster analysis

    NASA Astrophysics Data System (ADS)

    Miwa, T.; Shimano, T.; Nishimura, T.

    2012-12-01

    Quantitative and speedy characterization of volcanic ash particle is needed to conduct a petrologic monitoring of ongoing eruption. We develop a new simple system using CCD camera images for quantitatively characterizing ash properties, and apply it to volcanic ash collected at Sakurajima. Our method characterizes volcanic ash particles by 1) apparent luminance through RGB filters and 2) a quasi-fractal dimension of the shape of particles. Using a monochromatic CCD camera (Starshoot by Orion Co. LTD.) attached to a stereoscopic microscope, we capture digital images of ash particles that are set on a glass plate under which white colored paper or polarizing plate is set. The images of 1390 x 1080 pixels are taken through three kinds of color filters (Red, Green and Blue) under incident-light and transmitted-light through polarizing plate. Brightness of the light sources is set to be constant, and luminance is calibrated by white and black colored papers. About fifteen ash particles are set on the plate at the same time, and their images are saved with a bit map format. We first extract the outlines of particles from the image taken under transmitted-light through polarizing plate. Then, luminances for each color are represented by 256 tones at each pixel in the particles, and the average and its standard deviation are calculated for each ash particle. We also measure the quasi-fractal dimension (qfd) of ash particles. We perform box counting that counts the number of boxes which consist of 1×1 and 128×128 pixels that catch the area of the ash particle. The qfd is estimated by taking the ratio of the former number to the latter one. These parameters are calculated by using software R. We characterize volcanic ash from Showa crater of Sakurajima collected in two days (Feb 09, 2009, and Jan 13, 2010), and apply cluster analyses. Dendrograms are formed from the qfd and following four parameters calculated from the luminance: Rf=R/(R+G+B), G=G/(R+G+B), B=B/(R+G+B), and total luminance=(R+G+B)/665. We classify the volcanic ash particles from the Dendrograms into three groups based on the euclid distance. The groups are named as Group A, B and C in order of increasing of the average value of total luminance. The classification shows that the numbers of particles belonging to Group A, B and C are 77, 25 and 6 in Feb, 09, 2009 sample, and 102, 19 and 6 in Jan, 13, 2010 sample, respectively. The examination under stereoscopic microscope suggests that Group A, B and C mainly correspond with juvenile, altered and free-crystal particles, respectively. So the result of classification by present method demonstrates a difference in the contribution of juvenile material between the two days. To evaluate reliability of our classification, we classify pseudo-samples in which errors of 10% are added in the measured parameters. We apply our method to one thousand psuedo-samples, and the result shows that the numbers of particles classified into the three groups vary less than 20 % of the total number of 235 particles. Our system can classify 120 particles within 6 minutes so that we easily increase the number of ash particles, which enable us to improve reliabilities and resolutions of the classification and to speedily capture temporal changes of the property of ash particles from active volcanoes.

  19. Developing a CCD camera with high spatial resolution for RIXS in the soft X-ray range

    NASA Astrophysics Data System (ADS)

    Soman, M. R.; Hall, D. J.; Tutt, J. H.; Murray, N. J.; Holland, A. D.; Schmitt, T.; Raabe, J.; Schmitt, B.

    2013-12-01

    The Super Advanced X-ray Emission Spectrometer (SAXES) at the Swiss Light Source contains a high resolution Charge-Coupled Device (CCD) camera used for Resonant Inelastic X-ray Scattering (RIXS). Using the current CCD-based camera system, the energy-dispersive spectrometer has an energy resolution (E/ΔE) of approximately 12,000 at 930 eV. A recent study predicted that through an upgrade to the grating and camera system, the energy resolution could be improved by a factor of 2. In order to achieve this goal in the spectral domain, the spatial resolution of the CCD must be improved to better than 5 μm from the current 24 μm spatial resolution (FWHM). The 400 eV-1600 eV energy X-rays detected by this spectrometer primarily interact within the field free region of the CCD, producing electron clouds which will diffuse isotropically until they reach the depleted region and buried channel. This diffusion of the charge leads to events which are split across several pixels. Through the analysis of the charge distribution across the pixels, various centroiding techniques can be used to pinpoint the spatial location of the X-ray interaction to the sub-pixel level, greatly improving the spatial resolution achieved. Using the PolLux soft X-ray microspectroscopy endstation at the Swiss Light Source, a beam of X-rays of energies from 200 eV to 1400 eV can be focused down to a spot size of approximately 20 nm. Scanning this spot across the 16 μm square pixels allows the sub-pixel response to be investigated. Previous work has demonstrated the potential improvement in spatial resolution achievable by centroiding events in a standard CCD. An Electron-Multiplying CCD (EM-CCD) has been used to improve the signal to effective readout noise ratio achieved resulting in a worst-case spatial resolution measurement of 4.5±0.2 μm and 3.9±0.1 μm at 530 eV and 680 eV respectively. A method is described that allows the contribution of the X-ray spot size to be deconvolved from these worst-case resolution measurements, estimating the spatial resolution to be approximately 3.5 μm and 3.0 μm at 530 eV and 680 eV, well below the resolution limit of 5 μm required to improve the spectral resolution by a factor of 2.

  20. Time-variable camera separation for compression of stereoscopic video

    NASA Astrophysics Data System (ADS)

    Ji, Maosheng; Hannuksela, Miska M.; Gabbouj, Moncef; Li, Houqiang

    2010-07-01

    This paper presents a hypothesis that stereoscopic perception requires a short adjustment period after a scene change before it is fully effective. A compression method based on this hypothesis is proposed - instead of coding pictures from the left and right views conventionally, a view in the middle of the left and right view is coded for a limited period after a scene change. The coded middle view can be utilized in two alternative ways in rendering. First, it can be rendered as such, which causes an abrupt change from conventional monoscopic video to stereoscopic video. Second, the layered depth video (LDV) coding scheme can be used to associate depth, background texture, and background depth to the middle view, enabling view synthesis and gradual view disparity increase in rendering. Subjective experiments were conducted to evaluate and validate the presented hypothesis and compare the two rendering methods. The results indicate that when the maximum disparity between the left and right views was relatively small, the presented time-variable camera separation method was imperceptible. A compression gain, the magnitude of which depended on the scene duration, was achieved with half of the sequences having a suitable disparity for the presented coding method.

  1. Flat Field Anomalies in an X-ray CCD Camera Measured Using a Manson X-ray Source

    SciTech Connect

    M. J. Haugh and M. B. Schneider

    2008-10-31

    The Static X-ray Imager (SXI) is a diagnostic used at the National Ignition Facility (NIF) to measure the position of the X-rays produced by lasers hitting a gold foil target. The intensity distribution taken by the SXI camera during a NIF shot is used to determine how accurately NIF can aim laser beams. This is critical to proper NIF operation. Imagers are located at the top and the bottom of the NIF target chamber. The CCD chip is an X-ray sensitive silicon sensor, with a large format array (2k x 2k), 24 μm square pixels, and 15 μm thick. A multi-anode Manson X-ray source, operating up to 10kV and 10W, was used to characterize and calibrate the imagers. The output beam is heavily filtered to narrow the spectral beam width, giving a typical resolution E/ΔE≈10. The X-ray beam intensity was measured using an absolute photodiode that has accuracy better than 1% up to the Si K edge and better than 5% at higher energies. The X-ray beam provides full CCD illumination and is flat, within ±1% maximum to minimum. The spectral efficiency was measured at 10 energy bands ranging from 930 eV to 8470 eV. We observed an energy dependent pixel sensitivity variation that showed continuous change over a large portion of the CCD. The maximum sensitivity variation occurred at 8470 eV. The geometric pattern did not change at lower energies, but the maximum contrast decreased and was not observable below 4 keV. We were also able to observe debris, damage, and surface defects on the CCD chip. The Manson source is a powerful tool for characterizing the imaging errors of an X-ray CCD imager. These errors are quite different from those found in a visible CCD imager.

  2. Characterization of OCam and CCD220: the fastest and most sensitive camera to date for AO wavefront sensing

    NASA Astrophysics Data System (ADS)

    Feautrier, Philippe; Gach, Jean-Luc; Balard, Philippe; Guillaume, Christian; Downing, Mark; Hubin, Norbert; Stadler, Eric; Magnard, Yves; Skegg, Michael; Robbins, Mark; Denney, Sandy; Suske, Wolfgang; Jorden, Paul; Wheeler, Patrick; Pool, Peter; Bell, Ray; Burt, David; Davies, Ian; Reyes, Javier; Meyer, Manfred; Baade, Dietrich; Kasper, Markus; Arsenault, Robin; Fusco, Thierry; Diaz-Garcia, José Javier

    2010-07-01

    For the first time, sub-electron read noise has been achieved with a camera suitable for astronomical wavefront-sensing (WFS) applications. The OCam system has demonstrated this performance at 1300 Hz frame rate and with 240×240-pixel frame rate. ESO and JRA2 OPTICON2 have jointly funded e2v technologies to develop a custom CCD for Adaptive Optics (AO) wavefront sensing applications. The device, called CCD220, is a compact Peltier-cooled 240×240 pixel frame-transfer 8-output back-illuminated sensor using the EMCCD technology. This paper demonstrates sub-electron read noise at frame rates from 25 Hz to 1300 Hz and dark current lower than 0.01 e-/pixel/frame. It reports on the comprehensive, quantitative performance characterization of OCam and the CCD220 such as readout noise, dark current, multiplication gain, quantum efficiency, charge transfer efficiency... OCam includes a low noise preamplifier stage, a digital board to generate the clocks and a microcontroller. The data acquisition system includes a user friendly timer file editor to generate any type of clocking scheme. A second version of OCam, called OCam2, was designed offering enhanced performances, a completely sealed camera package and an additional Peltier stage to facilitate operation on a telescope or environmentally rugged applications. OCam2 offers two types of built-in data link to the Real Time Computer: the CameraLink industry standard interface and various fiber link options like the sFPDP interface. OCam2 includes also a modified mechanical design to ease the integration of microlens arrays for use of this camera in all types of wavefront sensing AO system. The front cover of OCam2 can be customized to include a microlens exchange mechanism.

  3. Nighttime Near Infrared Observations of Augustine Volcano Jan-Apr, 2006 Recorded With a Small Astronomical CCD Camera

    NASA Astrophysics Data System (ADS)

    Sentman, D.; McNutt, S.; Reyes, C.; Stenbaek-Nielsen, H.; Deroin, N.

    2006-12-01

    Nighttime observations of Augustine Volcano were made during Jan-Apr, 2006 using a small, unfiltered, astronomical CCD camera operating from Homer, Alaska. Time-lapse images of the volcano were made looking across the open water of the Cook Inlet over a slant range of ~105 km. A variety of volcano activities were observed that originated in near-infrared (NIR) 0.9-1.1 micron emissions, which were detectable at the upper limit of the camera passband but were otherwise invisible to the naked eye. These activities included various types of steam releases, pyroclastic flows, rockfalls and debris flows that were correlated very closely with seismic measurements made from instruments located within 4 km on the volcanic island. Specifically, flow events to the east (towards the camera) produced high amplitudes on the eastern seismic stations and events presumably to the west were stronger on western stations. The ability to detect nighttime volcanic emissions in the NIR over large horizontal distances using standard silicon CCD technology, even in the presence of weak intervening fog, came as a surprise, and is due to a confluence of several mutually reinforcing factors: (1) Hot enough (~1000K) thermal emissions from the volcano that the short wavelength portion of the Planck radiation curve overlaps the upper portions (0.9-1.1 micron) of the sensitivity of the silicon CCD detectors, and could thus be detected, (2) The existence of several atmospheric transmission windows within the NIR passband of the camera for the emissions to propagate with relatively small attenuation through more than 10 atmospheres, and (3) in the case of fog, forward Mie scattering.

  4. Video-Camera-Based Position-Measuring System

    NASA Technical Reports Server (NTRS)

    Lane, John; Immer, Christopher; Brink, Jeffrey; Youngquist, Robert

    2005-01-01

    A prototype optoelectronic system measures the three-dimensional relative coordinates of objects of interest or of targets affixed to objects of interest in a workspace. The system includes a charge-coupled-device video camera mounted in a known position and orientation in the workspace, a frame grabber, and a personal computer running image-data-processing software. Relative to conventional optical surveying equipment, this system can be built and operated at much lower cost; however, it is less accurate. It is also much easier to operate than are conventional instrumentation systems. In addition, there is no need to establish a coordinate system through cooperative action by a team of surveyors. The system operates in real time at around 30 frames per second (limited mostly by the frame rate of the camera). It continuously tracks targets as long as they remain in the field of the camera. In this respect, it emulates more expensive, elaborate laser tracking equipment that costs of the order of 100 times as much. Unlike laser tracking equipment, this system does not pose a hazard of laser exposure. Images acquired by the camera are digitized and processed to extract all valid targets in the field of view. The three-dimensional coordinates (x, y, and z) of each target are computed from the pixel coordinates of the targets in the images to accuracy of the order of millimeters over distances of the orders of meters. The system was originally intended specifically for real-time position measurement of payload transfers from payload canisters into the payload bay of the Space Shuttle Orbiters (see Figure 1). The system may be easily adapted to other applications that involve similar coordinate-measuring requirements. Examples of such applications include manufacturing, construction, preliminary approximate land surveying, and aerial surveying. For some applications with rectangular symmetry, it is feasible and desirable to attach a target composed of black and white squares to an object of interest (see Figure 2). For other situations, where circular symmetry is more desirable, circular targets also can be created. Such a target can readily be generated and modified by use of commercially available software and printed by use of a standard office printer. All three relative coordinates (x, y, and z) of each target can be determined by processing the video image of the target. Because of the unique design of corresponding image-processing filters and targets, the vision-based position- measurement system is extremely robust and tolerant of widely varying fields of view, lighting conditions, and varying background imagery.

  5. Deep-Sea Video Cameras Without Pressure Housings

    NASA Technical Reports Server (NTRS)

    Cunningham, Thomas

    2004-01-01

    Underwater video cameras of a proposed type (and, optionally, their light sources) would not be housed in pressure vessels. Conventional underwater cameras and their light sources are housed in pods that keep the contents dry and maintain interior pressures of about 1 atmosphere (.0.1 MPa). Pods strong enough to withstand the pressures at great ocean depths are bulky, heavy, and expensive. Elimination of the pods would make it possible to build camera/light-source units that would be significantly smaller, lighter, and less expensive. The depth ratings of the proposed camera/light source units would be essentially unlimited because the strengths of their housings would no longer be an issue. A camera according to the proposal would contain an active-pixel image sensor and readout circuits, all in the form of a single silicon-based complementary metal oxide/semiconductor (CMOS) integrated- circuit chip. As long as none of the circuitry and none of the electrical leads were exposed to seawater, which is electrically conductive, silicon integrated- circuit chips could withstand the hydrostatic pressure of even the deepest ocean. The pressure would change the semiconductor band gap by only a slight amount . not enough to degrade imaging performance significantly. Electrical contact with seawater would be prevented by potting the integrated-circuit chip in a transparent plastic case. The electrical leads for supplying power to the chip and extracting the video signal would also be potted, though not necessarily in the same transparent plastic. The hydrostatic pressure would tend to compress the plastic case and the chip equally on all sides; there would be no need for great strength because there would be no need to hold back high pressure on one side against low pressure on the other side. A light source suitable for use with the camera could consist of light-emitting diodes (LEDs). Like integrated- circuit chips, LEDs can withstand very large hydrostatic pressures. If power-supply regulators or filter capacitors were needed, these could be attached in chip form directly onto the back of, and potted with, the imager chip. Because CMOS imagers dissipate little power, the potting would not result in overheating. To minimize the cost of the camera, a fixed lens could be fabricated as part of the plastic case. For improved optical performance at greater cost, an adjustable glass achromatic lens would be mounted in a reservoir that would be filled with transparent oil and subject to the full hydrostatic pressure, and the reservoir would be mounted on the case to position the lens in front of the image sensor. The lens would by adjusted for focus by use of a motor inside the reservoir (oil-filled motors already exist).

  6. A method for correcting geometric distortion in video cameras

    NASA Astrophysics Data System (ADS)

    Dijak, J. T.

    A rapid semi-automatic method is described for obtaining the warping polynomial coefficients required to correct geometric distortion in video cameras. After imaging and digitizing a dense grid of 255 points, the cursor facilities of a Gould-DeAnza IP8500 image processor are used in conjunction with a FORTRAN program to allow identification of all 255 points to subpixel accuracy. A second FORTRAN program is then used to solve for the optimum polynomial coefficients required to 'warp' the distorted grid image into an ideal, rectilinear, image. Polynomial orders between 1 and 6 are accommodated, with execution times of less than 45 seconds. Typically, a grid image with average distortions on the order of 2 pixels is corrected to reduce average distortion to on the order of 0.15 pixel.

  7. The Camera Is Not a Methodology: Towards a Framework for Understanding Young Children's Use of Video Cameras

    ERIC Educational Resources Information Center

    Bird, Jo; Colliver, Yeshe; Edwards, Susan

    2014-01-01

    Participatory research methods argue that young children should be enabled to contribute their perspectives on research seeking to understand their worldviews. Visual research methods, including the use of still and video cameras with young children have been viewed as particularly suited to this aim because cameras have been considered easy and…

  8. Photon-counting gamma camera based on columnar CsI(Tl) optically coupled to a back-illuminated CCD

    NASA Astrophysics Data System (ADS)

    Miller, Brian W.; Barber, H. Bradford; Barrett, Harrison H.; Chen, Liying; Taylor, Sean J.

    2007-03-01

    Recent advances have been made in a new class of CCD-based, single-photon-counting gamma-ray detectors which offer sub-100 μm intrinsic resolutions. 1-7 These detectors show great promise in small-animal SPECT and molecular imaging and exist in a variety of cofigurations. Typically, a columnar CsI(Tl) scintillator or a radiography screen (Gd IIO IIS:Tb) is imaged onto the CCD. Gamma-ray interactions are seen as clusters of signal spread over multiple pixels. When the detector is operated in a charge-integration mode, signal spread across pixels results in spatial-resolution degradation. However, if the detector is operated in photon-counting mode, the gamma-ray interaction position can be estimated using either Anger (centroid) estimation or maximum-likelihood position estimation resulting in a substantial improvement in spatial resolution.2 Due to the low-light-level nature of the scintillation process, CCD-based gamma cameras implement an amplfication stage in the CCD via electron multiplying (EMCCDs) 8-10 or via an image intensfier prior to the optical path.1 We have applied ideas and techniques from previous systems to our high-resolution LumiSPECT detector. 11, 12 LumiSPECT is a dual-modality optical/SPECT small-animal imaging system which was originally designed to operate in charge-integration mode. It employs a cryogenically cooled, high-quantum-efficiency, back-illuminated large-format CCD and operates in single-photon-counting mode without any intermediate amplfication process. Operating in photon-counting mode, the detector has an intrinsic spatial resolution of 64 μm compared to 134 μm in integrating mode.

  9. ATR/OTR-SY Tank Camera Purge System and in Tank Color Video Imaging System

    SciTech Connect

    Werry, S.M.

    1995-06-06

    This procedure will document the satisfactory operation of the 101-SY tank Camera Purge System (CPS) and 101-SY in tank Color Camera Video Imaging System (CCVIS). Included in the CPRS is the nitrogen purging system safety interlock which shuts down all the color video imaging system electronics within the 101-SY tank vapor space during loss of nitrogen purge pressure.

  10. Frequency Identification of Vibration Signals Using Video Camera Image Data

    PubMed Central

    Jeng, Yih-Nen; Wu, Chia-Hung

    2012-01-01

    This study showed that an image data acquisition system connecting a high-speed camera or webcam to a notebook or personal computer (PC) can precisely capture most dominant modes of vibration signal, but may involve the non-physical modes induced by the insufficient frame rates. Using a simple model, frequencies of these modes are properly predicted and excluded. Two experimental designs, which involve using an LED light source and a vibration exciter, are proposed to demonstrate the performance. First, the original gray-level resolution of a video camera from, for instance, 0 to 256 levels, was enhanced by summing gray-level data of all pixels in a small region around the point of interest. The image signal was further enhanced by attaching a white paper sheet marked with a black line on the surface of the vibration system in operation to increase the gray-level resolution. Experimental results showed that the Prosilica CV640C CMOS high-speed camera has the critical frequency of inducing the false mode at 60 Hz, whereas that of the webcam is 7.8 Hz. Several factors were proven to have the effect of partially suppressing the non-physical modes, but they cannot eliminate them completely. Two examples, the prominent vibration modes of which are less than the associated critical frequencies, are examined to demonstrate the performances of the proposed systems. In general, the experimental data show that the non-contact type image data acquisition systems are potential tools for collecting the low-frequency vibration signal of a system. PMID:23202026

  11. On the Complexity of Digital Video Cameras in/as Research: Perspectives and Agencements

    ERIC Educational Resources Information Center

    Bangou, Francis

    2014-01-01

    The goal of this article is to consider the potential for digital video cameras to produce as part of a research agencement. Our reflection will be guided by the current literature on the use of video recordings in research, as well as by the rhizoanalysis of two vignettes. The first of these vignettes is associated with a short video clip shot by…

  12. On the Complexity of Digital Video Cameras in/as Research: Perspectives and Agencements

    ERIC Educational Resources Information Center

    Bangou, Francis

    2014-01-01

    The goal of this article is to consider the potential for digital video cameras to produce as part of a research agencement. Our reflection will be guided by the current literature on the use of video recordings in research, as well as by the rhizoanalysis of two vignettes. The first of these vignettes is associated with a short video clip shot by

  13. An exploration of the potential use of a CCD camera for absorption spectroscopy in scattered light at the rainbow

    NASA Astrophysics Data System (ADS)

    Card, J. B. A.; Jones, A. R.

    2003-02-01

    The advent of the CCD camera has made it possible to record light intensity as a function of two dimensions. In this paper, we have explored the camera's potential to measure spectra in scattered light at the rainbow by recording the intensity as a function of angle and wavelength. To this end, white light from a xenon arc lamp was scattered from water sprays containing various concentrations of water-soluble food dyes. Comparisons were made with theoretical spectra calculated using Mie theory Qualitatively agreement was excellent. Quantitatively agreement was reasonable, but there were some discrepancies as yet to be explained. Although the main rainbow is insensitive to the particle size distribution, if concentration of the absorbing species is to be recovered accurately an independent means of determining the particle sizes will be necessary.

  14. Suppression of bright objects using a spatial light modulator when imaging with CCD- and image intensified cameras

    NASA Astrophysics Data System (ADS)

    Groeder, Torbjoern

    1993-01-01

    The operation of CCD (Charge Coupled Devices) and intensified cameras is described with emphasis on the problems caused by local overexposure of the image sensor. In trying to reduce the amount of blooming in the image, a transmissive graphic Liquid Crystal Display (LCD) operating as a spatial light modulator was placed in front of the cameras, and the transmission of the display elements was dynamically controlled in real time from a Personal Computer (PC). By activating the LCD elements in front of strongly illuminated areas of the image sensor, improvement of the image quality was demonstrated. Various types of LCDs were studied in trying to learn which ones would be best suited for this particular application, and this work may suggest a new application area for LCD's.

  15. Structural Dynamics Analysis and Research for FEA Modeling Method of a Light High Resolution CCD Camera

    NASA Astrophysics Data System (ADS)

    Sun, Jiwen; Wei, Ling; Fu, Danying

    2002-01-01

    resolution and wide swath. In order to assure its high optical precision smoothly passing the rigorous dynamic load of launch, it should be of high structural rigidity. Therefore, a careful study of the dynamic features of the camera structure should be performed. Pro/E. An interference examination is performed on the precise CAD model of the camera for mending the structural design. for the first time in China, and the analysis of structural dynamic of the camera is accomplished by applying the structural analysis code PATRAN and NASTRAN. The main research programs include: 1) the comparative calculation of modes analysis of the critical structure of the camera is achieved by using 4 nodes and 10 nodes tetrahedral elements respectively, so as to confirm the most reasonable general model; 2) through the modes analysis of the camera from several cases, the inherent frequencies and modes are obtained and further the rationality of the structural design of the camera is proved; 3) the static analysis of the camera under self gravity and overloads is completed and the relevant deformation and stress distributions are gained; 4) the response calculation of sine vibration of the camera is completed and the corresponding response curve and maximum acceleration response with corresponding frequencies are obtained. software technique is accurate and efficient. sensitivity, the dynamic design and engineering optimization of the critical structure of the camera are discussed. fundamental technology in design of forecoming space optical instruments.

  16. [Research Award providing funds for a tracking video camera

    NASA Technical Reports Server (NTRS)

    Collett, Thomas

    2000-01-01

    The award provided funds for a tracking video camera. The camera has been installed and the system calibrated. It has enabled us to follow in real time the tracks of individual wood ants (Formica rufa) within a 3m square arena as they navigate singly in-doors guided by visual cues. To date we have been using the system on two projects. The first is an analysis of the navigational strategies that ants use when guided by an extended landmark (a low wall) to a feeding site. After a brief training period, ants are able to keep a defined distance and angle from the wall, using their memory of the wall's height on the retina as a controlling parameter. By training with walls of one height and length and testing with walls of different heights and lengths, we can show that ants adjust their distance from the wall so as to keep the wall at the height that they learned during training. Thus, their distance from the base of a tall wall is further than it is from the training wall, and the distance is shorter when the wall is low. The stopping point of the trajectory is defined precisely by the angle that the far end of the wall makes with the trajectory. Thus, ants walk further if the wall is extended in length and not so far if the wall is shortened. These experiments represent the first case in which the controlling parameters of an extended trajectory can be defined with some certainty. It raises many questions for future research that we are now pursuing.

  17. Variable high-resolution color CCD camera system with online capability for professional photo studio application

    NASA Astrophysics Data System (ADS)

    Breitfelder, Stefan; Reichel, Frank R.; Gaertner, Ernst; Hacker, Erich J.; Cappellaro, Markus; Rudolf, Peter; Voelk, Ute

    1998-04-01

    Digital cameras are of increasing significance for professional applications in photo studios where fashion, portrait, product and catalog photographs or advertising photos of high quality have to be taken. The eyelike is a digital camera system which has been developed for such applications. It is capable of working online with high frame rates and images of full sensor size and it provides a resolution that can be varied between 2048 by 2048 and 6144 by 6144 pixel at a RGB color depth of 12 Bit per channel with an also variable exposure time of 1/60s to 1s. With an exposure time of 100 ms digitization takes approx. 2 seconds for an image of 2048 by 2048 pixels (12 Mbyte), 8 seconds for the image of 4096 by 4096 pixels (48 Mbyte) and 40 seconds for the image of 6144 by 6144 pixels (108 MByte). The eyelike can be used in various configurations. Used as a camera body most commercial lenses can be connected to the camera via existing lens adaptors. On the other hand the eyelike can be used as a back to most commercial 4' by 5' view cameras. This paper describes the eyelike camera concept with the essential system components. The article finishes with a description of the software, which is needed to bring the high quality of the camera to the user.

  18. Implementation of a parallel-beam optical-CT apparatus for three-dimensional radiation dosimetry using a high-resolution CCD camera

    NASA Astrophysics Data System (ADS)

    Huang, Wen-Tzeng; Chen, Chin-Hsing; Hung, Chao-Nan; Tuan, Chiu-Ching; Chang, Yuan-Jen

    2015-06-01

    In this study, a charge-coupled device (CCD) camera with 2-megapixel (1920×1080-pixel) and 12-bit resolution was developed for optical computed tomography(optical CT). The signal-to-noise ratio (SNR) of our system was 30.12 dB, better than that of commercially available CCD cameras (25.31 dB). The 50% modulation transfer function (MTF50) of our 1920×1080-pixel camera gave a line width per picture height (LW/PH) of 745, which is 73% of the diffraction-limited resolution. Compared with a commercially available 1-megapixel CCD camera (1296×966-pixel) with a LW/PH=358 and 46.6% of the diffraction-limited resolution, our camera system provided higher spatial resolution and better image quality. The NIPAM gel dosimeter was used to evaluate the optical CT with a 2-megapixel CCD. A clinical five-field irradiation treatment plan was generated using the Eclipse planning system (Varian Corp., Palo Alto, CA, USA). The gel phantom was irradiated using a 6-MV Varian Clinac IX linear accelerator (Varian). The measured NIPAM gel dose distributions and the calculated dose distributions, generated by the treatment planning software (TPS), were compared using the 3% dose-difference and 3 mm distance-to-agreement criteria. The gamma pass rate was as high as 98.2% when 2-megapixel CCD camera was used in optical CT. However, the gamma pass rate was only 96.0% when a commercially available 1-megapixel CCD camera was used.

  19. Liquid-crystal-display projector-based modulation transfer function measurements of charge-coupled-device video camera systems.

    PubMed

    Teipen, B T; MacFarlane, D L

    2000-02-01

    We demonstrate the ability to measure the system modulation transfer function (MTF) of both color and monochrome charge-coupled-device (CCD) video camera systems with a liquid-crystal-display (LCD) projector. Test matrices programmed to the LCD projector were chosen primarily to have a flat power spectral density (PSD) when averaged along one dimension. We explored several matrices and present results for a matrix produced with a random-number generator, a matrix of sequency-ordered Walsh functions, a pseudorandom Hadamard matrix, and a pseudorandom uniformly redundant array. All results are in agreement with expected filtering. The Walsh matrix and the Hadamard matrix show excellent agreement with the matrix from the random-number generator. We show that shift-variant effects between the LCD array and the CCD array can be kept small. This projector test method offers convenient measurement of the MTF of a low-cost video system. Such characterization is useful for an increasing number of machine vision applications and metrology applications. PMID:18337921

  20. Dynamic imaging with a triggered and intensified CCD camera system in a high-intensity neutron beam

    NASA Astrophysics Data System (ADS)

    Vontobel, P.; Frei, G.; Brunner, J.; Gildemeister, A. E.; Engelhardt, M.

    2005-04-01

    When time-dependent processes within metallic structures should be inspected and visualized, neutrons are well suited due to their high penetration through Al, Ag, Ti or even steel. Then it becomes possible to inspect the propagation, distribution and evaporation of organic liquids as lubricants, fuel or water. The principle set-up of a suited real-time system was implemented and tested at the radiography facility NEUTRA of PSI. The highest beam intensity there is 2×107 cm s, which enables to observe sequences in a reasonable time and quality. The heart of the detection system is the MCP intensified CCD camera PI-Max with a Peltier cooled chip (1300×1340 pixels). The intensifier was used for both gating and image enhancement, where as the information was accumulated over many single frames on the chip before readout. Although, a 16-bit dynamic range is advertised by the camera manufacturers, it must be less due to the inherent noise level from the intensifier. The obtained result should be seen as the starting point to go ahead to fit the different requirements of car producers in respect to fuel injection, lubricant distribution, mechanical stability and operation control. Similar inspections will be possible for all devices with repetitive operation principle. Here, we report about two measurements dealing with the lubricant distribution in a running motorcycle motor turning at 1200 rpm. We were monitoring the periodic stationary movements of piston, valves and camshaft with a micro-channel plate intensified CCD camera system (PI-Max 1300RB, Princeton Instruments) triggered at exactly chosen time points.

  1. Acceptance/operational test procedure 241-AN-107 Video Camera System

    SciTech Connect

    Pedersen, L.T.

    1994-11-18

    This procedure will document the satisfactory operation of the 241-AN-107 Video Camera System. The camera assembly, including camera mast, pan-and-tilt unit, camera, and lights, will be installed in Tank 241-AN-107 to monitor activities during the Caustic Addition Project. The camera focus, zoom, and iris remote controls will be functionally tested. The resolution and color rendition of the camera will be verified using standard reference charts. The pan-and-tilt unit will be tested for required ranges of motion, and the camera lights will be functionally tested. The master control station equipment, including the monitor, VCRs, printer, character generator, and video micrometer will be set up and performance tested in accordance with original equipment manufacturer`s specifications. The accuracy of the video micrometer to measure objects in the range of 0.25 inches to 67 inches will be verified. The gas drying distribution system will be tested to ensure that a drying gas can be flowed over the camera and lens in the event that condensation forms on these components. This test will be performed by attaching the gas input connector, located in the upper junction box, to a pressurized gas supply and verifying that the check valve, located in the camera housing, opens to exhaust the compressed gas. The 241-AN-107 camera system will also be tested to assure acceptable resolution of the camera imaging components utilizing the camera system lights.

  2. Compact pnCCD-based X-ray camera with high spatial and energy resolution: a color X-ray camera.

    PubMed

    Scharf, O; Ihle, S; Ordavo, I; Arkadiev, V; Bjeoumikhov, A; Bjeoumikhova, S; Buzanich, G; Gubzhokov, R; Günther, A; Hartmann, R; Kühbacher, M; Lang, M; Langhoff, N; Liebel, A; Radtke, M; Reinholz, U; Riesemeier, H; Soltau, H; Strüder, L; Thünemann, A F; Wedell, R

    2011-04-01

    For many applications there is a requirement for nondestructive analytical investigation of the elemental distribution in a sample. With the improvement of X-ray optics and spectroscopic X-ray imagers, full field X-ray fluorescence (FF-XRF) methods are feasible. A new device for high-resolution X-ray imaging, an energy and spatial resolving X-ray camera, is presented. The basic idea behind this so-called "color X-ray camera" (CXC) is to combine an energy dispersive array detector for X-rays, in this case a pnCCD, with polycapillary optics. Imaging is achieved using multiframe recording of the energy and the point of impact of single photons. The camera was tested using a laboratory 30 μm microfocus X-ray tube and synchrotron radiation from BESSY II at the BAMline facility. These experiments demonstrate the suitability of the camera for X-ray fluorescence analytics. The camera simultaneously records 69,696 spectra with an energy resolution of 152 eV for manganese K(α) with a spatial resolution of 50 μm over an imaging area of 12.7 × 12.7 mm(2). It is sensitive to photons in the energy region between 3 and 40 keV, limited by a 50 μm beryllium window, and the sensitive thickness of 450 μm of the chip. Online preview of the sample is possible as the software updates the sums of the counts for certain energy channel ranges during the measurement and displays 2-D false-color maps as well as spectra of selected regions. The complete data cube of 264 × 264 spectra is saved for further qualitative and quantitative processing. PMID:21355541

  3. Digital image measurement of specimen deformation based on CCD cameras and Image J software: an application to human pelvic biomechanics

    NASA Astrophysics Data System (ADS)

    Jia, Yongwei; Cheng, Liming; Yu, Guangrong; Lou, Yongjian; Yu, Yan; Chen, Bo; Ding, Zuquan

    2008-03-01

    A method of digital image measurement of specimen deformation based on CCD cameras and Image J software was developed. This method was used to measure the biomechanics behavior of human pelvis. Six cadaveric specimens from the third lumbar vertebra to the proximal 1/3 part of femur were tested. The specimens without any structural abnormalities were dissected of all soft tissue, sparing the hip joint capsules and the ligaments of the pelvic ring and floor. Markers with black dot on white background were affixed to the key regions of the pelvis. Axial loading from the proximal lumbar was applied by MTS in the gradient of 0N to 500N, which simulated the double feet standing stance. The anterior and lateral images of the specimen were obtained through two CCD cameras. Based on Image J software, digital image processing software, which can be freely downloaded from the National Institutes of Health, digital 8-bit images were processed. The procedure includes the recognition of digital marker, image invert, sub-pixel reconstruction, image segmentation, center of mass algorithm based on weighted average of pixel gray values. Vertical displacements of S1 (the first sacral vertebrae) in front view and micro-angular rotation of sacroiliac joint in lateral view were calculated according to the marker movement. The results of digital image measurement showed as following: marker image correlation before and after deformation was excellent. The average correlation coefficient was about 0.983. According to the 768 × 576 pixels image (pixel size 0.68mm × 0.68mm), the precision of the displacement detected in our experiment was about 0.018 pixels and the comparatively error could achieve 1.11\\perthou. The average vertical displacement of S1 of the pelvis was 0.8356+/-0.2830mm under vertical load of 500 Newtons and the average micro-angular rotation of sacroiliac joint in lateral view was 0.584+/-0.221°. The load-displacement curves obtained from our optical measure system matched the clinical results. Digital image measurement of specimen deformation based on CCD cameras and Image J software has good perspective for application in biomechanical research, which has the advantage of simple optical setup, no-contact, high precision, and no special requirement of test environment.

  4. Performances of a solid streak camera based on conventional CCD with nanosecond time resolution

    NASA Astrophysics Data System (ADS)

    Wang, Bo; Bai, Yonglin; Zhu, Bingli; Gou, Yongsheng; Xu, Peng; Bai, XiaoHong; Liu, Baiyu; Qin, Junjun

    2015-02-01

    Imaging systems with high temporal resolution are needed to study rapid physical phenomena ranging from shock waves, including extracorporeal shock waves used for surgery, to diagnostics of laser fusion and fuel injection in internal combustion engines. However, conventional streak cameras use a vacuum tube making thus fragile, cumbersome and expensive. Here we report an CMOS streak camera project consists in reproducing completely this streak camera functionality with a single CMOS chip. By changing the mode of charge transfer of CMOS image sensor, fast photoelectric diagnostics of single point with linear CMOS and high-speed line scanning with array CMOS sensor can be achieved respectively. A fast photoelectric diagnostics system has been designed and fabricated to investigate the feasibility of this method. Finally, the dynamic operation of the sensors is exposed. Measurements show a sample time of 500 ps and a time resolution better than 2 ns.

  5. MISR Level 1A CCD Science data, all cameras (MIL1A_V1)

    NASA Technical Reports Server (NTRS)

    Diner, David J. (Principal Investigator)

    The Level 1A data are raw MISR data that are decommutated, reformatted 12-bit Level 0 data shifted to byte boundaries, i.e., reversal of square-root encoding applied and converted to 16 bit, and annotated (e.g., with time information). These data are used by the Level 1B1 processing algorithm to generate calibrated radiances. The science data output preserves the spatial sampling rate of the Level 0 raw MISR CCD science data. CCD data are collected during routine science observations of the sunlit portion of the Earth. Each product represents one 'granule' of data. A 'granule' is defined to be the smallest unit of data required for MISR processing. Also, included in the Level 1A product are pointers to calibration coefficient files provided for Level 1B processing. [Location=GLOBAL] [Temporal_Coverage: Start_Date=2000-02-24; Stop_Date=] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180].

  6. MISR Level 1A CCD Science data, all cameras (MIL1A_V2)

    NASA Technical Reports Server (NTRS)

    Diner, David J. (Principal Investigator)

    The Level 1A data are raw MISR data that are decommutated, reformatted 12-bit Level 0 data shifted to byte boundaries, i.e., reversal of square-root encoding applied and converted to 16 bit, and annotated (e.g., with time information). These data are used by the Level 1B1 processing algorithm to generate calibrated radiances. The science data output preserves the spatial sampling rate of the Level 0 raw MISR CCD science data. CCD data are collected during routine science observations of the sunlit portion of the Earth. Each product represents one 'granule' of data. A 'granule' is defined to be the smallest unit of data required for MISR processing. Also, included in the Level 1A product are pointers to calibration coefficient files provided for Level 1B processing. [Location=GLOBAL] [Temporal_Coverage: Start_Date=2000-02-24; Stop_Date=] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180].

  7. Fused Six-Camera Video of STS-134 Launch - Duration: 79 seconds.

    NASA Video Gallery

    Imaging experts funded by the Space Shuttle Program and located at NASA's Ames Research Center prepared this video by merging nearly 20,000 photographs taken by a set of six cameras capturing 250 i...

  8. Station Cameras Capture New Videos of Hurricane Katia - Duration: 5 minutes, 36 seconds.

    NASA Video Gallery

    Aboard the International Space Station, external cameras captured new video of Hurricane Katia as it moved northwest across the western Atlantic north of Puerto Rico at 10:35 a.m. EDT on September ...

  9. Infrared imaging spectrometry by the use of bundled chalcogenide glass fibers and a PtSi CCD camera

    NASA Astrophysics Data System (ADS)

    Saito, Mitsunori; Kikuchi, Katsuhiro; Tanaka, Chinari; Sone, Hiroshi; Morimoto, Shozo; Yamashita, Toshiharu T.; Nishii, Junji

    1999-10-01

    A coherent fiber bundle for infrared image transmission was prepared by arranging 8400 chalcogenide (AsS) glass fibers. The fiber bundle, 1 m in length, is transmissive in the infrared spectral region of 1 - 6 micrometer. A remote spectroscopic imaging system was constructed with the fiber bundle and an infrared PtSi CCD camera. The system was used for the real-time observation (frame time: 1/60 s) of gas distribution. Infrared light from a SiC heater was delivered to a gas cell through a chalcogenide fiber, and transmitted light was observed through the fiber bundle. A band-pass filter was used for the selection of gas species. A He-Ne laser of 3.4 micrometer wavelength was also used for the observation of hydrocarbon gases. Gases bursting from a nozzle were observed successfully by a remote imaging system.

  10. Range-Gated LADAR Coherent Imaging Using Parametric Up-Conversion of IR and NIR Light for Imaging with a Visible-Range Fast-Shuttered Intensified Digital CCD Camera

    SciTech Connect

    YATES,GEORGE J.; MCDONALD,THOMAS E. JR.; BLISS,DAVID E.; CAMERON,STEWART M.; ZUTAVERN,FRED J.

    2000-12-20

    Research is presented on infrared (IR) and near infrared (NIR) sensitive sensor technologies for use in a high speed shuttered/intensified digital video camera system for range-gated imaging at ''eye-safe'' wavelengths in the region of 1.5 microns. The study is based upon nonlinear crystals used for second harmonic generation (SHG) in optical parametric oscillators (OPOS) for conversion of NIR and IR laser light to visible range light for detection with generic S-20 photocathodes. The intensifiers are ''stripline'' geometry 18-mm diameter microchannel plate intensifiers (MCPIIS), designed by Los Alamos National Laboratory and manufactured by Philips Photonics. The MCPIIS are designed for fast optical shattering with exposures in the 100-200 ps range, and are coupled to a fast readout CCD camera. Conversion efficiency and resolution for the wavelength conversion process are reported. Experimental set-ups for the wavelength shifting and the optical configurations for producing and transporting laser reflectance images are discussed.

  11. Engineering task plan for flammable gas atmosphere mobile color video camera systems

    SciTech Connect

    Kohlman, E.H.

    1995-01-25

    This Engineering Task Plan (ETP) describes the design, fabrication, assembly, and testing of the mobile video camera systems. The color video camera systems will be used to observe and record the activities within the vapor space of a tank on a limited exposure basis. The units will be fully mobile and designed for operation in the single-shell flammable gas producing tanks. The objective of this tank is to provide two mobile camera systems for use in flammable gas producing single-shell tanks (SSTs) for the Flammable Gas Tank Safety Program. The camera systems will provide observation, video recording, and monitoring of the activities that occur in the vapor space of applied tanks. The camera systems will be designed to be totally mobile, capable of deployment up to 6.1 meters into a 4 inch (minimum) riser.

  12. Using a Video Camera to Measure the Radius of the Earth

    ERIC Educational Resources Information Center

    Carroll, Joshua; Hughes, Stephen

    2013-01-01

    A simple but accurate method for measuring the Earth's radius using a video camera is described. A video camera was used to capture a shadow rising up the wall of a tall building at sunset. A free program called ImageJ was used to measure the time it took the shadow to rise a known distance up the building. The time, distance and length of

  13. Using a Video Camera to Measure the Radius of the Earth

    ERIC Educational Resources Information Center

    Carroll, Joshua; Hughes, Stephen

    2013-01-01

    A simple but accurate method for measuring the Earth's radius using a video camera is described. A video camera was used to capture a shadow rising up the wall of a tall building at sunset. A free program called ImageJ was used to measure the time it took the shadow to rise a known distance up the building. The time, distance and length of…

  14. Close infrared thermography using an intensified CCD camera: application in nondestructive high resolution evaluation of electrothermally actuated MEMS

    NASA Astrophysics Data System (ADS)

    Serio, B.; Hunsinger, J. J.; Conseil, F.; Derderian, P.; Collard, D.; Buchaillot, L.; Ravat, M. F.

    2005-06-01

    This communication proposes the description of an optical method for thermal characterization of MEMS devices. The method is based on the use of an intensified CCD camera to record the thermal radiation emitted by the studied device in the spectral domain from 600 nm to about 850 nm. The camera consists of an intensifier associated to a CCD sensor. The intensification allows for very low signal levels to be amplified and detected. We used a standard optical microscope to image the device with sub-micron resolution. Since, in close infrared, at very small scale and low temperature, typically 250°C for thermal MEMS (Micro-Electro-Mechanical Systems), the thermal radiation is very weak, we used image integration in order to increase the signal to noise ratio. Knowing the imaged materials emissivity, the temperature is given by using Planck"s law. In order to evaluate the system performances we have made micro-thermographies of a micro-relay thermal actuator. This device is an "U-shape" Al/SiO2 bimorph cantilever micro-relay with a gold-to-gold electrical contact, designed for secured harsh environment applications. The initial beam curvature resulting from residual stresses ensures a large gap between the contacts of the micro-relay. The current flow through the metallic layer heats the bimorph by Joule effect, and the differential expansion provides the vertical displacement for contact. The experimental results are confronted to FEM and analytical simulations. A good agreement was obtained between experimental results and simulations.

  15. High-sensitive radiography system utilizing a pulse x-ray generator and a night-vision CCD camera (MLX)

    NASA Astrophysics Data System (ADS)

    Sato, Eiichi; Sagae, Michiaki; Tanaka, Etsuro; Mori, Hidezo; Kawai, Toshiaki; Inoue, Takashi; Ogawa, Akira; Sato, Shigehiro; Ichimaru, Toshio; Takayama, Kazuyoshi

    2007-01-01

    High-sensitive radiography system utilizing a kilohertz-range stroboscopic x-ray generator and a night-vision CCD camera (MLX) is described. The x-ray generator consists of the following major components: a main controller, a condenser unit with a Cockcroft-Walton circuit, and an x-ray tube unit in conjunction with a grid controller. The main condenser of about 500 nF in the unit is charged up to 100 kV by the circuit, and the electric charges in the condenser are discharged to the triode by the grid control circuit. The maximum tube current and the repetition rate are approximately 0.5 A and 50 kHz, respectively. The x-ray pulse width ranges from 0.01 to 1.0 ms, and the maximum shot number has a value of 32. At a charging voltage of 60 kV and a width of 1.0 ms, the x-ray intensity obtained without filtering was 6.04 μGy at 1.0 m per pulse. In radiography, an object is exposed by the pulse x-ray generator, and a radiogram is taken by an image intensifier. The image is intensified by the CCD camera, and a stop-motion image is stored by a flash memory device using a trigger delay device. The image quality was improved with increases in the x-ray duration, and a single-shot radiography was performed with durations of less than 1.0 ms.

  16. Characterization of the luminance and shape of ash particles at Sakurajima volcano, Japan, using CCD camera images

    NASA Astrophysics Data System (ADS)

    Miwa, Takahiro; Shimano, Taketo; Nishimura, Takeshi

    2015-01-01

    We develop a new method for characterizing the properties of volcanic ash at the Sakurajima volcano, Japan, based on automatic processing of CCD camera images. Volcanic ash is studied in terms of both luminance and particle shape. A monochromatic CCD camera coupled with a stereomicroscope is used to acquire digital images through three filters that pass red, green, or blue light. On single ash particles, we measure the apparent luminance, corresponding to 256 tones for each color (red, green, and blue) for each pixel occupied by ash particles in the image, and the average and standard deviation of the luminance. The outline of each ash particle is captured from a digital image taken under transmitted light through a polarizing plate. Also, we define a new quasi-fractal dimension ( D qf ) to quantify the complexity of the ash particle outlines. We examine two ash samples, each including about 1000 particles, which were erupted from the Showa crater of the Sakurajima volcano, Japan, on February 09, 2009 and January 13, 2010. The apparent luminance of each ash particle shows a lognormal distribution. The average luminance of the ash particles erupted in 2009 is higher than that of those erupted in 2010, which is in good agreement with the results obtained from component analysis under a binocular microscope (i.e., the number fraction of dark juvenile particles is lower for the 2009 sample). The standard deviations of apparent luminance have two peaks in the histogram, and the quasi-fractal dimensions show different frequency distributions between the two samples. These features are not recognized in the results of conventional qualitative classification criteria or the sphericity of the particle outlines. Our method can characterize and distinguish ash samples, even for ash particles that have gradual property changes, and is complementary to component analysis. This method also enables the relatively fast and systematic analysis of ash samples that is required for petrologic monitoring of ongoing activity, such as at the Sakurajima volcano.

  17. Method for separating video camera motion from scene motion for constrained 3D displacement measurements

    NASA Astrophysics Data System (ADS)

    Gauthier, L. R.; Jansen, M. E.; Meyer, J. R.

    2014-09-01

    Camera motion is a potential problem when a video camera is used to perform dynamic displacement measurements. If the scene camera moves at the wrong time, the apparent motion of the object under study can easily be confused with the real motion of the object. In some cases, it is practically impossible to prevent camera motion, as for instance, when a camera is used outdoors in windy conditions. A method to address this challenge is described that provides an objective means to measure the displacement of an object of interest in the scene, even when the camera itself is moving in an unpredictable fashion at the same time. The main idea is to synchronously measure the motion of the camera and to use those data ex post facto to subtract out the apparent motion in the scene that is caused by the camera motion. The motion of the scene camera is measured by using a reference camera that is rigidly attached to the scene camera and oriented towards a stationary reference object. For instance, this reference object may be on the ground, which is known to be stationary. It is necessary to calibrate the reference camera by simultaneously measuring the scene images and the reference images at times when it is known that the scene object is stationary and the camera is moving. These data are used to map camera movement data to apparent scene movement data in pixel space and subsequently used to remove the camera movement from the scene measurements.

  18. Content-adaptive high-resolution hyperspectral video acquisition with a hybrid camera system.

    PubMed

    Ma, Chenguang; Cao, Xun; Wu, Rihui; Dai, Qionghai

    2014-02-15

    We present a hybrid camera system that combines optical designs with computational processing to achieve content-adaptive high-resolution hyperspectral video acquisition. In particular, we record two video streams: one high-spatial resolution RGB video and one low-spatial resolution hyperspectral video in which the recorded points are dynamically selected using a spatial light modulator (SLM). Then through video-frame registration and a spatio-temporal spreading of the co-located spectral/RGB information, video with high spatial and spectral resolution is produced. The sampling patterns on the SLM are generated on-the-fly according to the scene content, which fully exploits the self-adaptivity of the hybrid camera system. With an experimental prototype, we demonstrate significantly improved accuracy and efficiency as compared to the state-of-the-art. PMID:24562246

  19. Night Vision Camera

    NASA Technical Reports Server (NTRS)

    1996-01-01

    PixelVision, Inc. developed the Night Video NV652 Back-illuminated CCD Camera, based on the expertise of a former Jet Propulsion Laboratory employee and a former employee of Scientific Imaging Technologies, Inc. The camera operates without an image intensifier, using back-illuminated and thinned CCD technology to achieve extremely low light level imaging performance. The advantages of PixelVision's system over conventional cameras include greater resolution and better target identification under low light conditions, lower cost and a longer lifetime. It is used commercially for research and aviation.

  20. Performance Characterization of the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) CCD Cameras

    NASA Astrophysics Data System (ADS)

    Joiner, R. K.; Kobayashi, K.; Winebarger, A. R.; Champey, P. R.

    2014-12-01

    The Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) is a sounding rocket instrument which is currently being developed by NASA's Marshall Space Flight Center (MSFC) and the National Astronomical Observatory of Japan (NAOJ). The goal of this instrument is to observe and detect the Hanle effect in the scattered Lyman-Alpha UV (121.6nm) light emitted by the Sun's Chromosphere to make measurements of the magnetic field in this region. In order to make accurate measurements of this effect, the performance characteristics of the three on-board charge-coupled devices (CCDs) must meet certain requirements. These characteristics include: quantum efficiency, gain, dark current, noise, and linearity. Each of these must meet predetermined requirements in order to achieve satisfactory performance for the mission. The cameras must be able to operate with a gain of no greater than 2 e­-/DN, a noise level less than 25e-, a dark current level which is less than 10e-/pixel/s, and a residual non-linearity of less than 1%. Determining these characteristics involves performing a series of tests with each of the cameras in a high vacuum environment. Here we present the methods and results of each of these performance tests for the CLASP flight cameras.

  1. Performance Characterization of the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) CCD Cameras

    NASA Technical Reports Server (NTRS)

    Joiner, Reyann; Kobayashi, Ken; Winebarger, Amy; Champey, Patrick

    2014-01-01

    The Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) is a sounding rocket instrument which is currently being developed by NASA's Marshall Space Flight Center (MSFC) and the National Astronomical Observatory of Japan (NAOJ). The goal of this instrument is to observe and detect the Hanle effect in the scattered Lyman-Alpha UV (121.6nm) light emitted by the Sun's Chromosphere to make measurements of the magnetic field in this region. In order to make accurate measurements of this effect, the performance characteristics of the three on-board charge-coupled devices (CCDs) must meet certain requirements. These characteristics include: quantum efficiency, gain, dark current, noise, and linearity. Each of these must meet predetermined requirements in order to achieve satisfactory performance for the mission. The cameras must be able to operate with a gain of no greater than 2 e(-)/DN, a noise level less than 25e(-), a dark current level which is less than 10e(-)/pixel/s, and a residual nonlinearity of less than 1%. Determining these characteristics involves performing a series of tests with each of the cameras in a high vacuum environment. Here we present the methods and results of each of these performance tests for the CLASP flight cameras.

  2. The Study of Timing Relationships Which Arise When Using a Television CCD Camera Watec WAT-902H2 Supreme in Astronomical Investigations

    NASA Astrophysics Data System (ADS)

    Dragomiretsky, V. V.; Ryabov, A. V.; Koshkin, N. I.

    The present paper describes the study of timing relationships which arise when using an analogue CCD camera Watec WAT-902H2 Sup in astronomical investigations, particularly in time-domain measurements of LEO satellites which are fast-moving against stellar background.

  3. Lights! Camera! Action! Handling Your First Video Assignment.

    ERIC Educational Resources Information Center

    Thomas, Marjorie Bekaert

    1989-01-01

    The author discusses points to consider when hiring and working with a video production company to develop a video for human resources purposes. Questions to ask the consultants are included, as is information on the role of the company liaison and on how to avoid expensive, time-wasting pitfalls. (CH)

  4. Lights, Cameras, Pencils! Using Descriptive Video to Enhance Writing

    ERIC Educational Resources Information Center

    Hoffner, Helen; Baker, Eileen; Quinn, Kathleen Benson

    2008-01-01

    Students of various ages and abilities can increase their comprehension and build vocabulary with the help of a new technology, Descriptive Video. Descriptive Video (also known as described programming) was developed to give individuals with visual impairments access to visual media such as television programs and films. Described programs,…

  5. Feasibility study of transmission of OTV camera control information in the video vertical blanking interval

    NASA Technical Reports Server (NTRS)

    White, Preston A., III

    1994-01-01

    The Operational Television system at Kennedy Space Center operates hundreds of video cameras, many remotely controllable, in support of the operations at the center. This study was undertaken to determine if commercial NABTS (North American Basic Teletext System) teletext transmission in the vertical blanking interval of the genlock signals distributed to the cameras could be used to send remote control commands to the cameras and the associated pan and tilt platforms. Wavelength division multiplexed fiberoptic links are being installed in the OTV system to obtain RS-250 short-haul quality. It was demonstrated that the NABTS transmission could be sent over the fiberoptic cable plant without excessive video quality degradation and that video cameras could be controlled using NABTS transmissions over multimode fiberoptic paths as long as 1.2 km.

  6. Utilization of an Electron Multiplying CCD camera for applications in quantum information processing

    NASA Astrophysics Data System (ADS)

    Patel, Monika; Chen, Jian; Habif, Jonathan

    2013-03-01

    Electron Multiplying Charge-Coupled Device (EMCCD) cameras utilize an on-chip amplification process which boosts low-light signals above the readout noise floor. Although traditionally used for biological imaging, they have recently attracted interest for single-photon counting and entangled state characterization in quantum information processing applications. In addition, they exhibit some photon number-resolving capacity, which is attractive from the point-of-view of several applications in optical continous-variable computing, such as building a cubic phase gate. We characterize the Andor Luca-R EMCCD camera as an affordable tool for applications in optical quantum information. We present measurements of single-photon detection efficiency, dark count probability as well as photon-number resolving capacity and place quantitative bounds on the noise performance and detection efficiency of the EMCCD detector array. We find that the readout noise floor is a Gaussian distribution centered at 500 counts/pixel/frame at high EM gain setting. We also characterize the trade-off between quantum efficiency and detector dark-count probability.

  7. Arbitrary viewpoint video synthesis from multiple uncalibrated cameras.

    PubMed

    Yaguchi, Satoshi; Saito, Hideo

    2004-02-01

    We propose a method for arbitrary view synthesis from uncalibrated multiple camera system, targeting large spaces such as soccer stadiums. In Projective Grid Space (PGS), which is a three-dimensional space defined by epipolar geometry between two basis cameras in the camera system, we reconstruct three-dimensional shape models from silhouette images. Using the three-dimensional shape models reconstructed in the PGS, we obtain a dense map of the point correspondence between reference images. The obtained correspondence can synthesize the image of arbitrary view between the reference images. We also propose a method for merging the synthesized images with the virtual background scene in the PGS. We apply the proposed methods to image sequences taken by a multiple camera system, which installed in a large concert hall. The synthesized image sequences of virtual camera have enough quality to demonstrate effectiveness of the proposed method. PMID:15369084

  8. ON RELATIVISTIC DISK SPECTROSCOPY IN COMPACT OBJECTS WITH X-RAY CCD CAMERAS

    SciTech Connect

    Miller, J. M.; Cackett, E. M.; D'Ai, A.; Bautz, M. W.; Nowak, M. A.; Bhattacharyya, S.; Burrows, D. N.; Kennea, J.; Fabian, A. C.; Reis, R. C.; Freyberg, M. J.; Haberl, F.; Strohmayer, T. E.; Tsujimoto, M.

    2010-12-01

    X-ray charge-coupled devices (CCDs) are the workhorse detectors of modern X-ray astronomy. Typically covering the 0.3-10.0 keV energy range, CCDs are able to detect photoelectric absorption edges and K shell lines from most abundant metals. New CCDs also offer resolutions of 30-50 (E/{Delta}E), which is sufficient to detect lines in hot plasmas and to resolve many lines shaped by dynamical processes in accretion flows. The spectral capabilities of X-ray CCDs have been particularly important in detecting relativistic emission lines from the inner disks around accreting neutron stars and black holes. One drawback of X-ray CCDs is that spectra can be distorted by photon 'pile-up', wherein two or more photons may be registered as a single event during one frame time. We have conducted a large number of simulations using a statistical model of photon pile-up to assess its impacts on relativistic disk line and continuum spectra from stellar-mass black holes and neutron stars. The simulations cover the range of current X-ray CCD spectrometers and operational modes typically used to observe neutron stars and black holes in X-ray binaries. Our results suggest that severe photon pile-up acts to falsely narrow emission lines, leading to falsely large disk radii and falsely low spin values. In contrast, our simulations suggest that disk continua affected by severe pile-up are measured to have falsely low flux values, leading to falsely small radii and falsely high spin values. The results of these simulations and existing data appear to suggest that relativistic disk spectroscopy is generally robust against pile-up when this effect is modest.

  9. Evaluation of imaging performance of a taper optics CCD; FReLoN' camera designed for medical imaging.

    PubMed

    Coan, Paola; Peterzol, Angela; Fiedler, Stefan; Ponchut, Cyril; Labiche, Jean Claude; Bravin, Alberto

    2006-05-01

    The purpose of this work was to assess the imaging performance of an indirect conversion detector (taper optics CCD; FReLoN' camera) in terms of the modulation transfer function (MTF), normalized noise power spectrum (NNPS) and detective quantum efficiency (DQE). Measurements were made with a synchrotron radiation laminar beam at various monochromatic energies in the 20-51.5 keV range for a gadolinium-based fluorescent screen varying in thickness; data acquisition and analysis were made by adapting to this beam geometry protocols used for conventional cone beams. The pre-sampled MTFs of the systems were measured using an edge method. The NNPS of the systems were determined for a range of exposure levels by two-dimensional Fourier analysis of uniformly exposed radiographs. The DQEs were assessed from the measured MTF, NNPS, exposure and incoming number of photons. The MTF, for a given screen, was found to be almost energy independent and, for a given energy, higher for the thinnest screen. At 33 keV and for the 40 (100) microm screen, at 10% the MTF is 9.2 (8.6) line-pairs mm(-1). The NNPS was found to be different in the two analyzed directions in relation to frequency. Highest DQE values were found for the combination 100 microm and 25 keV (0.5); it was still equal to 0.4 at 51.5 keV (above the gadolinium K-edge). The DQE is limited by the phosphor screen conversion yield and by the CCD efficiency. At the end of the manuscript the results of the FReLoN characterization and those from a selected number of detectors presented in the literature are compared. PMID:16645252

  10. A unified framework for capturing facial images in video surveillance systems using cooperative camera system

    NASA Astrophysics Data System (ADS)

    Chan, Fai; Moon, Yiu-Sang; Chen, Jiansheng; Ma, Yiu-Kwan; Tsang, Wai-Hung; Fu, Kah-Kuen

    2008-04-01

    Low resolution and un-sharp facial images are always captured from surveillance videos because of long human-camera distance and human movements. Previous works addressed this problem by using an active camera to capture close-up facial images without considering human movements and mechanical delays of the active camera. In this paper, we proposed a unified framework to capture facial images in video surveillance systems by using one static and active camera in a cooperative manner. Human faces are first located by a skin-color based real-time face detection algorithm. A stereo camera model is also employed to approximate human face location and his/her velocity with respect to the active camera. Given the mechanical delays of the active camera, the position of a target face with a given delay can be estimated using a Human-Camera Synchronization Model. By controlling the active camera with corresponding amount of pan, tilt, and zoom, a clear close-up facial image of a moving human can be captured then. We built the proposed system in an 8.4-meter indoor corridor. Results show that the proposed stereo camera configuration can locate faces with average error of 3%. In addition, it is capable of capturing facial images of a walking human clearly in first instance in 90% of the test cases.

  11. Experimental Comparison of the High-Speed Imaging Performance of an EM-CCD and sCMOS Camera in a Dynamic Live-Cell Imaging Test Case

    PubMed Central

    Beier, Hope T.; Ibey, Bennett L.

    2014-01-01

    The study of living cells may require advanced imaging techniques to track weak and rapidly changing signals. Fundamental to this need is the recent advancement in camera technology. Two camera types, specifically sCMOS and EM-CCD, promise both high signal-to-noise and high speed (>100 fps), leaving researchers with a critical decision when determining the best technology for their application. In this article, we compare two cameras using a live-cell imaging test case in which small changes in cellular fluorescence must be rapidly detected with high spatial resolution. The EM-CCD maintained an advantage of being able to acquire discernible images with a lower number of photons due to its EM-enhancement. However, if high-resolution images at speeds approaching or exceeding 1000 fps are desired, the flexibility of the full-frame imaging capabilities of sCMOS is superior. PMID:24404178

  12. The Importance of Camera Calibration and Distortion Correction to Obtain Measurements with Video Surveillance Systems

    NASA Astrophysics Data System (ADS)

    Cattaneo, C.; Mainetti, G.; Sala, R.

    2015-11-01

    Video surveillance systems are commonly used as important sources of quantitative information but from the acquired images it is possible to obtain a large amount of metric information. Yet, different methodological issues must be considered in order to perform accurate measurements using images. The most important one is the camera calibration, which is the estimation of the parameters defining the camera model. One of the most used camera calibration method is the Zhang's method, that allows the estimation of the linear parameters of the camera model. This method is very diffused as it requires a simple setup and it allows to calibrate cameras using a simple and fast procedure, but it does not consider lenses distortions, that must be taken into account with short focal lenses, commonly used in video surveillance systems. In order to perform accurate measurements, the linear camera model and the Zhang's method are improved in order to take nonlinear parameters into account and compensate the distortion contribute. In this paper we first describe the pinhole camera model that considers cameras as central projection systems. After a brief introduction to the camera calibration process and in particular the Zhang's method, we give a description of the different types of lens distortions and the techniques used for the distortion compensation. At the end some numerical example are shown in order to demonstrate the importance of the distortion compensation to obtain accurate measurements.

  13. Fast-neutron radiation effects in a silica-core optical fiber studied by a CCD-camera spectrometer

    SciTech Connect

    Griscom, D.L.; Gingerich, M.E.; Friebele, E.J. ); Putnam, M. ); Unruh, W. )

    1994-02-20

    A simple CCD-camera spectrometer was deployed at the Los Alamos Spallation Radiation Effects Facility to characterize fast-neutron irradiation effects in several silica-based optical fibers over the wavelength range [similar to]450--1100 nm. The experimental arrangement allowed optical loss spectra to be developed from remotely recovered frame grabs at various times during irradiation without it being necessary to resort to cutback methods. Data recorded for a pure-silica-core/F-doped-silica-clad fiber displayed a peculiar artifact, which is described and mathematically modeled in terms of leaky modes propagating in an optical cladding that is substantially less susceptible to radiation-induced optical attenuation than is the core. Evidence from optical time-domain reflectometry supports the postulate that mode leakage into the cladding may be a result of light scattering from the tracks of ions displaced by the 14-MeV neutrons. These results suggest that fibers with fluorine doping in the core, as well as in the cladding, would be relatively resistant to radiation-induced attenuation in the UV--visible spectral region.

  14. Performance of a fluorescent screen and CCD camera as a two-dimensional dosimetry system for dynamic treatment techniques.

    PubMed

    Boon, S N; van Luijk, P; Böhringer, T; Coray, A; Lomax, A; Pedroni, E; Schaffner, B; Schippers, J M

    2000-10-01

    A two-dimensionally position sensitive dosimetry system has been tested for different dosimetric applications in a radiation therapy facility with a scanning proton beam. The system consists of a scintillating (fluorescent) screen, mounted at the beam-exit side of a phantom and it is observed by a charge coupled device (CCD) camera. The observed light distribution at the screen is equivalent to the two-dimensional (2D)-dose distribution at the screen position. It has been found that the dosimetric properties of the system, measured in a scanning proton beam, are equal to those measured in a proton beam broadened by a scattering system. Measurements of the transversal dose distribution of a single pencil beam are consistent with dose measurements as well as with dose calculations in clinically relevant fields made with multiple pencil beams. Measurements of inhomogeneous dose distributions have shown to be of sufficient accuracy to be suitable for the verification of dose calculation algorithms. The good sensitivity and sub-mm spatial resolution of the system allows for the detection of deviations of a few percent in dose from the expected (intended or calculated) dose distribution. Its dosimetric properties and the immediate availability of the data make this device a useful tool in the quality control of scanning proton beams. PMID:11099186

  15. Spatial resolution limit study of a CCD camera and scintillator based neutron imaging system according to MTF determination and analysis.

    PubMed

    Kharfi, F; Denden, O; Bourenane, A; Bitam, T; Ali, A

    2012-01-01

    Spatial resolution limit is a very important parameter of an imaging system that should be taken into consideration before examination of any object. The objectives of this work are the determination of a neutron imaging system's response in terms of spatial resolution. The proposed procedure is based on establishment of the Modulation Transfer Function (MTF). The imaging system being studied is based on a high sensitivity CCD neutron camera (2×10(-5)lx at f1.4). The neutron beam used is from the horizontal beam port (H.6) of the Algerian Es-Salam research reactor. Our contribution is on the MTF determination by proposing an accurate edge identification method and a line spread function undersampling problem-resolving procedure. These methods and procedure are integrated into a MatLab code. The methods, procedures and approaches proposed in this work are available for any other neutron imaging system and allow for judging the ability of a neutron imaging system to produce spatial (internal details) properties of any object under examination. PMID:22014891

  16. Video geographic information system using mobile mapping in mobilephone camera

    NASA Astrophysics Data System (ADS)

    Kang, Jinsuk; Lee, Jae-Joon

    2013-12-01

    In this Paper is to develop core technologies such as automatic shape extraction from images (video), spatialtemporal data processing, efficient modeling, and then make it inexpensive and fast to build and process the huge 3D geographic data. The upgrade and maintenance of the technologies are also easy due to the component-based system architecture. Therefore, we designed and implemented the Video mobile GIS using a real-time database system, which consisted of a real-time GIS engine, a middleware, and a mobile client.

  17. A Comparison of Techniques for Camera Selection and Hand-Off in a Video Network

    NASA Astrophysics Data System (ADS)

    Li, Yiming; Bhanu, Bir

    Video networks are becoming increasingly important for solving many real-world problems. Multiple video sensors require collaboration when performing various tasks. One of the most basic tasks is the tracking of objects, which requires mechanisms to select a camera for a certain object and hand-off this object from one camera to another so as to accomplish seamless tracking. In this chapter, we provide a comprehensive comparison of current and emerging camera selection and hand-off techniques. We consider geometry-, statistics-, and game theory-based approaches and provide both theoretical and experimental comparison using centralized and distributed computational models. We provide simulation and experimental results using real data for various scenarios of a large number of cameras and objects for in-depth understanding of strengths and weaknesses of these techniques.

  18. BOREAS RSS-3 Imagery and Snapshots from a Helicopter-Mounted Video Camera

    NASA Technical Reports Server (NTRS)

    Walthall, Charles L.; Loechel, Sara; Nickeson, Jaime (Editor); Hall, Forrest G. (Editor)

    2000-01-01

    The BOREAS RSS-3 team collected helicopter-based video coverage of forested sites acquired during BOREAS as well as single-frame "snapshots" processed to still images. Helicopter data used in this analysis were collected during all three 1994 IFCs (24-May to 16-Jun, 19-Jul to 10-Aug, and 30-Aug to 19-Sep), at numerous tower and auxiliary sites in both the NSA and the SSA. The VHS-camera observations correspond to other coincident helicopter measurements. The field of view of the camera is unknown. The video tapes are in both VHS and Beta format. The still images are stored in JPEG format.

  19. Camera/Video Phones in Schools: Law and Practice

    ERIC Educational Resources Information Center

    Parry, Gareth

    2005-01-01

    The emergence of mobile phones with built-in digital cameras is creating legal and ethical concerns for school systems throughout the world. Users of such phones can instantly email, print or post pictures to other MMS1 phones or websites. Local authorities and schools in Britain, Europe, USA, Canada, Australia and elsewhere have introduced

  20. Passive millimeter-wave video camera for aviation applications

    NASA Astrophysics Data System (ADS)

    Fornaca, Steven W.; Shoucri, Merit; Yujiri, Larry

    1998-07-01

    Passive Millimeter Wave (PMMW) imaging technology offers significant safety benefits to world aviation. Made possible by recent technological breakthroughs, PMMW imaging sensors provide visual-like images of objects under low visibility conditions (e.g., fog, clouds, snow, sandstorms, and smoke) which blind visual and infrared sensors. TRW has developed an advanced, demonstrator version of a PMMW imaging camera that, when front-mounted on an aircraft, gives images of the forward scene at a rate and quality sufficient to enhance aircrew vision and situational awareness under low visibility conditions. Potential aviation uses for a PMMW camera are numerous and include: (1) Enhanced vision for autonomous take- off, landing, and surface operations in Category III weather on Category I and non-precision runways; (2) Enhanced situational awareness during initial and final approach, including Controlled Flight Into Terrain (CFIT) mitigation; (3) Ground traffic control in low visibility; (4) Enhanced airport security. TRW leads a consortium which began flight tests with the demonstration PMMW camera in September 1997. Flight testing will continue in 1998. We discuss the characteristics of PMMW images, the current state of the technology, the integration of the camera with other flight avionics to form an enhanced vision system, and other aviation applications.

  1. Temperature monitoring of Nd:YAG laser cladding (CW and PP) by advanced pyrometry and CCD-camera-based diagnostic tool

    NASA Astrophysics Data System (ADS)

    Doubenskaia, M.; Bertrand, Ph.; Smurov, Igor Y.

    2004-04-01

    The set of original pyrometers and the special diagnostic CCD-camera were applied for monitoring of Nd:YAG laser cladding (Pulsed-Periodic and Continuous Wave) with coaxial powder injection and on-line measurement of cladded layer temperature. The experiments were carried out in course of elaboration of wear resistant coatings using various powder blends (WC-Co, CuSn, Mo, Stellite grade 12, etc.) applying variation of different process parameters: laser power, cladding velocity, powder feeding rate, etc. Surface temperature distribution to the cladding seam and the overall temperature mapping were registered. The CCD-camera based diagnostic tool was applied for: (1) monitoring of flux of hot particles and its instability; (2) measurement of particle-in-flight size and velocity; (3) monitoring of particle collision with the clad in the interaction zone.

  2. Lights! Camera! Action!: video projects in the classroom.

    PubMed

    Epstein, Carol Diane; Hovancsek, Marcella T; Dolan, Pamela L; Durner, Erin; La Rocco, Nicole; Preiszig, Patricia; Winnen, Caitlin

    2003-12-01

    We report on two classroom video projects intended to promote active student involvement in their classroom experience during a year-long medical-surgical nursing course. We implemented two types of projects, Nursing Grand Rounds and FPBTV. The projects are templates that can be applied to any nursing specialty and can be implemented without the use of video technology. During the course of several years, both projects have proven effective in encouraging students to promote pattern recognition of characteristic features of common illnesses, to develop teamwork strategies, and to practice their presentation skills in a safe environment among their peers. The projects appealed to students because they increased retention of information and immersed students in the experience of becoming experts about an illness or a family of medications. These projects have enabled students to become engaged and invested in their own learning in the classroom. PMID:14694997

  3. Laser Imaging Video Camera Sees Through Fire, Fog, Smoke

    NASA Technical Reports Server (NTRS)

    2015-01-01

    Under a series of SBIR contracts with Langley Research Center, inventor Richard Billmers refined a prototype for a laser imaging camera capable of seeing through fire, fog, smoke, and other obscurants. Now, Canton, Ohio-based Laser Imaging through Obscurants (LITO) Technologies Inc. is demonstrating the technology as a perimeter security system at Glenn Research Center and planning its future use in aviation, shipping, emergency response, and other fields.

  4. Observation of hydrothermal flows with acoustic video camera

    NASA Astrophysics Data System (ADS)

    Mochizuki, M.; Asada, A.; Tamaki, K.; Scientific Team Of Yk09-13 Leg 1

    2010-12-01

    To evaluate hydrothermal discharging and its diffusion process along the ocean ridge is necessary for understanding balance of mass and flux in the ocean, ecosystem around hydrothermal fields and so on. However, it has been difficult for us to measure hydrothermal activities without disturbance caused by observation platform ( submersible, ROV, AUV ). We wanted to have some observational method to observe hydrothermal discharging behavior as it was. DIDSON (Dual-Frequency IDentification SONar) is acoustic lens-based sonar. It has sufficiently high resolution and rapid refresh rate that it can substitute for optical system in turbid or dark water where optical systems fail. DIDSON operates at two frequencies, 1.8MHz or 1.1MHz, and forms 96 beams spaced 0.3° apart or 48 beams spaced 0.6° apart respectively. It images out to 12m at 1.8MHz and 40m at 1.1MHz. The transmit and receive beams are formed with acoustic lenses with rectangular apertures and made of polymethylpentene plastic and FC-70 liquid. This physical beam forming allows DIDSON to consume only 30W of power. DIDSON updates its image between 20 to 1 frames/s depending on the operating frequency and the maximum range imaged. It communicates its host using Ethernet. Institute of Industrial Science, University of Tokyo ( IIS ) has understood DIDSON’s superior performance and tried to find new method for utilization of it. The observation systems that IIS has ever developed based on DIDSON are waterside surveillance system, automatic measurement system for fish length, automatic system for fish counting, diagnosis system for deterioration of underwater structure and so on. A next challenge is to develop an observation method based on DIDSON for hydrothermal discharging from seafloor vent. We expected DIDSON to reveal whole image of hydrothermal plume as well as detail inside the plume. In October 2009, we conducted seafloor reconnaissance using a manned deep-sea submersible Shinkai6500 in Central Indian Ridge 18-20deg.S, where hydrothermal plume signatures were previously perceived. DIDSON was equipped on the top of Shinkai6500 in order to get acoustic video images of hydrothermal plumes. In this cruise, seven dives of Shinkai6500 were conducted. The acoustic video images of the hydrothermal plumes had been captured in three of seven dives. These are only a few acoustic video images of the hydrothermal plumes. Processing and analyzing the acoustic video image data are going on. We will report the overview of the acoustic video image of the hydrothermal plumes and discuss possibility of DIDSON as an observation tool for seafloor hydrothermal activity.

  5. Procurement documentation for the beam characterization subsystem: pyrheliometer, digitizer, MODACS III, video camera. RADL item No. 3-4

    SciTech Connect

    Not Available

    1980-02-01

    Procurement documentation is given describing the basic features of the parts of a Beam Characterization System of a solar central receiver facility. Parts include pyrheliometers, a digitizer, components of the data acquisition and control system, and video cameras. The specification establishing the performance and acceptance requirements for the video cameras are given. (LEW)

  6. Nyquist Sampling Theorem: Understanding the Illusion of a Spinning Wheel Captured with a Video Camera

    ERIC Educational Resources Information Center

    Levesque, Luc

    2014-01-01

    Inaccurate measurements occur regularly in data acquisition as a result of improper sampling times. An understanding of proper sampling times when collecting data with an analogue-to-digital converter or video camera is crucial in order to avoid anomalies. A proper choice of sampling times should be based on the Nyquist sampling theorem. If the…

  7. STS-29 Discovery, OV-103, MS Bagian uses video camera on forward flight deck

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Mission Specialist James P. Bagian points video camera out forward flight deck window W2 while freefloating above commanders station seat and controls. An unsecured seat belt drifts below Bagian's elbows. Bagian films Earth's surface while onboard Discovery, Orbiter Vehicle (OV) 103, during Mission STS-29.

  8. Nyquist Sampling Theorem: Understanding the Illusion of a Spinning Wheel Captured with a Video Camera

    ERIC Educational Resources Information Center

    Levesque, Luc

    2014-01-01

    Inaccurate measurements occur regularly in data acquisition as a result of improper sampling times. An understanding of proper sampling times when collecting data with an analogue-to-digital converter or video camera is crucial in order to avoid anomalies. A proper choice of sampling times should be based on the Nyquist sampling theorem. If the

  9. Digital Video Cameras for Brainstorming and Outlining: The Process and Potential

    ERIC Educational Resources Information Center

    Unger, John A.; Scullion, Vicki A.

    2013-01-01

    This "Voices from the Field" paper presents methods and participant-exemplar data for integrating digital video cameras into the writing process across postsecondary literacy contexts. The methods and participant data are part of an ongoing action-based research project systematically designed to bring research and theory into practice…

  10. Video content analysis on body-worn cameras for retrospective investigation

    NASA Astrophysics Data System (ADS)

    Bouma, Henri; Baan, Jan; ter Haar, Frank B.; Eendebak, Pieter T.; den Hollander, Richard J. M.; Burghouts, Gertjan J.; Wijn, Remco; van den Broek, Sebastiaan P.; van Rest, Jeroen H. C.

    2015-10-01

    In the security domain, cameras are important to assess critical situations. Apart from fixed surveillance cameras we observe an increasing number of sensors on mobile platforms, such as drones, vehicles and persons. Mobile cameras allow rapid and local deployment, enabling many novel applications and effects, such as the reduction of violence between police and citizens. However, the increased use of bodycams also creates potential challenges. For example: how can end-users extract information from the abundance of video, how can the information be presented, and how can an officer retrieve information efficiently? Nevertheless, such video gives the opportunity to stimulate the professionals' memory, and support complete and accurate reporting. In this paper, we show how video content analysis (VCA) can address these challenges and seize these opportunities. To this end, we focus on methods for creating a complete summary of the video, which allows quick retrieval of relevant fragments. The content analysis for summarization consists of several components, such as stabilization, scene selection, motion estimation, localization, pedestrian tracking and action recognition in the video from a bodycam. The different components and visual representations of summaries are presented for retrospective investigation.

  11. Temporal evolution of thermocavitation bubbles using high speed video camera

    NASA Astrophysics Data System (ADS)

    Padilla-Martinez, J. P.; Aguilar, G.; Ramirez-San-Juan, J. C.; Ramos-García, R.

    2011-10-01

    In this work, we present a novel method of cavitation, thermocavitation, induced by CW low power laser radiation in a highly absorbing solution of copper nitrate (CuNO4) dissolved in deionized water. The high absorption coefficient of the solution (α=135 cm-1) produces an overheated region (~300cm-1) followed by explosive phase transition and consequently the formation of an expanding vapor bubble, which later collapse very rapidly emitting intense acoustic shockwaves. We study the dynamic behavior of bubbles formed in contact with solid interface as a function of laser power using high speed video recording with rates of ~105 fps. The bubble grows regularly without any significant modification of its halfhemisphere shape, it reaches its maximum radius, but it deforms in the final stage of the collapse, probably due to the bubble adhesion to the surface. We also show that the maximum bubble radius and the shock-wave energy scales are inversely with the beam intensity.

  12. Acceptance/operational test procedure 101-AW tank camera purge system and 101-AW video camera system

    SciTech Connect

    Castleberry, J.L.

    1994-09-19

    This procedure will document the satisfactory operation of the 101-AW Tank Camera Purge System (CPS) and the 101-AW Video Camera System. The safety interlock which shuts down all the electronics inside the 101-AW vapor space, during loss of purge pressure, will be in place and tested to ensure reliable performance. This procedure is separated into four sections. Section 6.1 is performed in the 306 building prior to delivery to the 200 East Tank Farms and involves leak checking all fittings on the 101-AW Purge Panel for leakage using a Snoop solution and resolving the leakage. Section 7.1 verifies that PR-1, the regulator which maintains a positive pressure within the volume (cameras and pneumatic lines), is properly set. In addition the green light (PRESSURIZED) (located on the Purge Control Panel) is verified to turn on above 10 in. w.g. and after the time delay (TDR) has timed out. Section 7.2 verifies that the purge cycle functions properly, the red light (PURGE ON) comes on, and that the correct flowrate is obtained to meet the requirements of the National Fire Protection Association. Section 7.3 verifies that the pan and tilt, camera, associated controls and components operate correctly. This section also verifies that the safety interlock system operates correctly during loss of purge pressure. During the loss of purge operation the illumination of the amber light (PURGE FAILED) will be verified.

  13. Composite video and graphics display for multiple camera viewing system in robotics and teleoperation

    NASA Technical Reports Server (NTRS)

    Diner, Daniel B. (Inventor); Venema, Steven C. (Inventor)

    1991-01-01

    A system for real-time video image display for robotics or remote-vehicle teleoperation is described that has at least one robot arm or remotely operated vehicle controlled by an operator through hand-controllers, and one or more television cameras and optional lighting element. The system has at least one television monitor for display of a television image from a selected camera and the ability to select one of the cameras for image display. Graphics are generated with icons of cameras and lighting elements for display surrounding the television image to provide the operator information on: the location and orientation of each camera and lighting element; the region of illumination of each lighting element; the viewed region and range of focus of each camera; which camera is currently selected for image display for each monitor; and when the controller coordinate for said robot arms or remotely operated vehicles have been transformed to correspond to coordinates of a selected or nonselected camera.

  14. Composite video and graphics display for multiple camera viewing system in robotics and teleoperation

    NASA Astrophysics Data System (ADS)

    Diner, Daniel B.; Venema, Steven C.

    1991-06-01

    A system for real-time video image display for robotics or remote-vehicle teleoperation is described that has at least one robot arm or remotely operated vehicle controlled by an operator through hand-controllers, and one or more television cameras and optional lighting element. The system has at least one television monitor for display of a television image from a selected camera and the ability to select one of the cameras for image display. Graphics are generated with icons of cameras and lighting elements for display surrounding the television image to provide the operator information on: the location and orientation of each camera and lighting element; the region of illumination of each lighting element; the viewed region and range of focus of each camera; which camera is currently selected for image display for each monitor; and when the controller coordinate for said robot arms or remotely operated vehicles have been transformed to correspond to coordinates of a selected or nonselected camera.

  15. Composite video and graphics display for camera viewing systems in robotics and teleoperation

    NASA Astrophysics Data System (ADS)

    Diner, Daniel B.; Venema, Steven C.

    1993-01-01

    A system for real-time video image display for robotics or remote-vehicle teleoperation is described that has at least one robot arm or remotely operated vehicle controlled by an operator through hand-controllers, and one or more television cameras and optional lighting element. The system has at least one television monitor for display of a television image from a selected camera and the ability to select one of the cameras for image display. Graphics are generated with icons of cameras and lighting elements for display surrounding the television image to provide the operator information on: the location and orientation of each camera and lighting element; the region of illumination of each lighting element; the viewed region and range of focus of each camera; which camera is currently selected for image display for each monitor; and when the controller coordinate for said robot arms or remotely operated vehicles have been transformed to correspond to coordinates of a selected or nonselected camera.

  16. Compressive Video Recovery Using Block Match Multi-Frame Motion Estimation Based on Single Pixel Cameras.

    PubMed

    Bi, Sheng; Zeng, Xiao; Tang, Xin; Qin, Shujia; Lai, King Wai Chiu

    2016-01-01

    Compressive sensing (CS) theory has opened up new paths for the development of signal processing applications. Based on this theory, a novel single pixel camera architecture has been introduced to overcome the current limitations and challenges of traditional focal plane arrays. However, video quality based on this method is limited by existing acquisition and recovery methods, and the method also suffers from being time-consuming. In this paper, a multi-frame motion estimation algorithm is proposed in CS video to enhance the video quality. The proposed algorithm uses multiple frames to implement motion estimation. Experimental results show that using multi-frame motion estimation can improve the quality of recovered videos. To further reduce the motion estimation time, a block match algorithm is used to process motion estimation. Experiments demonstrate that using the block match algorithm can reduce motion estimation time by 30%. PMID:26950127

  17. Compressive Video Recovery Using Block Match Multi-Frame Motion Estimation Based on Single Pixel Cameras

    PubMed Central

    Bi, Sheng; Zeng, Xiao; Tang, Xin; Qin, Shujia; Lai, King Wai Chiu

    2016-01-01

    Compressive sensing (CS) theory has opened up new paths for the development of signal processing applications. Based on this theory, a novel single pixel camera architecture has been introduced to overcome the current limitations and challenges of traditional focal plane arrays. However, video quality based on this method is limited by existing acquisition and recovery methods, and the method also suffers from being time-consuming. In this paper, a multi-frame motion estimation algorithm is proposed in CS video to enhance the video quality. The proposed algorithm uses multiple frames to implement motion estimation. Experimental results show that using multi-frame motion estimation can improve the quality of recovered videos. To further reduce the motion estimation time, a block match algorithm is used to process motion estimation. Experiments demonstrate that using the block match algorithm can reduce motion estimation time by 30%. PMID:26950127

  18. A Refrigerated Web Camera for Photogrammetric Video Measurement inside Biomass Boilers and Combustion Analysis

    PubMed Central

    Porteiro, Jacobo; Riveiro, Belén; Granada, Enrique; Armesto, Julia; Eguía, Pablo; Collazo, Joaquín

    2011-01-01

    This paper describes a prototype instrumentation system for photogrammetric measuring of bed and ash layers, as well as for flying particle detection and pursuit using a single device (CCD) web camera. The system was designed to obtain images of the combustion process in the interior of a domestic boiler. It includes a cooling system, needed because of the high temperatures in the combustion chamber of the boiler. The cooling system was designed using CFD simulations to ensure effectiveness. This method allows more complete and real-time monitoring of the combustion process taking place inside a boiler. The information gained from this system may facilitate the optimisation of boiler processes. PMID:22319349

  19. Hardware-based smart camera for recovering high dynamic range video from multiple exposures

    NASA Astrophysics Data System (ADS)

    Lapray, Pierre-Jean; Heyrman, Barthélémy; Ginhac, Dominique

    2014-10-01

    In many applications such as video surveillance or defect detection, the perception of information related to a scene is limited in areas with strong contrasts. The high dynamic range (HDR) capture technique can deal with these limitations. The proposed method has the advantage of automatically selecting multiple exposure times to make outputs more visible than fixed exposure ones. A real-time hardware implementation of the HDR technique that shows more details both in dark and bright areas of a scene is an important line of research. For this purpose, we built a dedicated smart camera that performs both capturing and HDR video processing from three exposures. What is new in our work is shown through the following points: HDR video capture through multiple exposure control, HDR memory management, HDR frame generation, and representation under a hardware context. Our camera achieves a real-time HDR video output at 60 fps at 1.3 megapixels and demonstrates the efficiency of our technique through an experimental result. Applications of this HDR smart camera include the movie industry, the mass-consumer market, military, automotive industry, and surveillance.

  20. Surgical video recording with a modified GoPro Hero 4 camera

    PubMed Central

    Lin, Lily Koo

    2016-01-01

    Background Surgical videography can provide analytical self-examination for the surgeon, teaching opportunities for trainees, and allow for surgical case presentations. This study examined if a modified GoPro Hero 4 camera with a 25 mm lens could prove to be a cost-effective method of surgical videography with enough detail for oculoplastic and strabismus surgery. Method The stock lens mount and lens were removed from a GoPro Hero 4 camera, and was refitted with a Peau Productions SuperMount and 25 mm lens. The modified GoPro Hero 4 camera was then fixed to an overhead surgical light. Results Camera settings were set to 1080p video resolution. The 25 mm lens allowed for nine times the magnification as the GoPro stock lens. There was no noticeable video distortion. The entire cost was less than 600 USD. Conclusion The adapted GoPro Hero 4 with a 25 mm lens allows for high-definition, cost-effective, portable video capture of oculoplastic and strabismus surgery. The 25 mm lens allows for detailed videography that can enhance surgical teaching and self-examination. PMID:26834455

  1. Video and acoustic camera techniques for studying fish under ice: a review and comparison

    SciTech Connect

    Mueller, Robert P.; Brown, Richard S.; Hop, Haakon H.; Moulton, Larry

    2006-09-05

    Researchers attempting to study the presence, abundance, size, and behavior of fish species in northern and arctic climates during winter face many challenges, including the presence of thick ice cover, snow cover, and, sometimes, extremely low temperatures. This paper describes and compares the use of video and acoustic cameras for determining fish presence and behavior in lakes, rivers, and streams with ice cover. Methods are provided for determining fish density and size, identifying species, and measuring swimming speed and successful applications of previous surveys of fish under the ice are described. These include drilling ice holes, selecting batteries and generators, deploying pan and tilt cameras, and using paired colored lasers to determine fish size and habitat associations. We also discuss use of infrared and white light to enhance image-capturing capabilities, deployment of digital recording systems and time-lapse techniques, and the use of imaging software. Data are presented from initial surveys with video and acoustic cameras in the Sagavanirktok River Delta, Alaska, during late winter 2004. These surveys represent the first known successful application of a dual-frequency identification sonar (DIDSON) acoustic camera under the ice that achieved fish detection and sizing at camera ranges up to 16 m. Feasibility tests of video and acoustic cameras for determining fish size and density at various turbidity levels are also presented. Comparisons are made of the different techniques in terms of suitability for achieving various fisheries research objectives. This information is intended to assist researchers in choosing the equipment that best meets their study needs.

  2. Design and fabrication of a CCD camera for use with relay optics in solar X-ray astronomy

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Configured as a subsystem of a sounding rocket experiment, a camera system was designed to record and transmit an X-ray image focused on a charge coupled device. The camera consists of a X-ray sensitive detector and the electronics for processing and transmitting image data. The design and operation of the camera are described. Schematics are included.

  3. A Novel Method to Reduce Time Investment When Processing Videos from Camera Trap Studies

    PubMed Central

    Swinnen, Kristijn R. R.; Reijniers, Jonas; Breno, Matteo; Leirs, Herwig

    2014-01-01

    Camera traps have proven very useful in ecological, conservation and behavioral research. Camera traps non-invasively record presence and behavior of animals in their natural environment. Since the introduction of digital cameras, large amounts of data can be stored. Unfortunately, processing protocols did not evolve as fast as the technical capabilities of the cameras. We used camera traps to record videos of Eurasian beavers (Castor fiber). However, a large number of recordings did not contain the target species, but instead empty recordings or other species (together non-target recordings), making the removal of these recordings unacceptably time consuming. In this paper we propose a method to partially eliminate non-target recordings without having to watch the recordings, in order to reduce workload. Discrimination between recordings of target species and non-target recordings was based on detecting variation (changes in pixel values from frame to frame) in the recordings. Because of the size of the target species, we supposed that recordings with the target species contain on average much more movements than non-target recordings. Two different filter methods were tested and compared. We show that a partial discrimination can be made between target and non-target recordings based on variation in pixel values and that environmental conditions and filter methods influence the amount of non-target recordings that can be identified and discarded. By allowing a loss of 5% to 20% of recordings containing the target species, in ideal circumstances, 53% to 76% of non-target recordings can be identified and discarded. We conclude that adding an extra processing step in the camera trap protocol can result in large time savings. Since we are convinced that the use of camera traps will become increasingly important in the future, this filter method can benefit many researchers, using it in different contexts across the globe, on both videos and photographs. PMID:24918777

  4. Using a video camera to measure the radius of the Earth

    NASA Astrophysics Data System (ADS)

    Carroll, Joshua; Hughes, Stephen

    2013-11-01

    A simple but accurate method for measuring the Earth’s radius using a video camera is described. A video camera was used to capture a shadow rising up the wall of a tall building at sunset. A free program called ImageJ was used to measure the time it took the shadow to rise a known distance up the building. The time, distance and length of the sidereal day were used to calculate the radius of the Earth. The radius was measured as 6394.3 ± 118 km, which is within 1.8% of the accepted average value of 6371 km and well within the experimental error. The experiment is suitable as a high school or university project and should produce a value for Earth’s radius within a few per cent at latitudes towards the equator, where at some times of the year the ecliptic is approximately normal to the horizon.

  5. A passive terahertz video camera based on lumped element kinetic inductance detectors

    NASA Astrophysics Data System (ADS)

    Rowe, Sam; Pascale, Enzo; Doyle, Simon; Dunscombe, Chris; Hargrave, Peter; Papageorgio, Andreas; Wood, Ken; Ade, Peter A. R.; Barry, Peter; Bideaud, Aurélien; Brien, Tom; Dodd, Chris; Grainger, William; House, Julian; Mauskopf, Philip; Moseley, Paul; Spencer, Locke; Sudiwala, Rashmi; Tucker, Carole; Walker, Ian

    2016-03-01

    We have developed a passive 350 GHz (850 μm) video-camera to demonstrate lumped element kinetic inductance detectors (LEKIDs)—designed originally for far-infrared astronomy—as an option for general purpose terrestrial terahertz imaging applications. The camera currently operates at a quasi-video frame rate of 2 Hz with a noise equivalent temperature difference per frame of ˜0.1 K, which is close to the background limit. The 152 element superconducting LEKID array is fabricated from a simple 40 nm aluminum film on a silicon dielectric substrate and is read out through a single microwave feedline with a cryogenic low noise amplifier and room temperature frequency domain multiplexing electronics.

  6. A digital underwater video camera system for aquatic research in regulated rivers

    USGS Publications Warehouse

    Martin, Benjamin M.; Irwin, Elise R.

    2010-01-01

    We designed a digital underwater video camera system to monitor nesting centrarchid behavior in the Tallapoosa River, Alabama, 20 km below a peaking hydropower dam with a highly variable flow regime. Major components of the system included a digital video recorder, multiple underwater cameras, and specially fabricated substrate stakes. The innovative design of the substrate stakes allowed us to effectively observe nesting redbreast sunfish Lepomis auritus in a highly regulated river. Substrate stakes, which were constructed for the specific substratum complex (i.e., sand, gravel, and cobble) identified at our study site, were able to withstand a discharge level of approximately 300 m3/s and allowed us to simultaneously record 10 active nests before and during water releases from the dam. We believe our technique will be valuable for other researchers that work in regulated rivers to quantify behavior of aquatic fauna in response to a discharge disturbance.

  7. A passive terahertz video camera based on lumped element kinetic inductance detectors.

    PubMed

    Rowe, Sam; Pascale, Enzo; Doyle, Simon; Dunscombe, Chris; Hargrave, Peter; Papageorgio, Andreas; Wood, Ken; Ade, Peter A R; Barry, Peter; Bideaud, Aurélien; Brien, Tom; Dodd, Chris; Grainger, William; House, Julian; Mauskopf, Philip; Moseley, Paul; Spencer, Locke; Sudiwala, Rashmi; Tucker, Carole; Walker, Ian

    2016-03-01

    We have developed a passive 350 GHz (850 μm) video-camera to demonstrate lumped element kinetic inductance detectors (LEKIDs)-designed originally for far-infrared astronomy-as an option for general purpose terrestrial terahertz imaging applications. The camera currently operates at a quasi-video frame rate of 2 Hz with a noise equivalent temperature difference per frame of ∼0.1 K, which is close to the background limit. The 152 element superconducting LEKID array is fabricated from a simple 40 nm aluminum film on a silicon dielectric substrate and is read out through a single microwave feedline with a cryogenic low noise amplifier and room temperature frequency domain multiplexing electronics. PMID:27036756

  8. Operation and maintenance manual for the high resolution stereoscopic video camera system (HRSVS) system 6230

    SciTech Connect

    Pardini, A.F., Westinghouse Hanford

    1996-07-16

    The High Resolution Stereoscopic Video Cameral System (HRSVS),system 6230, is a stereoscopic camera system that will be used as an end effector on the LDUA to perform surveillance and inspection activities within Hanford waste tanks. It is attached to the LDUA by means of a Tool Interface Plate (TIP), which provides a feed through for all electrical and pneumatic utilities needed by the end effector to operate.

  9. A Large-panel Two-CCD Camera Coordinate System with an Alternate-Eight-Matrix Look-Up-Table Method

    NASA Astrophysics Data System (ADS)

    Lin, Chern-Sheng; Lu, An-Tsung; Hsu, Yuen-Chang; Tien, Chuen-Lin; Chen, Der-Chin; Chang, Nin-Chun

    2012-03-01

    This study proposed a novel positioning model, composing of a two-camera calibration system and an Alternate-Eight-Matrix (AEM) Look-Up-Table (LUT). Two video cameras were fixed on two sides of a large-size screen to solve the problem of field of view. The first to the fourth LUTs were used to compute the corresponding positions of specified regions on the screen captured by the camera on the right side. In these four LUTs, the coordinate mapping data of the target were stored in two matrixes, while the gray level threshold values of different positions were stored in other matrixes. Similarly, the fifth to the eighth LUTs were used to compute the corresponding positions of the specified regions on the screen captured by the camera on the left side. Experimental results showed that the proposed model can solve the problems of dead zones and non-uniform light fields, while achieving rapid and precise positioning results.

  10. Evaluation of lens distortion errors using an underwater camera system for video-based motion analysis

    NASA Technical Reports Server (NTRS)

    Poliner, Jeffrey; Fletcher, Lauren; Klute, Glenn K.

    1994-01-01

    Video-based motion analysis systems are widely employed to study human movement, using computers to capture, store, process, and analyze video data. This data can be collected in any environment where cameras can be located. One of the NASA facilities where human performance research is conducted is the Weightless Environment Training Facility (WETF), a pool of water which simulates zero-gravity with neutral buoyance. Underwater video collection in the WETF poses some unique problems. This project evaluates the error caused by the lens distortion of the WETF cameras. A grid of points of known dimensions was constructed and videotaped using a video vault underwater system. Recorded images were played back on a VCR and a personal computer grabbed and stored the images on disk. These images were then digitized to give calculated coordinates for the grid points. Errors were calculated as the distance from the known coordinates of the points to the calculated coordinates. It was demonstrated that errors from lens distortion could be as high as 8 percent. By avoiding the outermost regions of a wide-angle lens, the error can be kept smaller.

  11. The design and implementation of a position measuring system for a remotely controlled video camera

    NASA Astrophysics Data System (ADS)

    Lloyd, Peter D.

    1989-06-01

    A position measuring system for a remotely controlled video camera was designed and built. The camera is intended to be used with the modified Advance Development Model of the AN/SAR-8 Infrared Search and Target Designation System (IRSTD) in use at the Naval Postgraduate School. The video data collected by the camera will be correlated with the infrared data from the IRSTD to develop a background data base that will be used in the development of signal processing algorithms. The measurement system uses two Hewlett Packard HEDS-6000 incremental optical encoders, two Motorola MC68705U3 microprocessors and two digital display devices to measure and present the camera's azimuth and elevation angles to an operator at a remote location. The azimuth can be measured over a range of 360 deg with a resolution of + or - 0.0213 deg and the elevation can be measured over 24 deg with a resolution of + or - 0.138 deg. The resolution is limited primarily by hysteresis, which is due to the backlash in the gears between the transducers and the axes of interest.

  12. Structural analysis of color video camera installation on tank 241AW101 (2 Volumes)

    SciTech Connect

    Strehlow, J.P.

    1994-08-24

    A video camera is planned to be installed on the radioactive storage tank 241AW101 at the DOE` s Hanford Site in Richland, Washington. The camera will occupy the 20 inch port of the Multiport Flange riser which is to be installed on riser 5B of the 241AW101 (3,5,10). The objective of the project reported herein was to perform a seismic analysis and evaluation of the structural components of the camera for a postulated Design Basis Earthquake (DBE) per the reference Structural Design Specification (SDS) document (6). The detail of supporting engineering calculations is documented in URS/Blume Calculation No. 66481-01-CA-03 (1).

  13. Low-complexity camera digital signal imaging for video document projection system

    NASA Astrophysics Data System (ADS)

    Hsia, Shih-Chang; Tsai, Po-Shien

    2011-04-01

    We present high-performance and low-complexity algorithms for real-time camera imaging applications. The main functions of the proposed camera digital signal processing (DSP) involve color interpolation, white balance, adaptive binary processing, auto gain control, and edge and color enhancement for video projection systems. A series of simulations demonstrate that the proposed method can achieve good image quality while keeping computation cost and memory requirements low. On the basis of the proposed algorithms, the cost-effective hardware core is developed using Verilog HDL. The prototype chip has been verified with one low-cost programmable device. The real-time camera system can achieve 1270 × 792 resolution with the combination of extra components and can demonstrate each DSP function.

  14. Design and evaluation of controls for drift, video gain, and color balance in spaceborne facsimile cameras

    NASA Technical Reports Server (NTRS)

    Katzberg, S. J.; Kelly, W. L., IV; Rowland, C. W.; Burcher, E. E.

    1973-01-01

    The facsimile camera is an optical-mechanical scanning device which has become an attractive candidate as an imaging system for planetary landers and rovers. This paper presents electronic techniques which permit the acquisition and reconstruction of high quality images with this device, even under varying lighting conditions. These techniques include a control for low frequency noise and drift, an automatic gain control, a pulse-duration light modulation scheme, and a relative spectral gain control. Taken together, these techniques allow the reconstruction of radiometrically accurate and properly balanced color images from facsimile camera video data. These techniques have been incorporated into a facsimile camera and reproduction system, and experimental results are presented for each technique and for the complete system.

  15. A New Remote Sensing Filter Radiometer Employing a Fabry-Perot Etalon and a CCD Camera for Column Measurements of Methane in the Earth Atmosphere

    NASA Technical Reports Server (NTRS)

    Georgieva, E. M.; Huang, W.; Heaps, W. S.

    2012-01-01

    A portable remote sensing system for precision column measurements of methane has been developed, built and tested at NASA GSFC. The sensor covers the spectral range from 1.636 micrometers to 1.646 micrometers, employs an air-gapped Fabry-Perot filter and a CCD camera and has a potential to operate from a variety of platforms. The detector is an XS-1.7-320 camera unit from Xenics Infrared solutions which combines an uncooled InGaAs detector array working up to 1.7 micrometers. Custom software was developed in addition to the graphical user basic interface X-Control provided by the company to help save and process the data. The technique and setup can be used to measure other trace gases in the atmosphere with minimal changes of the etalon and the prefilter. In this paper we describe the calibration of the system using several different approaches.

  16. Video camera observation for assessing overland flow patterns during rainfall events

    NASA Astrophysics Data System (ADS)

    Silasari, Rasmiaditya; Oismüller, Markus; Blöschl, Günter

    2015-04-01

    Physically based hydrological models have been widely used in various studies to model overland flow propagation in cases such as flood inundation and dam break flow. The capability of such models to simulate the formation of overland flow by spatial and temporal discretization of the empirical equations makes it possible for hydrologists to trace the overland flow generation both spatially and temporally across surface and subsurface domains. As the upscaling methods transforming hydrological process spatial patterns from the small obrseved scale to the larger catchment scale are still being progressively developed, the physically based hydrological models become a convenient tool to assess the patterns and their behaviors crucial in determining the upscaling process. Related studies in the past had successfully used these models as well as utilizing field observation data for model verification. The common observation data used for this verification are overland flow discharge during natural rainfall events and camera observations during synthetic events (staged field experiments) while the use of camera observations during natural events are hardly discussed in publications. This study advances in exploring the potential of video camera observations of overland flow generation during natural rainfall events to support the physically based hydrological model verification and the assessment of overland flow spatial patterns. The study is conducted within a 64ha catchment located at Petzenkirchen, Lower Austria, known as HOAL (Hydrological Open Air Laboratory). The catchment land covers are dominated by arable land (87%) with small portions (13%) of forest, pasture and paved surfaces. A 600m stream is running at southeast of the catchment flowing southward and equipped with flumes and pressure transducers measuring water level in minutely basis from various inlets along the stream (i.e. drainages, surface runoffs, springs) to be calculated into flow discharge. A video camera with 10x optical zoom is installed 7m above the ground at the middle of the catchment overlooking the west hillslope area of the stream. Minutely images are taken daily during daylight while video recording is triggered by raindrop movements. The observed images and videos are analyzed in accordance to the overland flow signals captured by the assigned pressure transducers and the rainfall intensities measured by four rain gauges across the catchment. The results show that the video camera observations enable us to assess the spatial and temporal development of the overland flow generation during natural events, thus showing potentials to be used in model verification as well as in spatial patterns analysis.

  17. Specific Analysis of Web Camera and High Resolution Planetary Imaging

    NASA Astrophysics Data System (ADS)

    Park, Youngsik; Lee, Dongju; Jin, Ho; Han, Wonyong; Park, Jang-Hyun

    2006-12-01

    Web camera is usually used for video communication between PC, it has small sensing area, cannot using long exposure application, so that is insufficient for astronomical application. But web camera is suitable for bright planet, moon, it doesn't need long exposure time. So many amateur astronomer using web camera for planetary imaging. We used ToUcam manufactured by Phillips for planetary imaging and Registax commercial program for a video file combining. And then, we are measure a property of web camera, such as linearity, gain that is usually using for analysis of CCD performance. Because of using combine technic selected high quality image from video frame, this method can take higher resolution planetary imaging than one shot image by film, digital camera and CCD. We describe a planetary observing method and a video frame combine method.

  18. Compact full-motion video hyperspectral cameras: development, image processing, and applications

    NASA Astrophysics Data System (ADS)

    Kanaev, A. V.

    2015-10-01

    Emergence of spectral pixel-level color filters has enabled development of hyper-spectral Full Motion Video (FMV) sensors operating in visible (EO) and infrared (IR) wavelengths. The new class of hyper-spectral cameras opens broad possibilities of its utilization for military and industry purposes. Indeed, such cameras are able to classify materials as well as detect and track spectral signatures continuously in real time while simultaneously providing an operator the benefit of enhanced-discrimination-color video. Supporting these extensive capabilities requires significant computational processing of the collected spectral data. In general, two processing streams are envisioned for mosaic array cameras. The first is spectral computation that provides essential spectral content analysis e.g. detection or classification. The second is presentation of the video to an operator that can offer the best display of the content depending on the performed task e.g. providing spatial resolution enhancement or color coding of the spectral analysis. These processing streams can be executed in parallel or they can utilize each other's results. The spectral analysis algorithms have been developed extensively, however demosaicking of more than three equally-sampled spectral bands has been explored scarcely. We present unique approach to demosaicking based on multi-band super-resolution and show the trade-off between spatial resolution and spectral content. Using imagery collected with developed 9-band SWIR camera we demonstrate several of its concepts of operation including detection and tracking. We also compare the demosaicking results to the results of multi-frame super-resolution as well as to the combined multi-frame and multiband processing.

  19. Real Time Speed Estimation of Moving Vehicles from Side View Images from an Uncalibrated Video Camera

    PubMed Central

    Doğan, Sedat; Temiz, Mahir Serhan; Külür, Sıtkı

    2010-01-01

    In order to estimate the speed of a moving vehicle with side view camera images, velocity vectors of a sufficient number of reference points identified on the vehicle must be found using frame images. This procedure involves two main steps. In the first step, a sufficient number of points from the vehicle is selected, and these points must be accurately tracked on at least two successive video frames. In the second step, by using the displacement vectors of the tracked points and passed time, the velocity vectors of those points are computed. Computed velocity vectors are defined in the video image coordinate system and displacement vectors are measured by the means of pixel units. Then the magnitudes of the computed vectors in image space should be transformed to the object space to find the absolute values of these magnitudes. This transformation requires an image to object space information in a mathematical sense that is achieved by means of the calibration and orientation parameters of the video frame images. This paper presents proposed solutions for the problems of using side view camera images mentioned here. PMID:22399909

  20. Holographic combiner design to obtain uniform symbol brightness at head-up display (HUD) video camera

    NASA Astrophysics Data System (ADS)

    Battey, David E.; Melzer, James E.

    1989-03-01

    A typical head-up display (HUD) system incorporates a video camera for recording the HUD symbology and the scene outside the cockpit. When using a HUD video camera (HVC) with a zero-power holographic combiner, the brightness of the HUD symbology seen by the camera changes significantly as a function of vertical field angle because the holographic combiner's reflectance characteristics are angularly sensitive and optimized for the pilot's eye position. A holographic combiner design is presented that overcomes this problem while simultaneously maintaining high reflectance of the phosphor's light to the pilot and high visual transmittance. The combiner contains an additional holographic layer tuned to the blue emission of the P53 phosphor as viewed from the HVC, taking advantage of the HVC's high sensitivity in the blue. The reflectance of the additional hologram is tapered to achieve minimum brightness variation at the HVC. The response of the additional hologram as viewed by the pilot shifts towards the ultra-violet and is thus nearly invisible. Theoretical and measured performance of the combiner are presented.

  1. Algorithm design for automated transportation photo enforcement camera image and video quality diagnostic check modules

    NASA Astrophysics Data System (ADS)

    Raghavan, Ajay; Saha, Bhaskar

    2013-03-01

    Photo enforcement devices for traffic rules such as red lights, toll, stops, and speed limits are increasingly being deployed in cities and counties around the world to ensure smooth traffic flow and public safety. These are typically unattended fielded systems, and so it is important to periodically check them for potential image/video quality problems that might interfere with their intended functionality. There is interest in automating such checks to reduce the operational overhead and human error involved in manually checking large camera device fleets. Examples of problems affecting such camera devices include exposure issues, focus drifts, obstructions, misalignment, download errors, and motion blur. Furthermore, in some cases, in addition to the sub-algorithms for individual problems, one also has to carefully design the overall algorithm and logic to check for and accurately classifying these individual problems. Some of these issues can occur in tandem or have the potential to be confused for each other by automated algorithms. Examples include camera misalignment that can cause some scene elements to go out of focus for wide-area scenes or download errors that can be misinterpreted as an obstruction. Therefore, the sequence in which the sub-algorithms are utilized is also important. This paper presents an overview of these problems along with no-reference and reduced reference image and video quality solutions to detect and classify such faults.

  2. Falling-incident detection and throughput enhancement in a multi-camera video-surveillance system.

    PubMed

    Shieh, Wann-Yun; Huang, Ju-Chin

    2012-09-01

    For most elderly, unpredictable falling incidents may occur at the corner of stairs or a long corridor due to body frailty. If we delay to rescue a falling elder who is likely fainting, more serious consequent injury may occur. Traditional secure or video surveillance systems need caregivers to monitor a centralized screen continuously, or need an elder to wear sensors to detect falling incidents, which explicitly waste much human power or cause inconvenience for elders. In this paper, we propose an automatic falling-detection algorithm and implement this algorithm in a multi-camera video surveillance system. The algorithm uses each camera to fetch the images from the regions required to be monitored. It then uses a falling-pattern recognition algorithm to determine if a falling incident has occurred. If yes, system will send short messages to someone needs to be noticed. The algorithm has been implemented in a DSP-based hardware acceleration board for functionality proof. Simulation results show that the accuracy of falling detection can achieve at least 90% and the throughput of a four-camera surveillance system can be improved by about 2.1 times. PMID:22154761

  3. A compact high-definition low-cost digital stereoscopic video camera for rapid robotic surgery development.

    PubMed

    Carlson, Jay; Kowalczuk, Jędrzej; Psota, Eric; Pérez, Lance C

    2012-01-01

    Robotic surgical platforms require vision feedback systems, which often consist of low-resolution, expensive, single-imager analog cameras. These systems are retooled for 3D display by simply doubling the cameras and outboard control units. Here, a fully-integrated digital stereoscopic video camera employing high-definition sensors and a class-compliant USB video interface is presented. This system can be used with low-cost PC hardware and consumer-level 3D displays for tele-medical surgical applications including military medical support, disaster relief, and space exploration. PMID:22356964

  4. VideoWeb Dataset for Multi-camera Activities and Non-verbal Communication

    NASA Astrophysics Data System (ADS)

    Denina, Giovanni; Bhanu, Bir; Nguyen, Hoang Thanh; Ding, Chong; Kamal, Ahmed; Ravishankar, Chinya; Roy-Chowdhury, Amit; Ivers, Allen; Varda, Brenda

    Human-activity recognition is one of the most challenging problems in computer vision. Researchers from around the world have tried to solve this problem and have come a long way in recognizing simple motions and atomic activities. As the computer vision community heads toward fully recognizing human activities, a challenging and labeled dataset is needed. To respond to that need, we collected a dataset of realistic scenarios in a multi-camera network environment (VideoWeb) involving multiple persons performing dozens of different repetitive and non-repetitive activities. This chapter describes the details of the dataset. We believe that this VideoWeb Activities dataset is unique and it is one of the most challenging datasets available today. The dataset is publicly available online at http://vwdata.ee.ucr.edu/ along with the data annotation.

  5. A semantic autonomous video surveillance system for dense camera networks in Smart Cities.

    PubMed

    Calavia, Lorena; Baladrón, Carlos; Aguiar, Javier M; Carro, Belén; Sánchez-Esguevillas, Antonio

    2012-01-01

    This paper presents a proposal of an intelligent video surveillance system able to detect and identify abnormal and alarming situations by analyzing object movement. The system is designed to minimize video processing and transmission, thus allowing a large number of cameras to be deployed on the system, and therefore making it suitable for its usage as an integrated safety and security solution in Smart Cities. Alarm detection is performed on the basis of parameters of the moving objects and their trajectories, and is performed using semantic reasoning and ontologies. This means that the system employs a high-level conceptual language easy to understand for human operators, capable of raising enriched alarms with descriptions of what is happening on the image, and to automate reactions to them such as alerting the appropriate emergency services using the Smart City safety network. PMID:23112607

  6. A Semantic Autonomous Video Surveillance System for Dense Camera Networks in Smart Cities

    PubMed Central

    Calavia, Lorena; Baladrón, Carlos; Aguiar, Javier M.; Carro, Belén; Sánchez-Esguevillas, Antonio

    2012-01-01

    This paper presents a proposal of an intelligent video surveillance system able to detect and identify abnormal and alarming situations by analyzing object movement. The system is designed to minimize video processing and transmission, thus allowing a large number of cameras to be deployed on the system, and therefore making it suitable for its usage as an integrated safety and security solution in Smart Cities. Alarm detection is performed on the basis of parameters of the moving objects and their trajectories, and is performed using semantic reasoning and ontologies. This means that the system employs a high-level conceptual language easy to understand for human operators, capable of raising enriched alarms with descriptions of what is happening on the image, and to automate reactions to them such as alerting the appropriate emergency services using the Smart City safety network. PMID:23112607

  7. Acute gastroenteritis and video camera surveillance: a cruise ship case report.

    PubMed

    Diskin, Arthur L; Caro, Gina M; Dahl, Eilif

    2014-01-01

    A 'faecal accident' was discovered in front of a passenger cabin of a cruise ship. After proper cleaning of the area the passenger was approached, but denied having any gastrointestinal symptoms. However, when confronted with surveillance camera evidence, she admitted having the accident and even bringing the towel stained with diarrhoea back to the pool towels bin. She was isolated until the next port where she was disembarked. Acute gastroenteritis (AGE) caused by Norovirus is very contagious and easily transmitted from person to person on cruise ships. The main purpose of isolation is to avoid public vomiting and faecal accidents. To quickly identify and isolate contagious passengers and crew and ensure their compliance are key elements in outbreak prevention and control, but this is difficult if ill persons deny symptoms. All passenger ships visiting US ports now have surveillance video cameras, which under certain circumstances can assist in finding potential index cases for AGE outbreaks. PMID:24677123

  8. Accurate and real-time depth video acquisition using Kinect-stereo camera fusion

    NASA Astrophysics Data System (ADS)

    Williem; Tai, Yu-Wing; Park, In Kyu

    2014-04-01

    This paper presents a Kinect-stereo camera fusion system that significantly improves the accuracy of depth map acquisition. The typical Kinect depth map suffers from missing depth values and errors, resulting from a single Kinect input. To ameliorate such problems, the proposed system couples a Kinect with a stereo RGB camera to provide an additional disparity map. Kinect depth map and the disparity map are efficiently fused in real time by exploiting a spatiotemporal Markov random field framework on a graphics processing unit. An efficient temporal data cost is proposed to maintain the temporal coherency between frames. We demonstrate the performance of the proposed system on challenging real-world examples. Experimental results confirm that the proposed system is robust and accurate in depth video acquisition.

  9. Fresnel hologram generation using an HD resolution depth range video camera

    NASA Astrophysics Data System (ADS)

    Oi, Ryutaro; Mishina, Tomoyuki; Yamamoto, Kenji; Senoh, Takanori; Kurita, Taiichiro

    2010-02-01

    Holography is considered as an ideal 3D display method. We generated a hologram under white light. The infrared depth camera, which we used, captures the depth information as well as color video of the scene in 20mm of accuracy at 2m of object distance. In this research, we developed a software converter to convert the HD resolution depth map to the hologram. In this conversion method, each elemental diffraction pattern on a hologram plane was calculated beforehand according to the object distance and the maximum diffraction angle determined by the reconstruction SLM device (high resolution LCOS). The reconstructed 3D image was observed.

  10. Modelling the spectral response of the Swift-XRT CCD camera: experience learnt from in-flight calibration

    NASA Astrophysics Data System (ADS)

    Godet, O.; Beardmore, A. P.; Abbey, A. F.; Osborne, J. P.; Cusumano, G.; Pagani, C.; Capalbi, M.; Perri, M.; Page, K. L.; Burrows, D. N.; Campana, S.; Hill, J. E.; Kennea, J. A.; Moretti, A.

    2009-02-01

    Context: Since its launch in November 2004, Swift has revolutionised our understanding of gamma-ray bursts. The X-ray telescope (XRT), one of the three instruments on board Swift, has played a key role in providing essential positions, timing, and spectroscopy of more than 300 GRB afterglows to date. Although Swift was designed to observe GRB afterglows with power-law spectra, Swift is spending an increasing fraction of its time observing more traditional X-ray sources, which have more complex spectra. Aims: The aim of this paper is a detailed description of the CCD response model used to compute the XRT RMFs (redistribution matrix files), the changes implemented to it based on measurements of celestial and on-board calibration sources, and current caveats in the RMFs for the spectral analysis of XRT data. Methods: The RMFs are computed via Monte-Carlo simulations based on a physical model describing the interaction of photons within the silicon bulk of the CCD detector. Results: We show that the XRT spectral response calibration was complicated by various energy offsets in photon counting (PC) and windowed timing (WT) modes related to the way the CCD is operated in orbit (variation in temperature during observations, contamination by optical light from the sunlit Earth and increase in charge transfer inefficiency). We describe how these effects can be corrected for in the ground processing software. We show that the low-energy response, the redistribution in spectra of absorbed sources, and the modelling of the line profile have been significantly improved since launch by introducing empirical corrections in our code when it was not possible to use a physical description. We note that the increase in CTI became noticeable in June 2006 (i.e. 14 months after launch), but the evidence of a more serious degradation in spectroscopic performance (line broadening and change in the low-energy response) due to large charge traps (i.e. faults in the Si crystal) became more significant after March 2007. We describe efforts to handle such changes in the spectral response. Finally, we show that the commanded increase in the substrate voltage from 0 to 6 V on 2007 August 30 reduced the dark current, enabling the collection of useful science data at higher CCD temperature (up to -50 °C). We also briefly describe the plan to recalibrate the XRT response files at this new voltage. Conclusions: We show that the XRT spectral response is described well by the public response files for line and continuum spectra in the 0.3-10 keV band in both PC and WT modes.

  11. CCD camera systems and support electronics for a White Light Coronagraph and X-ray XUV solar telescope

    NASA Technical Reports Server (NTRS)

    Harrison, D. C.; Kubierschky, K.; Staples, M. H.; Carpenter, C. H.

    1980-01-01

    Two instruments, a White Light Coronagraph and an X-ray XUV telescope built into the same housing, share several electronic functions. Each instrument uses a CCD as an imaging detector, but due to different spectral requirements, each uses a different type. Hardware reduction, required by the stringent weight and volume allocations of the interplanetary mission, is made possible by the use of a microprocessor. Most instrument functions are software controlled with the end use circuits treated as peripherals to the microprocessor. The instruments are being developed for the International Solar Polar Mission.

  12. Experimental verification of a micrometeoroid damage in the pn-CCD camera system aboard XMM-Newton

    NASA Astrophysics Data System (ADS)

    Meidinger, Norbert; Aschenbach, Bernd; Braeuninger, Heinrich W.; Drolshagen, Gerhard; Englhauser, Jakob; Hartmann, Robert; Hartner, Gisela D.; Srama, Ralf; Strueder, Lothar; Stuebig, Martin; Truemper, Joachim E.

    2003-03-01

    The pn-CCD is the focal plane detector of one of the three X-ray telescopes aboard the XMM-Newton observatory. During revolution #156 more than 30 individual bright pixels lightened up out of approximately 150,000 pixels of the 6 cm × 6 cm large detector area. The amount of leakage current generated in the pixels cannot be explained by single heavy ions impact, however. We suggest that a micrometeoroid scattered off the mirror surface under grazing incidence reached the focal plane detector and produced the bright pixels. This proposal was studied by us experimentally at the Heidelberg dust accelerator. Micron-sized iron particles were accelerated to speeds of the order of 5 km/s impinging on the surface of an X-ray mirror under grazing incidence. Scatter products have been found with detectors placed behind the mirror. They have been analyzed by various methods to characterize their properties and the effects produced by them in the pn-CCD. Micrometeoroid damage to semiconductor detectors in the focus of grazing incidence optics might be of concern for future space projects with very large collecting area and are proposed to be studied in detail.

  13. A simple method based on the application of a CCD camera as a sensor to detect low concentrations of barium sulfate in suspension.

    PubMed

    de Sena, Rodrigo Caciano; Soares, Matheus; Pereira, Maria Luiza Oliveira; da Silva, Rogério Cruz Domingues; do Rosário, Francisca Ferreira; da Silva, Joao Francisco Cajaiba

    2011-01-01

    The development of a simple, rapid and low cost method based on video image analysis and aimed at the detection of low concentrations of precipitated barium sulfate is described. The proposed system is basically composed of a webcam with a CCD sensor and a conventional dichroic lamp. For this purpose, software for processing and analyzing the digital images based on the RGB (Red, Green and Blue) color system was developed. The proposed method had shown very good repeatability and linearity and also presented higher sensitivity than the standard turbidimetric method. The developed method is presented as a simple alternative for future applications in the study of precipitations of inorganic salts and also for detecting the crystallization of organic compounds. PMID:22346607

  14. Mounted Video Camera Captures Launch of STS-112, Shuttle Orbiter Atlantis

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A color video camera mounted to the top of the External Tank (ET) provided this spectacular never-before-seen view of the STS-112 mission as the Space Shuttle Orbiter Atlantis lifted off in the afternoon of October 7, 2002. The camera provided views as the orbiter began its ascent until it reached near-orbital speed, about 56 miles above the Earth, including a view of the front and belly of the orbiter, a portion of the Solid Rocket Booster, and ET. The video was downlinked during flight to several NASA data-receiving sites, offering the STS-112 team an opportunity to monitor the shuttle's performance from a new angle. Atlantis carried the S1 Integrated Truss Structure and the Crew and Equipment Translation Aid (CETA) Cart. The CETA is the first of two human-powered carts that will ride along the International Space Station's railway providing a mobile work platform for future extravehicular activities by astronauts. Landing on October 18, 2002, the Orbiter Atlantis ended its 11-day mission.

  15. Mounted Video Camera Captures Launch of STS-112, Shuttle Orbiter Atlantis

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A color video camera mounted to the top of the External Tank (ET) provided this spectacular never-before-seen view of the STS-112 mission as the Space Shuttle Orbiter Atlantis lifted off in the afternoon of October 7, 2002, The camera provided views as the the orbiter began its ascent until it reached near-orbital speed, about 56 miles above the Earth, including a view of the front and belly of the orbiter, a portion of the Solid Rocket Booster, and ET. The video was downlinked during flight to several NASA data-receiving sites, offering the STS-112 team an opportunity to monitor the shuttle's performance from a new angle. Atlantis carried the S1 Integrated Truss Structure and the Crew and Equipment Translation Aid (CETA) Cart. The CETA is the first of two human-powered carts that will ride along the International Space Station's railway providing a mobile work platform for future extravehicular activities by astronauts. Landing on October 18, 2002, the Orbiter Atlantis ended its 11-day mission.

  16. A stroboscopic technique for using CCD cameras in flow visualization systems for continuous viewing and stop action photography

    NASA Technical Reports Server (NTRS)

    Franke, John M.; Rhodes, David B.; Jones, Stephen B.; Dismond, Harriet R.

    1992-01-01

    A technique for synchronizing a pulse light source to charge coupled device cameras is presented. The technique permits the use of pulse light sources for continuous as well as stop action flow visualization. The technique has eliminated the need to provide separate lighting systems at facilities requiring continuous and stop action viewing or photography.

  17. Complex effusive events at Kilauea as documented by the GOES satellite and remote video cameras

    USGS Publications Warehouse

    Harris, A.J.L.; Thornber, C.R.

    1999-01-01

    GOES provides thermal data for all of the Hawaiian volcanoes once every 15 min. We show how volcanic radiance time series produced from this data stream can be used as a simple measure of effusive activity. Two types of radiance trends in these time series can be used to monitor effusive activity: (a) Gradual variations in radiance reveal steady flow-field extension and tube development. (b) Discrete spikes correlate with short bursts of activity, such as lava fountaining or lava-lake overflows. We are confident that any effusive event covering more than 10,000 m2 of ground in less than 60 min will be unambiguously detectable using this approach. We demonstrate this capability using GOES, video camera and ground-based observational data for the current eruption of Kilauea volcano (Hawai'i). A GOES radiance time series was constructed from 3987 images between 19 June and 12 August 1997. This time series displayed 24 radiance spikes elevated more than two standard deviations above the mean; 19 of these are correlated with video-recorded short-burst effusive events. Less ambiguous events are interpreted, assessed and related to specific volcanic events by simultaneous use of permanently recording video camera data and ground-observer reports. The GOES radiance time series are automatically processed on data reception and made available in near-real-time, so such time series can contribute to three main monitoring functions: (a) automatically alerting major effusive events; (b) event confirmation and assessment; and (c) establishing effusive event chronology.

  18. Calibration grooming and alignment for LDUA High Resolution Stereoscopic Video Camera System (HRSVS)

    SciTech Connect

    Pardini, A.F.

    1998-01-27

    The High Resolution Stereoscopic Video Camera System (HRSVS) was designed by the Savannah River Technology Center (SRTC) to provide routine and troubleshooting views of tank interiors during characterization and remediation phases of underground storage tank (UST) processing. The HRSVS is a dual color camera system designed to provide stereo viewing of the interior of the tanks including the tank wall in a Class 1, Division 1, flammable atmosphere. The HRSVS was designed with a modular philosophy for easy maintenance and configuration modifications. During operation of the system with the LDUA, the control of the camera system will be performed by the LDUA supervisory data acquisition system (SDAS). Video and control status 1458 will be displayed on monitors within the LDUA control center. All control functions are accessible from the front panel of the control box located within the Operations Control Trailer (OCT). The LDUA will provide all positioning functions within the waste tank for the end effector. Various electronic measurement instruments will be used to perform CG and A activities. The instruments may include a digital volt meter, oscilloscope, signal generator, and other electronic repair equipment. None of these instruments will need to be calibrated beyond what comes from the manufacturer. During CG and A a temperature indicating device will be used to measure the temperature of the outside of the HRSVS from initial startup until the temperature has stabilized. This device will not need to be in calibration during CG and A but will have to have a current calibration sticker from the Standards Laboratory during any acceptance testing. This sensor will not need to be in calibration during CG and A but will have to have a current calibration sticker from the Standards Laboratory during any acceptance testing.

  19. Visual fatigue modeling for stereoscopic video shot based on camera motion

    NASA Astrophysics Data System (ADS)

    Shi, Guozhong; Sang, Xinzhu; Yu, Xunbo; Liu, Yangdong; Liu, Jing

    2014-11-01

    As three-dimensional television (3-DTV) and 3-D movie become popular, the discomfort of visual feeling limits further applications of 3D display technology. The cause of visual discomfort from stereoscopic video conflicts between accommodation and convergence, excessive binocular parallax, fast motion of objects and so on. Here, a novel method for evaluating visual fatigue is demonstrated. Influence factors including spatial structure, motion scale and comfortable zone are analyzed. According to the human visual system (HVS), people only need to converge their eyes to the specific objects for static cameras and background. Relative motion should be considered for different camera conditions determining different factor coefficients and weights. Compared with the traditional visual fatigue prediction model, a novel visual fatigue predicting model is presented. Visual fatigue degree is predicted using multiple linear regression method combining with the subjective evaluation. Consequently, each factor can reflect the characteristics of the scene, and the total visual fatigue score can be indicated according to the proposed algorithm. Compared with conventional algorithms which ignored the status of the camera, our approach exhibits reliable performance in terms of correlation with subjective test results.

  20. Gain, Level, And Exposure Control For A Television Camera

    NASA Technical Reports Server (NTRS)

    Major, Geoffrey J.; Hetherington, Rolfe W.

    1992-01-01

    Automatic-level-control/automatic-gain-control (ALC/AGC) system for charge-coupled-device (CCD) color television camera prevents over-loading in bright scenes using technique for measuring brightness of scene from red, green, and blue output signals and processing these into adjustments of video amplifiers and iris on camera lens. System faster, does not distort video brightness signals, and built with smaller components.

  1. Identifying predators and fates of grassland passerine nests using miniature video cameras

    USGS Publications Warehouse

    Pietz, P.J.; Granfors, D.A.

    2000-01-01

    Nest fates, causes of nest failure, and identities of nest predators are difficult to determine for grassland passerines. We developed a miniature video-camera system for use in grasslands and deployed it at 69 nests of 10 passerine species in North Dakota during 1996-97. Abandonment rates were higher at nests 1 day or night (22-116 hr) at 6 nests, 5 of which were depredated by ground squirrels or mice. For nests without cameras, estimated predation rates were lower for ground nests than aboveground nests (P = 0.055), but did not differ between open and covered nests (P = 0.74). Open and covered nests differed, however, when predation risk (estimated by initial-predation rate) was examined separately for day and night using camera-monitored nests; the frequency of initial predations that occurred during the day was higher for open nests than covered nests (P = 0.015). Thus, vulnerability of some nest types may depend on the relative importance of nocturnal and diurnal predators. Predation risk increased with nestling age from 0 to 8 days (P = 0.07). Up to 15% of fates assigned to camera-monitored nests were wrong when based solely on evidence that would have been available from periodic nest visits. There was no evidence of disturbance at nearly half the depredated nests, including all 5 depredated by large mammals. Overlap in types of sign left by different predator species, and variability of sign within species, suggests that evidence at nests is unreliable for identifying predators of grassland passerines.

  2. Television automatic video-line tester

    NASA Astrophysics Data System (ADS)

    Ge, Zhaoxiang; Tang, Dongsheng; Feng, Binghua

    1998-08-01

    The linearity of telescope video-line is an important character for geodetic instruments and micrometer- telescopes. The instrument of 1 inch video-line tester, invented by University of Shanghai for Science and Technology, has been adopted in related instrument criterion and national metering regulation. But in optical and chemical reading with visual alignment, it can cause subjective error and can not give detailed data and so on. In this paper, the author put forward an improvement for video-line tester by using CCD for TV camera, displaying and processing CCD signal through computer, and auto-testing, with advantage of objectivity, reliability, rapid speed and less focusing error.

  3. The design and realization of a three-dimensional video system by means of a CCD array

    NASA Astrophysics Data System (ADS)

    Boizard, J. L.

    1985-12-01

    Design features and principles and initial tests of a prototype three-dimensional robot vision system based on a laser source and a CCD detector array is described. The use of a laser as a coherent illumination source permits the determination of the relief using one emitter since the location of the source is a known quantity with low distortion. The CCD signal detector array furnishes an acceptable signal/noise ratio and, when wired to an appropriate signal processing system, furnishes real-time data on the return signals, i.e., the characteristic points of an object being scanned. Signal processing involves integration of 29 kB of data per 100 samples, with sampling occurring at a rate of 5 MHz (the CCDs) and yielding an image every 12 msec. Algorithms for filtering errors from the data stream are discussed.

  4. Optimizing Detection Rate and Characterization of Subtle Paroxysmal Neonatal Abnormal Facial Movements with Multi-Camera Video-Electroencephalogram Recordings.

    PubMed

    Pisani, Francesco; Pavlidis, Elena; Cattani, Luca; Ferrari, Gianluigi; Raheli, Riccardo; Spagnoli, Carlotta

    2016-06-01

    Objectives We retrospectively analyze the diagnostic accuracy for paroxysmal abnormal facial movements, comparing one camera versus multi-camera approach. Background Polygraphic video-electroencephalogram (vEEG) recording is the current gold standard for brain monitoring in high-risk newborns, especially when neonatal seizures are suspected. One camera synchronized with the EEG is commonly used. Methods Since mid-June 2012, we have started using multiple cameras, one of which point toward newborns' faces. We evaluated vEEGs recorded in newborns in the study period between mid-June 2012 and the end of September 2014 and compared, for each recording, the diagnostic accuracies obtained with one-camera and multi-camera approaches. Results We recorded 147 vEEGs from 87 newborns and found 73 episodes of paroxysmal facial abnormal movements in 18 vEEGs of 11 newborns with the multi-camera approach. By using the single-camera approach, only 28.8% of these events were identified (21/73). Ten positive vEEGs with multicamera with 52 paroxysmal facial abnormal movements (52/73, 71.2%) would have been considered as negative with the single-camera approach. Conclusions The use of one additional facial camera can significantly increase the diagnostic accuracy of vEEGs in the detection of paroxysmal abnormal facial movements in the newborns. PMID:27111027

  5. Embedded FIR filter design for real-time refocusing using a standard plenoptic video camera

    NASA Astrophysics Data System (ADS)

    Hahne, Christopher; Aggoun, Amar

    2014-03-01

    A novel and low-cost embedded hardware architecture for real-time refocusing based on a standard plenoptic camera is presented in this study. The proposed layout design synthesizes refocusing slices directly from micro images by omitting the process for the commonly used sub-aperture extraction. Therefore, intellectual property cores, containing switch controlled Finite Impulse Response (FIR) filters, are developed and applied to the Field Programmable Gate Array (FPGA) XC6SLX45 from Xilinx. Enabling the hardware design to work economically, the FIR filters are composed of stored product as well as upsampling and interpolation techniques in order to achieve an ideal relation between image resolution, delay time, power consumption and the demand of logic gates. The video output is transmitted via High-Definition Multimedia Interface (HDMI) with a resolution of 720p at a frame rate of 60 fps conforming to the HD ready standard. Examples of the synthesized refocusing slices are presented.

  6. Plant iodine-131 uptake in relation to root concentration as measured in minirhizotron by video camera:

    SciTech Connect

    Moss, K.J.

    1990-09-01

    Glass viewing tubes (minirhizotrons) were placed in the soil beneath native perennial bunchgrass (Agropyron spicatum). The tubes provided access for observing and quantifying plant roots with a miniature video camera and soil moisture estimates by neutron hydroprobe. The radiotracer I-131 was delivered to the root zone at three depths with differing root concentrations. The plant was subsequently sampled and analyzed for I-131. Plant uptake was greater when I-131 was applied at soil depths with higher root concentrations. When I-131 was applied at soil depths with lower root concentrations, plant uptake was less. However, the relationship between root concentration and plant uptake was not a direct one. When I-131 was delivered to deeper soil depths with low root concentrations, the quantity of roots there appeared to be less effective in uptake than the same quantity of roots at shallow soil depths with high root concentration. 29 refs., 6 figs., 11 tabs.

  7. A two camera video imaging system with application to parafoil angle of attack measurements

    NASA Technical Reports Server (NTRS)

    Meyn, Larry A.; Bennett, Mark S.

    1991-01-01

    This paper describes the development of a two-camera, video imaging system for the determination of three-dimensional spatial coordinates from stereo images. This system successfully measured angle of attack at several span-wise locations for large-scale parafoils tested in the NASA Ames 80- by 120-Foot Wind Tunnel. Measurement uncertainty for angle of attack was less than 0.6 deg. The stereo ranging system was the primary source for angle of attack measurements since inclinometers sewn into the fabric ribs of the parafoils had unknown angle offsets acquired during installation. This paper includes discussions of the basic theory and operation of the stereo ranging system, system measurement uncertainty, experimental set-up, calibration results, and test results. Planned improvements and enhancements to the system are also discussed.

  8. Dual-modality imaging in vivo with an NIR and gamma emitter using an intensified CCD camera and a conventional gamma camera

    NASA Astrophysics Data System (ADS)

    Houston, Jessica P.; Ke, Shi; Wang, Wei; Li, Chun; Sevick-Muraca, Eva M.

    2005-04-01

    Fluorescence-enhanced optical imaging measurements and conventional gamma camera images on human M21 melanoma xenografts were acquired for a "dual-modality" molecular imaging study. The avb3 integrin cell surface receptors were imaged using a cyclic peptide, cyclopentapeptide cyclo(lys-Arg-Gly-Asp-phe) [c(KRGDf)] probe which is known to target the membrane receptor. The probe, dual-labeled with a radiotracer, 111Indium, for gamma scintigraphy as well as with a near-infrared dye, IRDye800, was injected into six nude mice at a dose equivalent to 90mCi of 111In and 5 nanomoles of near-infrared (NIR) dye. A 15 min gamma scan and 800 millisecond NIR-sensitive ICCD optical photograph were collected 24 hours after injection of the dual-labeled probe. The image quality between the nuclear and optical data was investigated with the results showing similar target-to-background ratios (TBR) based on the origin of fluorescence and gamma emissions at the targeted tumor site. Furthermore, an analysis of SNR versus contrast showed greater sensitivity of optical over nuclear imaging for the subcutaneous tumor targets measured by surface regions of interest.

  9. A cooled CCD camera-based protocol provides an effective solution for in vitro monitoring of luciferase.

    PubMed

    Afshari, Amirali; Uhde-Stone, Claudia; Lu, Biao

    2015-03-13

    Luciferase assay has become an increasingly important technique to monitor a wide range of biological processes. However, the mainstay protocols require a luminometer to acquire and process the data, therefore limiting its application to specialized research labs. To overcome this limitation, we have developed an alternative protocol that utilizes a commonly available cooled charge-coupled device (CCCD), instead of a luminometer for data acquiring and processing. By measuring activities of different luciferases, we characterized their substrate specificity, assay linearity, signal-to-noise levels, and fold-changes via CCCD. Next, we defined the assay parameters that are critical for appropriate use of CCCD for different luciferases. To demonstrate the usefulness in cultured mammalian cells, we conducted a case study to examine NFκB gene activation in response to inflammatory signals in human embryonic kidney cells (HEK293 cells). We found that data collected by CCCD camera was equivalent to those acquired by luminometer, thus validating the assay protocol. In comparison, The CCCD-based protocol is readily amenable to live-cell and high-throughput applications, offering fast simultaneous data acquisition and visual and quantitative data presentation. In conclusion, the CCCD-based protocol provides a useful alternative for monitoring luciferase reporters. The wide availability of CCCD will enable more researchers to use luciferases to monitor and quantify biological processes. PMID:25677617

  10. A unified and efficient framework for court-net sports video analysis using 3D camera modeling

    NASA Astrophysics Data System (ADS)

    Han, Jungong; de With, Peter H. N.

    2007-01-01

    The extensive amount of video data stored on available media (hard and optical disks) necessitates video content analysis, which is a cornerstone for different user-friendly applications, such as, smart video retrieval and intelligent video summarization. This paper aims at finding a unified and efficient framework for court-net sports video analysis. We concentrate on techniques that are generally applicable for more than one sports type to come to a unified approach. To this end, our framework employs the concept of multi-level analysis, where a novel 3-D camera modeling is utilized to bridge the gap between the object-level and the scene-level analysis. The new 3-D camera modeling is based on collecting features points from two planes, which are perpendicular to each other, so that a true 3-D reference is obtained. Another important contribution is a new tracking algorithm for the objects (i.e. players). The algorithm can track up to four players simultaneously. The complete system contributes to summarization by various forms of information, of which the most important are the moving trajectory and real-speed of each player, as well as 3-D height information of objects and the semantic event segments in a game. We illustrate the performance of the proposed system by evaluating it for a variety of court-net sports videos containing badminton, tennis and volleyball, and we show that the feature detection performance is above 92% and events detection about 90%.

  11. A versatile digital video engine for safeguards and security applications

    SciTech Connect

    Hale, W.R.; Johnson, C.S.; DeKeyser, P.

    1996-08-01

    The capture and storage of video images have been major engineering challenges for safeguard and security applications since the video camera provided a method to observe remote operations. The problems of designing reliable video cameras were solved in the early 1980`s with the introduction of the CCD (charged couple device) camera. The first CCD cameras cost in the thousands of dollars but have now been replaced by cameras costing in the hundreds. The remaining problem of storing and viewing video images in both attended and unattended video surveillance systems and remote monitoring systems is being solved by sophisticated digital compression systems. One such system is the PC-104 three card set which is literally a ``video engine`` that can provide power for video storage systems. The use of digital images in surveillance systems makes it possible to develop remote monitoring systems, portable video surveillance units, image review stations, and authenticated camera modules. This paper discusses the video card set and how it can be used in many applications.

  12. Search for Trans-Neptunian Objects: a new MIDAS context confronted with some results obtained with the UH 8k CCD Mosaic Camera

    NASA Astrophysics Data System (ADS)

    Rousselot, P.; Lombard, F.; Moreels, G.

    1998-09-01

    We present the results obtained with a new program dedicated to the automatic detection of trans-Neptunian objects (TNOs) with standard sets of images obtained in the same field of view. This program has the key advantage, when compared to other similar softwares, of being designed to be used with one of the main astronomical data processing package; the Munich Image Data Analysis System (MIDAS) developped by The European Southern Observatory (ESO). It is available freely from the World Wide Web server of the Observatory of Besan\\c con (http://www.obs-besancon/www/ publi/philippe/tno.html). This program has been tested with observational data collected with the UH 8k CCD mosaic Camera, used during two nights, on October 25 and 26, 1997, at the prime focus of the CFH telescope (Mauna Kea, Hawaii). The purpose of these observational data was to detect new TNOs and a previous analysis conducted by the classical method of blinking, had lead to a first detection of a new TNO. This object appears close to the detection limit of the images (i.e. to the 24(th) magnitude) and presents an unsual orbital inclination (i =~ 33(deg) ). It has allowed the efficient and successful testing of the program to detect faint moving objects, demonstrating its ability to detect the objects close to the sky background noise with a very limited number of false detections.

  13. The Automatically Triggered Video or Imaging Station (ATVIS): An Inexpensive Way to Catch Geomorphic Events on Camera

    NASA Astrophysics Data System (ADS)

    Wickert, A. D.

    2010-12-01

    To understand how single events can affect landscape change, we must catch the landscape in the act. Direct observations are rare and often dangerous. While video is a good alternative, commercially-available video systems for field installation cost 11,000, weigh ~100 pounds (45 kg), and shoot 640x480 pixel video at 4 frames per second. This is the same resolution as a cheap point-and-shoot camera, with a frame rate that is nearly an order of magnitude worse. To overcome these limitations of resolution, cost, and portability, I designed and built a new observation station. This system, called ATVIS (Automatically Triggered Video or Imaging Station), costs 450--500 and weighs about 15 pounds. It can take roughly 3 hours of 1280x720 pixel video, 6.5 hours of 640x480 video, or 98,000 1600x1200 pixel photos (one photo every 7 seconds for 8 days). The design calls for a simple Canon point-and-shoot camera fitted with custom firmware that allows 5V pulses through its USB cable to trigger it to take a picture or to initiate or stop video recording. These pulses are provided by a programmable microcontroller that can take input from either sensors or a data logger. The design is easily modifiable to a variety of camera and sensor types, and can also be used for continuous time-lapse imagery. We currently have prototypes set up at a gully near West Bijou Creek on the Colorado high plains and at tributaries to Marble Canyon in northern Arizona. Hopefully, a relatively inexpensive and portable system such as this will allow geomorphologists to supplement sensor networks with photo or video monitoring and allow them to see—and better quantify—the fantastic array of processes that modify landscapes as they unfold. Camera station set up at Badger Canyon, Arizona.Inset: view into box. Clockwise from bottom right: camera, microcontroller (blue), DC converter (red), solar charge controller, 12V battery. Materials and installation assistance courtesy of Ron Griffiths and the USGS Grand Canyon Monitoring and Research Center.

  14. CCD TV focal plane guider development and comparison to SIRTF applications

    NASA Technical Reports Server (NTRS)

    Rank, David M.

    1989-01-01

    It is expected that the SIRTF payload will use a CCD TV focal plane fine guidance sensor to provide acquisition of sources and tracking stability of the telescope. Work has been done to develop CCD TV cameras and guiders at Lick Observatory for several years and have produced state of the art CCD TV systems for internal use. NASA decided to provide additional support so that the limits of this technology could be established and a comparison between SIRTF requirements and practical systems could be put on a more quantitative basis. The results of work carried out at Lick Observatory which was designed to characterize present CCD autoguiding technology and relate it to SIRTF applications is presented. Two different design types of CCD cameras were constructed using virtual phase and burred channel CCD sensors. A simple autoguider was built and used on the KAO, Mt. Lemon and Mt. Hamilton telescopes. A video image processing system was also constructed in order to characterize the performance of the auto guider and CCD cameras.

  15. Optimal camera exposure for video surveillance systems by predictive control of shutter speed, aperture, and gain

    NASA Astrophysics Data System (ADS)

    Torres, Juan; Menéndez, José Manuel

    2015-02-01

    This paper establishes a real-time auto-exposure method to guarantee that surveillance cameras in uncontrolled light conditions take advantage of their whole dynamic range while provide neither under nor overexposed images. State-of-the-art auto-exposure methods base their control on the brightness of the image measured in a limited region where the foreground objects are mostly located. Unlike these methods, the proposed algorithm establishes a set of indicators based on the image histogram that defines its shape and position. Furthermore, the location of the objects to be inspected is likely unknown in surveillance applications. Thus, the whole image is monitored in this approach. To control the camera settings, we defined a parameters function (Ef ) that linearly depends on the shutter speed and the electronic gain; and is inversely proportional to the square of the lens aperture diameter. When the current acquired image is not overexposed, our algorithm computes the value of Ef that would move the histogram to the maximum value that does not overexpose the capture. When the current acquired image is overexposed, it computes the value of Ef that would move the histogram to a value that does not underexpose the capture and remains close to the overexposed region. If the image is under and overexposed, the whole dynamic range of the camera is therefore used, and a default value of the Ef that does not overexpose the capture is selected. This decision follows the idea that to get underexposed images is better than to get overexposed ones, because the noise produced in the lower regions of the histogram can be removed in a post-processing step while the saturated pixels of the higher regions cannot be recovered. The proposed algorithm was tested in a video surveillance camera placed at an outdoor parking lot surrounded by buildings and trees which produce moving shadows in the ground. During the daytime of seven days, the algorithm was running alternatively together with a representative auto-exposure algorithm in the recent literature. Besides the sunrises and the nightfalls, multiple weather conditions occurred which produced light changes in the scene: sunny hours that produced sharpen shadows and highlights; cloud coverages that softened the shadows; and cloudy and rainy hours that dimmed the scene. Several indicators were used to measure the performance of the algorithms. They provided the objective quality as regards: the time that the algorithms recover from an under or over exposure, the brightness stability, and the change related to the optimal exposure. The results demonstrated that our algorithm reacts faster to all the light changes than the selected state-of-the-art algorithm. It is also capable of acquiring well exposed images and maintaining the brightness stable during more time. Summing up the results, we concluded that the proposed algorithm provides a fast and stable auto-exposure method that maintains an optimal exposure for video surveillance applications. Future work will involve the evaluation of this algorithm in robotics.

  16. Surgeon point-of-view recording: Using a high-definition head-mounted video camera in the operating room

    PubMed Central

    Nair, Akshay Gopinathan; Kamal, Saurabh; Dave, Tarjani Vivek; Mishra, Kapil; Reddy, Harsha S; Rocca, David Della; Rocca, Robert C Della; Andron, Aleza; Jain, Vandana

    2015-01-01

    Objective: To study the utility of a commercially available small, portable ultra-high definition (HD) camera (GoPro Hero 4) for intraoperative recording. Methods: A head mount was used to fix the camera on the operating surgeon's head. Due care was taken to protect the patient's identity. The recorded video was subsequently edited and used as a teaching tool. This retrospective, noncomparative study was conducted at three tertiary eye care centers. The surgeries recorded were ptosis correction, ectropion correction, dacryocystorhinostomy, angular dermoid excision, enucleation, blepharoplasty and lid tear repair surgery (one each). The recorded videos were reviewed, edited, and checked for clarity, resolution, and reproducibility. Results: The recorded videos were found to be high quality, which allowed for zooming and visualization of the surgical anatomy clearly. Minimal distortion is a drawback that can be effectively addressed during postproduction. The camera, owing to its lightweight and small size, can be mounted on the surgeon's head, thus offering a unique surgeon point-of-view. In our experience, the results were of good quality and reproducible. Conclusions: A head-mounted ultra-HD video recording system is a cheap, high quality, and unobtrusive technique to record surgery and can be a useful teaching tool in external facial and ophthalmic plastic surgery. PMID:26655001

  17. Single-Camera Panoramic-Imaging Systems

    NASA Technical Reports Server (NTRS)

    Lindner, Jeffrey L.; Gilbert, John

    2007-01-01

    Panoramic detection systems (PDSs) are developmental video monitoring and image-data processing systems that, as their name indicates, acquire panoramic views. More specifically, a PDS acquires images from an approximately cylindrical field of view that surrounds an observation platform. The main subsystems and components of a basic PDS are a charge-coupled- device (CCD) video camera and lens, transfer optics, a panoramic imaging optic, a mounting cylinder, and an image-data-processing computer. The panoramic imaging optic is what makes it possible for the single video camera to image the complete cylindrical field of view; in order to image the same scene without the benefit of the panoramic imaging optic, it would be necessary to use multiple conventional video cameras, which have relatively narrow fields of view.

  18. Nyquist sampling theorem: understanding the illusion of a spinning wheel captured with a video camera

    NASA Astrophysics Data System (ADS)

    Lvesque, Luc

    2014-11-01

    Inaccurate measurements occur regularly in data acquisition as a result of improper sampling times. An understanding of proper sampling times when collecting data with an analogue-to-digital converter or video camera is crucial in order to avoid anomalies. A proper choice of sampling times should be based on the Nyquist sampling theorem. If the sampling time is chosen judiciously, then it is possible to accurately determine the frequency of a signal varying periodically with time. This paper is of educational value as it presents the principles of sampling during data acquisition. The concept of the Nyquist sampling theorem is usually introduced very briefly in the literature, with very little practical examples to grasp its importance during data acquisitions. Through a series of carefully chosen examples, we attempt to present data sampling from the elementary conceptual idea and try to lead the reader naturally to the Nyquist sampling theorem so we may more clearly understand why a signal can be interpreted incorrectly during a data acquisition procedure in the case of undersampling.

  19. Electrode temperature measurements of multi-phase AC arc by high-speed video camera

    NASA Astrophysics Data System (ADS)

    Tanaka, M.; Ikeba, T.; Liu, Y.; Matsuura, T.; Watanabe, T.

    2012-12-01

    The multi-phase AC arc plasma has been applied in the glass melting technology as a promising heat source. The electrode erosion of the multi-phase arc is one of the most important issues to be solved. In the present work, the theory of the two-colour pyrometry by using the high-speed video camera with band-pass filters was applied to the electrode temperature measurements. First, the spectroscopic measurements of the electrode region in the multi-phase arc were conducted to select the appropriate wavelengths of band-pass filters. Then, the electrode temperatures of 2-phase, 6-phase, and 12-phase arcs were measured. The electrode tip temperature and the molten area were evaluated from the obtained 2-dimensional temperature distributions. Results indicated that the increase of the number of the phases leads to the lower tip temperature and the larger molten area. Observation of the multiple arcs further revealed the particular characteristics of the multi-phase arc, such as the periodical arc motion has important role on the electrode molten state.

  20. A practical guide to CCD astronomy.

    NASA Astrophysics Data System (ADS)

    Martinez, P.; Klotz, A.

    This book is an English translation of the French original "Guide pratique de l'astronomie CCD". High-performance CCD cameras have opened up a new window on the universe for amateur astronomers. This book provides a complete, self-contained guide to choosing and using CCD cameras. The book starts with an introduction to how a CCD camera works and just what determines its performance. The authors then show how to use a CCD camera and accurately calibrate the images obtained. A clear review is provided of the software available for visualizing, analyzing, and processing digital images. Finally, the reader is guided through a series of key areas in astronomy where one can make best use of CCD cameras.

  1. Modeling camera orientation and 3D structure from a sequence of images taken by a perambulating commercial video camera

    NASA Astrophysics Data System (ADS)

    M-Rouhani, Behrouz; Anderson, James A. D. W.

    1997-04-01

    In this paper we report the degree of reliability of image sequences taken by off-the-shelf TV cameras for modeling camera rotation and reconstructing 3D structure using computer vision techniques. This is done in spite of the fact that computer vision systems usually use imaging devices that are specifically designed for the human vision. Our scenario consists of a static scene and a mobile camera moving through the scene. The scene is any long axial building dominated by features along the three principal orientations and with at least one wall containing prominent repetitive planar features such as doors, windows bricks etc. The camera is an ordinary commercial camcorder moving along the axial axis of the scene and is allowed to rotate freely within the range +/- 10 degrees in all directions. This makes it possible that the camera be held by a walking unprofessional cameraman with normal gait, or to be mounted on a mobile robot. The system has been tested successfully on sequence of images of a variety of structured, but fairly cluttered scenes taken by different walking cameramen. The potential application areas of the system include medicine, robotics and photogrammetry.

  2. Study of secondary flow in centrifugal blood pumps using a flow visualization method with a high-speed video camera.

    PubMed

    Sakuma, I; Fukui, Y; Dohi, T

    1996-06-01

    Four pump models with different vane configurations were evaluated with flow visualization techniques using a high-speed video camera. These models also were evaluated through in vivo hemolysis tests using bovine blood. The impeller having the greatest fluid velocity relative to the impeller, the largest velocity variance, and the most irregular local flow patterns in the flow passage caused the most hemolysis. Even if the pumps were operated at almost the same speed (rpm) at the same output, the impeller showing more irregular flow patterns had a statistically greater rate of hemolysis. This fact confirms that the existence of local irregular flow patterns in a centrifugal blood pump deteriorates its hemolytic performance. Thus, to optimize the design of the pump, it is very important to examine the secondary flow patterns in the centrifugal blood pump in detail using flow visualization with a high-speed video camera. PMID:8817952

  3. Research on portable intelligent monitoring system based on video server

    NASA Astrophysics Data System (ADS)

    Song, Gui-cai; Na, Yan-xiang; Yang, Fei-yu; Cao, Shi-hao

    2011-08-01

    Intelligent video surveillance system study in this paper is constituted by CCD cameras, infrared pyroelectric sensor, stepping motor and the computer. And make In-depth study for portable intelligent monitoring system from two aspects of hardware and software. compare and analyse between CCD image sensor and CMOS image sensor, key research on the CCD various' characteristics and performance indicators; investigate for the infrared pyroelectric sensor structure, characteristics, put forward further method to improve pyroelectric sensor performance, response degree. On software, according to the calculation of moving object detection, through controll the step motor, can tracking video real-time or finish videoing,video Real-time. Intelligent video surveillance system use infrared pyroelectric sensor as access switches, make sure the foolproof of monitoring site safety system.

  4. Application of video-cameras for quality control and sampling optimisation of hydrological and erosion measurements in a catchment

    NASA Astrophysics Data System (ADS)

    Lora-Millán, Julio S.; Taguas, Encarnacion V.; Gomez, Jose A.; Perez, Rafael

    2014-05-01

    Long term soil erosion studies imply substantial efforts, particularly when there is the need to maintain continuous measurements. There are high costs associated to maintenance of field equipment keeping and quality control of data collection. Energy supply and/or electronic failures, vandalism and burglary are common causes of gaps in datasets, reducing their reach in many cases. In this work, a system of three video-cameras, a recorder and a transmission modem (3G technology) has been set up in a gauging station where rainfall, runoff flow and sediment concentration are monitored. The gauging station is located in the outlet of an olive orchard catchment of 6.4 ha. Rainfall is measured with one automatic raingauge that records intensity at one minute intervals. The discharge is measured by a flume of critical flow depth, where the water is recorded by an ultrasonic sensor. When the water level rises to a predetermined level, the automatic sampler turns on and fills a bottle at different intervals according to a program depending on the antecedent precipitation. A data logger controls the instruments' functions and records the data. The purpose of the video-camera system is to improve the quality of the dataset by i) the visual analysis of the measurement conditions of flow into the flume; ii) the optimisation of the sampling programs. The cameras are positioned to record the flow at the approximation and the gorge of the flume. In order to contrast the values of ultrasonic sensor, there is a third camera recording the flow level close to a measure tape. This system is activated when the ultrasonic sensor detects a height threshold, equivalent to an electric intensity level. Thus, only when there is enough flow, video-cameras record the event. This simplifies post-processing and reduces the cost of download of recordings. The preliminary contrast analysis will be presented as well as the main improvements in the sample program.

  5. Determination of visible coordinates of the low-orbit space objects and their photometry by the CCD camera with the analogue output. Initial image processing

    NASA Astrophysics Data System (ADS)

    Shakun, L. S.; Koshkin, N. I.

    2014-06-01

    The number of artificial space objects in the low Earth orbit has been continuously increasing. That raises the requirements for the accuracy of measurement of their coordinates and for the precision of the prediction of their motion. The accuracy of the prediction can be improved if the actual current orientation of the non-spherical satellite is taken into account. In so doing, it becomes possible to directly determine the atmospheric density along the orbit. The problem solution is to regularly conduct the photometric surveillances of a large number of satellites and monitor the parameters of their rotation around the centre of mass. To do that, it is necessary to get and promptly process large video arrays, containing pictures of a satellite against the background stars. In the present paper, the method for the simultaneous measurement of coordinates and brightness of the low Earth orbit space objects against the background stars when they are tracked by telescope KT-50 with the mirror diameter of 50 cm and with video camera WAT-209H2 is considered. The problem of determination of the moments of exposures of images is examined in detail. The estimation of the accuracy of measuring both the apparent coordinates of stars and their photometry is given on the example of observation of the open star cluster. In the presented observations, the standard deviation of one position measured is 1σ, the accuracy of determination of the moment of exposure of images is better than 0.0001 s. The estimate of the standard deviation of one measurement of brightness is 0.1m. Some examples of the results of surveillances of satellites are also presented in the paper.

  6. Lori Losey - The Woman Behind the Video Camera - Duration: 3 minutes, 36 seconds.

    NASA Video Gallery

    The often-spectacular aerial video imagery of NASA flight research, airborne science missions and space satellite launches doesn't just happen. Much of it is the work of Lori Losey, senior video pr...

  7. HDR {sup 192}Ir source speed measurements using a high speed video camera

    SciTech Connect

    Fonseca, Gabriel P.; Rubo, Rodrigo A.; Sales, Camila P. de; Verhaegen, Frank

    2015-01-15

    Purpose: The dose delivered with a HDR {sup 192}Ir afterloader can be separated into a dwell component, and a transit component resulting from the source movement. The transit component is directly dependent on the source speed profile and it is the goal of this study to measure accurate source speed profiles. Methods: A high speed video camera was used to record the movement of a {sup 192}Ir source (Nucletron, an Elekta company, Stockholm, Sweden) for interdwell distances of 0.25–5 cm with dwell times of 0.1, 1, and 2 s. Transit dose distributions were calculated using a Monte Carlo code simulating the source movement. Results: The source stops at each dwell position oscillating around the desired position for a duration up to (0.026 ± 0.005) s. The source speed profile shows variations between 0 and 81 cm/s with average speed of ∼33 cm/s for most of the interdwell distances. The source stops for up to (0.005 ± 0.001) s at nonprogrammed positions in between two programmed dwell positions. The dwell time correction applied by the manufacturer compensates the transit dose between the dwell positions leading to a maximum overdose of 41 mGy for the considered cases and assuming an air-kerma strength of 48 000 U. The transit dose component is not uniformly distributed leading to over and underdoses, which is within 1.4% for commonly prescribed doses (3–10 Gy). Conclusions: The source maintains its speed even for the short interdwell distances. Dose variations due to the transit dose component are much lower than the prescribed treatment doses for brachytherapy, although transit dose component should be evaluated individually for clinical cases.

  8. Lights, Camera, Action! A Guide to Using Video Production and Instruction in the Classroom.

    ERIC Educational Resources Information Center

    Limpus, Bruce

    This instructional guide offers practical ideas for incorporating video production in the classroom. Aspects of video production are presented sequentially. Strategies and suggestions are given for using video production to reinforce traditional subject content and provide interdisciplinary connections. The book is organized in two parts. After…

  9. Lights, Camera, Action: Advancing Learning, Research, and Program Evaluation through Video Production in Educational Leadership Preparation

    ERIC Educational Resources Information Center

    Friend, Jennifer; Militello, Matthew

    2015-01-01

    This article analyzes specific uses of digital video production in the field of educational leadership preparation, advancing a three-part framework that includes the use of video in (a) teaching and learning, (b) research methods, and (c) program evaluation and service to the profession. The first category within the framework examines videos

  10. New fully electronic streak camera based on intensified picosecond tube

    NASA Astrophysics Data System (ADS)

    Imhoff, Claude; Eumurian, Gregoire M.; Pastre, Jean-Luc

    1993-10-01

    Thomson-CSF is introducing a new camera, the NUCAM TSN 906, especially designed for measurement of ultra-fast light phenomena. The camera features a temporal resolution of less than three picoseconds over a broad light spectrum by the utilization of photocathode S20, S25, or S1. This camera is designed for ease of use in industrial as well as laboratory environments. In previous camera generations, the image was either acquired on photographic film or through adaptation of an external video camera. With the TSN 906, electronic image acquisition is standard, creating ease of use. NUCAM integrates an intensified image converter tube, 512 X 512 pixel CCD sensor, image memory and GPIB in the same case. This original design produces a very compact and low cost camera. The camera can be used locally by displaying the image on a video monitor or remote controlled via a microcomputer installed with an interface board and control software.

  11. CCD Memory

    NASA Technical Reports Server (NTRS)

    Janesick, James R.; Elliot, Tom; Norris, Dave; Vescelus, Fred

    1987-01-01

    CCD memory device yields over 6.4 x 10 to the eighth power levels of information on single chip. Charge-coupled device (CCD) demonstrated to operate as either read-only-memory (ROM) or photon-programmable memory with capacity of 640,000 bits, with each bit capable of being weighted to more than 1,000 discrete analog levels. Larger memory capacities now possible using proposed approach in conjunction with CCD's now being fabricated, which yield over 4 x 10 to the ninth power discrete levels of information on single chip.

  12. On the use of Video Camera Systems in the Detection of Kuiper Belt Objects by Stellar Occultations

    NASA Astrophysics Data System (ADS)

    Subasinghe, Dilini

    2012-10-01

    Due to the distance between us and the Kuiper Belt, direct detection of Kuiper Belt Objects (KBOs) is not currently possible for objects less than 10 km in diameter. Indirect methods such as stellar occultations must be employed to remotely probe these bodies. The size, shape, as well as atmospheric properties and ring system information of a body (if any), can be collected through observations of stellar occultations. This method has been previously used with some success - Roques et al. (2006) detected 3 Trans-Neptunian objects; Schlichting et al. (2009) detected a single object in archival data. However, previous assessments of KBO occultation detection rates have been calculated only for telescopes - we extend this method to video camera systems. Building on Roques & Moncuquet (2000), we present a derivation that can be applied to any video camera system, taking into account camera specifications and diffraction effects. This allows for a determination of the number of observable KBO occultations per night. Example calculations are presented for some of the automated meteor camera systems currently in use at the University of Western Ontario. The results of this project will allow us to refine and improve our own camera system, as well as allow others to enhance their systems for KBO detection. Roques, F., Doressoundiram, A., Dhillon, V., Marsh, T., Bickerton, S., Kavelaars, J. J., Moncuquet, M., Auvergne, M., Belskaya, I., Chevreton, M., Colas, F., Fernandez, A., Fitzsimmons, A., Lecacheux, J., Mousis, O., Pau, S., Peixinho, N., & Tozzi, G. P. (2006). The Astronomical Journal, 132(2), 819-822. Roques, F., & Moncuquet, M. (2000). Icarus, 147(2), 530-544. Schlichting, H. E., Ofek, E. O., Wenz, M., Sari, R., Gal-Yam, A., Livio, M., Nelan, E., & Zucker, S. (2009). Nature, 462(7275), 895-897.

  13. Method for eliminating artifacts in CCD imagers

    DOEpatents

    Turko, B.T.; Yates, G.J.

    1992-06-09

    An electronic method for eliminating artifacts in a video camera employing a charge coupled device (CCD) as an image sensor is disclosed. The method comprises the step of initializing the camera prior to normal read out and includes a first dump cycle period for transferring radiation generated charge into the horizontal register while the decaying image on the phosphor being imaged is being integrated in the photosites, and a second dump cycle period, occurring after the phosphor image has decayed, for rapidly dumping unwanted smear charge which has been generated in the vertical registers. Image charge is then transferred from the photosites and to the vertical registers and read out in conventional fashion. The inventive method allows the video camera to be used in environments having high ionizing radiation content, and to capture images of events of very short duration and occurring either within or outside the normal visual wavelength spectrum. Resultant images are free from ghost, smear and smear phenomena caused by insufficient opacity of the registers and, and are also free from random damage caused by ionization charges which exceed the charge limit capacity of the photosites. 3 figs.

  14. Method for eliminating artifacts in CCD imagers

    DOEpatents

    Turko, Bojan T.; Yates, George J.

    1992-01-01

    An electronic method for eliminating artifacts in a video camera (10) employing a charge coupled device (CCD) (12) as an image sensor. The method comprises the step of initializing the camera (10) prior to normal read out and includes a first dump cycle period (76) for transferring radiation generated charge into the horizontal register (28) while the decaying image on the phosphor (39) being imaged is being integrated in the photosites, and a second dump cycle period (78), occurring after the phosphor (39) image has decayed, for rapidly dumping unwanted smear charge which has been generated in the vertical registers (32). Image charge is then transferred from the photosites (36) and (38) to the vertical registers (32) and read out in conventional fashion. The inventive method allows the video camera (10) to be used in environments having high ionizing radiation content, and to capture images of events of very short duration and occurring either within or outside the normal visual wavelength spectrum. Resultant images are free from ghost, smear and smear phenomena caused by insufficient opacity of the registers (28) and (32), and are also free from random damage caused by ionization charges which exceed the charge limit capacity of the photosites (36) and (37).

  15. Advanced Video Data-Acquisition System For Flight Research

    NASA Technical Reports Server (NTRS)

    Miller, Geoffrey; Richwine, David M.; Hass, Neal E.

    1996-01-01

    Advanced video data-acquisition system (AVDAS) developed to satisfy variety of requirements for in-flight video documentation. Requirements range from providing images for visualization of airflows around fighter airplanes at high angles of attack to obtaining safety-of-flight documentation. F/A-18 AVDAS takes advantage of very capable systems like NITE Hawk forward-looking infrared (FLIR) pod and recent video developments like miniature charge-couple-device (CCD) color video cameras and other flight-qualified video hardware.

  16. SU-C-18A-02: Image-Based Camera Tracking: Towards Registration of Endoscopic Video to CT

    SciTech Connect

    Ingram, S; Rao, A; Wendt, R; Castillo, R; Court, L; Yang, J; Beadle, B

    2014-06-01

    Purpose: Endoscopic examinations are routinely performed on head and neck and esophageal cancer patients. However, these images are underutilized for radiation therapy because there is currently no way to register them to a CT of the patient. The purpose of this work is to develop a method to track the motion of an endoscope within a structure using images from standard clinical equipment. This method will be incorporated into a broader endoscopy/CT registration framework. Methods: We developed a software algorithm to track the motion of an endoscope within an arbitrary structure. We computed frame-to-frame rotation and translation of the camera by tracking surface points across the video sequence and utilizing two-camera epipolar geometry. The resulting 3D camera path was used to recover the surrounding structure via triangulation methods. We tested this algorithm on a rigid cylindrical phantom with a pattern spray-painted on the inside. We did not constrain the motion of the endoscope while recording, and we did not constrain our measurements using the known structure of the phantom. Results: Our software algorithm can successfully track the general motion of the endoscope as it moves through the phantom. However, our preliminary data do not show a high degree of accuracy in the triangulation of 3D point locations. More rigorous data will be presented at the annual meeting. Conclusion: Image-based camera tracking is a promising method for endoscopy/CT image registration, and it requires only standard clinical equipment. It is one of two major components needed to achieve endoscopy/CT registration, the second of which is tying the camera path to absolute patient geometry. In addition to this second component, future work will focus on validating our camera tracking algorithm in the presence of clinical imaging features such as patient motion, erratic camera motion, and dynamic scene illumination.

  17. Lights, Camera, Action: Advancing Learning, Research, and Program Evaluation through Video Production in Educational Leadership Preparation

    ERIC Educational Resources Information Center

    Friend, Jennifer; Militello, Matthew

    2015-01-01

    This article analyzes specific uses of digital video production in the field of educational leadership preparation, advancing a three-part framework that includes the use of video in (a) teaching and learning, (b) research methods, and (c) program evaluation and service to the profession. The first category within the framework examines videos…

  18. Lights, Camera, Action! Learning about Management with Student-Produced Video Assignments

    ERIC Educational Resources Information Center

    Schultz, Patrick L.; Quinn, Andrew S.

    2014-01-01

    In this article, we present a proposal for fostering learning in the management classroom through the use of student-produced video assignments. We describe the potential for video technology to create active learning environments focused on problem solving, authentic and direct experiences, and interaction and collaboration to promote student

  19. Lights, Camera, Action! Learning about Management with Student-Produced Video Assignments

    ERIC Educational Resources Information Center

    Schultz, Patrick L.; Quinn, Andrew S.

    2014-01-01

    In this article, we present a proposal for fostering learning in the management classroom through the use of student-produced video assignments. We describe the potential for video technology to create active learning environments focused on problem solving, authentic and direct experiences, and interaction and collaboration to promote student…

  20. In-situ measurements of alloy oxidation/corrosion/erosion using a video camera and proximity sensor with microcomputer control

    NASA Technical Reports Server (NTRS)

    Deadmore, D. L.

    1984-01-01

    Two noncontacting and nondestructive, remotely controlled methods of measuring the progress of oxidation/corrosion/erosion of metal alloys, exposed to flame test conditions, are described. The external diameter of a sample under test in a flame was measured by a video camera width measurement system. An eddy current proximity probe system, for measurements outside of the flame, was also developed and tested. The two techniques were applied to the measurement of the oxidation of 304 stainless steel at 910 C using a Mach 0.3 flame. The eddy current probe system yielded a recession rate of 0.41 mils diameter loss per hour and the video system gave 0.27.

  1. 241-AZ-101 Waste Tank Color Video Camera System Shop Acceptance Test Report

    SciTech Connect

    WERRY, S.M.

    2000-03-23

    This report includes shop acceptance test results. The test was performed prior to installation at tank AZ-101. Both the camera system and camera purge system were originally sought and procured as a part of initial waste retrieval project W-151.

  2. Hand-gesture extraction and recognition from the video sequence acquired by a dynamic camera using condensation algorithm

    NASA Astrophysics Data System (ADS)

    Luo, Dan; Ohya, Jun

    2009-01-01

    To achieve environments in which humans and mobile robots co-exist, technologies for recognizing hand gestures from the video sequence acquired by a dynamic camera could be useful for human-to-robot interface systems. Most of conventional hand gesture technologies deal with only still camera images. This paper proposes a very simple and stable method for extracting hand motion trajectories based on the Human-Following Local Coordinate System (HFLC System), which is obtained from the located human face and both hands. Then, we apply Condensation Algorithm to the extracted hand trajectories so that the hand motion is recognized. We demonstrate the effectiveness of the proposed method by conducting experiments on 35 kinds of sign language based hand gestures.

  3. Activity profiles and hook-tool use of New Caledonian crows recorded by bird-borne video cameras

    PubMed Central

    Troscianko, Jolyon; Rutz, Christian

    2015-01-01

    New Caledonian crows are renowned for their unusually sophisticated tool behaviour. Despite decades of fieldwork, however, very little is known about how they make and use their foraging tools in the wild, which is largely owing to the difficulties in observing these shy forest birds. To obtain first estimates of activity budgets, as well as close-up observations of tool-assisted foraging, we equipped 19 wild crows with self-developed miniature video cameras, yielding more than 10 h of analysable video footage for 10 subjects. While only four crows used tools during recording sessions, they did so extensively: across all 10 birds, we conservatively estimate that tool-related behaviour occurred in 3% of total observation time, and accounted for 19% of all foraging behaviour. Our video-loggers provided first footage of crows manufacturing, and using, one of their most complex tool types—hooked stick tools—under completely natural foraging conditions. We recorded manufacture from live branches of paperbark (Melaleuca sp.) and another tree species (thought to be Acacia spirorbis), and deployment of tools in a range of contexts, including on the forest floor. Taken together, our video recordings reveal an ‘expanded’ foraging niche for hooked stick tools, and highlight more generally how crows routinely switch between tool- and bill-assisted foraging. PMID:26701755

  4. Studying complex decision making in natural settings: using a head-mounted video camera to study competitive orienteering.

    PubMed

    Omodei, M M; McLennan, J

    1994-12-01

    Head-mounted video recording is described as a potentially powerful method for studying decision making in natural settings. Most alternative data-collection procedures are intrusive and disruptive of the decision-making processes involved while conventional video-recording procedures are either impractical or impossible. As a severe test of the robustness of the methodology we studied the decision making of 6 experienced orienteers who carried a head-mounted light-weight video camera as they navigated, running as fast as possible, around a set of control points in a forest. Use of the Wilcoxon matched-pairs signed-ranks test indicated that compared with free recall, video-assisted recall evoked (a) significantly greater experiential immersion in the recall, (b) significantly more specific recollections of navigation-related thoughts and feelings, (c) significantly more realizations of map and terrain features and aspects of running speed which were not noticed at the time of actual competition, and (d) significantly greater insight into specific navigational errors and the intrusion of distracting thoughts into the decision-making process. Potential applications of the technique in (a) the environments of emergency services, (b) therapeutic contexts, (c) education and training, and (d) sports psychology are discussed. PMID:7870526

  5. Activity profiles and hook-tool use of New Caledonian crows recorded by bird-borne video cameras.

    PubMed

    Troscianko, Jolyon; Rutz, Christian

    2015-12-01

    New Caledonian crows are renowned for their unusually sophisticated tool behaviour. Despite decades of fieldwork, however, very little is known about how they make and use their foraging tools in the wild, which is largely owing to the difficulties in observing these shy forest birds. To obtain first estimates of activity budgets, as well as close-up observations of tool-assisted foraging, we equipped 19 wild crows with self-developed miniature video cameras, yielding more than 10 h of analysable video footage for 10 subjects. While only four crows used tools during recording sessions, they did so extensively: across all 10 birds, we conservatively estimate that tool-related behaviour occurred in 3% of total observation time, and accounted for 19% of all foraging behaviour. Our video-loggers provided first footage of crows manufacturing, and using, one of their most complex tool types--hooked stick tools--under completely natural foraging conditions. We recorded manufacture from live branches of paperbark (Melaleuca sp.) and another tree species (thought to be Acacia spirorbis), and deployment of tools in a range of contexts, including on the forest floor. Taken together, our video recordings reveal an 'expanded' foraging niche for hooked stick tools, and highlight more generally how crows routinely switch between tool- and bill-assisted foraging. PMID:26701755

  6. CCD imaging systems for DEIMOS

    NASA Astrophysics Data System (ADS)

    Wright, Christopher A.; Kibrick, Robert I.; Alcott, Barry; Gilmore, David K.; Pfister, Terry; Cowley, David J.

    2003-03-01

    The DEep Imaging Multi-Object Spectrograph (DEIMOS) images with an 8K x 8K science mosaic composed of eight 2K x 4K MIT/Lincoln Lab (MIT/LL) CCDs. It also incorporates two 1200 x 600 Orbit Semiconductor CCDs for active, close-loop flexure compensation. The science mosaic CCD controller system reads out all eight science CCDs in 40 seconds while maintaining the low noise floor of the MIT/Lincoln Lab CCDs. The flexure compensation (FC) CCD controller reads out the FC CCDs several times per minute during science mosaic exposures. The science mosaic CCD controller and the FC CCD controller are located on the electronics ring of DEIMOS. Both the MIT/Lincoln Lab CCDs and the Orbit flexure compensation CCDs and their associated cabling and printed circuit boards are housed together in the same detector vessel that is approximately 10 feet away from the electronics ring. Each CCD controller has a modular hardware design and is based on the San Diego State University (SDSU) Generation 2 (SDSU-2) CCD controller. Provisions have been made to the SDSU-2 video board to accommodate external CCD preamplifiers that are located at the detector vessel. Additional circuitry has been incorporated in the CCD controllers to allow the readback of all clocks and bias voltages for up to eight CCDs, to allow up to 10 temperature monitor and control points of the mosaic, and to allow full-time monitoring of power supplies and proper power supply sequencing. Software control features of the CCD controllers are: software selection between multiple mosaic readout modes, readout speeds, selectable gains, ramped parallel clocks to eliminate spurious charge on the CCDs, constant temperature monitoring and control of each CCD within the mosaic, proper sequencing of the bias voltages of the CCD output MOSFETs, and anti-blooming operation of the science mosaic. We cover both the hardware and software highlights of both of these CCD controller systems as well as their respective performance.

  7. Lights, Camera: Learning! Findings from studies of video in formal and informal science education

    NASA Astrophysics Data System (ADS)

    Borland, J.

    2013-12-01

    As part of the panel, media researcher, Jennifer Borland, will highlight findings from a variety of studies of videos across the spectrum of formal to informal learning, including schools, museums, and in viewers homes. In her presentation, Borland will assert that the viewing context matters a great deal, but there are some general take-aways that can be extrapolated to the use of educational video in a variety of settings. Borland has served as an evaluator on several video-related projects funded by NASA and the the National Science Foundation including: Data Visualization videos and Space Shows developed by the American Museum of Natural History, DragonflyTV, Earth the Operators Manual, The Music Instinct and Time Team America.

  8. Lights, camera, action…critique? Submit videos to AGU communications workshop

    NASA Astrophysics Data System (ADS)

    Viñas, Maria-José

    2011-08-01

    What does it take to create a science video that engages the audience and draws thousands of views on YouTube? Those interested in finding out should submit their research-related videos to AGU's Fall Meeting science film analysis workshop, led by oceanographer turned documentary director Randy Olson. Olson, writer-director of two films (Flock of Dodos: The Evolution-Intelligent Design Circus and Sizzle: A Global Warming Comedy) and author of the book Don't Be Such a Scientist: Talking Substance in an Age of Style, will provide constructive criticism on 10 selected video submissions, followed by moderated discussion with the audience. To submit your science video (5 minutes or shorter), post it on YouTube and send the link to the workshop coordinator, Maria-José Viñas (mjvinas@agu.org), with the following subject line: Video submission for Olson workshop. AGU will be accepting submissions from researchers and media officers of scientific institutions until 6:00 P.M. eastern time on Friday, 4 November. Those whose videos are selected to be screened will be notified by Friday, 18 November. All are welcome to attend the workshop at the Fall Meeting.

  9. Ground and aerial use of an infrared video camera with a mid-infrared filter (1.45 to 2.0 microns)

    NASA Astrophysics Data System (ADS)

    Everitt, J. H.; Escobar, D. E.; Nixon, P. R.; Hussey, M. A.; Blazquez, C. H.

    1986-01-01

    A black-and-white infrared (0.9 to 2.2 micron) video camera, filtered to record radiation within the 1.45 to 2.0 microns midinfrared water absorption region, was evaluated with ground and aerial studies. Imagery of single leaves of seven plant species (four succulent; three nonsucculent) showed that succulent leaves were easily distinguishable from nonsucculent leaves. Spectrophotometric leaf reflectance measurements made over the 1.45 to 2.0 microns confirmed the imagery results. Ground-based video recordings also showed that severely drought-stressed buffelgrass (Cenchrus ciliaris L.) plants were distinguishable from the nonstressed and moderately stressed plants. Moreover, the camera provided airborne imagery that clearly differentiated between irrigated and nonirrigated grass plots. Due to the lower radiation intensity in the mid-infrared spectral region and the low sensitivity response of the camera's tube, these video images were not as sharp as those obtained by visible or visible/near-infrared sensitive video cameras. Nevertheless, these results showed that a video camera with midinfrared sensitivity has potential for use in remote sensing research and applications.

  10. Development of observation method for hydrothermal flows with acoustic video camera

    NASA Astrophysics Data System (ADS)

    Mochizuki, M.; Asada, A.; Kinoshita, M.; Tamura, H.; Tamaki, K.

    2011-12-01

    DIDSON (Dual-Frequency IDentification SONar) is acoustic lens-based sonar. It has sufficiently high resolution and rapid refresh rate that it can substitute for optical system in turbid or dark water where optical systems fail. Institute of Industrial Science, University of Tokyo (IIS) has understood DIDSON's superior performance and tried to develop a new observation method based on DIDSON for hydrothermal discharging from seafloor vent. We expected DIDSON to reveal whole image of hydrothermal plume as well as detail inside the plume. In October 2009, we conducted seafloor reconnaissance using a manned deep-sea submersible Shinkai6500 in Central Indian Ridge 18-20deg.S, where hydrothermal plume signatures were previously perceived. DIDSON was equipped on the top of Shinkai6500 in order to get acoustic video images of hydrothermal plumes. The acoustic video images of the hydrothermal plumes had been captured in three of seven dives. These are only a few acoustic video images of the hydrothermal plumes. We could identify shadings inside the acoustic video images of the hydrothermal plumes. Silhouettes of the hydrothermal plumes varied from second to second, and the shadings inside them varied their shapes, too. These variations corresponded to internal structures and flows of the plumes. We are analyzing the acoustic video images in order to deduce information of their internal structures and flows in plumes. On the other hand, we are preparing a tank experiment so that we will have acoustic video images of water flow under the control of flow rate. The purpose of the experiment is to understand relation between flow rate and acoustic video image quantitatively. Results from this experiment will support the aforementioned image analysis of the hydrothermal plume data from Central Indian Ridge. We will report the overview of the image analysis and the tank experiments, and discuss possibility of DIDSON as an observation tool for seafloor hydrothermal activity.

  11. Applying compressive sensing to TEM video: A substantial frame rate increase on any camera

    SciTech Connect

    Stevens, Andrew; Kovarik, Libor; Abellan, Patricia; Yuan, Xin; Carin, Lawrence; Browning, Nigel D.

    2015-08-13

    One of the main limitations of imaging at high spatial and temporal resolution during in-situ transmission electron microscopy (TEM) experiments is the frame rate of the camera being used to image the dynamic process. While the recent development of direct detectors has provided the hardware to achieve frame rates approaching 0.1 ms, the cameras are expensive and must replace existing detectors. In this paper, we examine the use of coded aperture compressive sensing (CS) methods to increase the frame rate of any camera with simple, low-cost hardware modifications. The coded aperture approach allows multiple sub-frames to be coded and integrated into a single camera frame during the acquisition process, and then extracted upon readout using statistical CS inversion. Here we describe the background of CS and statistical methods in depth and simulate the frame rates and efficiencies for in-situ TEM experiments. Depending on the resolution and signal/noise of the image, it should be possible to increase the speed of any camera by more than an order of magnitude using this approach.

  12. Applying compressive sensing to TEM video: A substantial frame rate increase on any camera

    DOE PAGESBeta

    Stevens, Andrew; Kovarik, Libor; Abellan, Patricia; Yuan, Xin; Carin, Lawrence; Browning, Nigel D.

    2015-08-13

    One of the main limitations of imaging at high spatial and temporal resolution during in-situ transmission electron microscopy (TEM) experiments is the frame rate of the camera being used to image the dynamic process. While the recent development of direct detectors has provided the hardware to achieve frame rates approaching 0.1 ms, the cameras are expensive and must replace existing detectors. In this paper, we examine the use of coded aperture compressive sensing (CS) methods to increase the frame rate of any camera with simple, low-cost hardware modifications. The coded aperture approach allows multiple sub-frames to be coded and integratedmore » into a single camera frame during the acquisition process, and then extracted upon readout using statistical CS inversion. Here we describe the background of CS and statistical methods in depth and simulate the frame rates and efficiencies for in-situ TEM experiments. Depending on the resolution and signal/noise of the image, it should be possible to increase the speed of any camera by more than an order of magnitude using this approach.« less

  13. Internet Teleprescence by Real-Time View-Dependent Image Generation with Omnidirectional Video Camera

    NASA Astrophysics Data System (ADS)

    Morita, Shinji; Yamazawa, Kazumasa; Yokoya, Naokazu

    2003-01-01

    This paper describes a new networked telepresence system which realizes virtual tours into a visualized dynamic real world without significant time delay. Our system is realized by the following three steps: (1) video-rate omnidirectional image acquisition, (2) transportation of an omnidirectional video stream via internet, and (3) real-time view-dependent perspective image generation from the omnidirectional video stream. Our system is applicable to real-time telepresence in the situation where the real world to be seen is far from an observation site, because the time delay from the change of user"s viewing direction to the change of displayed image is small and does not depend on the actual distance between both sites. Moreover, multiple users can look around from a single viewpoint in a visualized dynamic real world in different directions at the same time. In experiments, we have proved that the proposed system is useful for internet telepresence.

  14. Compact all-CMOS spatiotemporal compressive sensing video camera with pixel-wise coded exposure.

    PubMed

    Zhang, Jie; Xiong, Tao; Tran, Trac; Chin, Sang; Etienne-Cummings, Ralph

    2016-04-18

    We present a low power all-CMOS implementation of temporal compressive sensing with pixel-wise coded exposure. This image sensor can increase video pixel resolution and frame rate simultaneously while reducing data readout speed. Compared to previous architectures, this system modulates pixel exposure at the individual photo-diode electronically without external optical components. Thus, the system provides reduction in size and power compare to previous optics based implementations. The prototype image sensor (127 × 90 pixels) can reconstruct 100 fps videos from coded images sampled at 5 fps. With 20× reduction in readout speed, our CMOS image sensor only consumes 14μW to provide 100 fps videos. PMID:27137331

  15. Video imaging system and thermal mapping of the molten hearth in an electron beam melting furnace

    SciTech Connect

    Miszkiel, M.E.; Davis, R.A.; Van Den Avyle, J.A.

    1995-12-31

    This project was initiated to develop an enhanced video imaging system for the Liquid Metal Processing Laboratory Electron Beam Melting (EB) Furnace at Sandia and to use color video images to map the temperature distribution of the surface of the molten hearth. In a series of test melts, the color output of the video image was calibrated against temperatures measured by an optical pyrometer and CCD camera viewing port above the molten pool. To prevent potential metal vapor deposition onto line-of-sight optical surfaces above the pool, argon backfill was used along with a pinhole aperture to obtain the vide image. The geometry of the optical port to the hearth set the limits for the focus lens and CCD camera`s field of view. Initial melts were completed with the pyrometer and pinhole aperture port in a fixed position. Using commercially available vacuum components, a second flange assembly was constructed to provide flexibility in choosing pyrometer target sights on the hearth and to adjust the field of view for the focus lens/CCD combination. RGB video images processed from the melts verified that red wavelength light captured with the video camera could be calibrated with the optical pyrometer target temperatures and used to generate temperature maps of the hearth surface. Two color ratio thermal mapping using red and green video images, which has theoretical advantages, was less successful due to probable camera non-linearities in the red and green image intensities.

  16. Determining Camera Gain in Room Temperature Cameras

    SciTech Connect

    Joshua Cogliati

    2010-12-01

    James R. Janesick provides a method for determining the amplification of a CCD or CMOS camera when only access to the raw images is provided. However, the equation that is provided ignores the contribution of dark current. For CCD or CMOS cameras that are cooled well below room temperature, this is not a problem, however, the technique needs adjustment for use with room temperature cameras. This article describes the adjustment made to the equation, and a test of this method.

  17. Visual surveys can reveal rather different 'pictures' of fish densities: Comparison of trawl and video camera surveys in the Rockall Bank, NE Atlantic Ocean

    NASA Astrophysics Data System (ADS)

    McIntyre, F. D.; Neat, F.; Collie, N.; Stewart, M.; Fernandes, P. G.

    2015-01-01

    Visual surveys allow non-invasive sampling of organisms in the marine environment which is of particular importance in deep-sea habitats that are vulnerable to damage caused by destructive sampling devices such as bottom trawls. To enable visual surveying at depths greater than 200 m we used a deep towed video camera system, to survey large areas around the Rockall Bank in the North East Atlantic. The area of seabed sampled was similar to that sampled by a bottom trawl, enabling samples from the towed video camera system to be compared with trawl sampling to quantitatively assess the numerical density of deep-water fish populations. The two survey methods provided different results for certain fish taxa and comparable results for others. Fish that exhibited a detectable avoidance behaviour to the towed video camera system, such as the Chimaeridae, resulted in mean density estimates that were significantly lower (121 fish/km2) than those determined by trawl sampling (839 fish/km2). On the other hand, skates and rays showed no reaction to the lights in the towed body of the camera system, and mean density estimates of these were an order of magnitude higher (64 fish/km2) than the trawl (5 fish/km2). This is probably because these fish can pass under the footrope of the trawl due to their flat body shape lying close to the seabed but are easily detected by the benign towed video camera system. For other species, such as Molva sp, estimates of mean density were comparable between the two survey methods (towed camera, 62 fish/km2; trawl, 73 fish/km2). The towed video camera system presented here can be used as an alternative benign method for providing indices of abundance for species such as ling in areas closed to trawling, or for those fish that are poorly monitored by trawl surveying in any area, such as the skates and rays.

  18. Lights! Camera! Action! Producing Library Instruction Video Tutorials Using Camtasia Studio

    ERIC Educational Resources Information Center

    Charnigo, Laurie

    2009-01-01

    From Web guides to online tutorials, academic librarians are increasingly experimenting with many different technologies in order to meet the needs of today's growing distance education populations. In this article, the author discusses one librarian's experience using Camtasia Studio to create subject specific video tutorials. Benefits, as well…

  19. Evaluation of a 0.9- to 2.2-microns sensitive video camera with a mid-infrared filter (1.45- to 2.0-microns)

    NASA Astrophysics Data System (ADS)

    Everitt, J. H.; Escobar, D. E.; Nixon, P. R.; Blazquez, C. H.; Hussey, M. A.

    The application of 0.9- to 2.2-microns sensitive black and white IR video cameras to remote sensing is examined. Field and laboratory recordings of the upper and lower surface of peperomia leaves, succulent prickly pear, and buffelgrass are evaluated; the reflectance, phytomass, green weight, and water content for the samples were measured. The data reveal that 0.9- to 2.2-microns video cameras are effective tools for laboratory and field research; however, the resolution and image quality of the data is poor compared to visible and near-IR images.

  20. Camera-on-a-Chip

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Jet Propulsion Laboratory's research on a second generation, solid-state image sensor technology has resulted in the Complementary Metal- Oxide Semiconductor Active Pixel Sensor (CMOS), establishing an alternative to the Charged Coupled Device (CCD). Photobit Corporation, the leading supplier of CMOS image sensors, has commercialized two products of their own based on this technology: the PB-100 and PB-300. These devices are cameras on a chip, combining all camera functions. CMOS "active-pixel" digital image sensors offer several advantages over CCDs, a technology used in video and still-camera applications for 30 years. The CMOS sensors draw less energy, they use the same manufacturing platform as most microprocessors and memory chips, and they allow on-chip programming of frame size, exposure, and other parameters.

  1. Laboratory Test of CCD #1 in BOAO

    NASA Astrophysics Data System (ADS)

    Park, Byeong-Gon; Chun, Moo Young; Kim, Seung-Lee

    1995-12-01

    An introduction to the first CCD camera system in Bohyunsan Optical Astronomy Observatory (CCD#1) is presented. The CCD camera adopts modular dewar design of IfA(Institute for Astronomy at Hawaii University) and SDSU(San Diego State University) general purpose CCD controller. The user interface is based on IfA design of easy-to-use GUI program running on the NeXT workstation. The characteristics of the CCD#1 including Gain, Charge Transfer Efficiency, rms Read-Out Noise, Linearity and Dynamic range is tested and discussed. The CCD#1 shows 6.4 electrons RON and gain of 3.49 electrons per ADU, and the optimization resulted in about 27 seconds readout time guaranteeing charge transfer efficiency of 0.99999 for both directions. Linearity test shows that non-linear coefficient is 6e-7 in the range of 0 to 30,000 ADU.

  2. High-speed flow visualization with a new digital video camera

    NASA Astrophysics Data System (ADS)

    Volpe, Jason

    2005-11-01

    Scientific photography opened new vistas upon high-speed physics in the previous century. Now, high-speed digital cameras are becoming available to replace the older photographic technology with similar speed, resolution, and light sensitivity but vastly better utility and user-friendliness. Here we apply a Photron Fastcam APX-RS digital camera that is capable of megapixel image resolution at 3000 frames/sec up to 250,000 frames/sec at lower resolution. Frame exposure is separately adjustable down to 1 microsecond. Several of the ``icons'' of high-speed flow visualization are repeated here, including firecracker and gram-range explosions, popping a champagne cork, vortex rings, shock emergence from a shock tube, the splash of a milk drop, and the burst of a toy balloon. Many of these visualizations utilize traditional schlieren or shadowgraph optics to show shock wave propagation. Still frames and brief movies will be shown.

  3. A risk-based coverage model for video surveillance camera control optimization

    NASA Astrophysics Data System (ADS)

    Zhang, Hongzhou; Du, Zhiguo; Zhao, Xingtao; Li, Peiyue; Li, Dehua

    2015-12-01

    Visual surveillance system for law enforcement or police case investigation is different from traditional application, for it is designed to monitor pedestrians, vehicles or potential accidents. Visual surveillance risk is defined as uncertainty of visual information of targets and events monitored in present work and risk entropy is introduced to modeling the requirement of police surveillance task on quality and quantity of vide information. the prosed coverage model is applied to calculate the preset FoV position of PTZ camera.

  4. Adaptive compressive sensing camera

    NASA Astrophysics Data System (ADS)

    Hsu, Charles; Hsu, Ming K.; Cha, Jae; Iwamura, Tomo; Landa, Joseph; Nguyen, Charles; Szu, Harold

    2013-05-01

    We have embedded Adaptive Compressive Sensing (ACS) algorithm on Charge-Coupled-Device (CCD) camera based on the simplest concept that each pixel is a charge bucket, and the charges comes from Einstein photoelectric conversion effect. Applying the manufactory design principle, we only allow altering each working component at a minimum one step. We then simulated what would be such a camera can do for real world persistent surveillance taking into account of diurnal, all weather, and seasonal variations. The data storage has saved immensely, and the order of magnitude of saving is inversely proportional to target angular speed. We did design two new components of CCD camera. Due to the matured CMOS (Complementary metal-oxide-semiconductor) technology, the on-chip Sample and Hold (SAH) circuitry can be designed for a dual Photon Detector (PD) analog circuitry for changedetection that predicts skipping or going forward at a sufficient sampling frame rate. For an admitted frame, there is a purely random sparse matrix [Φ] which is implemented at each bucket pixel level the charge transport bias voltage toward its neighborhood buckets or not, and if not, it goes to the ground drainage. Since the snapshot image is not a video, we could not apply the usual MPEG video compression and Hoffman entropy codec as well as powerful WaveNet Wrapper on sensor level. We shall compare (i) Pre-Processing FFT and a threshold of significant Fourier mode components and inverse FFT to check PSNR; (ii) Post-Processing image recovery will be selectively done by CDT&D adaptive version of linear programming at L1 minimization and L2 similarity. For (ii) we need to determine in new frames selection by SAH circuitry (i) the degree of information (d.o.i) K(t) dictates the purely random linear sparse combination of measurement data a la [Φ]M,N M(t) = K(t) Log N(t).

  5. Study of recognizing multiple persons' complicated hand gestures from the video sequence acquired by a moving camera

    NASA Astrophysics Data System (ADS)

    Dan, Luo; Ohya, Jun

    2010-02-01

    Recognizing hand gestures from the video sequence acquired by a dynamic camera could be a useful interface between humans and mobile robots. We develop a state based approach to extract and recognize hand gestures from moving camera images. We improved Human-Following Local Coordinate (HFLC) System, a very simple and stable method for extracting hand motion trajectories, which is obtained from the located human face, body part and hand blob changing factor. Condensation algorithm and PCA-based algorithm was performed to recognize extracted hand trajectories. In last research, this Condensation Algorithm based method only applied for one person's hand gestures. In this paper, we propose a principal component analysis (PCA) based approach to improve the recognition accuracy. For further improvement, temporal changes in the observed hand area changing factor are utilized as new image features to be stored in the database after being analyzed by PCA. Every hand gesture trajectory in the database is classified into either one hand gesture categories, two hand gesture categories, or temporal changes in hand blob changes. We demonstrate the effectiveness of the proposed method by conducting experiments on 45 kinds of sign language based Japanese and American Sign Language gestures obtained from 5 people. Our experimental recognition results show better performance is obtained by PCA based approach than the Condensation algorithm based method.

  6. Design and analysis of filter-based optical systems for spectral responsivity estimation of digital video cameras

    NASA Astrophysics Data System (ADS)

    Chang, Gao-Wei; Jian, Hong-Da; Yeh, Zong-Mu; Cheng, Chin-Pao

    2004-02-01

    For estimating spectral responsivities of digital video cameras, a filter-based optical system is designed with sophisticated filter selections, in this paper. The filter consideration in the presence of noise is central to the optical systems design, since the spectral filters primarily prescribe the structure of the perturbed system. A theoretical basis is presented to confirm that sophisticated filter selections can make this system as insensitive to noise as possible. Also, we propose a filter selection method based on the orthogonal-triangular (QR) decomposition with column pivoting (QRCP). To investigate the noise effects, we assess the estimation errors between the actual and estimated spectral responsivities, with the different signal-to-noise ratio (SNR) levels of an eight-bit/channel camera. Simulation results indicate that the proposed method yields satisfactory estimation accuracy. That is, the filter-based optical system with the spectral filters selected from the QRCP-based method is much less sensitive to noise than those with other filters from different selections.

  7. The evolution of the scientific CCD

    NASA Astrophysics Data System (ADS)

    Blouke, M. M.

    2011-03-01

    There is little doubt that the Charge-Coupled Device (CCD) and its cousin the CMOS Active Pixel Sensor (APS) have completely revolutionized the imaging field. It is becoming more and more difficult to obtain film for cameras and digital cameras are available from the cell phone to the professional photographer in multimegapixel format. This paper explores some of the origins of the CCD as a scientific sensor.

  8. DrugCam(®)-An intelligent video camera system to make safe cytotoxic drug preparations.

    PubMed

    Benizri, Frédéric; Dalifard, Benoit; Zemmour, Christophe; Henriquet, Maxime; Fougereau, Emmanuelle; Le Franc, Benoit

    2016-04-11

    DrugCam(®) is a new approach to control the chemotherapy preparations with an intelligent video system that enables automatic verification during the critical stages of preparations combined with an a posteriori control with partial or total visualization of the video recording of preparations. The assessment was about the recognizing of anticancer drug vials (qualitative analysis) and syringe volumes (quantitative analysis). The qualitative analysis was conducted for a total of 120 vials with sensitivity of 100% for 84.2% of the vials and at least 97% for all the vials tested. Accuracy was at least 98.5% for all vials. The quantitative analysis was assessed by detecting 10 measures of each graduation for syringes. The identification error rate was 2.1% (244/11,640) i.e. almost 94% to the next graduation. Only 3% (35/1164) of the graduations tested, i.e. 23/35 for volume <0.13ml of 1ml syringes, presented a volume error outside the admissible limit of ±5% of a confidence band constructed for the estimated linear regression line for each syringe. In addition to the vial detection model, barcodes can also read when they are present on vials. DrugCam(®) offers an innovative approach for controlling chemotherapy preparations and constitutes an optimized application of telepharmacy. PMID:26923317

  9. Introducing Contactless Blood Pressure Assessment Using a High Speed Video Camera.

    PubMed

    Jeong, In Cheol; Finkelstein, Joseph

    2016-04-01

    Recent studies demonstrated that blood pressure (BP) can be estimated using pulse transit time (PTT). For PTT calculation, photoplethysmogram (PPG) is usually used to detect a time lag in pulse wave propagation which is correlated with BP. Until now, PTT and PPG were registered using a set of body-worn sensors. In this study a new methodology is introduced allowing contactless registration of PTT and PPG using high speed camera resulting in corresponding image-based PTT (iPTT) and image-based PPG (iPPG) generation. The iPTT value can be potentially utilized for blood pressure estimation however extent of correlation between iPTT and BP is unknown. The goal of this preliminary feasibility study was to introduce the methodology for contactless generation of iPPG and iPTT and to make initial estimation of the extent of correlation between iPTT and BP "in vivo." A short cycling exercise was used to generate BP changes in healthy adult volunteers in three consecutive visits. BP was measured by a verified BP monitor simultaneously with iPTT registration at three exercise points: rest, exercise peak, and recovery. iPPG was simultaneously registered at two body locations during the exercise using high speed camera at 420 frames per second. iPTT was calculated as a time lag between pulse waves obtained as two iPPG's registered from simultaneous recoding of head and palm areas. The average inter-person correlation between PTT and iPTT was 0.85 ± 0.08. The range of inter-person correlations between PTT and iPTT was from 0.70 to 0.95 (p < 0.05). The average inter-person coefficient of correlation between SBP and iPTT was -0.80 ± 0.12. The range of correlations between systolic BP and iPTT was from 0.632 to 0.960 with p < 0.05 for most of the participants. Preliminary data indicated that a high speed camera can be potentially utilized for unobtrusive contactless monitoring of abrupt blood pressure changes in a variety of settings. The initial prototype system was able to successfully generate approximation of pulse transit time and showed high intra-individual correlation between iPTT and BP. Further investigation of the proposed approach is warranted. PMID:26791993

  10. Social Interactions of Juvenile Brown Boobies at Sea as Observed with Animal-Borne Video Cameras

    PubMed Central

    Yoda, Ken; Murakoshi, Miku; Tsutsui, Kota; Kohno, Hiroyoshi

    2011-01-01

    While social interactions play a crucial role on the development of young individuals, those of highly mobile juvenile birds in inaccessible environments are difficult to observe. In this study, we deployed miniaturised video recorders on juvenile brown boobies Sula leucogaster, which had been hand-fed beginning a few days after hatching, to examine how social interactions between tagged juveniles and other birds affected their flight and foraging behaviour. Juveniles flew longer with congeners, especially with adult birds, than solitarily. In addition, approximately 40% of foraging occurred close to aggregations of congeners and other species. Young seabirds voluntarily followed other birds, which may directly enhance their foraging success and improve foraging and flying skills during their developmental stage, or both. PMID:21573196

  11. A simple, inexpensive video camera setup for the study of avian nest activity

    USGS Publications Warehouse

    Sabine, J.B.; Meyers, J.M.; Schweitzer, S.H.

    2005-01-01

    Time-lapse video photography has become a valuable tool for collecting data on avian nest activity and depredation; however, commercially available systems are expensive (>USA $4000/unit). We designed an inexpensive system to identify causes of nest failure of American Oystercatchers (Haematopus palliatus) and assessed its utility at Cumberland Island National Seashore, Georgia. We successfully identified raccoon (Procyon lotor), bobcat (Lynx rufus), American Crow (Corvus brachyrhynchos), and ghost crab (Ocypode quadrata) predation on oystercatcher nests. Other detected causes of nest failure included tidal overwash, horse trampling, abandonment, and human destruction. System failure rates were comparable with commercially available units. Our system's efficacy and low cost (<$800) provided useful data for the management and conservation of the American Oystercatcher.

  12. Measuring multivariate subjective image quality for still and video cameras and image processing system components

    NASA Astrophysics Data System (ADS)

    Nyman, Göte; Leisti, Tuomas; Lindroos, Paul; Radun, Jenni; Suomi, Sini; Virtanen, Toni; Olives, Jean-Luc; Oja, Joni; Vuori, Tero

    2008-01-01

    The subjective quality of an image is a non-linear product of several, simultaneously contributing subjective factors such as the experienced naturalness, colorfulness, lightness, and clarity. We have studied subjective image quality by using a hybrid qualitative/quantitative method in order to disclose relevant attributes to experienced image quality. We describe our approach in mapping the image quality attribute space in three cases: still studio image, video clips of a talking head and moving objects, and in the use of image processing pipes for 15 still image contents. Naive observers participated in three image quality research contexts in which they were asked to freely and spontaneously describe the quality of the presented test images. Standard viewing conditions were used. The data shows which attributes are most relevant for each test context, and how they differentiate between the selected image contents and processing systems. The role of non-HVS based image quality analysis is discussed.

  13. Development of CCD controller for scientific application

    NASA Astrophysics Data System (ADS)

    Khan, M. S.; Pathan, F. M.; Shah, U. V., Prof; Makwana, D. H., Prof; Anandarao, B. G., Prof

    2010-02-01

    Photoelectric equipment has wide applications such as spectroscopy, temperature measurement in infrared region and in astronomical research etc. A photoelectric transducer converts radiant energy into electrical energy. There are two types of photoelectric transducers namely photo-multiplier tube (PMT) and charged couple device (CCD) are used to convert radiant energy into electrical signal. Now the entire modern instruments use CCD technology. We have designed and developed a CCD camera controller using camera chip CD47-10 of Marconi which has 1K × 1K pixel for space application only.

  14. Jellyfish Support High Energy Intake of Leatherback Sea Turtles (Dermochelys coriacea): Video Evidence from Animal-Borne Cameras

    PubMed Central

    Heaslip, Susan G.; Iverson, Sara J.; Bowen, W. Don; James, Michael C.

    2012-01-01

    The endangered leatherback turtle is a large, highly migratory marine predator that inexplicably relies upon a diet of low-energy gelatinous zooplankton. The location of these prey may be predictable at large oceanographic scales, given that leatherback turtles perform long distance migrations (1000s of km) from nesting beaches to high latitude foraging grounds. However, little is known about the profitability of this migration and foraging strategy. We used GPS location data and video from animal-borne cameras to examine how prey characteristics (i.e., prey size, prey type, prey encounter rate) correlate with the daytime foraging behavior of leatherbacks (n = 19) in shelf waters off Cape Breton Island, NS, Canada, during August and September. Video was recorded continuously, averaged 1:53 h per turtle (range 0:08–3:38 h), and documented a total of 601 prey captures. Lion's mane jellyfish (Cyanea capillata) was the dominant prey (83–100%), but moon jellyfish (Aurelia aurita) were also consumed. Turtles approached and attacked most jellyfish within the camera's field of view and appeared to consume prey completely. There was no significant relationship between encounter rate and dive duration (p = 0.74, linear mixed-effects models). Handling time increased with prey size regardless of prey species (p = 0.0001). Estimates of energy intake averaged 66,018 kJ•d−1 but were as high as 167,797 kJ•d−1 corresponding to turtles consuming an average of 330 kg wet mass•d−1 (up to 840 kg•d−1) or approximately 261 (up to 664) jellyfish•d-1. Assuming our turtles averaged 455 kg body mass, they consumed an average of 73% of their body mass•d−1 equating to an average energy intake of 3–7 times their daily metabolic requirements, depending on estimates used. This study provides evidence that feeding tactics used by leatherbacks in Atlantic Canadian waters are highly profitable and our results are consistent with estimates of mass gain prior to southward migration. PMID:22438906

  15. Jellyfish support high energy intake of leatherback sea turtles (Dermochelys coriacea): video evidence from animal-borne cameras.

    PubMed

    Heaslip, Susan G; Iverson, Sara J; Bowen, W Don; James, Michael C

    2012-01-01

    The endangered leatherback turtle is a large, highly migratory marine predator that inexplicably relies upon a diet of low-energy gelatinous zooplankton. The location of these prey may be predictable at large oceanographic scales, given that leatherback turtles perform long distance migrations (1000s of km) from nesting beaches to high latitude foraging grounds. However, little is known about the profitability of this migration and foraging strategy. We used GPS location data and video from animal-borne cameras to examine how prey characteristics (i.e., prey size, prey type, prey encounter rate) correlate with the daytime foraging behavior of leatherbacks (n = 19) in shelf waters off Cape Breton Island, NS, Canada, during August and September. Video was recorded continuously, averaged 1:53 h per turtle (range 0:08-3:38 h), and documented a total of 601 prey captures. Lion's mane jellyfish (Cyanea capillata) was the dominant prey (83-100%), but moon jellyfish (Aurelia aurita) were also consumed. Turtles approached and attacked most jellyfish within the camera's field of view and appeared to consume prey completely. There was no significant relationship between encounter rate and dive duration (p = 0.74, linear mixed-effects models). Handling time increased with prey size regardless of prey species (p = 0.0001). Estimates of energy intake averaged 66,018 kJ • d(-1) but were as high as 167,797 kJ • d(-1) corresponding to turtles consuming an average of 330 kg wet mass • d(-1) (up to 840 kg • d(-1)) or approximately 261 (up to 664) jellyfish • d(-1). Assuming our turtles averaged 455 kg body mass, they consumed an average of 73% of their body mass • d(-1) equating to an average energy intake of 3-7 times their daily metabolic requirements, depending on estimates used. This study provides evidence that feeding tactics used by leatherbacks in Atlantic Canadian waters are highly profitable and our results are consistent with estimates of mass gain prior to southward migration. PMID:22438906

  16. A Motionless Camera

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Omniview, a motionless, noiseless, exceptionally versatile camera was developed for NASA as a receiving device for guiding space robots. The system can see in one direction and provide as many as four views simultaneously. Developed by Omniview, Inc. (formerly TRI) under a NASA Small Business Innovation Research (SBIR) grant, the system's image transformation electronics produce a real-time image from anywhere within a hemispherical field. Lens distortion is removed, and a corrected "flat" view appears on a monitor. Key elements are a high resolution charge coupled device (CCD), image correction circuitry and a microcomputer for image processing. The system can be adapted to existing installations. Applications include security and surveillance, teleconferencing, imaging, virtual reality, broadcast video and military operations. Omniview technology is now called IPIX. The company was founded in 1986 as TeleRobotics International, became Omniview in 1995, and changed its name to Interactive Pictures Corporation in 1997.

  17. Noise aliasing in interline-video-based fluoroscopy systems.

    PubMed

    Lai, H; Cunningham, A

    2002-03-01

    Video-based imaging systems for continuous (nonpulsed) x-ray fluoroscopy use a variety of video formats. Conventional video-camera systems may operate in either interlaced or progressive-scan modes, and CCD systems may operate in interline- or frame-transfer modes. A theoretical model of the image noise power spectrum corresponding to these formats is described. It is shown that with respect to frame-transfer or progressive-readout modes, interline or interlaced cameras operating in a frame-integration mode will result in a spectral shift of 25% of the total image noise power from low spatial frequencies to high. In a field-integration mode, noise power is doubled with most of the increase occurring at high spatial frequencies. The differences are due primarily to the effect of noise aliasing. In interline or interlaced formats, alternate lines are obtained with each video field resulting in a vertical sampling frequency for noise that is one half of the physical sampling frequency. The extent of noise aliasing is modified by differences in the statistical correlations between video fields in the different modes. The theoretical model is validated with experiments using an x-ray image intensifier and CCD-camera system. It is shown that different video modes affect the shape of the noise-power spectrum and therefore the detective quantum efficiency. While the effect on observer performance is not addressed, it is concluded that in order to minimize image noise at the critical mid-to-high spatial frequencies for a specified x-ray exposure, fluoroscopic systems should use only frame-transfer (CCD camera) or progressive-scan (conventional video) formats. PMID:11929012

  18. Bird-Borne Video-Cameras Show That Seabird Movement Patterns Relate to Previously Unrevealed Proximate Environment, Not Prey

    PubMed Central

    Tremblay, Yann; Thiebault, Andréa; Mullers, Ralf; Pistorius, Pierre

    2014-01-01

    The study of ecological and behavioral processes has been revolutionized in the last two decades with the rapid development of biologging-science. Recently, using image-capturing devices, some pilot studies demonstrated the potential of understanding marine vertebrate movement patterns in relation to their proximate, as opposed to remote sensed environmental contexts. Here, using miniaturized video cameras and GPS tracking recorders simultaneously, we show for the first time that information on the immediate visual surroundings of a foraging seabird, the Cape gannet, is fundamental in understanding the origins of its movement patterns. We found that movement patterns were related to specific stimuli which were mostly other predators such as gannets, dolphins or fishing boats. Contrary to a widely accepted idea, our data suggest that foraging seabirds are not directly looking for prey. Instead, they search for indicators of the presence of prey, the latter being targeted at the very last moment and at a very small scale. We demonstrate that movement patterns of foraging seabirds can be heavily driven by processes unobservable with conventional methodology. Except perhaps for large scale processes, local-enhancement seems to be the only ruling mechanism; this has profounds implications for ecosystem-based management of marine areas. PMID:24523892

  19. Bird-borne video-cameras show that seabird movement patterns relate to previously unrevealed proximate environment, not prey.

    PubMed

    Tremblay, Yann; Thiebault, Andréa; Mullers, Ralf; Pistorius, Pierre

    2014-01-01

    The study of ecological and behavioral processes has been revolutionized in the last two decades with the rapid development of biologging-science. Recently, using image-capturing devices, some pilot studies demonstrated the potential of understanding marine vertebrate movement patterns in relation to their proximate, as opposed to remote sensed environmental contexts. Here, using miniaturized video cameras and GPS tracking recorders simultaneously, we show for the first time that information on the immediate visual surroundings of a foraging seabird, the Cape gannet, is fundamental in understanding the origins of its movement patterns. We found that movement patterns were related to specific stimuli which were mostly other predators such as gannets, dolphins or fishing boats. Contrary to a widely accepted idea, our data suggest that foraging seabirds are not directly looking for prey. Instead, they search for indicators of the presence of prey, the latter being targeted at the very last moment and at a very small scale. We demonstrate that movement patterns of foraging seabirds can be heavily driven by processes unobservable with conventional methodology. Except perhaps for large scale processes, local-enhancement seems to be the only ruling mechanism; this has profounds implications for ecosystem-based management of marine areas. PMID:24523892

  20. Assessing the application of an airborne intensified multispectral video camera to measure chlorophyll a in three Florida estuaries

    SciTech Connect

    Dierberg, F.E.; Zaitzeff, J.

    1997-08-01

    After absolute and spectral calibration, an airborne intensified, multispectral video camera was field tested for water quality assessments over three Florida estuaries (Tampa Bay, Indian River Lagoon, and the St. Lucie River Estuary). Univariate regression analysis of upwelling spectral energy vs. ground-truthed uncorrected chlorophyll a (Chl a) for each estuary yielded lower coefficients of determination (R{sup 2}) with increasing concentrations of Gelbstoff within an estuary. More predictive relationships were established by adding true color as a second independent variable in a bivariate linear regression model. These regressions successfully explained most of the variation in upwelling light energy (R{sup 2}=0.94, 0.82 and 0.74 for the Tampa Bay, Indian River Lagoon, and St. Lucie estuaries, respectively). Ratioed wavelength bands within the 625-710 nm range produced the highest correlations with ground-truthed uncorrected Chl a, and were similar to those reported as being the most predictive for Chl a in Tennessee reservoirs. However, the ratioed wavebands producing the best predictive algorithms for Chl a differed among the three estuaries due to the effects of varying concentrations of Gelbstoff on upwelling spectral signatures, which precluded combining the data into a common data set for analysis.

  1. Observation of the dynamic movement of fragmentations by high-speed camera and high-speed video

    NASA Astrophysics Data System (ADS)

    Suk, Chul-Gi; Ogata, Yuji; Wada, Yuji; Katsuyama, Kunihisa

    1995-05-01

    The experiments of blastings using mortal concrete blocks and model concrete columns were carried out in order to obtain technical information on fragmentation caused by the blasting demolition. The dimensions of mortal concrete blocks were 1,000 X 1,000 X 1,000 mm. Six kinds of experimental blastings were carried out using mortal concrete blocks. In these experiments precision detonators and No. 6 electric detonators with 10 cm detonating fuse were used and discussed the control of fragmentation. As the results of experiment it was clear that the flying distance of fragmentation can be controlled using a precise blasting system. The reinforced concrete model columns for typical apartment houses in Japan were applied to the experiments. The dimension of concrete test column was 800 X 800 X 2400 mm and buried 400 mm in the ground. The specified design strength of the concrete was 210 kgf/cm2. These columns were exploded by the blasting with internal loading of dynamite. The fragmentation were observed by two kinds of high speed camera with 500 and 2000 FPS and a high speed video with 400 FPS. As one of the results in the experiments, the velocity of fragmentation, blasted 330 g of explosive with the minimum resisting length of 0.32 m, was measured as much as about 40 m/s.

  2. Linear CCD attitude measurement system based on the identification of the auxiliary array CCD

    NASA Astrophysics Data System (ADS)

    Hu, Yinghui; Yuan, Feng; Li, Kai; Wang, Yan

    2015-10-01

    Object to the high precision flying target attitude measurement issues of a large space and large field of view, comparing existing measurement methods, the idea is proposed of using two array CCD to assist in identifying the three linear CCD with multi-cooperative target attitude measurement system, and to address the existing nonlinear system errors and calibration parameters and more problems with nine linear CCD spectroscopic test system of too complicated constraints among camera position caused by excessive. The mathematical model of binocular vision and three linear CCD test system are established, co-spot composition triangle utilize three red LED position light, three points' coordinates are given in advance by Cooperate Measuring Machine, the red LED in the composition of the three sides of a triangle adds three blue LED light points as an auxiliary, so that array CCD is easier to identify three red LED light points, and linear CCD camera is installed of a red filter to filter out the blue LED light points while reducing stray light. Using array CCD to measure the spot, identifying and calculating the spatial coordinates solutions of red LED light points, while utilizing linear CCD to measure three red LED spot for solving linear CCD test system, which can be drawn from 27 solution. Measured with array CCD coordinates auxiliary linear CCD has achieved spot identification, and has solved the difficult problems of multi-objective linear CCD identification. Unique combination of linear CCD imaging features, linear CCD special cylindrical lens system is developed using telecentric optical design, the energy center of the spot position in the depth range of convergence in the direction is perpendicular to the optical axis of the small changes ensuring highprecision image quality, and the entire test system improves spatial object attitude measurement speed and precision.

  3. Head-coupled remote stereoscopic camera system for telepresence applications

    NASA Technical Reports Server (NTRS)

    Bolas, M. T.; Fisher, S. S.

    1990-01-01

    The Virtual Environment Workstation Project (VIEW) at NASA's Ames Research Center has developed a remotely controlled stereoscopic camera system that can be used for telepresence research and as a tool to develop and evaluate configurations for head-coupled visual systems associated with space station telerobots and remore manipulation robotic arms. The prototype camera system consists of two lightweight CCD video cameras mounted on a computer controlled platform that provides real-time pan, tilt, and roll control of the camera system in coordination with head position transmitted from the user. This paper provides an overall system description focused on the design and implementation of the camera and platform hardware configuration and the development of control software. Results of preliminary performance evaluations are reported with emphasis on engineering and mechanical design issues and discussion of related psychophysiological effects and objectives.

  4. Improvement in the light sensitivity of the ultrahigh-speed high-sensitivity CCD with a microlens array

    NASA Astrophysics Data System (ADS)

    Hayashida, T.,; Yonai, J.; Kitamura, K.; Arai, T.; Kurita, T.; Tanioka, K.; Maruyama, H.; Etoh, T. Goji; Kitagawa, S.; Hatade, K.; Yamaguchi, T.; Takeuchi, H.; Iida, K.

    2008-02-01

    We are advancing the development of ultrahigh-speed, high-sensitivity CCDs for broadcast use that are capable of capturing smooth slow-motion videos in vivid colors even where lighting is limited, such as at professional baseball games played at night. We have already developed a 300,000 pixel, ultrahigh-speed CCD, and a single CCD color camera that has been used for sports broadcasts and science programs using this CCD. However, there are cases where even higher sensitivity is required, such as when using a telephoto lens during a baseball broadcast or a high-magnification microscope during science programs. This paper provides a summary of our experimental development aimed at further increasing the sensitivity of CCDs using the light-collecting effects of a microlens array.

  5. Cryostat and CCD for MEGARA at GTC

    NASA Astrophysics Data System (ADS)

    Castillo-Domínguez, E.; Ferrusca, D.; Tulloch, S.; Velázquez, M.; Carrasco, E.; Gallego, J.; Gil de Paz, A.; Sánchez, F. M.; Vílchez Medina, J. M.

    2012-09-01

    MEGARA (Multi-Espectrógrafo en GTC de Alta Resolución para Astronomía) is the new integral field unit (IFU) and multi-object spectrograph (MOS) instrument for the GTC. The spectrograph subsystems include the pseudo-slit, the shutter, the collimator with a focusing mechanism, pupil elements on a volume phase holographic grating (VPH) wheel and the camera joined to the cryostat through the last lens, with a CCD detector inside. In this paper we describe the full preliminary design of the cryostat which will harbor the CCD detector for the spectrograph. The selected cryogenic device is an LN2 open-cycle cryostat which has been designed by the "Astronomical Instrumentation Lab for Millimeter Wavelengths" at INAOE. A complete description of the cryostat main body and CCD head is presented as well as all the vacuum and temperature sub-systems to operate it. The CCD is surrounded by a radiation shield to improve its performance and is placed in a custom made mechanical mounting which will allow physical adjustments for alignment with the spectrograph camera. The 4k x 4k pixel CCD231 is our selection for the cryogenically cooled detector of MEGARA. The characteristics of this CCD, the internal cryostat cabling and CCD controller hardware are discussed. Finally, static structural finite element modeling and thermal analysis results are shown to validate the cryostat model.

  6. Design of video interface conversion system based on FPGA

    NASA Astrophysics Data System (ADS)

    Zhao, Heng; Wang, Xiang-jun

    2014-11-01

    This paper presents a FPGA based video interface conversion system that enables the inter-conversion between digital and analog video. Cyclone IV series EP4CE22F17C chip from Altera Corporation is used as the main video processing chip, and single-chip is used as the information interaction control unit between FPGA and PC. The system is able to encode/decode messages from the PC. Technologies including video decoding/encoding circuits, bus communication protocol, data stream de-interleaving and de-interlacing, color space conversion and the Camera Link timing generator module of FPGA are introduced. The system converts Composite Video Broadcast Signal (CVBS) from the CCD camera into Low Voltage Differential Signaling (LVDS), which will be collected by the video processing unit with Camera Link interface. The processed video signals will then be inputted to system output board and displayed on the monitor.The current experiment shows that it can achieve high-quality video conversion with minimum board size.

  7. Cameras in the Classroom.

    ERIC Educational Resources Information Center

    Steinman, Richard C.

    1993-01-01

    Describes the following uses for a video camera in the science classroom: video presentations, microscope work, taping and/or monitoring experiments, analyzing everyday phenomena, lesson enhancement, field trip alternative, and classroom management. (PR)

  8. X-ray beam profile measurements with CCD detectors

    NASA Astrophysics Data System (ADS)

    Attaelmanan, A.; Rindby, A.; Voglis, P.; Shermeat, A.

    1993-08-01

    A commercial CCD video camera has been used as a room temperature X-ray detector to measure microbeam X-ray beam profiles generated by conical glass capillaries from conventional X-ray tubes. Beam profiles were recorded for two capillaries of diameters 168 and 15 μm using a Cr tube operated at 15 kV, 5 mA. A standard framegrabber was used for image digitization and an image processing system was used for image quantification. Spatial distribution as well as angular divergence were determined by measuring the beam profile at various distances from the capillary exit cross section. Beam profiles from misaligned capillaries have been recorded and split event probability, spatial resolution and radiation damage are discussed.

  9. Advanced camera image data acquisition system for Pi-of-the-Sky

    NASA Astrophysics Data System (ADS)

    Kwiatkowski, Maciej; Kasprowicz, Grzegorz; Pozniak, Krzysztof; Romaniuk, Ryszard; Wrochna, Grzegorz

    2008-11-01

    The paper describes a new generation of high performance, remote control, CCD cameras designed for astronomical applications. A completely new camera PCB was designed, manufactured, tested and commissioned. The CCD chip was positioned in a different way than previously resulting in better performance of the astronomical video data acquisition system. The camera was built using a low-noise, 4Mpixel CCD circuit by STA. The electronic circuit of the camera is highly parameterized and reconfigurable, as well as modular in comparison with the solution of first generation, due to application of open software solutions and FPGA circuit, Altera Cyclone EP1C6. New algorithms were implemented into the FPGA chip. There were used the following advanced electronic circuit in the camera system: microcontroller CY7C68013a (core 8051) by Cypress, image processor AD9826 by Analog Devices, GigEth interface RTL8169s by Realtec, memory SDRAM AT45DB642 by Atmel, CPU typr microprocessor ARM926EJ-S AT91SAM9260 by ARM and Atmel. Software solutions for the camera and its remote control, as well as image data acquisition are based only on the open source platform. There were used the following image interfaces ISI and API V4L2, data bus AMBA, AHB, INDI protocol. The camera will be replicated in 20 pieces and is designed for continuous on-line, wide angle observations of the sky in the research program Pi-of-the-Sky.

  10. The Dark Energy Survey CCD imager design

    SciTech Connect

    Cease, H.; DePoy, D.; Diehl, H.T.; Estrada, J.; Flaugher, B.; Guarino, V.; Kuk, K.; Kuhlmann, S.; Schultz, K.; Schmitt, R.L.; Stefanik, A.; /Fermilab /Ohio State U. /Argonne

    2008-06-01

    The Dark Energy Survey is planning to use a 3 sq. deg. camera that houses a {approx} 0.5m diameter focal plane of 62 2kx4k CCDs. The camera vessel including the optical window cell, focal plate, focal plate mounts, cooling system and thermal controls is described. As part of the development of the mechanical and cooling design, a full scale prototype camera vessel has been constructed and is now being used for multi-CCD readout tests. Results from this prototype camera are described.

  11. Are traditional methods of determining nest predators and nest fates reliable? An experiment with Wood Thrushes (Hylocichla mustelina) using miniature video cameras

    USGS Publications Warehouse

    Williams, Gary E.; Wood, P.B.

    2002-01-01

    We used miniature infrared video cameras to monitor Wood Thrush (Hylocichla mustelina) nests during 1998-2000. We documented nest predators and examined whether evidence at nests can be used to predict predator identities and nest fates. Fifty-six nests were monitored; 26 failed, with 3 abandoned and 23 depredated. We predicted predator class (avian, mammalian, snake) prior to review of video footage and were incorrect 57% of the time. Birds and mammals were underrepresented whereas snakes were over-represented in our predictions. We documented ???9 nest-predator species, with the southern flying squirrel (Glaucomys volans) taking the most nests (n = 8). During 2000, we predicted fate (fledge or fail) of 27 nests; 23 were classified correctly. Traditional methods of monitoring nests appear to be effective for classifying success or failure of nests, but ineffective at classifying nest predators.

  12. Individual camera identification using correlation of fixed pattern noise in image sensors.

    PubMed

    Kurosawa, Kenji; Kuroki, Kenro; Akiba, Norimitsu

    2009-05-01

    This paper presents results of experiments related to individual video camera identification using a correlation coefficient of fixed pattern noise (FPN) in image sensors. Five color charge-coupled device (CCD) modules of the same brand were examined. Images were captured using a 12-bit monochrome video capture board and stored in a personal computer. For each module, 100 frames were captured. They were integrated to obtain FPN. The results show that a specific CCD module was distinguished among the five modules by analyzing the normalized correlation coefficient. The temporal change of the correlation coefficient during several days had only a negligible effect on identifying the modules. Furthermore, a positive relation was found between the correlation coefficient of the same modules and the number of frames that were used for image integration. Consequently, precise individual camera identification is enhanced by acquisition of as many frames as possible. PMID:19302379

  13. Air truth: operation of a remotely controlled infrared camera and ground-air video/data link during airborne lidar tests

    NASA Astrophysics Data System (ADS)

    Nemzek, Robert J.

    1999-10-01

    In early tests of an airborne lidar platform, we confirmed the utility of an elevated, ground-based infrared camera as a chemical plume diagnostic. For a series of lidar tests during the summer of 1998, we carried this concept a step further, by adding a digital data link to pass infrared camera video and other data streams to and from the aircraft in real time. In addition, the entire system had to be operated from a distance, for safety considerations. To achieve this goal under a restricted budget and significant time and effort constraints, we assembled a system using primarily off-the-shelf components and software requiring little customization. Remote system control was achieved by a set of radio modems, while the aircraft data link was effected via wireless ethernet connection. The system performed reliably throughout the test series.

  14. Testing fully depleted CCD

    NASA Astrophysics Data System (ADS)

    Casas, Ricard; Cardiel-Sas, Laia; Castander, Francisco J.; Jiménez, Jorge; de Vicente, Juan

    2014-08-01

    The focal plane of the PAU camera is composed of eighteen 2K x 4K CCDs. These devices, plus four spares, were provided by the Japanese company Hamamatsu Photonics K.K. with type no. S10892-04(X). These detectors are 200 μm thick fully depleted and back illuminated with an n-type silicon base. They have been built with a specific coating to be sensitive in the range from 300 to 1,100 nm. Their square pixel size is 15 μm. The read-out system consists of a Monsoon controller (NOAO) and the panVIEW software package. The deafualt CCD read-out speed is 133 kpixel/s. This is the value used in the calibration process. Before installing these devices in the camera focal plane, they were characterized using the facilities of the ICE (CSIC- IEEC) and IFAE in the UAB Campus in Bellaterra (Barcelona, Catalonia, Spain). The basic tests performed for all CCDs were to obtain the photon transfer curve (PTC), the charge transfer efficiency (CTE) using X-rays and the EPER method, linearity, read-out noise, dark current, persistence, cosmetics and quantum efficiency. The X-rays images were also used for the analysis of the charge diffusion for different substrate voltages (VSUB). Regarding the cosmetics, and in addition to white and dark pixels, some patterns were also found. The first one, which appears in all devices, is the presence of half circles in the external edges. The origin of this pattern can be related to the assembly process. A second one appears in the dark images, and shows bright arcs connecting corners along the vertical axis of the CCD. This feature appears in all CCDs exactly in the same position so our guess is that the pattern is due to electrical fields. Finally, and just in two devices, there is a spot with wavelength dependence whose origin could be the result of a defectous coating process.

  15. Visualization of explosion phenomena using a high-speed video camera with an uncoupled objective lens by fiber-optic

    NASA Astrophysics Data System (ADS)

    Tokuoka, Nobuyuki; Miyoshi, Hitoshi; Kusano, Hideaki; Hata, Hidehiro; Hiroe, Tetsuyuki; Fujiwara, Kazuhito; Yasushi, Kondo

    2008-11-01

    Visualization of explosion phenomena is very important and essential to evaluate the performance of explosive effects. The phenomena, however, generate blast waves and fragments from cases. We must protect our visualizing equipment from any form of impact. In the tests described here, the front lens was separated from the camera head by means of a fiber-optic cable in order to be able to use the camera, a Shimadzu Hypervision HPV-1, for tests in severe blast environment, including the filming of explosions. It was possible to obtain clear images of the explosion that were not inferior to the images taken by the camera with the lens directly coupled to the camera head. It could be confirmed that this system is very useful for the visualization of dangerous events, e.g., at an explosion site, and for visualizations at angles that would be unachievable under normal circumstances.

  16. CCD Photometry of the Polar BY Cam

    NASA Astrophysics Data System (ADS)

    Jessop, H.; Chin, V.; Spear, G.

    1992-12-01

    BY Cam (=H0538+608), a very erratic member of the AM Herculis-type binaries also known as polars, was observed at the University of Arizona 40-inch telescope, with the Sonoma State University Astrolink CCD camera for six nights during November 1991. Two additional nights of CCD photometry were obtained during September 1992 at the Sonoma State Observatory, with the 25-cm Epoch Automated Telescope and SSU Astrolink CCD camera. These data comprise one of the most extensive sets of photometry acquired for this object. We will present the results of these observations, and discuss their relevance towards the further determination of some of the system's parameters. This work has been supported by a California State University Pre-Doctoral Award and Pre-Doctoral Summer Internship Award, and a Grant-In-Aid from the Sigma Xi Scientific Research Society.

  17. EL Sistema CCD de Tonantzintla. Pruebas Y Planes Futuros

    NASA Astrophysics Data System (ADS)

    Cardona, O.; Chavira, E.; Furenlid, L.; Iriarte, B.

    1987-05-01

    We present results of the laboratory tests of the CCD camera system recently acquired by INAOE, also the theoretical and observational performance of the instrument with the one meter telescope of UNAM. The system has a TI 4849 CCD with 390 >c 584 pixels. We will present the future plans of its use in the new 2.1 m telescope at Cananea, onora.

  18. CCD Double Star Measures: Jack Jones Observatory Report #2

    NASA Astrophysics Data System (ADS)

    Jones, James L.

    2009-10-01

    This paper submits 44 CCD measurements of 41 multiple star systems for inclusion in the WDS. Observations were made during the calendar year 2008. Measurements were made using a CCD camera and an 11" Schmidt-Cassegrain telescope. Brief discussions of pertinent observations are included.

  19. Vacuum Camera Cooler

    NASA Technical Reports Server (NTRS)

    Laugen, Geoffrey A.

    2011-01-01

    Acquiring cheap, moving video was impossible in a vacuum environment, due to camera overheating. This overheating is brought on by the lack of cooling media in vacuum. A water-jacketed camera cooler enclosure machined and assembled from copper plate and tube has been developed. The camera cooler (see figure) is cup-shaped and cooled by circulating water or nitrogen gas through copper tubing. The camera, a store-bought "spy type," is not designed to work in a vacuum. With some modifications the unit can be thermally connected when mounted in the cup portion of the camera cooler. The thermal conductivity is provided by copper tape between parts of the camera and the cooled enclosure. During initial testing of the demonstration unit, the camera cooler kept the CPU (central processing unit) of this video camera at operating temperature. This development allowed video recording of an in-progress test, within a vacuum environment.

  20. Camera Operator and Videographer

    ERIC Educational Resources Information Center

    Moore, Pam

    2007-01-01

    Television, video, and motion picture camera operators produce images that tell a story, inform or entertain an audience, or record an event. They use various cameras to shoot a wide range of material, including television series, news and sporting events, music videos, motion pictures, documentaries, and training sessions. Those who film or

  1. Camera Operator and Videographer

    ERIC Educational Resources Information Center

    Moore, Pam

    2007-01-01

    Television, video, and motion picture camera operators produce images that tell a story, inform or entertain an audience, or record an event. They use various cameras to shoot a wide range of material, including television series, news and sporting events, music videos, motion pictures, documentaries, and training sessions. Those who film or…

  2. A video precipitation sensor for imaging and velocimetry of hydrometeors

    NASA Astrophysics Data System (ADS)

    Liu, X. C.; Gao, T. C.; Liu, L.

    2014-07-01

    A new method to determine the shape and fall velocity of hydrometeors by using a single CCD camera is proposed in this paper, and a prototype of a video precipitation sensor (VPS) is developed. The instrument consists of an optical unit (collimated light source with multi-mode fibre cluster), an imaging unit (planar array CCD sensor), an acquisition and control unit, and a data processing unit. The cylindrical space between the optical unit and imaging unit is sampling volume (300 mm × 40 mm × 30 mm). As the precipitation particles fall through the sampling volume, the CCD camera exposes twice in a single frame, which allows the double exposure of particles images to be obtained. The size and shape can be obtained by the images of particles; the fall velocity can be calculated by particle displacement in the double-exposure image and interval time; the drop size distribution and velocity distribution, precipitation intensity, and accumulated precipitation amount can be calculated by time integration. The innovation of VPS is that the shape, size, and velocity of precipitation particles can be measured by only one planar array CCD sensor, which can address the disadvantages of a linear scan CCD disdrometer and an impact disdrometer. Field measurements of rainfall demonstrate the VPS's capability to measure micro-physical properties of single particles and integral parameters of precipitation.

  3. Identification of Prey Captures in Australian Fur Seals (Arctocephalus pusillus doriferus) Using Head-Mounted Accelerometers: Field Validation with Animal-Borne Video Cameras.

    PubMed

    Volpov, Beth L; Hoskins, Andrew J; Battaile, Brian C; Viviant, Morgane; Wheatley, Kathryn E; Marshall, Greg; Abernathy, Kyler; Arnould, John P Y

    2015-01-01

    This study investigated prey captures in free-ranging adult female Australian fur seals (Arctocephalus pusillus doriferus) using head-mounted 3-axis accelerometers and animal-borne video cameras. Acceleration data was used to identify individual attempted prey captures (APC), and video data were used to independently verify APC and prey types. Results demonstrated that head-mounted accelerometers could detect individual APC but were unable to distinguish among prey types (fish, cephalopod, stingray) or between successful captures and unsuccessful capture attempts. Mean detection rate (true positive rate) on individual animals in the testing subset ranged from 67-100%, and mean detection on the testing subset averaged across 4 animals ranged from 82-97%. Mean False positive (FP) rate ranged from 15-67% individually in the testing subset, and 26-59% averaged across 4 animals. Surge and sway had significantly greater detection rates, but also conversely greater FP rates compared to heave. Video data also indicated that some head movements recorded by the accelerometers were unrelated to APC and that a peak in acceleration variance did not always equate to an individual prey item. The results of the present study indicate that head-mounted accelerometers provide a complementary tool for investigating foraging behaviour in pinnipeds, but that detection and FP correction factors need to be applied for reliable field application. PMID:26107647

  4. Identification of Prey Captures in Australian Fur Seals (Arctocephalus pusillus doriferus) Using Head-Mounted Accelerometers: Field Validation with Animal-Borne Video Cameras

    PubMed Central

    Volpov, Beth L.; Hoskins, Andrew J.; Battaile, Brian C.; Viviant, Morgane; Wheatley, Kathryn E.; Marshall, Greg; Abernathy, Kyler; Arnould, John P. Y.

    2015-01-01

    This study investigated prey captures in free-ranging adult female Australian fur seals (Arctocephalus pusillus doriferus) using head-mounted 3-axis accelerometers and animal-borne video cameras. Acceleration data was used to identify individual attempted prey captures (APC), and video data were used to independently verify APC and prey types. Results demonstrated that head-mounted accelerometers could detect individual APC but were unable to distinguish among prey types (fish, cephalopod, stingray) or between successful captures and unsuccessful capture attempts. Mean detection rate (true positive rate) on individual animals in the testing subset ranged from 67-100%, and mean detection on the testing subset averaged across 4 animals ranged from 82-97%. Mean False positive (FP) rate ranged from 15-67% individually in the testing subset, and 26-59% averaged across 4 animals. Surge and sway had significantly greater detection rates, but also conversely greater FP rates compared to heave. Video data also indicated that some head movements recorded by the accelerometers were unrelated to APC and that a peak in acceleration variance did not always equate to an individual prey item. The results of the present study indicate that head-mounted accelerometers provide a complementary tool for investigating foraging behaviour in pinnipeds, but that detection and FP correction factors need to be applied for reliable field application. PMID:26107647

  5. Study of design and control of remote manipulators. Part 4: Experiments in video camera positioning with regard to remote manipulation

    NASA Technical Reports Server (NTRS)

    Mackro, J.

    1973-01-01

    The results are presented of a study involving closed circuit television as the means of providing the necessary task-to-operator feedback for efficient performance of the remote manipulation system. Experiments were performed to determine the remote video configuration that will result in the best overall system. Two categories of tests were conducted which include: those which involved remote control position (rate) of just the video system, and those in which closed circuit TV was used along with manipulation of the objects themselves.

  6. Fast measurement of temporal noise of digital camera's photosensors

    NASA Astrophysics Data System (ADS)

    Cheremkhin, Pavel A.; Evtikhiev, Nikolay N.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Starikov, Rostislav S.; Starikov, Sergey N.

    2015-10-01

    Currently photo- and videocameras are widespread parts of both scientific experimental setups and consumer applications. They are used in optics, radiophysics, astrophotography, chemistry, and other various fields of science and technology such as control systems and video-surveillance monitoring. One of the main information limitations of photoand videocameras are noises of photosensor pixels. Camera's photosensor noise can be divided into random and pattern components. Temporal noise includes random noise component while spatial noise includes pattern noise component. Spatial part usually several times lower in magnitude than temporal. At first approximation spatial noises might be neglected. Earlier we proposed modification of the automatic segmentation of non-uniform targets (ASNT) method for measurement of temporal noise of photo- and videocameras. Only two frames are sufficient for noise measurement with the modified method. In result, proposed ASNT modification should allow fast and accurate measurement of temporal noise. In this paper, we estimated light and dark temporal noises of four cameras of different types using the modified ASNT method with only several frames. These cameras are: consumer photocamera Canon EOS 400D (CMOS, 10.1 MP, 12 bit ADC), scientific camera MegaPlus II ES11000 (CCD, 10.7 MP, 12 bit ADC), industrial camera PixeLink PLB781F (CMOS, 6.6 MP, 10 bit ADC) and video-surveillance camera Watec LCL-902C (CCD, 0.47 MP, external 8 bit ADC). Experimental dependencies of temporal noise on signal value are in good agreement with fitted curves based on a Poisson distribution excluding areas near saturation. We measured elapsed time for processing of shots used for temporal noise estimation. The results demonstrate the possibility of fast obtaining of dependency of camera full temporal noise on signal value with the proposed ASNT modification.

  7. Digital video.

    PubMed

    Johnson, Don; Johnson, Mike

    2004-04-01

    The process of digital capture, editing, and archiving video has become an important aspect of documenting arthroscopic surgery. Recording the arthroscopic findings before and after surgery is an essential part of the patient's medical record. The hardware and software has become more reasonable to purchase, but the learning curve to master the software is steep. Digital video is captured at the time of arthroscopy to a hard disk, and written to a CD at the end of the operative procedure. The process of obtaining video of open procedures is more complex. Outside video of the procedure is recorded on digital tape with a digital video camera. The camera must be plugged into a computer to capture the video on the hard disk. Adobe Premiere software is used to edit the video and render the finished video to the hard drive. This finished video is burned onto a CD. We outline the choice of computer hardware and software for the manipulation of digital video. The techniques of backup and archiving the completed projects and files also are outlined. The uses of digital video for education and the formats that can be used in PowerPoint presentations are discussed. PMID:15123920

  8. Video endoscopy. Fundamentals and problems.

    PubMed

    Demling, L; Hagel, H J

    1985-09-01

    The video system represents a new endoscopic technique with major advantages, some of which point the way into the future. This system permits a large number of persons to participate directly in the examination. Documentation is more comprehensive and more reliable, and pathological processes can be observed with the aid of video tape recordings. It is to be expected that the optical elements of the video endoscope will become smaller, while the instruments will become longer. Since there is no loss of light with these endoscopes, it would appear possible that they will make the entire small bowel accessible to inspection. Compared with conventional standards, the colour quality on the video monitor screen, in particular in the red range, and of the video photograph still leaves something to be desired. User-friendly equipment provided with an automatic colour adaption facility, is required. The good thing about the future is, of course, that it comes slowly - and this applies to video endoscopy, too. Since July, 1984, our department has been acquiring experience with the video endoscope manufactured by the firm of Welch Allyn, New York, and, in the meantime, we have examined 97 patients with this system, 80 in the upper, 17 in the lower gastrointestinal tract. The heart of the video endoscope is a light-sensitive microprocessor silicon chip, roughly 4 X 4 mm in size, which acts like a miniature television camera. Properly, it is termed a charge coupled device chip (CCD chip). Utilizing the crystalline structure of the silicon chip, and its property for thermal oxidation, such electronic components as diodes, capacitors and resistors are integrated onto it.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:4054060

  9. What Does the Camera Communicate? An Inquiry into the Politics and Possibilities of Video Research on Learning

    ERIC Educational Resources Information Center

    Vossoughi, Shirin; Escudé, Meg

    2016-01-01

    This piece explores the politics and possibilities of video research on learning in educational settings. The authors (a research-practice team) argue that changing the stance of inquiry from "surveillance" to "relationship" is an ongoing and contingent practice that involves pedagogical, political, and ethical choices on the…

  10. Reliability of sagittal plane hip, knee, and ankle joint angles from a single frame of video data using the GAITRite camera system.

    PubMed

    Ross, Sandy A; Rice, Clinton; Von Behren, Kristyn; Meyer, April; Alexander, Rachel; Murfin, Scott

    2015-01-01

    The purpose of this study was to establish intra-rater, intra-session, and inter-rater, reliability of sagittal plane hip, knee, and ankle angles with and without reflective markers using the GAITRite walkway and single video camera between student physical therapists and an experienced physical therapist. This study included thirty-two healthy participants age 20-59, stratified by age and gender. Participants performed three successful walks with and without markers applied to anatomical landmarks. GAITRite software was used to digitize sagittal hip, knee, and ankle angles at two phases of gait: (1) initial contact; and (2) mid-stance. Intra-rater reliability was more consistent for the experienced physical therapist, regardless of joint or phase of gait. Intra-session reliability was variable, the experienced physical therapist showed moderate to high reliability (intra-class correlation coefficient (ICC)?=?0.50-0.89) and the student physical therapist showed very poor to high reliability (ICC?=?0.07-0.85). Inter-rater reliability was highest during mid-stance at the knee with markers (ICC?=?0.86) and lowest during mid-stance at the hip without markers (ICC?=?0.25). Reliability of a single camera system, especially at the knee joint shows promise. Depending on the specific type of reliability, error can be attributed to the testers (e.g. lack of digitization practice and marker placement), participants (e.g. loose fitting clothing) and camera systems (e.g. frame rate and resolution). However, until the camera technology can be upgraded to a higher frame rate and resolution, and the software can be linked to the GAITRite walkway, the clinical utility for pre/post measures is limited. PMID:25230893

  11. Multi-station Video Orbits of Minor Meteor Showers

    NASA Astrophysics Data System (ADS)

    Madiedo, Jos M.; Trigo-Rodrguez, Josep M.

    2008-06-01

    During 2006 the SPanish Meteor Network (SPMN) set up three automated video stations in Andalusia for increasing the atmospheric coverage of the already existing low-scan-rate all-sky CCD systems. Despite their initially thought complementary nature, sensitive video cameras have been employed to setup an automatic meteor detection system that provides valuable real-time information on unusual meteor activity, and remarkable fireball events. In fact, during 2006 SPMN video stations participated in the detection of two unexpected meteor outbursts: Orionids and Comae Berenicids. The three new SPMN stations guarantee almost a continuous monitoring of meteor and fireball activity in Andalusia (Spain) and also increase the chance of future meteorite recoveries. A description of the main characteristics of these new observing video stations and some examples of the trajectory, radiant and orbital data obtained so far are presented here.

  12. Evaluation of the performance of the 576 384 Thomson CCD for astronomical use

    NASA Astrophysics Data System (ADS)

    Mellier, Y.; Cailloux, M.; Dupin, J. P.; Fort, B.; Lours, C.

    1986-03-01

    A slow scan CCD camera system was built at the Toulouse Observatory and used to evaluate the performance of the 576384 CCD from Thomson-CSF (new THX 31133) at low temperatures (150K). The authors have emphasized the optimization of the most important parameters for astronomical applications and have compared the Thomson CCD with other CCDs, now currently used in astronomy.

  13. Compact laser radar and three-dimensional camera.

    PubMed

    Medina, Antonio; Gayá, Francisco; del Pozo, Francisco

    2006-04-01

    A novel three-dimensional (3D) camera is capable of providing high-precision 3D images in real time. The camera uses a diode laser to illuminate the scene, a shuttered solid-state charge-coupled device (CCD) sensor, and a simple phase detection technique based on the sensor shutter. The amplitude of the reflected signal carries the luminance information, while the phase of the signal carries range information. The system output is coded as a video signal. This camera offers significant advantages over existing technology. The precision in range is dependent only on phase shift and laser power and theoretically is far superior to existing time-of-flight laser radar systems. Other advantages are reduced size and simplicity and compact and inexpensive construction. We built a prototype that produced high-resolution images in range the (z) and x-y. PMID:16604759

  14. An electronic pan/tilt/zoom camera system

    NASA Technical Reports Server (NTRS)

    Zimmermann, Steve; Martin, H. L.

    1992-01-01

    A small camera system is described for remote viewing applications that employs fisheye optics and electronics processing for providing pan, tilt, zoom, and rotational movements. The fisheye lens is designed to give a complete hemispherical FOV with significant peripheral distortion that is corrected with high-speed electronic circuitry. Flexible control of the viewing requirements is provided by a programmable transformation processor so that pan/tilt/rotation/zoom functions can be accomplished without mechanical movements. Images are presented that were taken with a prototype system using a CCD camera, and 5 frames/sec can be acquired from a 180-deg FOV. The image-tranformation device can provide multiple images with different magnifications and pan/tilt/rotation sequences at frame rates compatible with conventional video devices. The system is of interest to the object tracking, surveillance, and viewing in constrained environments that would require the use of several cameras.

  15. A geometric comparison of video camera-captured raster data to vector-parented raster data generated by the X-Y digitizing table

    NASA Technical Reports Server (NTRS)

    Swalm, C.; Pelletier, R.; Rickman, D.; Gilmore, K.

    1989-01-01

    The relative accuracy of a georeferenced raster data set captured by the Megavision 1024XM system using the Videk Megaplus CCD cameras is compared to a georeferenced raster data set generated from vector lines manually digitized through the ELAS software package on a Summagraphics X-Y digitizer table. The study also investigates the amount of time necessary to fully complete the rasterization of the two data sets, evaluating individual areas such as time necessary to generate raw data, time necessary to edit raw data, time necessary to georeference raw data, and accuracy of georeferencing against a norm. Preliminary results exhibit a high level of agreement between areas of the vector-parented data and areas of the captured file data where sufficient control points were chosen. Maps of 1:20,000 scale were digitized into raster files of 5 meter resolution per pixel and overall error in RMS was estimated at less than eight meters. Such approaches offer time and labor-saving advantages as well as increasing the efficiency of project scheduling and enabling the digitization of new types of data.

  16. Overview of a hybrid underwater camera system

    NASA Astrophysics Data System (ADS)

    Church, Philip; Hou, Weilin; Fournier, Georges; Dalgleish, Fraser; Butler, Derek; Pari, Sergio; Jamieson, Michael; Pike, David

    2014-05-01

    The paper provides an overview of a Hybrid Underwater Camera (HUC) system combining sonar with a range-gated laser camera system. The sonar is the BlueView P900-45, operating at 900kHz with a field of view of 45 degrees and ranging capability of 60m. The range-gated laser camera system is based on the third generation LUCIE (Laser Underwater Camera Image Enhancer) sensor originally developed by the Defence Research and Development Canada. LUCIE uses an eye-safe laser generating 1ns pulses at a wavelength of 532nm and at the rate of 25kHz. An intensified CCD camera operates with a gating mechanism synchronized with the laser pulse. The gate opens to let the camera capture photons from a given range of interest and can be set from a minimum delay of 5ns with increments of 200ps. The output of the sensor is a 30Hz video signal. Automatic ranging is achieved using a sonar altimeter. The BlueView sonar and LUCIE sensors are integrated with an underwater computer that controls the sensors parameters and displays the real-time data for the sonar and the laser camera. As an initial step for data integration, graphics overlays representing the laser camera field-of-view along with the gate position and width are overlaid on the sonar display. The HUC system can be manually handled by a diver and can also be controlled from a surface vessel through an umbilical cord. Recent test data obtained from the HUC system operated in a controlled underwater environment will be presented along with measured performance characteristics.

  17. CCDs and CCD controllers for the GTC Day One

    NASA Astrophysics Data System (ADS)

    Kohley, R.; Suárez-Valles, M.; Burley, G.; Cavaller-Marqués, L.; Vilela, R.; Casanova, A.; Tomás, A.

    2005-12-01

    The need of the GTC for acquisition cameras, wavefront sensors and scientific instrumentation operating in the optical wavelength range has led to the acquisition of various CCDs and CCD controllers. Due to stringent science, mechanical and ambient requirements the initial idea to employ a general purpose controller had been dropped in favor of two specialized systems, one for the acquisition and guiding purposes of the telescope and the commissioning camera and the other for scientific instrumentation. We describe the specifications, design and performance of the two different CCD camera systems based on laboratory test results.

  18. Progress in video immersion using Panospheric imaging

    NASA Astrophysics Data System (ADS)

    Bogner, Stephen L.; Southwell, David T.; Penzes, Steven G.; Brosinsky, Chris A.; Anderson, Ron; Hanna, Doug M.

    1998-09-01

    Having demonstrated significant technical and marketplace advantages over other modalities for video immersion, PanosphericTM Imaging (PI) continues to evolve rapidly. This paper reports on progress achieved since AeroSense 97. The first practical field deployment of the technology occurred in June-August 1997 during the NASA-CMU 'Atacama Desert Trek' activity, where the Nomad mobile robot was teleoperated via immersive PanosphericTM imagery from a distance of several thousand kilometers. Research using teleoperated vehicles at DRES has also verified the exceptional utility of the PI technology for achieving high levels of situational awareness, operator confidence, and mission effectiveness. Important performance enhancements have been achieved with the completion of the 4th Generation PI DSP-based array processor system. The system is now able to provide dynamic full video-rate generation of spatial and computational transformations, resulting in a programmable and fully interactive immersive video telepresence. A new multi- CCD camera architecture has been created to exploit the bandwidth of this processor, yielding a well-matched PI system with greatly improved resolution. While the initial commercial application for this technology is expected to be video tele- conferencing, it also appears to have excellent potential for application in the 'Immersive Cockpit' concept. Additional progress is reported in the areas of Long Wave Infrared PI Imaging, Stereo PI concepts, PI based Video-Servoing concepts, PI based Video Navigation concepts, and Foveation concepts (to merge localized high-resolution views with immersive views).

  19. AXAF CCD Imaging Spectrometer (ACIS)

    NASA Astrophysics Data System (ADS)

    Garmire, G. P.

    1997-05-01

    The ACIS is an advanced X-ray camera for the AXAF scheduled to be launched in 1998. The camera is composed of two arrays of CCDs, one optimized for imaging using four CCDs abutted in a square array, and a linear array of six CCDs optimized for imaging the dispersed spectrum formed by the High and Medium Energy Transmission Grating Spectrometers. The imaging array is tipped with respect to the optical axis to better approximate the curved focal surface formed by the AXAF Wolter Type I optics. The spectroscopic array has a slight tilt to follow the Rowland circle of the grating focus. The CCD camera and electronics were built at the MIT Center for Space Research and Lincoln Laborator. Much of the thermal and mechanical design as well as the power system were carried out at Lockheed-Martin in Denver, Colorado. The CCDs have been calibrated at MIT and the synchrotron at BESSY in Berlin, Germany. The entire flight instrument has been calibrated at the XRCF facility at Marshall Space flight Center in Huntsville, Alabama. The anticipated instrument performance characteristics based on the calibration reluts will be pre A few examples of possible observations will werve to illustrate the great scientific capabilities of the AXAF.

  20. The future scientific CCD

    NASA Technical Reports Server (NTRS)

    Janesick, J. R.; Elliott, T.; Collins, S.; Marsh, H.; Blouke, M. M.

    1984-01-01

    Since the first introduction of charge-coupled devices (CCDs) in 1970, CCDs have been considered for applications related to memories, logic circuits, and the detection of visible radiation. It is pointed out, however, that the mass market orientation of CCD development has left largely untapped the enormous potential of these devices for advanced scientific instrumentation. The present paper has, therefore, the objective to introduce the CCD characteristics to the scientific community, taking into account prospects for further improvement. Attention is given to evaluation criteria, a summary of current CCDs, CCD performance characteristics, absolute calibration tools, quantum efficiency, aspects of charge collection, charge transfer efficiency, read noise, and predictions regarding the characteristics of the next generation of silicon scientific CCD imagers.

  1. Evaluating intensified camera systems

    SciTech Connect

    S. A. Baker

    2000-06-30

    This paper describes image evaluation techniques used to standardize camera system characterizations. The authors group is involved with building and fielding several types of camera systems. Camera types include gated intensified cameras, multi-frame cameras, and streak cameras. Applications range from X-ray radiography to visible and infrared imaging. Key areas of performance include sensitivity, noise, and resolution. This team has developed an analysis tool, in the form of image processing software, to aid an experimenter in measuring a set of performance metrics for their camera system. These performance parameters are used to identify a camera system's capabilities and limitations while establishing a means for camera system comparisons. The analysis tool is used to evaluate digital images normally recorded with CCD cameras. Electro-optical components provide fast shuttering and/or optical gain to camera systems. Camera systems incorporate a variety of electro-optical components such as microchannel plate (MCP) or proximity focused diode (PFD) image intensifiers; electro-static image tubes; or electron-bombarded (EB) CCDs. It is often valuable to evaluate the performance of an intensified camera in order to determine if a particular system meets experimental requirements.

  2. CCD Imaging of KIC 8462852

    NASA Astrophysics Data System (ADS)

    Lahey, Adam

    2016-06-01

    A particularly interesting star, KIC 8562852, recently became famous for its enigmatic dips in brightness. The interpretation broadcast by many popular media outlets was that the dips were caused by a megastructure built around the star by an intelligent civilization. The best scientific hypothesis relies on a natural phenomenon: the break-up of a comet orbiting the star. To further address this problem, we have measured the star for four months using BGSU’s 0.5m telescope and digital CCD camera, and we present the star’s brightness as a function of time. Using three very clear nights, we refined the brightness of four comparison stars which can be used by the local astronomical community to monitor the star’s brightness. These newly refined magnitudes should reduce the uncertainties in our brightness measurements; this error analysis is essential in determining the significance of any brightness deviations. An observed dip in brightness would confirm the comet hypothesis by establishing a cyclical pattern, or may serve as a basis for new understanding of variable stars. An additional element to the project involves creating CCD calibration images and a well-documented procedure for future use.

  3. Concerning the Video Drift Method to Measure Double Stars

    NASA Astrophysics Data System (ADS)

    Nugent, Richard L.; Iverson, Ernest W.

    2015-05-01

    Classical methods to measure position angles and separations of double stars rely on just a few measurements either from visual observations or photographic means. Visual and photographic CCD observations are subject to errors from the following sources: misalignments from eyepiece/camera/barlow lens/micrometer/focal reducers, systematic errors from uncorrected optical distortions, aberrations from the telescope system, camera tilt, magnitude and color effects. Conventional video methods rely on calibration doubles and graphically calculating the east-west direction plus careful choice of select video frames stacked for measurement. Atmospheric motion is one of the larger sources of error in any exposure/measurement method which is on the order of 0.5-1.5. Ideally, if a data set from a short video can be used to derive position angle and separation, with each data set self-calibrating independent of any calibration doubles or star catalogues, this would provide measurements of high systematic accuracy. These aims are achieved by the video drift method first proposed by the authors in 2011. This self calibrating video method automatically analyzes 1,000's of measurements from a short video clip.

  4. Event-Driven Random-Access-Windowing CCD Imaging System

    NASA Technical Reports Server (NTRS)

    Monacos, Steve; Portillo, Angel; Ortiz, Gerardo; Alexander, James; Lam, Raymond; Liu, William

    2004-01-01

    A charge-coupled-device (CCD) based high-speed imaging system, called a realtime, event-driven (RARE) camera, is undergoing development. This camera is capable of readout from multiple subwindows [also known as regions of interest (ROIs)] within the CCD field of view. Both the sizes and the locations of the ROIs can be controlled in real time and can be changed at the camera frame rate. The predecessor of this camera was described in High-Frame-Rate CCD Camera Having Subwindow Capability (NPO- 30564) NASA Tech Briefs, Vol. 26, No. 12 (December 2002), page 26. The architecture of the prior camera requires tight coupling between camera control logic and an external host computer that provides commands for camera operation and processes pixels from the camera. This tight coupling limits the attainable frame rate and functionality of the camera. The design of the present camera loosens this coupling to increase the achievable frame rate and functionality. From a host computer perspective, the readout operation in the prior camera was defined on a per-line basis; in this camera, it is defined on a per-ROI basis. In addition, the camera includes internal timing circuitry. This combination of features enables real-time, event-driven operation for adaptive control of the camera. Hence, this camera is well suited for applications requiring autonomous control of multiple ROIs to track multiple targets moving throughout the CCD field of view. Additionally, by eliminating the need for control intervention by the host computer during the pixel readout, the present design reduces ROI-readout times to attain higher frame rates. This camera (see figure) includes an imager card consisting of a commercial CCD imager and two signal-processor chips. The imager card converts transistor/ transistor-logic (TTL)-level signals from a field programmable gate array (FPGA) controller card. These signals are transmitted to the imager card via a low-voltage differential signaling (LVDS) cable assembly. The FPGA controller card is connected to the host computer via a standard peripheral component interface (PCI).

  5. On the development of new SPMN diurnal video systems for daylight fireball monitoring

    NASA Astrophysics Data System (ADS)

    Madiedo, J. M.; Trigo-Rodríguez, J. M.; Castro-Tirado, A. J.

    2008-09-01

    Daylight fireball video monitoring High-sensitivity video devices are commonly used for the study of the activity of meteor streams during the night. These provide useful data for the determination, for instance, of radiant, orbital and photometric parameters ([1] to [7]). With this aim, during 2006 three automated video stations supported by Universidad de Huelva were set up in Andalusia within the framework of the SPanish Meteor Network (SPMN). These are endowed with 8-9 high sensitivity wide-field video cameras that achieve a meteor limiting magnitude of about +3. These stations have increased the coverage performed by the low-scan allsky CCD systems operated by the SPMN and, besides, achieve a time accuracy of about 0.01s for determining the appearance of meteor and fireball events. Despite of these nocturnal monitoring efforts, we realised the need of setting up stations for daylight fireball detection. Such effort was also motivated by the appearance of the two recent meteorite-dropping events of Villalbeto de la Peña [8,9] and Puerto Lápice [10]. Although the Villalbeto de la Peña event was casually videotaped, and photographed, no direct pictures or videos were obtained for the Puerto Lápice event. Consequently, in order to perform a continuous recording of daylight fireball events, we setup new automated systems based on CCD video cameras. However, the development of these video stations implies several issues with respect to nocturnal systems that must be properly solved in order to get an optimal operation. The first of these video stations, also supported by University of Huelva, has been setup in Sevilla (Andalusia) during May 2007. But, of course, fireball association is unequivocal only in those cases when two or more stations recorded the fireball, and when consequently the geocentric radiant is accurately determined. With this aim, a second diurnal video station is being setup in Andalusia in the facilities of Centro Internacional de Estudios y Convenciones Ecológicas y Medioambientales (CIECEM, University of Huelva), in the environment of Doñana Natural Park (Huelva province). In this way, both stations, which are separated by a distance of 75 km, will work as a double video station system in order to provide trajectory and orbit information of mayor bolides and, thus, increase the chance of meteorite recovery in the Iberian Peninsula. The new diurnal SPMN video stations are endowed with different models of Mintron cameras (Mintron Enterprise Co., LTD). These are high-sensitivity devices that employ a colour 1/2" Sony interline transfer CCD image sensor. Aspherical lenses are attached to the video cameras in order to maximize image quality. However, the use of fast lenses is not a priority here: while most of our nocturnal cameras use f0.8 or f1.0 lenses in order to detect meteors as faint as magnitude +3, diurnal systems employ in most cases f1.4 to f2.0 lenses. Their focal length ranges from 3.8 to 12 mm to cover different atmospheric volumes. The cameras are arranged in such a way that the whole sky is monitored from every observing station. Figure 1. A daylight event recorded from Sevilla on May 26, 2008 at 4h30m05.4 +-0.1s UT. The way our diurnal video cameras work is similar to the operation of our nocturnal systems [1]. Thus, diurnal stations are automatically switched on and off at sunrise and sunset, respectively. The images taken at 25 fps and with a resolution of 720x576 pixels are continuously sent to PC computers through a video capture device. The computers run a software (UFOCapture, by SonotaCo, Japan) that automatically registers meteor trails and stores the corresponding video frames on hard disk. Besides, before the signal from the cameras reaches the computers, a video time inserter that employs a GPS device (KIWI-OSD, by PFD Systems) inserts time information on every video frame. This allows us to measure time in a precise way (about 0.01 sec.) along the whole fireball path. EPSC Abstracts, Vol. 3, EPSC2008-A-00319, 2008 European Planetary Science Congress, Author(s) 2008 However, one of the issues with respect to nocturnal observing stations is the high number of false detections as a consequence of several factors: higher activity of birds and insects, reflection of sunlight on planes and helicopters, etc. Sometimes some of these false events follow a pattern which is very similar to fireball trails, which makes absolutely necessary the use of a second station in order to discriminate between them. Other key issue is related to the passage of the Sun before the field of view of some of the cameras. In fact, special care is necessary with this to avoid any damage to the CCD sensor. Besides, depending on atmospheric conditions (dust or moisture, for instance), the Sun may saturate most of the video frame. To solve this, our automated system determines which camera is pointing towards the Sun at a given moment and disconnects it. As the cameras are endowed with autoiris lenses, its disconnection means that the optics is fully closed and, so, the CCD sensor is protected. This, of course, means that when this happens the atmospheric volume covered by the corresponding camera is not monitored. It must be also taken into account that, in general, operation temperatures are higher for diurnal cameras. This results in higher thermal noise and, so, poses some difficulties to the detection software. To minimize this effect, it is necessary to employ CCD video cameras with proper signal to noise ratio. Refrigeration of the CCD sensor with, for instance, a Peltier system, can also be considered. The astrometric reduction procedure is also somewhat different for daytime events: it requires that reference objects are located within the field of view of every camera in order to calibrate the corresponding images. This is done by allowing every camera to capture distant buildings that, by means of said calibration, would allow us to obtain the equatorial coordinates of the fireball along its path by measuring its corresponding X and Y positions on every video frame. Such calibration can be performed from stars positions measured from nocturnal images taken with the same cameras. Once made, if the cameras are not moved it is possible to estimate the equatorial coordinates of any future fireball event. We don't use any software for automatic astrometry of the images. This crucial step is made via direct measurements of the pixel position as in all our previous work. Then, from these astrometric measurements, our software estimates the atmospheric trajectory and radiant for each fireball ([10] to [13]). During 2007 and 2008 the SPMN has also setup other diurnal stations based on 1/3' progressive-scan CMOS sensors attached to modified wide-field lenses covering a 120x80 degrees FOV. They are placed in Andalusia: El Arenosillo (Huelva), La Mayora (Málaga) and Murtas (Granada). They have also night sensitivity thanks to a infrared cut filter (ICR) which enables the camera to perform well in both high and low light condition in colour as well as provide IR sensitive Black/White video at night. Conclusions First detections of daylight fireballs by CCD video camera are being achieved in the SPMN framework. Future expansion and set up of new observing stations is currently being planned. The future establishment of additional diurnal SPMN stations will allow an increase in the number of daytime fireballs detected. This will also increase our chance of meteorite recovery.

  6. Research of aerial camera focal pane micro-displacement measurement system based on Michelson interferometer

    NASA Astrophysics Data System (ADS)

    Wang, Shu-juan; Zhao, Yu-liang; Li, Shu-jun

    2014-09-01

    The aerial camera focal plane in the correct position is critical to the imaging quality. In order to adjust the aerial camera focal plane displacement caused in the process of maintenance, a new micro-displacement measuring system of aerial camera focal plane in view of the Michelson interferometer has been designed in this paper, which is based on the phase modulation principle, and uses the interference effect to realize the focal plane of the micro-displacement measurement. The system takes He-Ne laser as the light source, uses the Michelson interference mechanism to produce interference fringes, changes with the motion of the aerial camera focal plane interference fringes periodically, and records the periodicity of the change of the interference fringes to obtain the aerial camera plane displacement; Taking linear CCD and its driving system as the interference fringes picking up tool, relying on the frequency conversion and differentiating system, the system determines the moving direction of the focal plane. After data collecting, filtering, amplifying, threshold comparing, counting, CCD video signals of the interference fringes are sent into the computer processed automatically, and output the focal plane micro displacement results. As a result, the focal plane micro displacement can be measured automatically by this system. This system uses linear CCD as the interference fringes picking up tool, greatly improving the counting accuracy and eliminated the artificial counting error almost, improving the measurement accuracy of the system. The results of the experiments demonstrate that: the aerial camera focal plane displacement measurement accuracy is 0.2nm. While tests in the laboratory and flight show that aerial camera focal plane positioning is accurate and can satisfy the requirement of the aerial camera imaging.

  7. Evaluating intensified camera systems

    SciTech Connect

    S. A. Baker

    2000-07-01

    This paper describes image evaluation techniques used to standardize camera system characterizations. Key areas of performance include resolution, noise, and sensitivity. This team has developed a set of analysis tools, in the form of image processing software used to evaluate camera calibration data, to aid an experimenter in measuring a set of camera performance metrics. These performance metrics identify capabilities and limitations of the camera system, while establishing a means for comparing camera systems. Analysis software is used to evaluate digital camera images recorded with charge-coupled device (CCD) cameras. Several types of intensified camera systems are used in the high-speed imaging field. Electro-optical components are used to provide precise shuttering or optical gain for a camera system. These components including microchannel plate or proximity focused diode image intensifiers, electro-static image tubes, or electron-bombarded CCDs affect system performance. It is important to quantify camera system performance in order to qualify a system as meeting experimental requirements. The camera evaluation tool is designed to provide side-by-side camera comparison and system modeling information.

  8. BVR photometry and CCD spectroscopy of nova Del 2013

    NASA Astrophysics Data System (ADS)

    Santangelo, M. M. M.; Pasquini, M.

    2013-08-01

    In the course of the CATS (Capannori Astronomical Transient Survey) project, M.M.M. Santangelo and M. Pasquini performed BVR photoelectric photometry and low resolution CCD long-slit spectrometry of nova Delphini 2013. The measurements were made with an Optec SSP-5A single channel photoelectric photometer (with a photomultiplier tube Hamamatsu R6358), and with a SBIG SGS spectrometer + CCD camera ST-7XME attached at OAC's 0.30-m f/10 Schmidt-Cassegrain telescope.

  9. Application of a Two Camera Video Imaging System to Three-Dimensional Vortex Tracking in the 80- by 120-Foot Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Meyn, Larry A.; Bennett, Mark S.

    1993-01-01

    A description is presented of two enhancements for a two-camera, video imaging system that increase the accuracy and efficiency of the system when applied to the determination of three-dimensional locations of points along a continuous line. These enhancements increase the utility of the system when extracting quantitative data from surface and off-body flow visualizations. The first enhancement utilizes epipolar geometry to resolve the stereo "correspondence" problem. This is the problem of determining, unambiguously, corresponding points in the stereo images of objects that do not have visible reference points. The second enhancement, is a method to automatically identify and trace the core of a vortex in a digital image. This is accomplished by means of an adaptive template matching algorithm. The system was used to determine the trajectory of a vortex generated by the Leading-Edge eXtension (LEX) of a full-scale F/A-18 aircraft tested in the NASA Ames 80- by 120-Foot Wind Tunnel. The system accuracy for resolving the vortex trajectories is estimated to be +/-2 inches over distance of 60 feet. Stereo images of some of the vortex trajectories are presented. The system was also used to determine the point where the LEX vortex "bursts". The vortex burst point locations are compared with those measured in small-scale tests and in flight and found to be in good agreement.

  10. Television applications of interline-transfer CCD arrays

    NASA Technical Reports Server (NTRS)

    Hoagland, K. A.

    1976-01-01

    The design features and characteristics of interline transfer (ILT) CCD arrays with 190 x 244 and 380 x 488 image elements are reviewed, with emphasis on optional operating modes and system application considerations. It was shown that the observed horizontal resolution for a TV system using an ILT image sensor can approach the aperture response limit determined by photosensor site width, resulting in enhanced resolution for moving images. Preferred camera configurations and read out clocking modes for maximum resolution and low light sensitivity are discussed, including a very low light level intensifier CCD concept. Several camera designs utilizing ILT-CCD arrays are described. These cameras demonstrate feasibility in applications where small size, low-power/low-voltage operation, high sensitivity and extreme ruggedness are either desired or mandatory system requirements.

  11. High-resolution CCD imagers using area-array CCD's for sensing spectral components of an optical line image

    NASA Technical Reports Server (NTRS)

    Elabd, Hammam (Inventor); Kosonocky, Walter F. (Inventor)

    1987-01-01

    CCD imagers with a novel replicated-line-imager architecture are abutted to form an extended line sensor. The sensor is preceded by optics having a slit aperture and having an optical beam splitter or astigmatic lens for projecting multiple line images through an optical color-discriminating stripe filter to the CCD imagers. A very high resolution camera suitable for use in a satellite, for example, is thus provided. The replicated-line architecture of the imager comprises an area-array CCD, successive rows of which are illuminated by replications of the same line segment, as transmitted by respective color filter stripes. The charge packets formed by accumulation of photoresponsive charge in the area-array CCD are read out row by row. Each successive row of charge packets is then converted from parallel to serial format in a CCD line register and its amplitude sensed to generate a line of output signal.

  12. Design of a CCD controller optimized for mosaics

    NASA Astrophysics Data System (ADS)

    Leach, Robert W.

    1988-10-01

    A controller for operating Thomson-CSF CCDs in a 2 x N mosaic is described. It is designed around a monolithic Digital Signal Processor, a bank of digital-to-analog converters for clock generation, a simple video processor, and a fiber-optic serial data link communicating with an instrument control computer. The controller is compact, low power, low cost, fast, and easily programmable to generate waveforms of arbitrary timing whose voltages are also software controlled. Up to 16 CCDs can be efficiently controlled, and each CCD has its own set of clock drivers and a video processor, allowing customization of the readout of each CCD device.

  13. High-speed multicolour photometry with CMOS cameras

    NASA Astrophysics Data System (ADS)

    Pokhvala, S. M.; Zhilyaev, B. E.; Reshetnyk, V. M.

    2012-11-01

    We present the results of testing the commercial digital camera Nikon D90 with a CMOS sensor for high-speed photometry with a small telescope Celestron 11'' at the Peak Terskol Observatory. CMOS sensor allows to perform photometry in 3 filters simultaneously that gives a great advantage compared with monochrome CCD detectors. The Bayer BGR colour system of CMOS sensors is close to the Johnson BVR system. The results of testing show that one can carry out photometric measurements with CMOS cameras for stars with the V-magnitude up to ≃14^{m} with the precision of 0.01^{m}. Stars with the V-magnitude up to ˜10 can be shot at 24 frames per second in the video mode.

  14. Video-Level Monitor

    NASA Technical Reports Server (NTRS)

    Gregory, Ray W.

    1993-01-01

    Video-level monitor developed to provide full-scene monitoring of video and indicates level of brightest portion. Circuit designed nonspecific and can be inserted in any closed-circuit camera system utilizing RS170 or RS330 synchronization and standard CCTV video levels. System made of readily available, off-the-shelf components. Several units are in service.

  15. The use of video for air pollution source monitoring

    SciTech Connect

    Ferreira, F.; Camara, A.

    1999-07-01

    The evaluation of air pollution impacts from single industrial emission sources is a complex environmental engineering problem. Recent developments in multimedia technologies used by personal computers improved the digitizing and processing of digital video sequences. This paper proposes a methodology where statistical analysis of both meteorological and air quality data combined with digital video images are used for monitoring air pollution sources. One of the objectives of this paper is to present the use of image processing algorithms in air pollution source monitoring. CCD amateur video cameras capture images that are further processed by computer. The use of video as a remote sensing system was implemented with the goal of determining some particular parameters, either meteorological or related with air quality monitoring and modeling of point sources. These parameters include the remote calculation of wind direction, wind speed, gases stack's outlet velocity, and stack's effective emission height. The characteristics and behavior of a visible pollutant's plume is also studied. Different sequences of relatively simple image processing operations are applied to the images gathered by the different cameras to segment the plume. The algorithms are selected depending on the atmospheric and lighting conditions. The developed system was applied to a 1,000 MW fuel power plant located at Setubal, Portugal. The methodology presented shows that digital video can be an inexpensive form to get useful air pollution related data for monitoring and modeling purposes.

  16. CCD technology applied to laser cladding

    NASA Astrophysics Data System (ADS)

    Meriaudeau, Fabrice; Renier, Eric; Truchetet, Frederic

    1996-03-01

    Power lasers are more and more used in aerospace industry or automobile industry; their widespread use through different processes such as: welding, drilling or coating, in order to perform some surface treatments of material, requires a better understanding. In order to control the quality of the process, many technics have been developed, but most of them are based on a post-mortem analysis of the samples, and/or require an important financial investment. Welding, coating or other material treatments involving material transformations are often controlled with a metallurgical analysis. We here propose a new method, a new approach of the phenomena, we control the industrial process during the application. For this, we use information provided by two CCD cameras. One supplies information related to the intensity, and geometry of the melted surface, the second about the shape of the powder distribution within the laser beam. We use data provided by post-mortem metallurgical analysis and correlate those informations with parameters measured by both CCD, we create a datas bank which represents the relation between the measured parameters and the quality of the coating. Both informations, provided by the 2 CCD cameras allows us to optimize the industrial process. We are actually working on the real time aspect of the application and expect an implementation of the system.

  17. Automatic processing method for astronomical CCD images

    NASA Astrophysics Data System (ADS)

    Li, Binhua; Yang, Lei; Mao, Wei

    2002-12-01

    Since several hundreds of CCD images are obtained with the CCD camera in the Lower Latitude Meridian Circle (LLMC) every observational night, it is essential to adopt an automatic processing method to find the initial position of each object in these images, to center the object detected and to calculate its magnitude. In this paper several existing automatic search algorithms searching for objects in astronomical CCD images are reviewed. Our automatic searching algorithm is described, which include 5 steps: background calculating, filtering, object detecting and identifying, and defect eliminating. Then several existing two-dimensional centering algorithms are also reviewed, and our modified two-dimensional moment algorithm and an empirical formula for the centering threshold are presented. An algorithm for determining the magnitudes of objects is also presented in the paper. All these algorithms are programmed with VC++ programming language. In the last our method is tested with CCD images from the 1m RCC telescope in Yunnan Observatory, and some primary results are also given.

  18. Acquisition cameras and wavefront sensors for the GTC 10-m telescope

    NASA Astrophysics Data System (ADS)

    Kohley, Ralf; Suárez Valles, Marcos; Burley, Gregory S.; Cavaller Marqués, Lluis; Vilela, Rafael; Justribó, Tomás

    2004-09-01

    The GTC Acquisition Cameras and Wavefront Sensors are based on a modular design with remote, low-profile and lightweight CCD heads and a compact CCD controller. The cameras employ E2V Technologies Peltier cooled CCD47-20 and CCD39-01 detectors, which achieve 1Hz and 200Hz full frame readouts, respectively. The CCD controller is a modified version of the Magellan CCD controller (Greg Burley - OCIW), which is linked to the GTC control system. We present the detailed design and first performance results of the cameras.

  19. A New Data System for the San Fernando Observatory Video Spectra-Spectroheliograph

    NASA Astrophysics Data System (ADS)

    Walton, S. R.; Chapman, G. A.

    1997-12-01

    The San Fernando Observatory Video Spectra-Spectroheliograph (SFO VSSHG) has been used for observation of vector magnetic fields on the Sun for the last several years, and was described in Walton and Chapman (1996), Solar Phys. 166, 267. The current VSSHG camera is a commercial video format (512 by 480) CCD camera from which spectra are recorded on analog 3/4'' professional grade videocasettes. Recently, commercial off-the-shelf hardware has become available which can equal the high speed and capacity of this system in a pure digital mode. We are developing a new data system for the VSSHG consisting of a 1024 square digital CCD camera capable of 15 frames per second, an Intel Pentium-II based personal computer with fast-wide SCSI hard disk, and a DLT-7000 digital linear tape drive. This combination of off-the-shelf hardware, purchased for about \\$30,000, should achieve the data rate of 7.5 megabytes per second (MB/s) required for recording 5 frames per second from the CCD camera to the hard disk in real time, which is sufficient for the VSSHG. The DLT tape drive can record 35 gigabytes at a rate of 5 MB/s uncompressed, and a small amount of data compression should allow it to record spectra in real time as well. As of this writing, only the computer has been received, but preliminary tests show that its hard disk performs at speeds well over 10 MB/s with no special optimizations. We will take delivery on the camera soon, and hope to have the first images with the new camera early this winter. A detailed description of the data system and on-line processing algorithms will be presented.

  20. Automatic inference of geometric camera parameters and inter-camera topology in uncalibrated disjoint surveillance cameras

    NASA Astrophysics Data System (ADS)

    den Hollander, Richard J. M.; Bouma, Henri; Baan, Jan; Eendebak, Pieter T.; van Rest, Jeroen H. C.

    2015-10-01

    Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many cameras, or for frequent ad-hoc deployments of cameras, the cost of this calibration is high. This creates a barrier for the use of video analytics. Automating the calibration allows for a short configuration time, and the use of video analytics in a wider range of scenarios, including ad-hoc crisis situations and large scale surveillance systems. We show an autocalibration method entirely based on pedestrian detections in surveillance video in multiple non-overlapping cameras. In this paper, we show the two main components of automatic calibration. The first shows the intra-camera geometry estimation that leads to an estimate of the tilt angle, focal length and camera height, which is important for the conversion from pixels to meters and vice versa. The second component shows the inter-camera topology inference that leads to an estimate of the distance between cameras, which is important for spatio-temporal analysis of multi-camera tracking. This paper describes each of these methods and provides results on realistic video data.

  1. Cone penetrometer deployed in situ video microscope for characterizing sub-surface soil properties

    SciTech Connect

    Lieberman, S.H.; Knowles, D.S.; Kertesz, J.

    1997-12-31

    In this paper we report on the development and field testing of an in situ video microscope that has been integrated with a cone penetrometer probe in order to provide a real-time method for characterizing subsurface soil properties. The video microscope system consists of a miniature CCD color camera system coupled with an appropriate magnification and focusing optics to provide a field of view with a coverage of approximately 20 mm. The camera/optic system is mounted in a cone penetrometer probe so that the camera views the soil that is in contact with a sapphire window mounted on the side of the probe. The soil outside the window is illuminated by diffuse light provided through the window by an optical fiber illumination system connected to a white light source at the surface. The video signal from the camera is returned to the surface where it can be displayed in real-time on a video monitor, recorded on a video cassette recorder (VCR), and/or captured digitally with a frame grabber installed in a microcomputer system. In its highest resolution configuration, the in situ camera system has demonstrated a capability to resolve particle sizes as small as 10 {mu}m. By using other lens systems to increase the magnification factor, smaller particles could be resolved, however, the field of view would be reduced. Initial field tests have demonstrated the ability of the camera system to provide real-time qualitative characterization of soil particle sizes. In situ video images also reveal information on porosity of the soil matrix and the presence of water in the saturated zone. Current efforts are focused on the development of automated imaging processing techniques as a means of extracting quantitative information on soil particle size distributions. Data will be presented that compares data derived from digital images with conventional sieve/hydrometer analyses.

  2. Vision Sensors and Cameras

    NASA Astrophysics Data System (ADS)

    Hoefflinger, Bernd

    Silicon charge-coupled-device (CCD) imagers have been and are a specialty market ruled by a few companies for decades. Based on CMOS technologies, active-pixel sensors (APS) began to appear in 1990 at the 1 ?m technology node. These pixels allow random access, global shutters, and they are compatible with focal-plane imaging systems combining sensing and first-level image processing. The progress towards smaller features and towards ultra-low leakage currents has provided reduced dark currents and ?m-size pixels. All chips offer Mega-pixel resolution, and many have very high sensitivities equivalent to ASA 12.800. As a result, HDTV video cameras will become a commodity. Because charge-integration sensors suffer from a limited dynamic range, significant processing effort is spent on multiple exposure and piece-wise analog-digital conversion to reach ranges >10,000:1. The fundamental alternative is log-converting pixels with an eye-like response. This offers a range of almost a million to 1, constant contrast sensitivity and constant colors, important features in professional, technical and medical applications. 3D retino-morphic stacking of sensing and processing on top of each other is being revisited with sub-100 nm CMOS circuits and with TSV technology. With sensor outputs directly on top of neurons, neural focal-plane processing will regain momentum, and new levels of intelligent vision will be achieved. The industry push towards thinned wafers and TSV enables backside-illuminated and other pixels with a 100% fill-factor. 3D vision, which relies on stereo or on time-of-flight, high-speed circuitry, will also benefit from scaled-down CMOS technologies both because of their size as well as their higher speed.

  3. Upgrades to NDSF Vehicle Camera Systems and Development of a Prototype System for Migrating and Archiving Video Data in the National Deep Submergence Facility Archives at WHOI

    NASA Astrophysics Data System (ADS)

    Fornari, D.; Howland, J.; Lerner, S.; Gegg, S.; Walden, B.; Bowen, A.; Lamont, M.; Kelley, D.

    2003-12-01

    In recent years, considerable effort has been made to improve the visual recording capabilities of Alvin and ROV Jason. This has culminated in the routine use of digital cameras, both internal and external on these vehicles, which has greatly expanded the scientific recording capabilities of the NDSF. The UNOLS National Deep Submergence Facility (NDSF) archives maintained at Woods Hole Oceanograpic Institution (WHOI) are the repository for the diverse suite of photographic still images (both 35mm and recently digital), video imagery, vehicle data and navigation, and near-bottom side-looking sonar data obtained by the facility vehicles. These data comprise a unique set of information from a wide range of seafloor environments over the more than 25 years of NDSF operations in support of science. Included in the holdings are Alvin data plus data from the tethered vehicles- ROV Jason, Argo II, and the DSL-120 side scan sonar. This information conservatively represents an outlay in facilities and science costs well in excess of \\$100 million. Several archive related improvement issues have become evident over the past few years. The most critical are: 1. migration and better access to the 35mm Alvin and Jason still images through digitization and proper cataloging with relevant meta-data, 2. assessing Alvin data logger data, migrating data on older media no longer in common use, and properly labeling and evaluating vehicle attitude and navigation data, 3. migrating older Alvin and Jason video data, especially data recorded on Hi-8 tape that is very susceptible to degradation on each replay, to newer digital format media such as DVD, 4. improving the capabilities of the NDSF archives to better serve the increasingly complex needs of the oceanographic community, including researchers involved in focused programs like Ridge2000 and MARGINS, where viable distributed databases in various disciplinary topics will form an important component of the data management structure. We report on an archiving effort to transfer video footage currently on Hi-8 and VHS tape to digital media (DVD). At the same time as this is being done, frame grab imagery at reasonable resolution (640x480) at 30 sec. intervals will be compiled and the images will be integrated, as much as possible with vehicle attitude/navigation data and provided to the user community in a web-browser format, such as has already been done for the recent Jason and Alvin frame grabbed imagery. The frame-grabbed images will be tagged with time, thereby permitting integration of vehicle attitude and navigation data once that is available. In order to prototype this system, we plan to utilize data from the East Pacific Rise and Juan de Fuca Ridge which are field areas selected by the community as Ridge2000 Integrated Study Sites. There are over 500 Alvin dives in both these areas and having frame-grabbed, synoptic views of the terrains covered during those dives will be invaluable for scientific and outreach use as part of Ridge2000. We plan to coordinate this activity with the Ridge2000 Data Management Office at LDEO.

  4. Characterizing the response of charge-couple device digital color cameras

    NASA Astrophysics Data System (ADS)

    Slavkovikj, Viktor; Hardeberg, Jon Yngve; Eichhorn, Alexander

    2012-03-01

    The advance and rapid development of electronic imaging technology has lead the way to production of imaging sensors capable of acquiring good quality digital images with a high resolution. At the same time the cost and size of imaging devices have reduced. This has incited an increasing research interest for techniques that use images obtained by multiple camera arrays. The use of multi-camera arrays is attractive because it allows capturing multi-view images of dynamic scenes, enabling the creation of novel computer vision and computer graphics applications, as well as next generation video and television systems. There are additional challenges when using a multi-camera array, however. Due to inconsistencies in the fabrication process of imaging sensors and filters, multi-camera arrays exhibit inter-camera color response variations. In this work we characterize and compare the response of two digital color cameras, which have a light sensor based on the charge-coupled device (CCD) array architecture. The results of the response characterization process can be used to model the cameras' responses, which is an important step when constructing a multi-camera array system.

  5. Video flowmeter

    DOEpatents

    Lord, D.E.; Carter, G.W.; Petrini, R.R.

    1983-08-02

    A video flowmeter is described that is capable of specifying flow nature and pattern and, at the same time, the quantitative value of the rate of volumetric flow. An image of a determinable volumetric region within a fluid containing entrained particles is formed and positioned by a rod optic lens assembly on the raster area of a low-light level television camera. The particles are illuminated by light transmitted through a bundle of glass fibers surrounding the rod optic lens assembly. Only particle images having speeds on the raster area below the raster line scanning speed may be used to form a video picture which is displayed on a video screen. The flowmeter is calibrated so that the locus of positions of origin of the video picture gives a determination of the volumetric flow rate of the fluid. 4 figs.

  6. Video flowmeter

    DOEpatents

    Lord, David E.; Carter, Gary W.; Petrini, Richard R.

    1983-01-01

    A video flowmeter is described that is capable of specifying flow nature and pattern and, at the same time, the quantitative value of the rate of volumetric flow. An image of a determinable volumetric region within a fluid (10) containing entrained particles (12) is formed and positioned by a rod optic lens assembly (31) on the raster area of a low-light level television camera (20). The particles (12) are illuminated by light transmitted through a bundle of glass fibers (32) surrounding the rod optic lens assembly (31). Only particle images having speeds on the raster area below the raster line scanning speed may be used to form a video picture which is displayed on a video screen (40). The flowmeter is calibrated so that the locus of positions of origin of the video picture gives a determination of the volumetric flow rate of the fluid (10).

  7. Video flowmeter

    DOEpatents

    Lord, D.E.; Carter, G.W.; Petrini, R.R.

    1981-06-10

    A video flowmeter is described that is capable of specifying flow nature and pattern and, at the same time, the quantitative value of the rate of volumetric flow. An image of a determinable volumetric region within a fluid containing entrained particles is formed and positioned by a rod optic lens assembly on the raster area of a low-light level television camera. The particles are illuminated by light transmitted through a bundle of glass fibers surrounding the rod optic lens assembly. Only particle images having speeds on the raster area below the raster line scanning speed may be used to form a video picture which is displayed on a video screen. The flowmeter is calibrated so that the locus of positions of origin of the video picture gives a determination of the volumetric flow rate of the fluid.

  8. Distributing digital video to multiple computers

    PubMed Central

    Murray, James A.

    2004-01-01

    Video is an effective teaching tool, and live video microscopy is especially helpful in teaching dissection techniques and the anatomy of small neural structures. Digital video equipment is more affordable now and allows easy conversion from older analog video devices. I here describe a simple technique for bringing digital video from one camera to all of the computers in a single room. This technique allows students to view and record the video from a single camera on a microscope. PMID:23493464

  9. Megapixel imaging camera for expanded H{sup {minus}} beam measurements

    SciTech Connect

    Simmons, J.E.; Lillberg, J.W.; McKee, R.J.; Slice, R.W.; Torrez, J.H.; McCurnin, T.W.; Sanchez, P.G.

    1994-02-01

    A charge coupled device (CCD) imaging camera system has been developed as part of the Ground Test Accelerator project at the Los Alamos National Laboratory to measure the properties of a large diameter, neutral particle beam. The camera is designed to operate in the accelerator vacuum system for extended periods of time. It would normally be cooled to reduce dark current. The CCD contains 1024 {times} 1024 pixels with pixel size of 19 {times} 19 {mu}m{sup 2} and with four phase parallel clocking and two phase serial clocking. The serial clock rate is 2.5{times}10{sup 5} pixels per second. Clock sequence and timing are controlled by an external logic-word generator. The DC bias voltages are likewise located externally. The camera contains circuitry to generate the analog clocks for the CCD and also contains the output video signal amplifier. Reset switching noise is removed by an external signal processor that employs delay elements to provide noise suppression by the method of double-correlated sampling. The video signal is digitized to 12 bits in an analog to digital converter (ADC) module controlled by a central processor module. Both modules are located in a VME-type computer crate that communicates via ethernet with a separate workstation where overall control is exercised and image processing occurs. Under cooled conditions the camera shows good linearity with dynamic range of 2000 and with dark noise fluctuations of about {plus_minus}1/2 ADC count. Full well capacity is about 5{times}10{sup 5} electron charges.

  10. Panoramic video in video-mediated education

    NASA Astrophysics Data System (ADS)

    Ouglov, Andrei; Hjelsvold, Rune

    2004-12-01

    This paper discusses the use of panoramic video and its benefits in video mediated education. A panoramic view is generated by covering the blackboard by two or more cameras and then stitching the captured videos together. This paper describes the properties and advantages of multi-camera, panoramic video compared to single-camera approaches. One important difference between panoramic video and regular video is that the former has a wider field of view (FOV). As a result, the blackboard covers a larger part of the video screen and the information density is increased. Most importantly, the size of the letters written on the blackboard is enlarged, which improves the student"s ability to clearly read what is written on the blackboard. The panoramic view also allows students to focus their attention on different parts of the blackboard in the same way they would be able to in the classroom. This paper also discussed the results from a study among students where a panoramic view was tested against single-camera views. The study indicates that the students preferred the panoramic view. The students also suggested improvements to could make panoramic video even more beneficial.

  11. Panoramic video in video-mediated education

    NASA Astrophysics Data System (ADS)

    Ouglov, Andrei; Hjelsvold, Rune

    2005-01-01

    This paper discusses the use of panoramic video and its benefits in video mediated education. A panoramic view is generated by covering the blackboard by two or more cameras and then stitching the captured videos together. This paper describes the properties and advantages of multi-camera, panoramic video compared to single-camera approaches. One important difference between panoramic video and regular video is that the former has a wider field of view (FOV). As a result, the blackboard covers a larger part of the video screen and the information density is increased. Most importantly, the size of the letters written on the blackboard is enlarged, which improves the student"s ability to clearly read what is written on the blackboard. The panoramic view also allows students to focus their attention on different parts of the blackboard in the same way they would be able to in the classroom. This paper also discussed the results from a study among students where a panoramic view was tested against single-camera views. The study indicates that the students preferred the panoramic view. The students also suggested improvements to could make panoramic video even more beneficial.

  12. Extreme ultraviolet response of a Tektronix 1024 x 1024 CCD

    NASA Astrophysics Data System (ADS)

    Moses, Daniel J.; Hochedez, Jean-Francois E.; Howard, Russell A.; Au, Benjamin D.; Wang, Dennis; Blouke, Morley

    1992-08-01

    The goal of the detector development program for the Solar and Heliospheric Spacecraft (SOHO) EUV Imaging Telescope (EIT) is an Extreme UltraViolet (EUV) CCD (Charge Coupled Device) camera. The Naval Research Lab (NRL) SOHO COD Group has developed a design for the EIT camera and is screening CCDs for flight application. Tektronix Inc. have fabricated 1024x1024 CCDs for the EIT program. As a part of the CCD screening effort the quantum efficiency (QE) of a prototype CCD has been measured in the NRL EUV laboratory over the wavelength range of 256 to 735 Angstroms. A simplified model has been applied to these QE measurements to illustrate the relevant physical processes that determine the performance of the detector.

  13. CCD image sensor induced error in PIV applications

    NASA Astrophysics Data System (ADS)

    Legrand, M.; Nogueira, J.; Vargas, A. A.; Ventas, R.; Rodríguez-Hidalgo, M. C.

    2014-06-01

    The readout procedure of charge-coupled device (CCD) cameras is known to generate some image degradation in different scientific imaging fields, especially in astrophysics. In the particular field of particle image velocimetry (PIV), widely extended in the scientific community, the readout procedure of the interline CCD sensor induces a bias in the registered position of particle images. This work proposes simple procedures to predict the magnitude of the associated measurement error. Generally, there are differences in the position bias for the different images of a certain particle at each PIV frame. This leads to a substantial bias error in the PIV velocity measurement (˜0.1 pixels). This is the order of magnitude that other typical PIV errors such as peak-locking may reach. Based on modern CCD technology and architecture, this work offers a description of the readout phenomenon and proposes a modeling for the CCD readout bias error magnitude. This bias, in turn, generates a velocity measurement bias error when there is an illumination difference between two successive PIV exposures. The model predictions match the experiments performed with two 12-bit-depth interline CCD cameras (MegaPlus ES 4.0/E incorporating the Kodak KAI-4000M CCD sensor with 4 megapixels). For different cameras, only two constant values are needed to fit the proposed calibration model and predict the error from the readout procedure. Tests by different researchers using different cameras would allow verification of the model, that can be used to optimize acquisition setups. Simple procedures to obtain these two calibration values are also described.

  14. CCD high-speed videography system with new concepts and techniques

    NASA Astrophysics Data System (ADS)

    Zheng, Zengrong; Zhao, Wenyi; Wu, Zhiqiang

    1997-05-01

    A novel CCD high speed videography system with brand-new concepts and techniques is developed by Zhejiang University recently. The system can send a series of short flash pulses to the moving object. All of the parameters, such as flash numbers, flash durations, flash intervals, flash intensities and flash colors, can be controlled according to needs by the computer. A series of moving object images frozen by flash pulses, carried information of moving object, are recorded by a CCD video camera, and result images are sent to a computer to be frozen, recognized and processed with special hardware and software. Obtained parameters can be displayed, output as remote controlling signals or written into CD. The highest videography frequency is 30,000 images per second. The shortest image freezing time is several microseconds. The system has been applied to wide fields of energy, chemistry, medicine, biological engineering, aero- dynamics, explosion, multi-phase flow, mechanics, vibration, athletic training, weapon development and national defense engineering. It can also be used in production streamline to carry out the online, real-time monitoring and controlling.

  15. Video-based beam position monitoring at CHESS

    NASA Astrophysics Data System (ADS)

    Revesz, Peter; Pauling, Alan; Krawczyk, Thomas; Kelly, Kevin J.

    2012-10-01

    CHESS has pioneered the development of X-ray Video Beam Position Monitors (VBPMs). Unlike traditional photoelectron beam position monitors that rely on photoelectrons generated by the fringe edges of the X-ray beam, with VBPMs we collect information from the whole cross-section of the X-ray beam. VBPMs can also give real-time shape/size information. We have developed three types of VBPMs: (1) VBPMs based on helium luminescence from the intense white X-ray beam. In this case the CCD camera is viewing the luminescence from the side. (2) VBPMs based on luminescence of a thin (~50 micron) CVD diamond sheet as the white beam passes through it. The CCD camera is placed outside the beam line vacuum and views the diamond fluorescence through a viewport. (3) Scatter-based VBPMs. In this case the white X-ray beam passes through a thin graphite filter or Be window. The scattered X-rays create an image of the beam's footprint on an X-ray sensitive fluorescent screen using a slit placed outside the beam line vacuum. For all VBPMs we use relatively inexpensive 1.3 Mega-pixel CCD cameras connected via USB to a Windows host for image acquisition and analysis. The VBPM host computers are networked and provide live images of the beam and streams of data about the beam position, profile and intensity to CHESS's signal logging system and to the CHESS operator. The operational use of VBPMs showed great advantage over the traditional BPMs by providing direct visual input for the CHESS operator. The VBPM precision in most cases is on the order of ~0.1 micron. On the down side, the data acquisition frequency (50-1000ms) is inferior to the photoelectron based BPMs. In the future with the use of more expensive fast cameras we will be able create VBPMs working in the few hundreds Hz scale.

  16. The DSLR Camera

    NASA Astrophysics Data System (ADS)

    Berkó, Ernő; Argyle, R. W.

    Cameras have developed significantly in the past decade; in particular, digital Single-Lens Reflex Cameras (DSLR) have appeared. As a consequence we can buy cameras of higher and higher pixel number, and mass production has resulted in the great reduction of prices. CMOS sensors used for imaging are increasingly sensitive, and the electronics in the cameras allows images to be taken with much less noise. The software background is developing in a similar way—intelligent programs are created for after-processing and other supplementary works. Nowadays we can find a digital camera in almost every household, most of these cameras are DSLR ones. These can be used very well for astronomical imaging, which is nicely demonstrated by the amount and quality of the spectacular astrophotos appearing in different publications. These examples also show how much post-processing software contributes to the rise in the standard of the pictures. To sum up, the DSLR camera serves as a cheap alternative for the CCD camera, with somewhat weaker technical characteristics. In the following, I will introduce how we can measure the main parameters (position angle and separation) of double stars, based on the methods, software and equipment I use. Others can easily apply these for their own circumstances.

  17. CCD Stability Monitor

    NASA Astrophysics Data System (ADS)

    Mack, Jennifer

    2009-07-01

    This program will verify that the low frequency flat fielding, the photometry, and the geometric distortion are stable in time and across the field of view of the CCD arrays. A moderately crowded stellar field in the cluster 47 Tuc is observed with the HRC {at the cluster core} and WFC {6 arcmin West of the cluster core} using the full suite of broad and narrow band imaging filters. The positions and magnitudes of objects will be used to monitor local and large scale variations in the plate scale and the sensitivity of the detectors and to derive an independent measure of the detector CTE. The UV sensitivity for the SBC and HRC will be addressed in the UV contamination monitor program {11886, PI=Smith}.One additional orbit will be obtained at the beginning of the cycle will allow a verification of the CCD gain ratios for WFC using gain 2.0, 1.4, 1.0, 0.5 and for HRC using gain 4.0 and 2.0. In addition, one subarray exposure with the WFC will allow a verification that photometry obtained in full-frame and in sub-array modes are repeatable to better than 1%. This test is important for the ACS Photometric Cross-Calibration program {11889, PI=Bohlin} which uses sub-array exposures.

  18. Readout electronics for the Dark Energy Camera

    NASA Astrophysics Data System (ADS)

    Castilla, Javier; Ballester, Otger; Cardiel, Laia; Chappa, Steve; de Vicente, Juan; Holm, Scott; Huffman, David; Kozlovsky, Mark; Martinez, Gustavo; Olsen, Jamieson; Shaw, Theresa; Stuermer, Walter

    2010-07-01

    The goal of the Dark Energy Survey (DES) is to measure the dark energy equation of state parameter with four complementary techniques: galaxy cluster counts, weak lensing, angular power spectrum and type Ia supernovae. DES will survey a 5000 sq. degrees area of the sky in five filter bands using a new 3 deg2 mosaic camera (DECam) mounted at the prime focus of the Blanco 4-meter telescope at the Cerro-Tololo International Observatory (CTIO). DECam is a ~520 megapixel optical CCD camera that consists of 62 2k x 4k science sensors plus 4 2k x 2k sensors for guiding. The CCDs, developed at the Lawrence Berkeley National Laboratory (LBNL) and packaged and tested at Fermilab, have been selected to obtain images efficiently at long wavelengths. A front-end electronics system has been developed specifically to perform the CCD readout. The system is based in Monsoon, an open source image acquisition system designed by the National Optical Astronomy Observatory (NOAO). The electronics consists mainly of three types of modules: Control, Acquisition and Clock boards. The system provides a total of 132 video channels, 396 bias levels and around 1000 clock channels in order to readout the full mosaic at 250 kpixel/s speed with 10 e- noise performance. System configuration and data acquisition is done by means of six 0.8 Gbps optical links. The production of the whole system is currently underway. The contribution will focus on the testing, calibration and general performance of the full system in a realistic environment.

  19. Rail head wear measurements using the CCD photonic system

    NASA Astrophysics Data System (ADS)

    Popov, Dmitry V.; Titov, Evgeny V.; Mikhailov, Sergey S.

    1999-10-01

    At present there are exist comprehensive studies in the field of railway track condition monitoring systems and development of non-contact photonic systems based on digital CCD-cameras, high-speed board computers and powerful software. Creation of such systems allows to conduct preventive track maintenance work beforehand and to avoid the effects of vibration from wavy rail defects on a wheel set. As a result, the safety of running, durability of permanent way and rolling-stock are increased and the maintenance costs are reduced. The system developed consists of four special digital matrix CCD-cameras and four laser stripe illuminators. An electronic interface for linking the computer with the cameras, contour extraction models of the rail profile have been developed and the analysis of input- output ports has been carried out. According to the algorithms make a cut-off method and a tangent method have been compared.

  20. Tests of commercial colour CMOS cameras for astronomical applications

    NASA Astrophysics Data System (ADS)

    Pokhvala, S. M.; Reshetnyk, V. M.; Zhilyaev, B. E.

    2013-12-01

    We present some results of testing commercial colour CMOS cameras for astronomical applications. Colour CMOS sensors allow to perform photometry in three filters simultaneously that gives a great advantage compared with monochrome CCD detectors. The Bayer BGR colour system realized in colour CMOS sensors is close to the astronomical Johnson BVR system. The basic camera characteristics: read noise (e^{-}/pix), thermal noise (e^{-}/pix/sec) and electronic gain (e^{-}/ADU) for the commercial digital camera Canon 5D MarkIII are presented. We give the same characteristics for the scientific high performance cooled CCD camera system ALTA E47. Comparing results for tests of Canon 5D MarkIII and CCD ALTA E47 show that present-day commercial colour CMOS cameras can seriously compete with the scientific CCD cameras in deep astronomical imaging.

  1. Characterization of the Series 1000 Camera System

    SciTech Connect

    Kimbrough, J; Moody, J; Bell, P; Landen, O

    2004-04-07

    The National Ignition Facility requires a compact network addressable scientific grade CCD camera for use in diagnostics ranging from streak cameras to gated x-ray imaging cameras. Due to the limited space inside the diagnostic, an analog and digital input/output option in the camera controller permits control of both the camera and the diagnostic by a single Ethernet link. The system consists of a Spectral Instruments Series 1000 camera, a PC104+ controller, and power supply. The 4k by 4k CCD camera has a dynamic range of 70 dB with less than 14 electron read noise at a 1MHz readout rate. The PC104+ controller includes 16 analog inputs, 4 analog outputs and 16 digital input/output lines for interfacing to diagnostic instrumentation. A description of the system and performance characterization is reported.

  2. Characterization of the series 1000 camera system

    SciTech Connect

    Kimbrough, J.R.; Moody, J.D.; Bell, P.M.; Landen, O.L.

    2004-10-01

    The National Ignition Facility requires a compact network addressable scientific grade charge coupled device (CCD) camera for use in diagnostics ranging from streak cameras to gated x-ray imaging cameras. Due to the limited space inside the diagnostic, an analog and digital input/output option in the camera controller permits control of both the camera and the diagnostic by a single Ethernet link. The system consists of a Spectral Instruments Series 1000 camera, a PC104+ controller, and power supply. The 4k by 4k CCD camera has a dynamic range of 70 dB with less than 14 electron read noise at a 1 MHz readout rate. The PC104+ controller includes 16 analog inputs, four analog outputs, and 16 digital input/output lines for interfacing to diagnostic instrumentation. A description of the system and performance characterization is reported.

  3. CCD technique for longitude/latitude astronomy

    NASA Astrophysics Data System (ADS)

    Damljanović, G.; Gerstbach, G.; de Biasi, M. S.; Pejović, N.

    2003-10-01

    We report about CCD (Charge Coupled Device) experiments with the isntruments of astrometry and geodesy for the longitude and latitude determinations. At the Techn. University Vienna (TU Vienna), a mobile zenith camera "G1" was developed, based on CCD MX916 (Starlight Xpress) and F=20 cm photo optic. With Hipparcos/Tycho Catalogue, the first results show accuracy up to 0."5 for latitude/longitude. The PC-guided observations can be completed within 10 minutes. The camera G1 (near 4 kg) is used for astrogeodesy (geoid, Earth's crust, etc.). At the Belgrade Astronomical Observatory (AOB), the accuracy of (mean value of) latitude/longitude determinations can be a few 0."01 using zenith stars, Tycho-2 Catalogue and a ST-8 of SBIG (Santa Barbara Instrument Group) with zenith-telescope BLZ (D=11 cm, F=128.7 cm). The same equipment with PIP instrument (D=20 cm and F=457.7 cm, Punta Indio PZT, near La Plata) yields a little better accuracy than the BLZ's one. Both instruments, BLZ and PIP, where in the list of Bureau International de l'Heure - BIH. The mentioned instruments have acquired good possibilities for semi or full-automatic observations.

  4. Colorized linear CCD data acquisition system with automatic exposure control

    NASA Astrophysics Data System (ADS)

    Li, Xiaofan; Sui, Xiubao

    2014-11-01

    Colorized linear cameras deliver superb color fidelity at the fastest line rates in the industrial inspection. It's RGB trilinear sensor eliminates image artifacts by placing a separate row of pixels for each color on a single sensor. It's advanced design minimizes distance between rows to minimize image artifacts due to synchronization. In this paper, the high-speed colorized linear CCD data acquisition system was designed take advantages of the linear CCD sensor μpd3728. The hardware and software design of the system based on FPGA is introduced and the design of the functional modules is performed. The all system is composed of CCD driver module, data buffering module, data processing module and computer interface module. The image data was transferred to computer by Camera link interface. The system which automatically adjusts the exposure time of linear CCD, is realized with a new method. The integral time of CCD can be controlled by the program. The method can automatically adjust the integration time for different illumination intensity under controlling of FPGA, and respond quickly to brightness changes. The data acquisition system is also offering programmable gains and offsets for each color. The quality of image can be improved after calibration in FPGA. The design has high expansibility and application value. It can be used in many application situations.

  5. Video Golf

    NASA Technical Reports Server (NTRS)

    1995-01-01

    George Nauck of ENCORE!!! invented and markets the Advanced Range Performance (ARPM) Video Golf System for measuring the result of a golf swing. After Nauck requested their assistance, Marshall Space Flight Center scientists suggested video and image processing/computing technology, and provided leads on commercial companies that dealt with the pertinent technologies. Nauck contracted with Applied Research Inc. to develop a prototype. The system employs an elevated camera, which sits behind the tee and follows the flight of the ball down range, catching the point of impact and subsequent roll. Instant replay of the video on a PC monitor at the tee allows measurement of the carry and roll. The unit measures distance and deviation from the target line, as well as distance from the target when one is selected. The information serves as an immediate basis for making adjustments or as a record of skill level progress for golfers.

  6. STIS-01 CCD Functional

    NASA Astrophysics Data System (ADS)

    Valenti, Jeff

    2001-07-01

    This activity measures the baseline performance and commandability of the CCD subsystem. Only primary amplifier D is used. Bias, Dark, and Flat Field exposures are taken in order to measure read noise, dark current, CTE, and gain. Numerous bias frames are taken to permit construction of "superbias" frames in which the effects of read noise have been rendered negligible. Dark exposures are made outside the SAA. Full frame and binned observations are made, with binning factors of 1x1 and 2x2. Finally, tungsten lamp exposures are taken through narrow slits to confirm the slit positions in the current database. All exposures are internals. This is a reincarnation of SM3A proposal 8502 with some unnecessary tests removed from the program.

  7. Based on line scan CCD print image detection system

    NASA Astrophysics Data System (ADS)

    Zhang, Lifeng; Xie, Kai; Li, Tong

    2015-12-01

    In this paper, a new method based on machine vision is proposed for the defects of the traditional manual inspection of the quality of printed matter. With the aid of on line array CCD camera for image acquisition, using stepper motor as a sampling of drive circuit. Through improvement of driving circuit, to achieve the different size or precision image acquisition. In the terms of image processing, the standard image registration algorithm then, because of the characteristics of CCD-image acquisition, rigid body transformation is usually used in the registration, so as to achieve the detection of printed image.

  8. Solid State Television Camera (CID)

    NASA Technical Reports Server (NTRS)

    Steele, D. W.; Green, W. T.

    1976-01-01

    The design, development and test are described of a charge injection device (CID) camera using a 244x248 element array. A number of video signal processing functions are included which maximize the output video dynamic range while retaining the inherently good resolution response of the CID. Some of the unique features of the camera are: low light level performance, high S/N ratio, antiblooming, geometric distortion, sequential scanning and AGC.

  9. ccdproc: CCD data reduction software

    NASA Astrophysics Data System (ADS)

    Craig, M. W.; Crawford, S. M.; Deil, Christoph; Gomez, Carlos; Günther, Hans Moritz; Heidt, Nathan; Horton, Anthony; Karr, Jennifer; Nelson, Stefan; Ninan, Joe Phillip; Pattnaik, Punyaslok; Rol, Evert; Schoenell, William; Seifert, Michael; Singh, Sourav; Sipocz, Brigitta; Stotts, Connor; Streicher, Ole; Tollerud, Erik; Walker, Nathan; ccdproc contributors

    2015-10-01

    Ccdproc is an affiliated package for the AstroPy package for basic data reductions of CCD images. The ccdproc package provides many of the necessary tools for processing of ccd images built on a framework to provide error propagation and bad pixel tracking throughout the reduction process.

  10. Structured light camera calibration

    NASA Astrophysics Data System (ADS)

    Garbat, P.; Skarbek, W.; Tomaszewski, M.

    2013-03-01

    Structured light camera which is being designed with the joined effort of Institute of Radioelectronics and Institute of Optoelectronics (both being large units of the Warsaw University of Technology within the Faculty of Electronics and Information Technology) combines various hardware and software contemporary technologies. In hardware it is integration of a high speed stripe projector and a stripe camera together with a standard high definition video camera. In software it is supported by sophisticated calibration techniques which enable development of advanced application such as real time 3D viewer of moving objects with the free viewpoint or 3D modeller for still objects.

  11. First Carlsberg Meridian Telescope (CMT) CCD Catalogue.

    NASA Astrophysics Data System (ADS)

    Bélizon, F.; Muiños, J. L.; Vallejo, M.; Evans, D. W.; Irwin, M.; Helmer, L.

    2003-11-01

    The Carlsberg Meridian Telescope (CMT) is a telescope owned by Copenhagen University Observatory (CUO). It was installed in the Spanish observatory of El Roque de los Muchachos on the island of La Palma (Canary Islands) in 1984. It is operated jointly by the CUO, the Institute of Astronomy, Cambridge (IoA) and the Real Instituto y Observatorio de la Armada of Spain (ROA) in the framework of an international agreement. From 1984 to 1998 the instrument was provided with a moving slit micrometer and with its observations a series of 11 catalogues were published, `Carlsberg Meridian Catalogue La Palma (CMC No 1-11)'. Since 1997, the telescope has been controlled remotely via Internet. The three institutions share this remote control in periods of approximately three months. In 1998, the CMT was upgraded by installing as sensor, a commercial Spectrasource CCD camera as a test of the possibility of performing meridian transits observed in drift-scan mode. Once this was shown possible, in 1999, a second model of CCD camera, built in the CUO workshop with a better performance, was installed. The Spectrasource camera was loaned to ROA by CUO and is now installed in the San Fernando Automatic Meridian Circle in San Juan (CMASF). In 1999, the observations were started of a sky survey from -3deg to +30deg in declination. In July 2002, a first release of the survey was published, with the positions of the observed stars in the band between -3deg and +3deg in declination. This oral communication will present this first release of the survey.

  12. Fully depleted back illuminated CCD

    DOEpatents

    Holland, Stephen Edward

    2001-01-01

    A backside illuminated charge coupled device (CCD) is formed of a relatively thick high resistivity photon sensitive silicon substrate, with frontside electronic circuitry, and an optically transparent backside ohmic contact for applying a backside voltage which is at least sufficient to substantially fully deplete the substrate. A greater bias voltage which overdepletes the substrate may also be applied. One way of applying the bias voltage to the substrate is by physically connecting the voltage source to the ohmic contact. An alternate way of applying the bias voltage to the substrate is to physically connect the voltage source to the frontside of the substrate, at a point outside the depletion region. Thus both frontside and backside contacts can be used for backside biasing to fully deplete the substrate. Also, high resistivity gaps around the CCD channels and electrically floating channel stop regions can be provided in the CCD array around the CCD channels. The CCD array forms an imaging sensor useful in astronomy.

  13. System for control of cooled CCD and image data processing for plasma spectroscopy

    SciTech Connect

    Mimura, M.; Kakeda, T.; Inoko, A.

    1995-12-31

    A Spectroscopic measurement system which has a spacial resolution is important for plasma study. This is especially true for a measurement of a plasma without axial symmetry like the LHD-plasma. Several years ago, we developed an imaging spectroscopy system using a CCD camera and an image-memory board of a personal computer. It was very powerful to study a plasma-gas interaction phenomena. In which system, however, an ordinary CCD was used so that the dark-current noise of the CCD prevented to measure dark spectral lines. Recently, a cooled CCD system can be obtained for the high sensitivity measurement. But such system is still very expensive. The cooled CCD itself as an element can be purchased cheaply, because amateur agronomists began to use it to take a picture of heavenly bodies. So we developed an imaging spectroscopy system using such a cheap cooled CCD for plasma experiment.

  14. Measurement of marine picoplankton cell size by using a cooled, charge-coupled device camera with image-analyzed fluorescence microscopy

    SciTech Connect

    Viles, C.L.; Sieracki, M.E. )

    1992-02-01

    Accurate measurement of the biomass and size distribution of picoplankton cells (0.2 to 2.0 {mu}m) is paramount in characterizing their contribution to the oceanic food web and global biogeochemical cycling. Image-analyzed fluorescence microscopy, usually based on video camera technology, allows detailed measurements of individual cells to be taken. The application of an imaging system employing a cooled, slow-scan charge-coupled device (CCD) camera to automated counting and sizing of individual picoplankton cells from natural marine samples is described. A slow-scan CCD-based camera was compared to a video camera and was superior for detecting and sizing very small, dim particles such as fluorochrome-stained bacteria. Several edge detection methods for accurately measuring picoplankton cells were evaluated. Standard fluorescent microspheres and a Sargasso Sea surface water picoplankton population were used in the evaluation. Global thresholding was inappropriate for these samples. Methods used previously in image analysis of nanoplankton cells (2 to 20 {mu}m) also did not work well with the smaller picoplankton cells. A method combining an edge detector and an adaptive edge strength operator worked best for rapidly generating accurate cell sizes. A complete sample analysis of more than 1,000 cells averages about 50 min and yields size, shape, and fluorescence data for each cell. With this system, the entire size range of picoplankton can be counted and measured.

  15. Video monitoring system for car seat

    NASA Technical Reports Server (NTRS)

    Elrod, Susan Vinz (Inventor); Dabney, Richard W. (Inventor)

    2004-01-01

    A video monitoring system for use with a child car seat has video camera(s) mounted in the car seat. The video images are wirelessly transmitted to a remote receiver/display encased in a portable housing that can be removably mounted in the vehicle in which the car seat is installed.

  16. CCD imager with photodetector bias introduced via the CCD register

    NASA Technical Reports Server (NTRS)

    Kosonocky, Walter F. (Inventor)

    1986-01-01

    An infrared charge-coupled-device (IR-CCD) imager uses an array of Schottky-barrier diodes (SBD's) as photosensing elements and uses a charge-coupled-device (CCD) for arranging charge samples supplied in parallel from the array of SBD's into a succession of serially supplied output signal samples. Its sensitivity to infrared (IR) is improved by placing bias charges on the Schottky barrier diodes. Bias charges are transported to the Schottky barrier diodes by a CCD also used for charge sample read-out.

  17. Spas color camera

    NASA Technical Reports Server (NTRS)

    Toffales, C.

    1983-01-01

    The procedures to be followed in assessing the performance of the MOS color camera are defined. Aspects considered include: horizontal and vertical resolution; value of the video signal; gray scale rendition; environmental (vibration and temperature) tests; signal to noise ratios; and white balance correction.

  18. Make a Pinhole Camera

    ERIC Educational Resources Information Center

    Fisher, Diane K.; Novati, Alexander

    2009-01-01

    On Earth, using ordinary visible light, one can create a single image of light recorded over time. Of course a movie or video is light recorded over time, but it is a series of instantaneous snapshots, rather than light and time both recorded on the same medium. A pinhole camera, which is simple to make out of ordinary materials and using ordinary…

  19. Make a Pinhole Camera

    ERIC Educational Resources Information Center

    Fisher, Diane K.; Novati, Alexander

    2009-01-01

    On Earth, using ordinary visible light, one can create a single image of light recorded over time. Of course a movie or video is light recorded over time, but it is a series of instantaneous snapshots, rather than light and time both recorded on the same medium. A pinhole camera, which is simple to make out of ordinary materials and using ordinary

  20. Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    Holland, S. Douglas (Inventor)

    1992-01-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.